• ragebutt@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    AR glasses really don’t make sense until there’s a way to develop some kind of non surgical neural interface. Like if I can control them with my thoughts? Awesome. If I have to walk around the city talking to myself like a crazy person? It’s the introduction of the Bluetooth earpiece all over again

    But then it’s like I absolutely do not want to give apple, google, meta, etc access to my brain

    • Iron Sight OS@lemmy.worldM
      link
      fedilink
      English
      arrow-up
      1
      ·
      51 minutes ago

      I’m working on a way to control them without a brain interface and also without the clunky methods being used today (gestures, peripherals, tapping the side of the glasses, etc.) Wish me well!

    • StitchIsABitch@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      I don’t think that’s quite true. Of course it would make things easier, but there are loads of applications where a smartphone “controller” would work.

      Like setting a route on Google maps and having navigation steps shown on your glasses. Doing hands free video calls. Live translation of a conversation. Or simply notification popups.

      As long as it’s an ongoing process, you simply take out your phone, start the app/program, and voilà! It would just be more of a display until we develop a neural interface.

    • jjagaimo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      From other companies I’ve seeen touch bars on the temples and a Bluetooth ring that can be scrolled / tapped