• ragebutt@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    AR glasses really don’t make sense until there’s a way to develop some kind of non surgical neural interface. Like if I can control them with my thoughts? Awesome. If I have to walk around the city talking to myself like a crazy person? It’s the introduction of the Bluetooth earpiece all over again

    But then it’s like I absolutely do not want to give apple, google, meta, etc access to my brain

    • Iron Sight OS@lemmy.worldM
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I’m working on a way to control them without a brain interface and also without the clunky methods being used today (gestures, peripherals, tapping the side of the glasses, etc.) Wish me well!

      • ragebutt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        I wish you super well. I would love it if you could share more as I’m extremely interested in cog sci and ux research stuff but if you can’t or are uncomfortable sharing I totally understand

        A shot in the dark guess based on an idea I had a while back thinking about this: eye tracking?? I did a study on autism where we used glasses that could do eye tracking a few years ago. I didn’t realize you could do it so unobtrusively and with the tech a bit more matured I could see it being a viable option, if you could software magic around just general eye movements and blinking and shit

        Again though totally get it if you do not want to discuss!!

        • lud@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          Doesn’t the vision pro use eye tracking extensively? Iirc you can type on the on-screen keyboard by just looking at the right key and doing something with your fingers.

          • ragebutt@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            It does, and so do some vr headsets. I’ve never used a vision pro bc I don’t know anyone irl who is a total goober. I have used a playstation vr2 and the eye tracking is lackluster but in general the psvr2 seems like sony shit it out and just kind of gave up. Like when you configure the eye tracking it’s clearly fairly capable but in games it doesn’t do much of anything.

            I do feel like apple would probably have given it a bit more polish although I also feel like it probably wouldn’t work as smoothly as they describe. Like you look at “s” and continually type “a” or “z” and the finger gesture keeps not gesturing. I did try a new apple watch that had the feature where you could tap your index finger and thumb to confirm dialogs and it worked like 60% of the time. When it worked it worked really well, like even with the gesture being small and covert, and when it didn’t it would be super dramatic and big and it would just not register

            But either way the issue for glasses is getting the hardware for eye tracking, associated cpu power, and battery power to run it all within a somewhat reasonable glasses frame. A big goofy vr helmet has room to spare

            For reference these weren’t what we used in the study (what we had were far more hacky looking, I don’t think they were a commercial product) but they were similar ish:

            https://www.tobii.com/products/eye-trackers/wearables/tobii-pro-glasses-3

    • StitchIsABitch@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 days ago

      I don’t think that’s quite true. Of course it would make things easier, but there are loads of applications where a smartphone “controller” would work.

      Like setting a route on Google maps and having navigation steps shown on your glasses. Doing hands free video calls. Live translation of a conversation. Or simply notification popups.

      As long as it’s an ongoing process, you simply take out your phone, start the app/program, and voilà! It would just be more of a display until we develop a neural interface.

    • jjagaimo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      From other companies I’ve seeen touch bars on the temples and a Bluetooth ring that can be scrolled / tapped