Search Unity

FOVE VR - VR Headset with Eye Tracking - Kickstarter / Pre-Order

Discussion in 'AR/VR (XR) Discussion' started by jashan, May 20, 2015.

  1. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    In my opinion, eye tracking directly integrated into headsets will be fairly awesome for quite a few reasons:
    • Multiplayer - have the eyes of avatar actually follow what they're looking at.
    • Control - look at a button, snap your finger and it will be clicked.
    • Rendering optimizations - render the area the user looks at with highest resolution while keeping stuff that's not directly in their field of view low-res (if that's fast enough, people won't notice)
    Of course, especially rendering optimizations need engine support, probably even GPU support, so that's something that will take a little while to actually work.

    There's currently a Kickstarter where you can support and pre-order a headset that has built-in eye-tracking. My understanding is that the company behind that is been doing eye-tracking for quite some time, so that's their area of expertise. IIRC, they've had a "demo" where a guy that couldn't move his hands played the piano with his eyes. And that worked surprisingly well.

    Delivery is scheduled May 2016 - so that's still quite a while from now (and I don't know how realistic that is). What I'd hope is that Unity will support this technology as much as possible as early as possible.

    Here's the link to the Kickstarter:

    FOVE:The World's First Eye Tracking Virtual Reality Headset

    Earliest bird is already gone, Early Bird still has 322 of 400 left. Right now they're at $169,236
    of $250,000, with 44 days to go.
     
    ZiadJ likes this.
  2. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    340
    Eye tracking should really be considerered by other competitors. It opens a new world of possibility for control input and game mechanics, more important to me than eventual rendering optimization.
     
    jashan likes this.
  3. Veggie

    Veggie

    Joined:
    Apr 3, 2013
    Posts:
    5
    Eye tracking should be linked into the camera focal point, that is, if you look left, then the scene camera should match the change of view angle. Not having this is the basis of motion sickness. Current systems have the viewers eyes locked forward. This unnatural fixed eyeball restriction causes a lot more movement of the head and therefore inner-ear/balance movement than is natural, the result, motion sickness.
     
    Last edited: Jun 4, 2015
  4. hardcoded2

    hardcoded2

    Joined:
    Nov 28, 2012
    Posts:
    6
    The fove demo shooting spaceships "feels" really right.

    Eye tracking as a way to augment existing control schemes seems like the right way to handle navigating 3d space, as almost everything feels a little wrong. The shooter demo was the first time it felt like I wasn't struggling against the input methods other than the vive.
     
  5. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307