Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Vive Pro eye - Unity retina tracking Examples?

Discussion in 'AR/VR (XR) Discussion' started by trappist-1, Jul 8, 2019.

  1. trappist-1


    Mar 26, 2017
    Saved up some dough and am ready to purchase a Vive Pro eye (for experimentation).

    After searching for "vive pro eye unity example" not much information came up.

    Could someone please help me find any examples (web sites) that show how to incorporate eye tracking into Unity + SteamVR?

    I was wondering if it's included as part of SteamVR 2.0 asset?
    How do I read in the retina position data (for individual eyes)?
    How do I read in blink data (for individual eyes)?

    Please help me understand how difficult this could be. I need to feel confident about dropping $1,500.00!
    Last edited: Jul 8, 2019
  2. MariusRu


    Oct 15, 2015
    Look for "VIVE SRanipal SDK"
    trappist-1 likes this.
  3. MariusRu


    Oct 15, 2015
    I just tried it myself so let me add some more detail. You need to
    - go to
    - download and install VIVE_SRanipalInstaller_1.0.3.0.msi
    - Start a new Unity Project and import SteamVR
    - download and unzip from the website, and import Vive-SRanipal-Unity-Plugin from the folder as custom package into your Unity project

    From what I`ve seen, the eye-tracking quality is similar to our old research-grade product which we bought for 20k two years ago. I`m quite impressed.
    If you intend to process the data further, i.e. with regards to regions of interest, check out my repo at Github:
    Shizola and trappist-1 like this.
  4. trappist-1


    Mar 26, 2017
    Wow, Thank you so much for that install guide! I was able to get through all setup and even ran the eyeSample scene in Unity on my Vive pro (not a Vive Pro eye)!!

    I ordered the Vive Pro eye kit during lunch, so the hardware is on the way!

    Hey just wondering, do you know of any way to somehow "crudely" emulate the eye data so I can start rigging my avatars and getting the basics out of the way before the hardware even arrives?

    I know I should be patient, but getting something written beforehand would make delivery day that much sweeter!
  5. trappist-1


    Mar 26, 2017
    I examined the code in `SRanipal_AvatarEyeSample` and have managed to create the necessary overrides that allow me to manipulate the avatar head in the EyeSample.scene and emulate very simple Vive Pro eye states and to allow changing the gaze target positon. I hope this helps me to quickly integrate the SDK into my game without having to wait for the delivery guy.

    From what I have gathered, the `SRanipal_AvatarEyeSample` script runs an `Update()` loop. And in that loop it will call both `UpdateGazeRay()` and `UpdateEyeShapesManalOverride()'.

    • Responsible for updating the forward lookAt target coordinate for each eye.
    • Can be overridden by supplying a custom forward vector (see: gazeDirectionCombinedLocalManualOverride )
    Manually override gaze target position:

    Code (csharp):
    1. [SerializeField] Vector3 gazeDirectionCombinedLocalManualOverride = Vector3.forward;
    2. public void UpdateGazeRay(Vector3 gazeDirectionCombinedLocal)
    3. {
    4.     //place this function in SRanipal_AvatarEyeSample and comment out the original function     for (int i = 0; i < EyesModels.Length; ++i)
    5.     {
    6.         Vector3 target = EyeAnchors[i].transform.TransformPoint(gazeDirectionCombinedLocalManualOverride);
    7.         Debug.DrawLine (EyeAnchors[i].transform.position, target, Color.cyan);
    8.         EyesModels[i].LookAt(target);
    9.     }
    10. }

    In the above you see the forward vectors (gazeDirectionCombinedLocalManualOverride) projected from each eyeAnchor. An eyeAnchor is simply an empty GameObjects crated at start and becomes a child of the head. It resides at the localposition of the corresponding eye and serves as the starting point of the forward projection for each eye's lookAt target.

    • This call is responsible for controlling the amount of blink and brow of the avatar's face and eyelashes.
    • It polls the current blink weightings from the eye trackers and updates the weight of blendshapes accordingly.
    • It iterates over the `EyeShapeTables` array and for each lookup table in that array it calls `RenderModelEyeShape()` which retrieves and sets the actual blendshape weight of the avatar.
    • It's important to note the condition where eyeShapeTable.eyeShapes is equal to `EyeShape.Eye_Left_Blink` or `EyeShape.Eye_Right_Blink` verses when it is equal to something like eyebrows as the two are handeled differently.

    Manually override blend shape weights:

    Code (csharp):
    1. [SerializeField][Range(0,1)] float blinkWeightManalOverride = 0;
    2. [SerializeField][Range(0,1)] float browWeightManalOverride = 0;
    3. public void UpdateEyeShapes(Dictionary<EyeShape, float> eyeWeightings)
    4. {
    5.     //place this function in SRanipal_AvatarEyeSample and comment out the original function
    6.     foreach (var table in EyeShapeTables) {
    7.         table.eyeShapes;
    8.         for (int i = 0; i < table.eyeShapes.Length; ++i)
    9.         {
    10.             EyeShape eyeShape = table.eyeShapes[i];
    11.             if (eyeShape == EyeShape.Eye_Left_Blink || eyeShape == EyeShape.Eye_Right_Blink) {
    12.                 eyeWeightings[eyeShape] = blinkWeightManalOverride;
    13.             }
    14.             else
    15.             {
    16.                 eyeWeightings[eyeShape] = browWeightManalOverride;
    17.             }
    18.         }
    19.         RenderModelEyeShape (table, eyeWeightings);
    20.     }
    21. }

    In the above you can see that the eye lids and eye brows are seperate controls, and the various arrangments that can be acheived.


    I am not sure if the overrides I have created will work for all situations, but I think they may suffice to get basic gaze tracking going.
    Last edited: Jul 9, 2019