Search Unity

Question Data Analysis of Eye Data with Motion Data in VR 3D coordinates

Discussion in 'VR' started by qttsc2021, Aug 15, 2022.

  1. qttsc2021

    qttsc2021

    Joined:
    Mar 29, 2022
    Posts:
    24
    Hello!

    I am working on a VR experiment using HTC Vive Pro Eye with the SRanipal SDK in Unity. The experiment involves a pedestrian that needs to make a decision on whether it is safe to cross the road based on different vehicle approaching speeds, crosswalk geometric designs, deceleration rates, etc. If he/she feels it is safe, then they should cross the road.

    I can collect motion data (position and rotation of HMD in x, y, z) and eye data. I would like to know if there is a way to analyze the eye data with the motion data? For example, a 3D heatmap of the VR environment to show where the pedestrian was looking during the experiment? Or any other way to analyze the data of eye gaze.

    I have attached an example of eye data I can collect and a picture of one scenario of my VR environment for reference.

    If anyone can help me on this, it would be highly appreciated. Thank you! :)

    3.1-Control Pedestrian view.PNG
     

    Attached Files:

    Last edited: Aug 15, 2022
  2. Shane_Michael

    Shane_Michael

    Joined:
    Jul 8, 2013
    Posts:
    158
    The gaze origin and direction are given in local space so you would need to use the camera's Transform to TransformPoint() and TransformDirection() them into world space. Then use the Physics API to cast that ray into the scene.

    From there you will have a series of gaze points which you can accumulate into a 3D texture. Sample that texture in the shader used for the environment to visualize the heatmap. Additionally, if you log the pedestrians head position, you could also use the point array (interpolate between the two nearest points in time to drive a simple distance calculation in a shader) to replay where they were looking at any point during the trial.
     
    qttsc2021 likes this.
  3. qttsc2021

    qttsc2021

    Joined:
    Mar 29, 2022
    Posts:
    24
    Hi @Shane_Michael
    Thank you so much for your response!
    I was just wondering, is it still possible to transform the gaze origin and direction from local space to world space after the trials? The main reason I am asking is because the experiment is ongoing and I already collected eye data for 10 people. Or do I have to start again and implement the TransformPoint() and TransformDirection() operations into my Eye Tracking script?
     
  4. Shane_Michael

    Shane_Michael

    Joined:
    Jul 8, 2013
    Posts:
    158
    As long as you were logging the HMD rotation/position data, then you can definitely do the math after the fact. Easiest would be to simply instantiate a dummy Transform, iterate through the points to set its position/rotation to match the head position, and use the TransformPoint/Direction functions on that Transform.
     
  5. qttsc2021

    qttsc2021

    Joined:
    Mar 29, 2022
    Posts:
    24
    Hi @Shane_Michael
    Is it possible to obtain a video recording of the ray casted by the Physics API into the scene for each run?
    Also, I am beginner in Unity. I am unsure on how to use the TransformPoint() and TransformDirection() operations to convert the gaze origin and direction from local to world space. I am sorry.. Can you please elaborate on how I can do that? I have attached my script that obtains raw eye data for your reference. Thank you. :)

    Code (CSharp):
    1. using System.Collections;
    2. using System.Runtime.InteropServices;
    3. using UnityEngine;
    4. using System;
    5. using System.IO;
    6. using ViveSR.anipal.Eye;
    7. using ViveSR.anipal;
    8. using ViveSR;
    9.  
    10. /// <summary>
    11. /// Example usage for eye tracking callback
    12. /// Note: Callback runs on a separate thread to report at ~120hz.
    13. /// Unity is not threadsafe and cannot call any UnityEngine api from within callback thread.
    14. /// </summary>
    15. public class EyeTrackingRecordData : MonoBehaviour
    16. {
    17.     // ********************************************************************************************************************
    18.     //
    19.     //  Define user ID information.
    20.     //  - The developers can define the user ID format such as "ABC_001". The ID is used for the name of text file
    21.     //    that records the measured eye movement data.
    22.     //
    23.     // ********************************************************************************************************************
    24.     public static int UserID = 11; // Always change Participant Number for every participant
    25.     public static int scenario = 1; // Always change Scenario Number for every scenario
    26.     //public string UserID;       // Definte ID number such as 001, ABC001, etc.
    27.     public string Path = Directory.GetCurrentDirectory();
    28.     //string File_Path = Directory.GetCurrentDirectory() + "\\video_" + UserID + ".txt";
    29.     public string File_Path = Directory.GetCurrentDirectory() + "P" + UserID + "_" + "S" + scenario + ".txt";
    30.  
    31.  
    32.     // ********************************************************************************************************************
    33.     //
    34.     //  Parameters for time-related information.
    35.     //
    36.     // ********************************************************************************************************************
    37.     public static int cnt_callback = 0;
    38.     public int cnt_saccade = 0, Endbuffer = 3, SaccadeTimer = 30;
    39.     float Timeout = 1.0f, InitialTimer = 0.0f;
    40.     private static long SaccadeEndTime = 0;
    41.     private static long MeasureTime, CurrentTime, MeasureEndTime = 0;
    42.     private static float time_stamp;
    43.     private static int frame;
    44.  
    45.     // ********************************************************************************************************************
    46.     //
    47.     //  Parameters for eye data.
    48.     //
    49.     // ********************************************************************************************************************
    50.     private static EyeData_v2 eyeData = new EyeData_v2();
    51.     public EyeParameter eye_parameter = new EyeParameter();
    52.     public GazeRayParameter gaze = new GazeRayParameter();
    53.     private static bool eye_callback_registered = false;
    54.     private static UInt64 eye_valid_L, eye_valid_R;                 // The bits explaining the validity of eye data.
    55.     private static float openness_L, openness_R;                    // The level of eye openness.
    56.     private static float pupil_diameter_L, pupil_diameter_R;        // Diameter of pupil dilation.
    57.     private static Vector2 pos_sensor_L, pos_sensor_R;              // Positions of pupils.
    58.     private static Vector3 gaze_origin_L, gaze_origin_R;            // Position of gaze origin.
    59.     private static Vector3 gaze_direct_L, gaze_direct_R;            // Direction of gaze ray.
    60.     private static float frown_L, frown_R;                          // The level of user's frown.
    61.     private static float squeeze_L, squeeze_R;                      // The level to show how the eye is closed tightly.
    62.     private static float wide_L, wide_R;                            // The level to show how the eye is open widely.
    63.     private static double gaze_sensitive;                           // The sensitive factor of gaze ray.
    64.     private static float distance_C;                                // Distance from the central point of right and left eyes.
    65.     private static bool distance_valid_C;                           // Validity of combined data of right and left eyes.
    66.     public bool cal_need;                                           // Calibration judge.
    67.     public bool result_cal;                                         // Result of calibration.
    68.     private static int track_imp_cnt = 0;
    69.     private static TrackingImprovement[] track_imp_item;
    70.  
    71.     //private static EyeData eyeData = new EyeData();
    72.     //private static bool eye_callback_registered = false;
    73.  
    74.     //public Text uiText;
    75.     private float updateSpeed = 0;
    76.     private static float lastTime, currentTime;
    77.  
    78.  
    79.     // ********************************************************************************************************************
    80.     //
    81.     //  Start is called before the first frame update. The Start() function is performed only one time.
    82.     //
    83.     // ********************************************************************************************************************
    84.     void Start()
    85.     {
    86.         //File_Path = Directory.GetCurrentDirectory() + "\\Assets" + UserID + ".txt";
    87.         InputUserID();                              // Check if the file with the same ID exists.
    88.         //Invoke("SystemCheck", 0.5f);                // System check.
    89.         //SRanipal_Eye_v2.LaunchEyeCalibration();     // Perform calibration for eye tracking.
    90.         //Calibration();
    91.         //TargetPosition();                           // Implement the targets on the VR view.
    92.         //Invoke("Measurement", 0.5f);                // Start the measurement of ocular movements in a separate callback function.
    93.     }
    94.  
    95.  
    96.     // ********************************************************************************************************************
    97.     //
    98.     //  Checks if the filename with the same user ID already exists. If so, you need to change the name of UserID.
    99.     //
    100.     // ********************************************************************************************************************
    101.     void InputUserID()
    102.     {
    103.         Debug.Log(File_Path);
    104.  
    105.         if (File.Exists(File_Path))
    106.         {
    107.             Debug.Log("File with the same UserID already exists. Please change the UserID in the C# code.");
    108.  
    109.             //  When the same file name is found, we stop playing Unity.
    110.  
    111.             if (UnityEditor.EditorApplication.isPlaying)
    112.             {
    113.                 UnityEditor.EditorApplication.isPlaying = false;
    114.             }
    115.         }
    116.         else
    117.         {
    118.             Data_txt();
    119.         }
    120.     }
    121.  
    122.  
    123.     // ********************************************************************************************************************
    124.     //
    125.     //  Check if the system works properly.
    126.     //
    127.     // ********************************************************************************************************************
    128.     void SystemCheck()
    129.     {
    130.         if (SRanipal_Eye_API.GetEyeData_v2(ref eyeData) == ViveSR.Error.WORK)
    131.         {
    132.             Debug.Log("Device is working properly.");
    133.         }
    134.  
    135.         if (SRanipal_Eye_API.GetEyeParameter(ref eye_parameter) == ViveSR.Error.WORK)
    136.         {
    137.             Debug.Log("Eye parameters are measured.");
    138.         }
    139.  
    140.         //  Check again if the initialisation of eye tracking functions successfully. If not, we stop playing Unity.
    141.         Error result_eye_init = SRanipal_API.Initial(SRanipal_Eye_v2.ANIPAL_TYPE_EYE_V2, IntPtr.Zero);
    142.  
    143.         if (result_eye_init == Error.WORK)
    144.         {
    145.             Debug.Log("[SRanipal] Initial Eye v2: " + result_eye_init);
    146.         }
    147.         else
    148.         {
    149.             Debug.LogError("[SRanipal] Initial Eye v2: " + result_eye_init);
    150.  
    151.             if (UnityEditor.EditorApplication.isPlaying)
    152.             {
    153.                 UnityEditor.EditorApplication.isPlaying = false;    // Stops Unity editor.
    154.             }
    155.         }
    156.     }
    157.  
    158.     // ********************************************************************************************************************
    159.     //
    160.     //  Calibration is performed if the calibration is necessary.
    161.     //
    162.     // ********************************************************************************************************************
    163.     void Calibration()
    164.     {
    165.         SRanipal_Eye_API.IsUserNeedCalibration(ref cal_need);           // Check the calibration status. If needed, we perform the calibration.
    166.  
    167.         if (cal_need == true)
    168.         {
    169.             result_cal = SRanipal_Eye_v2.LaunchEyeCalibration();
    170.  
    171.             if (result_cal == true)
    172.             {
    173.                 Debug.Log("Calibration is done successfully.");
    174.             }
    175.  
    176.             else
    177.             {
    178.                 Debug.Log("Calibration is failed.");
    179.                 if (UnityEditor.EditorApplication.isPlaying)
    180.                 {
    181.                     UnityEditor.EditorApplication.isPlaying = false;    // Stops Unity editor if the calibration if failed.
    182.                 }
    183.             }
    184.         }
    185.  
    186.         if (cal_need == false)
    187.         {
    188.             Debug.Log("Calibration is not necessary");
    189.         }
    190.     }
    191.  
    192.     // ********************************************************************************************************************
    193.     //
    194.     //  Create a text file and header names of each column to store the measured data of eye movements.
    195.     //
    196.     // ********************************************************************************************************************
    197.     void Data_txt()
    198.     {
    199.         string variable =
    200.         "time(100ns)" + "," +
    201.         "time_stamp(ms)" + "," +
    202.         "frame" + "," +
    203.         "eye_valid_L" + "," +
    204.         "eye_valid_R" + "," +
    205.         "openness_L" + "," +
    206.         "openness_R" + "," +
    207.         "pupil_diameter_L(mm)" + "," +
    208.         "pupil_diameter_R(mm)" + "," +
    209.         "pos_sensor_L.x" + "," +
    210.         "pos_sensor_L.y" + "," +
    211.         "pos_sensor_R.x" + "," +
    212.         "pos_sensor_R.y" + "," +
    213.         "gaze_origin_L.x(mm)" + "," +
    214.         "gaze_origin_L.y(mm)" + "," +
    215.         "gaze_origin_L.z(mm)" + "," +
    216.         "gaze_origin_R.x(mm)" + "," +
    217.         "gaze_origin_R.y(mm)" + "," +
    218.         "gaze_origin_R.z(mm)" + "," +
    219.         "gaze_direct_L.x" + "," +
    220.         "gaze_direct_L.y" + "," +
    221.         "gaze_direct_L.z" + "," +
    222.         "gaze_direct_R.x" + "," +
    223.         "gaze_direct_R.y" + "," +
    224.         "gaze_direct_R.z" + "," +
    225.         "gaze_sensitive" + "," +
    226.         "frown_L" + "," +
    227.         "frown_R" + "," +
    228.         "squeeze_L" + "," +
    229.         "squeeze_R" + "," +
    230.         "wide_L" + "," +
    231.         "wide_R" + "," +
    232.         "distance_valid_C" + "," +
    233.         "distance_C(mm)" + "," +
    234.         "track_imp_cnt" +
    235.         Environment.NewLine;
    236.  
    237.         File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", variable);
    238.     }
    239.  
    240.  
    241.     void Update()
    242.     {
    243.         if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING) return;
    244.  
    245.  
    246.         if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
    247.         {
    248.             SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
    249.             eye_callback_registered = true;
    250.         }
    251.         else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
    252.         {
    253.             SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
    254.             eye_callback_registered = false;
    255.         }
    256.  
    257.         float timeNow = Time.realtimeSinceStartup;
    258.     }
    259.  
    260.  
    261.     private void OnDisable()
    262.     {
    263.         Release();
    264.     }
    265.  
    266.     void OnApplicationQuit()
    267.     {
    268.         Release();
    269.     }
    270.  
    271.     /// <summary>
    272.     /// Release callback thread when disabled or quit
    273.     /// </summary>
    274.     private static void Release()
    275.     {
    276.         if (eye_callback_registered == true)
    277.         {
    278.             SRanipal_Eye.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
    279.             eye_callback_registered = false;
    280.         }
    281.     }
    282.  
    283.     /// <summary>
    284.     /// Required class for IL2CPP scripting backend support
    285.     /// </summary>
    286.     internal class MonoPInvokeCallbackAttribute : System.Attribute
    287.     {
    288.         public MonoPInvokeCallbackAttribute() { }
    289.     }
    290.  
    291.     /// <summary>
    292.     /// Eye tracking data callback thread.
    293.     /// Reports data at ~120hz
    294.     /// MonoPInvokeCallback attribute required for IL2CPP scripting backend
    295.     /// </summary>
    296.     /// <param name="eye_data">Reference to latest eye_data</param>
    297.     [MonoPInvokeCallback]
    298.     private static void EyeCallback(ref EyeData_v2 eye_data)
    299.     {
    300.         EyeParameter eye_parameter = new EyeParameter();
    301.         SRanipal_Eye_API.GetEyeParameter(ref eye_parameter);
    302.  
    303.         eyeData = eye_data;
    304.         // do stuff with eyeData...
    305.  
    306.         //lastTime = currentTime;
    307.         //currentTime = eyeData.timestamp;
    308.  
    309.         MeasureTime = DateTime.Now.Ticks;
    310.         time_stamp = eyeData.timestamp;
    311.         frame = eyeData.frame_sequence;
    312.         eye_valid_L = eyeData.verbose_data.left.eye_data_validata_bit_mask;
    313.         eye_valid_R = eyeData.verbose_data.right.eye_data_validata_bit_mask;
    314.         openness_L = eyeData.verbose_data.left.eye_openness;
    315.         openness_R = eyeData.verbose_data.right.eye_openness;
    316.         pupil_diameter_L = eyeData.verbose_data.left.pupil_diameter_mm;
    317.         pupil_diameter_R = eyeData.verbose_data.right.pupil_diameter_mm;
    318.         pos_sensor_L = eyeData.verbose_data.left.pupil_position_in_sensor_area;
    319.         pos_sensor_R = eyeData.verbose_data.right.pupil_position_in_sensor_area;
    320.         gaze_origin_L = eyeData.verbose_data.left.gaze_origin_mm;
    321.         gaze_origin_R = eyeData.verbose_data.right.gaze_origin_mm;
    322.         gaze_direct_L = eyeData.verbose_data.left.gaze_direction_normalized;
    323.         gaze_direct_R = eyeData.verbose_data.right.gaze_direction_normalized;
    324.         gaze_sensitive = eye_parameter.gaze_ray_parameter.sensitive_factor;
    325.         frown_L = eyeData.expression_data.left.eye_frown;
    326.         frown_R = eyeData.expression_data.right.eye_frown;
    327.         squeeze_L = eyeData.expression_data.left.eye_squeeze;
    328.         squeeze_R = eyeData.expression_data.right.eye_squeeze;
    329.         wide_L = eyeData.expression_data.left.eye_wide;
    330.         wide_R = eyeData.expression_data.right.eye_wide;
    331.         distance_valid_C = eyeData.verbose_data.combined.convergence_distance_validity;
    332.         distance_C = eyeData.verbose_data.combined.convergence_distance_mm;
    333.         track_imp_cnt = eyeData.verbose_data.tracking_improvements.count;
    334.         //track_imp_item = eyeData.verbose_data.tracking_improvements.items;
    335.  
    336.         //  Convert the measured data to string data to write in a text file.
    337.         string value =
    338.             MeasureTime.ToString() + "," +
    339.             time_stamp.ToString() + "," +
    340.             frame.ToString() + "," +
    341.             eye_valid_L.ToString() + "," +
    342.             eye_valid_R.ToString() + "," +
    343.             openness_L.ToString() + "," +
    344.             openness_R.ToString() + "," +
    345.             pupil_diameter_L.ToString() + "," +
    346.             pupil_diameter_R.ToString() + "," +
    347.             pos_sensor_L.x.ToString() + "," +
    348.             pos_sensor_L.y.ToString() + "," +
    349.             pos_sensor_R.x.ToString() + "," +
    350.             pos_sensor_R.y.ToString() + "," +
    351.             gaze_origin_L.x.ToString() + "," +
    352.             gaze_origin_L.y.ToString() + "," +
    353.             gaze_origin_L.z.ToString() + "," +
    354.             gaze_origin_R.x.ToString() + "," +
    355.             gaze_origin_R.y.ToString() + "," +
    356.             gaze_origin_R.z.ToString() + "," +
    357.             gaze_direct_L.x.ToString() + "," +
    358.             gaze_direct_L.y.ToString() + "," +
    359.             gaze_direct_L.z.ToString() + "," +
    360.             gaze_direct_R.x.ToString() + "," +
    361.             gaze_direct_R.y.ToString() + "," +
    362.             gaze_direct_R.z.ToString() + "," +
    363.             gaze_sensitive.ToString() + "," +
    364.             frown_L.ToString() + "," +
    365.             frown_R.ToString() + "," +
    366.             squeeze_L.ToString() + "," +
    367.             squeeze_R.ToString() + "," +
    368.             wide_L.ToString() + "," +
    369.             wide_R.ToString() + "," +
    370.             distance_valid_C.ToString() + "," +
    371.             distance_C.ToString() + "," +
    372.             track_imp_cnt.ToString() +
    373.             //track_imp_item.ToString() +
    374.             Environment.NewLine;
    375.  
    376.         File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", value);
    377.  
    378.  
    379.     }
    380. }
     
  6. Shane_Michael

    Shane_Michael

    Joined:
    Jul 8, 2013
    Posts:
    158
    I don't have time to look through code, but I found a reference here of visualizing eye rays.

    You may have to modify it to use the data from your logs instead of reading the data from the device in real-time, and then attach it to an arbitrary model to represent the head (instead of the actual camera).

    From there you can drop a camera elsewhere in your scene, and screen capture it as it plays through the data.
     
  7. qttsc2021

    qttsc2021

    Joined:
    Mar 29, 2022
    Posts:
    24
    Hi @Shane_Michael !
    I finally managed to get the 3D world space coordinates of Eye Gaze. Thank you so much for your suggestions. I am just a bit confused on how to create the 3D texture (heatmap) of the gaze points I collected. Can you please elaborate on how to achieve this since I am new to Unity... Looking forward to your response! :)
     
  8. qttsc2021

    qttsc2021

    Joined:
    Mar 29, 2022
    Posts:
    24
    For your reference: An example of Eye data I can collect in the 3D world space coordinates can be seen in the below image.

    Sample Eye Gaze 3D coordinates.PNG
     
  9. Shane_Michael

    Shane_Michael

    Joined:
    Jul 8, 2013
    Posts:
    158
    I would calculate the bounding volume for your points, and then create a 3D texture with enough resolution to sufficiently capture the data. Any modern GPU has enough memory and performance that you can likely afford to round up, but 3D textures can really balloon in size. I might start with 128 x 64 x 256 given your data, and bump it up if it looks too blocky. It's useful to use a single channel texture (i.e. only red) to save memory.

    From there it's fairly straightforward to loop through each point, and for each texel in the 3D texture calculate the distant, apply some falloff function, if you want, and accumulate it in an array. Once you'd added up all the contributions, normalize it (i.e. divide by the max value), and assign the values to each of the texels in the 3D texture.

    From there, inside your shader, you use the world position of each fragment to sample the 3D texture at the correct place (i.e. with respect to the bounding volume of your 3D texture) and use that to index into a color gradient lookup table (your classic heat map) to generate the heatmap output. Add it to a lit shader as emission channel or output the value directly in an unlit shader. This is easier if you are using URP/HDRP and can use Shader Graph, but with the built-in pipeline, you'd need to write a surface shader.

    Another option is to put all your points in an array that is passed into a shader, and have your shader accumulate the distances at runtime to generate the heatmap. This wouldn't scale well as you get more and more points, but requires less precomputation and might be less work to set up. Would still require getting comfortable with Shader Graph, though.
     
  10. archhp

    archhp

    Joined:
    Jun 5, 2019
    Posts:
    1
    Hello
    I am working on a project similar to the above using a HTC Vive Pro Eye and SRanipal SDK for a behavioral study.
    I am trying the save the eye tracking data to a csv file and be able to visualize the gaze data using a heatmap.

    I saw the script that you had shared and the link to the gaze rays. I wanted to ask if you might be willing to share some information on how you resolved the transformation to world space and if you were able to visualize the eye rays.

    Any help would be much appreciated!
     
  11. may883520zxm

    may883520zxm

    Joined:
    Nov 25, 2022
    Posts:
    2
    I would like to ask when you convert from local to global, why left and right eye need to multiply -1 on x value, but the combined eye did not do that. I get this information from the GazeRaySample code, But I don't think that's right? I couldn't understand why the combined eye don't need -x?