Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Anyone tried to integrate a 3D sensor into a VR project?

Discussion in 'AR/VR (XR) Discussion' started by David_Orbbec3D, May 21, 2015.

  1. David_Orbbec3D

    David_Orbbec3D

    Joined:
    May 21, 2015
    Posts:
    2
    Hello,

    My team is currently working on our in-house develped 3D sensor. We think Gaming/VR may be a very important field of 3D sensors.

    We are now working on Unity support for our hardware and, we would like to hear what do experienced developers say.

    Is there any existing high quality project involving Unity+VR+3D sensor?
    What kind of hardware specs do you need if you want to integrate a 3D sensor to a VR game?
    What kind of API/function (related to the sensor) do you think it's necessary?

    I appreciate if you can share some of your opinions.
     
  2. Pascal-Serrarens

    Pascal-Serrarens

    Joined:
    Jan 14, 2013
    Posts:
    40
    I do a lot of work with 3D sensors, mainly for avatar control, for my product InstantVR. Although the devices are not very generic (Kinect, Leap Motion, Razer Hydra, Rift etc.) I do have a generic representation of tracking points which can be used in combination with generic 3D sensor like yours.

    A tracking target for me is just a transform with 6DOF (positional, rotation). The following points are important to me:
    1. tracking range. What is the range in which the tracking works. Leap Motion has a small range, Kinect/Rift are better, but you actually want a range similar to the Lighthouse/HTC Vive (15mx15mx3m).
    2. precision. This is the resolution of the tracking. We are often looking at sub-mm and better than 0.1 degree tracking.
    3. frequency. For VR, you want to have the update frequency to be higher than the refresh rate of the HMD. The tendency is that this goes up to 120Hz. So tracking of objects needs to reach that same frequency in order to look smooth.
    3. accuracy. This is the difficult part. All tracking devices are relative to a certain point. When you use just one kind of 3D sensor then calibration of sensor is easy. But when you use different kinds of sensors together it gets difficult. For instance, if you track your hand with Leap Motion and you are holding an object in your hand which has its own tracking sensor, you do not want to have that object positioned 10 cm left of your hand. I have worked hard to get this right for the currently supported devices, but it still needs to get better. If you want to make it easier for VR, I suggest to come up with a way to calibrate to the HMD in an automatic way. To examples of this:
    - Leap motion is mounted on the HMD, so the reference point is positioned on the HMD so there is no need to calibrate the positions between them.
    - Sixense STEM put an additional tracker on the HMD so that it knows where the HMD is relative to the other tracking points. This make calibration very easy at the cost of an additional tracker.

    About the API:
    - I want to know the position/rotation
    - I want to know whether the tracker is available (it may be switched of or unplugged)
    - Every tracker needs to be identified such that I can distinguish between them.
    There may be more, but that is what I could think of at the moment.

    Actually I am really looking forward to your solution, because I am thinking about support tracking arbitrary objects besides body parts at the moment.
     
    David_Orbbec3D likes this.
  3. David_Orbbec3D

    David_Orbbec3D

    Joined:
    May 21, 2015
    Posts:
    2