Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Attaching Main Camera to a livestreamed avatar from Xsens

Discussion in 'VR' started by unity_dHnDr_Fk_qs2eA, Apr 19, 2022.

  1. unity_dHnDr_Fk_qs2eA

    unity_dHnDr_Fk_qs2eA

    Joined:
    Nov 1, 2020
    Posts:
    9

    https://answers.unity.com/questions/1823076/attaching-main-camera-to-a-livestreamed-avatar-fro.html

    "Hi! I am developing a multiplayer VR game in Unity. I am streaming live body motion from the MVN Analyze into Unity, and this works fine. My problem is when I want to mount the Main Camera in my XRRig into the MVN avatar's head. I want to create a setup which is similar to this YouTube video : https://www.xsens.com/news/lets-go-full-immersive-vr - except that I don't stream finger data, and use the HTC controllers instead.

    We are using HTC Vive headset and controllers, as well as the MVN Awinda body motion capture sensors. I have tried to set the Main Camera as a child to the Avatar and its head in many different ways, and it seems to spawn far away from the head in play-mode even though I place it correctly on beforehand. I've also tried to make a transform script which should transform the main camera's position and rotation. Any advice on how to tackle this?"
    I found above Question and also I saw the answer but I don't understand it completely could anyone explain that how I can do this answer to the above question?

    "usually you just need to calculate the offset in runtime, via adding a parent object...etc. Or simply override the transformation in late update..etc."
    Best regards