Search Unity

Need help to solve the problem with rotation of tracking objects

Discussion in 'AR/VR (XR) Discussion' started by carvinx, Oct 9, 2015.

  1. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hello, everyone

    I am looking for solution of following problem:

    I am using Optitrack Motive to track the markers that i weared on my finger and using Unity to create 3d objects.

    And middleVR (free edition) for 3d classes.

    If i move my finger, i will see the 3d object for finger move with my finger too. It works.
    But if i sometimes rotate my finger 90 degrees or 180 degrees, the 3d finger object will jump to the opposite site and rotates 180 degrees back...

    Video for problem:
    http://de.tinypic.com/player.php?v=dqmxiu&s=8#.Vhf0Xvntmko

    Have your any idea about my problem?

    Thank your!

    regards,

    Carvin
     
  2. DrBlort

    DrBlort

    Joined:
    Nov 14, 2012
    Posts:
    72
    Could be any number of things, I'd try to isolate the cause first by checking:

    - Is it the tracking software that's sending bad positional / rotation data?
    - Is it my code that's rotating the 3D object?
    - Maybe it's a sign problem (minus / plus same angle value), are you using Euler angles instead of quaternions?

    Hope it helps!
     
  3. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hi, DrBlort

    1. The tracking software is Optitrack Motive, it sends the correct data of positions and rotations of markers.
    2. What is your code? I am using a 3rd party code to read the data from Motive to Unity.
    3. I am using Quaternion...

    Have you any idea?
     
  4. DrBlort

    DrBlort

    Joined:
    Nov 14, 2012
    Posts:
    72
    Hey,

    About 2, I meant the question as if you were making it. Bad choice from my part :)

    If I'm not misunderstanding, you're not using the position/rotation from Optitrack data (given by that 3rd party software) directly over your model.

    So the questions that you should be checking are:
    Is the 3rd party software doing a conversion between the Optitrack data and Unity rotations? If so, check if there's a way to read it raw and see what happens.
    If instead you're making a conversion, check if there isn't a wrong assumption there.
     
  5. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hi, DrBlort

    Thank your for your answer again.

    I am using the class of the 3rd party code to give die tracking transform to my unity (finger) objects directly, i am sure about it.

    But i am not sure that the function which reads the position and rotation data to my finger objects will prevent that problem in my video...because it also uses EulerAngles function...

    In the 3rd party code is the following function of the class to read the data from Motive:

    Code (CSharp):
    1. // Unpack RigidBody data
    2.         private static void ReadRigidBody(Byte[] b, ref int offset, OptiTrackRigidBody rb)
    3.         {
    4.             try
    5.             {
    6.                 int[] iData = new int[100];
    7.                 float[] fData = new float[100];
    8.              
    9.                 // RB ID
    10.                 Buffer.BlockCopy(b, offset, iData, 0, 4); offset += 4;
    11.                 //int iSkelID = iData[0] >> 16;           // hi 16 bits = ID of bone's parent skeleton
    12.                 //int iBoneID = iData[0] & 0xffff;       // lo 16 bits = ID of bone
    13.                 rb.ID = iData[0]; // already have it from data descriptions
    14.              
    15.                 // RB pos
    16.                 float[] pos = new float[3];
    17.                 Buffer.BlockCopy(b, offset, pos, 0, 4 * 3); offset += 4 * 3;
    18.                 rb.position.x = pos[0]; rb.position.y = pos[1]; rb.position.z = pos[2];
    19.              
    20.                 // RB ori
    21.                 float[] ori = new float[4];
    22.                 Buffer.BlockCopy(b, offset, ori, 0, 4 * 4); offset += 4 * 4;
    23.                 rb.orientation.x = ori[0]; rb.orientation.y = ori[1]; rb.orientation.z = ori[2]; rb.orientation.w = ori[3];
    24.                 Buffer.BlockCopy(b, offset, iData, 0, 4); offset += 4;
    25.                 int nMarkers = iData[0];
    26.                 Buffer.BlockCopy(b, offset, fData, 0, 4 * 3 * nMarkers); offset += 4 * 3 * nMarkers;
    27.              
    28.                 Buffer.BlockCopy(b, offset, iData, 0, 4 * nMarkers); offset += 4 * nMarkers;
    29.              
    30.                 Buffer.BlockCopy(b, offset, fData, 0, 4 * nMarkers); offset += 4 * nMarkers;
    31.              
    32.                 Buffer.BlockCopy(b, offset, fData, 0, 4); offset += 4;      
    33.             } catch (Exception e)
    34.             {
    35.                 Debug.LogError(e.ToString());
    36.             }
    37.         }
    And in the following function of another class to get the positions and rotations from that class:

    Code (CSharp):
    1. public Vector3 getPosition(int rigidbodyIndex)
    2.     {
    3.         if(OptitrackManagement.DirectMulticastSocketClient.IsInit())
    4.         {
    5.             DataStream networkData = OptitrackManagement.DirectMulticastSocketClient.GetDataStream();
    6.             Vector3 pos = origin + networkData.getRigidbody(rigidbodyIndex).position * scale;
    7.             pos.x = -pos.x; // not really sure if this is the best way to do it
    8.             //pos.y = pos.y; // these may change depending on your configuration and calibration
    9.             //pos.z = -pos.z;
    10.             return pos;
    11.         }
    12.         else
    13.         {
    14.             return Vector3.zero;
    15.         }
    16.     }
    17.  
    18.     public Quaternion getOrientation(int rigidbodyIndex)
    19.     {
    20.         // should add a way to filter it
    21.         if(OptitrackManagement.DirectMulticastSocketClient.IsInit())
    22.         {
    23.             DataStream networkData = OptitrackManagement.DirectMulticastSocketClient.GetDataStream();
    24.             Quaternion rot = networkData.getRigidbody(rigidbodyIndex).orientation;
    25.  
    26.             // change the handedness from motive
    27.             //rot = new Quaternion(rot.z, rot.y, rot.x, rot.w); // depending on calibration
    28.          
    29.             // Invert pitch and yaw
    30.             Vector3 euler = rot.eulerAngles;
    31.             rot.eulerAngles = new Vector3(euler.x, -euler.y, euler.z); // these may change depending on your calibration
    32.  
    33.             return rot;
    34.         }
    35.         else
    36.         {
    37.             return Quaternion.identity;
    38.         }
    39.     }
    How should i only use quaternion instead of euler angles ???

    Thank you very much.

    regards, Carvin
     
    Last edited: Oct 10, 2015
  6. DrBlort

    DrBlort

    Joined:
    Nov 14, 2012
    Posts:
    72
    I would change the second function to return the original rotation, which is being modified to invert the Y angle, to this:

    Code (CSharp):
    1.  public Quaternion getOrientation(int rigidbodyIndex)
    2.     {
    3.         // should add a way to filter it
    4.         if(OptitrackManagement.DirectMulticastSocketClient.IsInit())
    5.         {
    6.             DataStream networkData = OptitrackManagement.DirectMulticastSocketClient.GetDataStream();
    7.             Quaternion rot = networkData.getRigidbody(rigidbodyIndex).orientation;
    8.             // change the handedness from motive
    9.             //rot = new Quaternion(rot.z, rot.y, rot.x, rot.w); // depending on calibration
    10.        
    11.             // Invert pitch and yaw
    12.             //Vector3 euler = rot.eulerAngles;
    13.             //rot.eulerAngles = new Vector3(euler.x, -euler.y, euler.z); // these may change depending on your calibration
    14.             return rot;
    15.         }
    16.         else
    17.         {
    18.             return Quaternion.identity;
    19.         }
    20.     }
    This way you'll have the original rotation value. Check what that the original rotation looks like, and then if you have to change it (very likely, I'm guessing that that was the motive to add the lines inverting the Y angle), I'd multiply it by a second quaternion having a rotation that compensates it.

    Something like this (obviously untested, and your variable names will surely be different):
    Code (CSharp):
    1.  
    2. Quaternion myDeviceThingOrientation = theDevice.getOrientation(index);
    3.  
    4. // This supposedly will have the same effect than the method commented lines
    5. myDeviceThingOrientation *= Quaternion.Euler(0.0f, 180.0f, 0.0f);
    6.  
    7.  
    As a brief explanation with my limited knowledge of quaternions, you "add" rotations by multiplication, which is what the *= is doing (multiplicating with itself, you could expand that to myDeviceThingOrientation = myDeviceThingOrientation * Quaternion.Euler(0.0f, 180.0f, 0.0f); if you like. It would be the same).
    Have in mind that if you multiplicate several rotations the order is important.

    Could be that you'll have to add a conditional, something like "if the Y angle of the rotation is > 180, then rotate, else do nothing", but that'll depend on the effect of the rotation sent by the device.
     
  7. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hi, DrBlort
    Thank your for your answer:)

    Sorry, i forgot to post another function i used to update the positions and orientations of my finger objects, as script component, i meant.


    like following code:

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. public class OptiTrackObject : MonoBehaviour {
    5.  
    6.     public int rigidbodyIndex;
    7.     public Vector3 rotationOffset;
    8.  
    9.     // Use this for initialization
    10.     void Start () {
    11.  
    12.     }
    13.  
    14.     // Update is called once per frame
    15.     void Update () {
    16.         Vector3 pos = OptiTrackManager.Instance.getPosition(rigidbodyIndex);
    17.         Quaternion rot = OptiTrackManager.Instance.getOrientation(rigidbodyIndex);
    18.         rot = rot * Quaternion.Euler(rotationOffset);
    19.         this.transform.position = pos;
    20.         this.transform.rotation = rot;
    21.     }
    22. }
    It was original code from the 3rd party programm, i already have changed the position with some offsets to my project...

    Thanks ;)
     
  8. DrBlort

    DrBlort

    Joined:
    Nov 14, 2012
    Posts:
    72
    Well, I think it's still a good idea to remove the modification to the original rotation in the 3rd party software, and apply those offsets yourself in you code, when you know what's doing and why (I mean the conditional code to modify the rotation depending on what's happening).

    EDIT:
    One check for that conditional code that I can think of, would be to check if the current rotation is bigger than, say, 90 degrees or more (whichever value is reasonable) than in the previous frame, because you know that the finger couldn't have rotated that much in so short of a time.
    That could indicate the need to add the offset.
     
    Last edited: Oct 10, 2015
  9. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hi, DrBlort.

    Thanks again !!!

    The 3 classes i posted are only the original code, i have modified it to my own project.

    In the function "public Quaternion getOrientation(int rigidbodyIndex)" i only have modified rot.eulerAngles = new Vector3(euler.x, -euler.y, euler.z); TO rot.eulerAngles = new Vector3(-euler.x, -euler.y, euler.z); because Motive and Unity have different coordinate systems.

    As you suggested i modified the getOrientation(int rigidbodyIndex) function -> without using eulerAngles, just return rot.
    And also changed rot = rot *Quaternion.Euler(rotationOffset); TO rot *= Quaternion.Euler(180.0f, 180.0f, 0) or rot *= Quaternion.Euler(-180.0f, -180.0f, 0)...But my problem is still there, the finger objects are also not at the right position i wished and they are still flipping if i rotate my finger...

    Have you seen my video?

    Thanks!

    Carvin
     
  10. DrBlort

    DrBlort

    Joined:
    Nov 14, 2012
    Posts:
    72
    I've seen it, yes. Maybe the object center (the pivot point) is not correct, or Optitrack Motive assumes a different hierarchy than you have in your objects, i.e. you have your object parented differently.

    Try with the conditional adjustment if the rotation changes too much from one frame to the next.
     
  11. carvinx

    carvinx

    Joined:
    Mar 19, 2014
    Posts:
    34
    Hi, DrBlort

    Thanks for your answer, the finger objects dont have any parent. As you suggested i will check the changing of angles in frames, if i rotate my finger...
     
  12. mash02

    mash02

    Joined:
    Jun 16, 2021
    Posts:
    1
    Hi everyone,
    I've a similar problem. I was looking in the internet but not enough lucky to find a solution. It would be nice you anyone can help me with solusions, suggestion or any guidelines.

    Problem: I have a physical pen that is tracked by OptiTrack (Motiv 2.3.5) and showed in unity VR. Pen's translation in (X, Y, Z) axis works fine but the problem in rotation. When the rotation angle bellow 180 Degree, everything is fine but if the pen rotate more than 180 Degree in Y axis, rotation added extra offset in each axis.
    for example: if I rotate the pen 180 degree in Y axis, then if I rotate the pen in any direction, it is not rotated as expected. (jumped, or added axtra angel, similar to this problem)

    My setup: in unity- Parent object > Penbody (child).
    Script: For rotation, I used Quaternion, no Euler angle was used in code.
    I have a custom script that actually calibrate OptiTrack coordinate to the unity coordinate.
    I use this code: https://github.com/felixkosmalla/un...r/Assets/Scripts/Calibration/VRCalibration.cs

    The following theory is a great help for finding optimal rotation in 3D
    https://nghiaho.com/?page_id=671
    It's a bit of wired that translation and rotation works fine until it rotates bellow 180 in Y axis.

    OptiTrack Coordinate of the tracked pen and display in VR

    Code (CSharp):
    1. void UpdatePose()
    2.     {
    3.         OptitrackRigidBodyState rbState = StreamingClient.GetLatestRigidBodyState( RigidBodyId, NetworkCompensation);
    4.         if ( rbState != null )
    5.         {
    6.             // this.transform.localPosition = rbState.Pose.Position;
    7.             // this.transform.localRotation = rbState.Pose.Orientation;
    8.  
    9.             //Raw coordinate from OptiTrack
    10.             Vector3 penOptiTrackPosition = rbState.Pose.Position;
    11.             Quaternion penOptiTrackRotation = rbState.Pose.Orientation;
    12.             penPosition = new Vector3(penOptiTrackPosition.x, penOptiTrackPosition.y, penOptiTrackPosition.z);
    13.             penOrientation = new Quaternion(penOptiTrackRotation.x, penOptiTrackRotation.y, penOptiTrackRotation.z, 1);
    14.  
    15.             transform.localPosition = penPosition;
    16.             transform.localRotation = penOrientation;
    17.                  
    18.         //    print("Pen rotation: " + penOrientation.eulerAngles.y);
    19.                        
    20.         }
    21.     }
    for coordinate calibration, I used the following implementation
    1. theory: https://nghiaho.com/?page_id=671
    2. public void svd_matrix_algorithm() from this- https://github.com/felixkosmalla/un...r/Assets/Scripts/Calibration/VRCalibration.cs