Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Combining Free Finger Movement (Oculus Touch) with Inverse Kinematics Rigged Player Model

Discussion in 'AR/VR (XR) Discussion' started by unity_3drapidsolutions, Nov 21, 2017.

  1. unity_3drapidsolutions

    unity_3drapidsolutions

    Joined:
    Nov 18, 2017
    Posts:
    21
    Hey all,

    This is my first post and I am a relatively new Unity developer. The point of this post is to seek guidance as to how to combine (1) the free finger movement animations of the oculus 'hands' easily loadable as a prefab with (2) the full-body movements available using inverse kinematics with a rigged (aka skeletonized ?) player model.

    In my ideal setup, I feel I would align the rigged skeleton model with the playercontroller so the view is projected slightly in front of the eyes and the inverse kinematics target the hands and the head to move the player forward.

    However, what I have so far is a rigged player model that moves with the player, with the hands of the model roughly approximating Oculus' built in glowing blue hands - which have great built in finger animations.

    Do I need to add additional bones/skeleton components to the fingers of the model and abandon Oculus' built in hands?

    Would I need to make my own finger animations or can I use the built in ones used by Oculus in their prefab?

    The reason I want to have full player models is because my application is a co-op game, and I want each player to have a full physical body. I recognize an alternative may be to have the player see themselves as floating hands which already have the grab animations built in but have other players see a full body,

    Thanks for your time and help, I want to get this perfect before I move forwards on the more complex game components.

    J