Search Unity

Determining Player "Forward"

Discussion in 'VR' started by Habitablaba, Oct 12, 2018.

  1. Habitablaba

    Habitablaba

    Joined:
    Aug 5, 2013
    Posts:
    136
    I'm looking for some suggestions for ways to infer what the player's "forward" vector is in a room-scale 360 experience.
    I'm not currently worried about extreme edge cases, especially if it sounds like the user may be intentionally not doing the intended thing, but I am interested in getting a decently accurate forward vector when they could be looking or facing any direction.
    The only real given is that, if they are behaving correctly, both of their hands should be "forward" of their head.
    I also don't care about verticality of the vector, so it'll end up being projected onto the X/Z plane and normalized.

    I'm being intentionally vague as to the interaction that is happening, because I'm really in search of a general solution that could work regardless of the interaction (excluding edge cases). So if it were punching, "arm swing" walking, Creed's "speed bag" locomotion, saber swinging, etc. ideally the same solution could be used for each of these.
    It's also important that sensor set up and room orientation should not be taken into account. I'm looking for something that could work in a large space with 360 tracking.

    I've considered:
    • Grabbing an average position of the hands, getting the vector from the head to this new position
    • Grabbing the vector from the head to the dominant hand
    • Dropping a raycast to the floor and using the hit geometry's forward vector
    • Getting a complicated IK rig setup, potentially hiding the geometry, and using the rig's hip placement/orientation to infer "forward"
    None of these quite feel right to me, although the average-hand-position vector feels the right-est.

    Is this a solved problem that I'm just not searching hard enough for? Anybody else have interesting solutions to this problem? Am I being too vague and should really just seek out a solution for a much more specific use-case and be happy with that?
     
    Last edited: Oct 12, 2018
  2. swanijam

    swanijam

    Joined:
    Nov 14, 2016
    Posts:
    23
    If by forward you mean the likely direction of the torso, you probably can't get a reliable vector, since for most arrangements of hands/head there are multiple possible poses. What might be close, though, is the line perpendicular to the line that is the average line of the head and hands on the XZ plane.
    upload_2018-11-1_11-27-0.png
    (grey line is the average line, blue ray is the forward vector)

    of course, this vector would rotate a lot if they're doing a running motion, so you might do a few things to limit its variability or rate of change. You could do the average of the head camera forward and that blue ray, or maybe do an average forward vector from that last 1 or 2 seconds, or both. You could also influence it by their direction of motion, if they're running, or the direction of the nearest enemy, if they're fighting. I would stay away from doing this with a complicated IK rig setup, because it'll be a lot of work, and probably be only a littler better than the estimating you could do with math.
     
  3. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Another idea is to take the head to left hand vector, and head to right hand vector, and find the vector in the middle. Then lerp between that and the headset forward vector, adjusting the lerp to taste.