Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

Motion Controllers (Vive or Oculus) without plug-ins (SteamVR or Oculus Tools)

Discussion in 'AR/VR (XR) Discussion' started by plmx, Dec 29, 2016.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    Is it possible to make use of the motion controllers (Vive) or the Oculus Touch controllers with native Unity 5.5 - i.e. without the SteamVR asset nor the Oculus Tools?

    I noticed that the Vive HMD works fine without SteamVR, but I'm not sure how and where to add controller support, as they don't "simply work".

    Any pointers would be greatly appreciated.
     
  2. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
  3. NickAtUnity

    NickAtUnity

    Unity Technologies

    Joined:
    Sep 13, 2016
    Posts:
    84
    plmx likes this.
  4. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    Hi Nick,

    thanks for the answer!

    I implemented input yesterday, though it took me quite some time to figure it out. The precise steps could be more explicit in the docs I think. I got a working implementation, but I am not sure this is the right way - so, can you confirm that the best way to map the buttons is to...:
    • for (just pressable) buttons, create an "key or mouse button" axis in the input manager, and then use "joystick button <buttonNr>" as the positive button in the axis, where buttonNr comes from the page you linked ("Unity Button ID")
    • for 0-1 buttons like the trigger and for the thumbsticks/touchpad, create an "Joystick Axis" axis in the input manager, and then set the axis to <axisNr>, where axisNr comes from the page you linked ("Unity Axis ID")
    ...?

    Since I am currently trying to completely support both Vive and Oculus with native Unity, some follow-up questions for you:
    • How do I trigger haptic feedback on the controllers?
    • How can I get the velocity of the controllers, which is required for throwing things (at least in the SteamVR plugin they use that to throw, maybe there is another way)
    • The axis of the trigger is only 0 or 1 for me - no intermediate values.
    • While loading scenes, at least SteamVR with the Vive drops back into the "This is real" compositor. The SteamVR plugin offers a way to set a custom loading screen. Is this possible in stock Unity?
    Thanks!
     
    Last edited: Jan 4, 2017
  5. williamj

    williamj

    Unity Technologies

    Joined:
    May 26, 2015
    Posts:
    94
    @plmx,

    The way you set up button and axis input is correct, sorry it was such a pain!

    As for your other questions:

    - Haptic feedback is not supported natively, yet.

    - In our public releases, querying velocity of the controllers is not native but is coming very very soon.

    - Hmmmm, axis of the triggers should return intermediate values... Perhaps check that you're not casting the value to an int. If that's not the case, try playing around with the Gravity, Dead, and Sensitivity properties on the associated input element in the Input Manager.

    - I'm not sure! I'll ask around.

    -Will

    Unity QA
     
    plmx likes this.
  6. williamj

    williamj

    Unity Technologies

    Joined:
    May 26, 2015
    Posts:
    94
    plmx likes this.
  7. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    @williamj Thanks very much for your detailed answers, very much appreciated!! It seems like you want to include all of the functionality I mentioned in Unity itself at some point, which is great!

    Do you already have a version planned for the haptics, which in addition to the velocity, would be the most important for me?

    Regarding the axis: It's a float, but setting the sensitivity to 1 did the trick. Thanks!
     
    Last edited: Jan 5, 2017
  8. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    @williamj @NickAtUnity One more thing here is the size of the play area for room-scale mode. Is there a way to get that in stock Unity? Planned?
     
  9. NickAtUnity

    NickAtUnity

    Unity Technologies

    Joined:
    Sep 13, 2016
    Posts:
    84
    For both haptics and play area, we are currently looking at how best to expose that via native Unity APIs. We have no timeline in place, but are doing the R&D to figure out how we can give developers access to those systems in a way that will work smoothly and consistently with various SDKs.
     
    Last edited: Jan 11, 2017
    plmx likes this.
  10. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    Thanks!
     
  11. bzor

    bzor

    Joined:
    May 28, 2013
    Posts:
    31
    sorry to revive this thread, but it looks like you have a good handle on with what I'm dealing with now..

    I'm trying to map the Touch buttons in the Input Manager, and I'm having some issues reading all of the buttons. I followed what plmx suggested, but here are the mappings I'm getting:

    Button.One is "A" which is "joystick button 0" in the chart
    - "joystick button 0" actually fires when I press "B"

    Button.Two is "B", but the Unity Button ID isn't in the chart

    Button.Three is "X" which is "joystick button 2" in the chart
    - "joystick button 2" actually fires when I press "Y"

    Button.Four is "Y" but the Unity Button ID isn't in the chart

    Button.PrimaryThumbstick and Button.SecondaryThumbstick work as expected (8/9)


    So with this setup I can read B and Y but not A and X.. Super confusing, am I missing something stupid here? thanks!! I'm on Unity 5.5 with Rift/Touch, OpenVR/SteamVR Plugin
     
    minersail likes this.
  12. buestad

    buestad

    Joined:
    Oct 3, 2012
    Posts:
    11
    I am working on implementing the Vive controller buttons now using just Unity native input. Most is OK but I am not happy with the trigger button. The button ID 14 (left) and 15 (right) goes true before the click in the Vive controller. The float value from the corresponding axis 9 (left) and 10 (right) is 0.25 when the value goes true and 0.2 when it goes false. The float value goes all the way up to 0.9 before the click is heard. When the click happens the float value jumps to 1.0. I can use a check to see if the value is 1.0 for the actual click but but it would be nicer if the Unity implementation actually used the button click for bool value of button ID 14 and 15 in stead of the 0.2-0.25 hysteresis implementation...
     
  13. SuppleTeets

    SuppleTeets

    Joined:
    Aug 22, 2014
    Posts:
    23
    This thread was super helpful!

    Also, is there a clever way to detect when controllers are active/inactive? Like, when they're tracking vs when they are not? I'm trying to switch to full native and I can't seem to find that one. (I'm on 5.6.0f3)

    EDIT:
    Looks like in 2017.1b there's VRNodeState.tracked which contains the goods. Guess I gotta go beta for it.
     
    Last edited: Apr 13, 2017
  14. RealSoftGames

    RealSoftGames

    Joined:
    Jun 8, 2014
    Posts:
    220
    sorry to necro this thread but i seem to be running into similar problems using unity 5.6 trying to dispose of steam VR and hook in my own implementation. cant seem to get the touch buttons to work on on any of the controls and as bzor mentioned the mapping is incorrect for the documentation
     
    bzor likes this.
  15. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    258
    We gave up on native Unity VR support and are now using SteamVR and OculusTools. There are just too many things that do not work or do not work correctly for implementing it natively. Real deal-breakers are (in 5.6):
    Some more minor things which are still annoying:
    Unless Unity addresses these, we are sticking to the plug-ins (and a custom-made wrapper).

    Sorry I can't be of more help here.
     
    Last edited: May 15, 2017
    Gruguir and bzor like this.
  16. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    A true cross platform API for VR with Unity is still a ways off. If you take a look at the 2017 betas you will see that VR.InputTracking does have quite a few additions but still not comprehensive. In general, Unity has stated that they are looking to get things fairly universal around the time of their new input system which is still very experimental and will only result in another experimental build later this summer. Realistically speaking I think we can count on at least another year of using the utilities from each vendor.
     
  17. Juanola_

    Juanola_

    Joined:
    Sep 29, 2015
    Posts:
    10
    Hi! Is there any update on the haptic feedback using the Unity VR API?
     
  18. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
  19. Juanola_

    Juanola_

    Joined:
    Sep 29, 2015
    Posts:
    10
  20. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    263
    Can someone please update the https://docs.unity3d.com/Manual/OpenVRControllers.html page to make this more clear? Because i had the same "problem" the page does no where mention that you have to enter "joystick button <Nr>" i thought i only have to enter the number and it took me a while to find a answer
     
  21. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    322
    are there news since ? I started working on XR plugins (input wrapper/device switcher..) seeing people including myself struggling with proprietary SDKs, and native XR integration is really helpfull for cross-platform needs. But it looks like most users still relying on OpenVR/Oculus etc for inputs since native integration is not complete.
     
    SuppleTeets likes this.
  22. cdytoby

    cdytoby

    Joined:
    Nov 19, 2014
    Posts:
    158
    I'm doing this right know, and I also have trouble implementing controller axis, because there is no way to get axis float by id.

    Also, for OpenVR, is there a way to decide which vr system is used? At least for controller it's not possible to decide I'm using htc vive controller or oculus touch controller.
     
  23. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    322
    You can check device and controller names.
     
  24. Nigey

    Nigey

    Joined:
    Sep 29, 2013
    Posts:
    1,115
    Slight necro here, but I'm having trouble with even the basics of having the triggers working with the input manager. Can anyone see where I'm going wrong here (been on this 2 days now):

    I have this setup:



    With this:

    Code (CSharp):
    1.         public const string MenuFormat = "Vive {0} - Menu Button";
    2.         public const string TrackpadPressFormat = "Vive {0} - Trackpad Press";
    3.         public const string TrackpadTouchFormat = "Vive {0} - Trackpad Touch";
    4.         public const string TrackpadVerticalFormat = "Vive {0} - Trackpad Vertical";
    5.         public const string TrackpadHorizontalFormat = "Vive {0} - Trackpad Horizontal";
    6.         public const string TriggerTouchFormat = "Vive {0} - Trigger Touch";
    7.         public const string TriggerSqueezeFormat = "Vive {0} - Trigger Squeeze";
    8.         public const string GripSqueezeFormat = "Vive {0} - Grip Squeeze";
    9.  
    10.         public bool IsUseButtonTouchPressed { get { return Input.GetButtonDown(string.Format(TriggerTouchFormat, SideFormat)); } }
    11.         public bool IsUseButtonTouchReleased { get { return Input.GetButtonUp(string.Format(TriggerTouchFormat, SideFormat)); } }
    12.         public bool IsUseButtonUp { get { return Input.GetButtonUp(string.Format(TriggerSqueezeFormat, SideFormat)); } }
    13.         public bool IsUseButtonDown { get { return Input.GetButtonDown(string.Format(TriggerSqueezeFormat, SideFormat)); } }
    14.         public bool IsUseButtonHeld { get { return Input.GetButton(string.Format(TriggerSqueezeFormat, SideFormat)); } }
    With this both IsUseButtonTouchPressed and IsUseButtonTouchReleased are true and false all the time when you're holding down the trigger slightly, and both IsUseButtonDown and IsUseButtonUp never trigger at all.

    Does anyone know what's going on? Input Manager just seems like a total enigma to me.
     
    Last edited: Jul 17, 2018
unityunity