Search Unity

Using the Unity Event System for VR

Discussion in 'AR/VR (XR) Discussion' started by ibrews, May 22, 2017.

  1. ibrews

    ibrews

    Joined:
    Mar 21, 2013
    Posts:
    31
    Hello!

    I really like how the GoogleVR SDK easily utilizes events like PointerEnter, PointerExit, and PointerClick, and was hoping to extend that functionality to a desktop VR experience that I had already built. Unfortunately, it looks like PointerClick doesn't follow to the typical "Fire1" input mappings beyond a mouse click, so I can't get PointerClick to respond to a button pressed on the VR game controller.

    I'm trying to keep my scripts as simple as possible since I plan on using this project to teach others. Does anyone know the easiest way to either:
    1) Extend the functionality of PointerClick to accept additional inputs (like, say, the B button on an Oculus Touch controller)
    or
    2) Create a new event that can be handled by the Event System (like, say, VRPointerClick which could then map to everything Fire1 does as well as GVRController.ClickButton.

    Thank you!
     
  2. catox

    catox

    Joined:
    Dec 19, 2014
    Posts:
    15
    I think what you need to do is to extend the input module of the eventsystem, because it's the part do the mapping things. Check Unity docs for more.
     
  3. Selzier

    Selzier

    Joined:
    Sep 23, 2014
    Posts:
    652
    Once you find what input the VR controller responds to, you can manually inject a PointerClick event via script. I show at in this tutorial at about 13:30:
     
    ibrews likes this.
  4. ibrews

    ibrews

    Joined:
    Mar 21, 2013
    Posts:
    31
    Thanks @Selzier! Love your youtube channel btw. That line of code about the Execute command definitely helps me a lot. However, I'm trying to have something that works across mobile and desktop within the same scene, so as far as I know, using the GVR SDK isn't an option since a lot of that functionality depends on having your platform set to Android.

    If I don't use the GVR SDK, then can I achieve similar results with the standard Unity Event System and a Physics Raycast component to read PointerEnter and PointerExit? I just tested and it seems to almost work... but the raycast seems very inconsistent. and definitely not centered on my vision. It also seems like Unity by default uses the location of the mouse in the Game window to read PointerEnter and PointerExit. Am I missing something?

    Thanks!
     
    ROBYER1 likes this.
  5. Selzier

    Selzier

    Joined:
    Sep 23, 2014
    Posts:
    652
    You would need to write you own custom Input Module (gaze input module) to replace the Standalone Input Module from Unity. That's what I did with Mobile VR Interaction Pack, so I don't have to use GoogleVR SDK anymore:
    https://www.assetstore.unity3d.com/en/#!/content/82023

    That does not require Android or anything, it should work with any Unity project. I don't have a Vive or Rift so I can't test with those devices but it should function the same, casting a ray out from camera's position for PointerEnter/Exit events, etc.

    Different Unity versions have different positions for "center of screen" when VR mode is enabled in build options, so you may need to get center of screen position via screen with and height, or VREyeTexture width and height, just depends on Unity version and if VR mode is enabled or not.
     
    ibrews likes this.
  6. ibrews

    ibrews

    Joined:
    Mar 21, 2013
    Posts:
    31
    Gotcha. Thanks so much for the help!
     
  7. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    As far as the mouse goes, if you still have the standalone input module on your event system, the mouse will still drive your pointer events from the main camera. Writing your own input module and removing the standard one would prevent this from happening. You'll off course need to raycast from either the controller or your head instead, and call the executeevents for the pointer events yourself.
     
    ibrews likes this.
  8. ibrews

    ibrews

    Joined:
    Mar 21, 2013
    Posts:
    31
    Ahhh that helps to clarify a few things. Thank you @greggtwep16 !
     
  9. adammarcwilliams

    adammarcwilliams

    Joined:
    May 12, 2017
    Posts:
    6
  10. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    Thanks. Just what I was looking for. Now I need to figure out how to get the click action to work with several different UI and 3D object raycast systems.
     
    ba55yunky and Selzier like this.
  11. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    In ExecuteEvents.Execute I was able to use
    Code (CSharp):
    1.  
    2. ExecuteEvents.Execute(gameObject, new OVRPointerEvents (EventSystem.current), ExecuteEvents.pointerClickHandler);
    which I assume replaces the Unity pointer with the HMD's pointer. If I want to simply click the button or raycast to the 3D object I am looking at by pressing A on the gamepad how is that done?
     
    ROBYER1 likes this.