Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.

Question Handling pointer events on render texture

Discussion in 'UI Toolkit' started by CSStudentsGoe, Aug 18, 2021.

  1. CSStudentsGoe

    CSStudentsGoe

    Joined:
    Feb 17, 2020
    Posts:
    7
    Hi there,

    I'm currently adapting my UI for VR, so I'm using a render texture on a quad inside the game world to render my UI elements.

    I was wondering how I could detect pointer events on the render texture in order to interact with my UI with a VR controller.

    Would it be a good approach to convert world coordinates of a RaycastHit on the quad to local coordinates of the texture and then dispatch an event? What about hovering?

    Any input appreciated! :)
     
  2. antoine-unity

    antoine-unity

    Unity Technologies

    Joined:
    Sep 10, 2015
    Posts:
    792
    Hi! Can you confirm you are using UI Toolkit and not the UGUI/Canvas system?
     
  3. CSStudentsGoe

    CSStudentsGoe

    Joined:
    Feb 17, 2020
    Posts:
    7
  4. antoine-unity

    antoine-unity

    Unity Technologies

    Joined:
    Sep 10, 2015
    Posts:
    792
    Are you using the UI Toolkit package?
     
  5. CSStudentsGoe

    CSStudentsGoe

    Joined:
    Feb 17, 2020
    Posts:
    7
    I'm using the version of UI toolkit that is bundled with Unity 2021.2.
     
  6. antoine-unity

    antoine-unity

    Unity Technologies

    Joined:
    Sep 10, 2015
    Posts:
    792
    Hi, I have converted a sample from the preview package, which works with 2021.2.

    It shows how to render the UI in a texture (through the PanelSettings configuration), how to apply that texture to 3D objects and how to remap coordinates from these object's surfaces to the UI space (UITextureProjection script).

    I can't comment on the specific of getting VR input to work. You would likely need an EventSystem in your scene with the latest version of the new Input system package (pre-release). As long as their is a configuration for Pointer Events in there, the input should work.
     

    Attached Files:

    DrViJ likes this.
  7. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    164
    Hello! Sorry for bumping this thread, and thanks for the example, it works perfect! Just to make sure, is this the recommended way to implement ingame/world gui?
     
    katemj likes this.
  8. antoine-unity

    antoine-unity

    Unity Technologies

    Joined:
    Sep 10, 2015
    Posts:
    792
    If you need a 3D perspective and other scene rendering features, then yes that is the only and recommended way to go with UI Toolkit at the moment.
     
    katemj and DrViJ like this.
  9. Stranger-Games

    Stranger-Games

    Joined:
    May 10, 2014
    Posts:
    393
    I have used your package which worked great with the input system ui module. But when I added XROrigin set it up and changed to XR UI Input Module, it won't work.
    I tried adding a world space canvas and it's working in VR.
    I hope there's an official way to use UI Toolkit within VR.
    For my VR game, the player can use his phone, and building the UI within the phone will be a lot cooler, more realistic and I believe a lot lighter when rendered if UI Toolkit is used.
     
  10. Stranger-Games

    Stranger-Games

    Joined:
    May 10, 2014
    Posts:
    393
    Actually I do prefer to do that using custom collider at the tip of the index finger. When this collider hits the phone screen (which has the render texture of UI Toolkit) I want to send that event to UI Toolkit. I guess the only way to do it is to write my own event system or extend from the XR UI Input Module, right?
     
  11. SimonDufour

    SimonDufour

    Unity Technologies

    Joined:
    Jun 30, 2020
    Posts:
    609
    I think you will want to look into this.
    whack a mole 3d.gif
     

    Attached Files:

    Onigiri likes this.
  12. StripeGuy

    StripeGuy

    Joined:
    Dec 30, 2016
    Posts:
    52
    I was really excited when I saw the title that I had found an answer to my question, but this seems to be doing the exact opposite of what I'd like to do.

    My whole app is using the UI Toolkit as an overlay, and I was wondering if there's a way to interact with the 3D world (old UI system for example) through a render texture within my UI Toolkit app?
     
  13. SimonDufour

    SimonDufour

    Unity Technologies

    Joined:
    Jun 30, 2020
    Posts:
    609
    Then you need to look for CameraTransformWorldToPanel in RuntimePanelUtils to go from world space to something on the ui.

    To know which 3d element is "under the mouse" you would use a regular raycast.
    To go from the UI to the 3d world, you usually take the mouse coordinate during the interaction and camera.ScreenToWorldPoint
     
  14. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    @SimonDufour @antoine-unity we have tested your example and it works perfectly with mouse but in XR with controllers it does not work well because the Vector2 that arrives in the following function appears with very strange negative values:

    https://docs.unity3d.com/ScriptReference/UIElements.PanelSettings.SetScreenToPanelSpaceFunction.html

    I understand that it is a bug, isn't it? It seems that in this part of the code that checks the eventSource it only takes into account touch, pen and mouse...And I guess that's part of the problem.

    I can only see this code in 2023.x branches (I am on 2022 LTS) so I don't know if it is the same or not.

    https://github.com/Unity-Technologi...System.InputForUIProcessor.cs#L132C50-L132C62

    Is there any way to unblock this problem? We have tried not to use that Vector2 and to try to know in every moment which XR controller InputSystem is processing to change the controller pivot transform that processes.

    The rest of the missing code is from your own example:

    Code (CSharp):
    1.  
    2. private XRController currentController = null;
    3.  
    Code (CSharp):
    1.  
    2. InputSystem.onEvent += (eventPtr, device) =>
    3. {
    4.     if (device is XRController controller)
    5.     {
    6.         currentController = controller;
    7.     }
    8.     else
    9.     {
    10.         currentController = null;
    11.     }
    12. };
    13.  
    Code (CSharp):
    1.  
    2. panel.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);
    3.  
    Code (CSharp):
    1.  
    2. private Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition)
    3. {
    4.     if (currentController == null) return new Vector2(float.NaN, float.NaN);
    5.  
    6.     Transform pointerPivot;
    7.  
    8.     if (XRController.leftHand == currentController)
    9.     {
    10.         pointerPivot = myLeftControllerPivot;
    11.     }
    12.     else
    13.     {
    14.         pointerPivot = myRightControllerPivot;
    15.     }
    16.  
    17.     Ray ray = new(pointerPivot.position, pointerPivot.forward);
    18.  
    19.     if (Physics.Raycast(ray, out RaycastHit hitInfo))
    20.     {
    21.         return GetPositionFromHit(hitInfo);
    22.     }
    23.  
    24.     return new Vector2(float.NaN, float.NaN);
    25. }
    26.  
    What happens to us is that if you have one controller active it works but if you have both at the same time it is as if they were competing in the same frame and only one of them is processed (in our case the right one).

    Is there any solution to this problem? Thank you.
     
    Last edited: Feb 18, 2024
  15. uBenoitA

    uBenoitA

    Unity Technologies

    Joined:
    Apr 15, 2020
    Posts:
    224
    Hi bdovaz,

    Support for XR in Unity 2022 and below is only available via the InputSystem package. If you add an EventSystem in your scene and add the InputSystemUIInputModule to it, then you should be able to reassign any of you standard input actions to the device that best suits your needs, including, I believe, XR.

    Calling the SetScreenToPanelSpaceFunction like you did (ignoring the screenPosition argument) might get you the right XR coordinates with a few fixes, but you would still get events only when the mouse, or touch, or pen, is also moving of clicking, unless you use the InputSystemUIInputModule actions mentioned above. If you do, then you should be able to use the screenPosition argument that was sent to you by the XR input directly.

    In terms of having weird negative values, that's an unfortunate consequence of the y coordinate being encoded from the top of the screen vs the bottom, between different parts of Unity. UI Toolkit being a replacement for IMGUI more than anything else, we adopted the top-left origin, whereas uGUI's EventSystem being in the GameObject world uses bottom-left. If you read the ScreenCoordinatesToRenderTexture method included in the unityPackage Simon sent you, you should have a good model on where to flip the y axis to get the values you need.

    I hope this helps!

    Benoit
     
  16. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    Have to tested this? I don't see any thread or post about this anywhere.

    @uBenoitA and what if we use XRI? We need to use:

    https://docs.unity3d.com/Packages/c...R.Interaction.Toolkit.UI.XRUIInputModule.html

    It's still compatible?
     
    Last edited: Mar 5, 2024
  17. uBenoitA

    uBenoitA

    Unity Technologies

    Joined:
    Apr 15, 2020
    Posts:
    224
    I must say, this goes beyond my expertise. I don't have XR hardware available to try it. All we do in UI Toolkit is relay the input events we get from the Input System actions. If it's possible to use XRI with uGUI by adding an EventSystem with the appropriate InputModule component for XR, then in principle that same setup should work with UI Toolkit too.

    Unfortunately I can't help you more than that. If that doesn't help, maybe you can get some more answers by asking in the Input or the XR forums too?
     
  18. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    @uBenoitA do you know how I should configure it so that XRI is compatible (through interactors in my case ray / direct) with UIToolkit using @SimonDufour approach? (https://forum.unity.com/threads/handling-pointer-events-on-render-texture.1158272/#post-8708277) Although you say to ask in the Input or XR forums actually my problem is with UIToolkit and that's why I ask here. Below I will mention members of the XRI team who I see are active on the forum in case they can help.

    Input module of XRI that I am using (default one only changing Active Input Mode to Input System Actions):

    upload_2024-3-9_10-32-4.png

    I have doubt if I have to map something in the UI actions because by default (in the InputActionAsset that comes in the Starter example) only has configured in those UI actions for mouse, pen and touch). But as it has that section of "input devices" with xr input then the doubt arises to me, of course.

    And the documentation is not very extensive for this component... It does not go in depth in each option of this component.

    https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.5/manual/ui-setup.html

    cc @VRDave_Unity @ericprovencher @unity_andrewc
     
  19. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    No radio silence, answer please.

    Unlocking this use case (interaction with XR with UI Toolkit through a render texture) is very important for a lot of people while the real world space support is coming in 2025/2026...

    cc @uBenoitA @VRDave_Unity @ericprovencher @unity_andrewc

    The screen from:

    https://docs.unity3d.com/ScriptReference/UIElements.PanelSettings.SetScreenToPanelSpaceFunction.html

    They are very weird, something has to be going on with the input, that's why I need help to configure it correctly in case it's not a Unity bug because as @uBenoitA says he hasn't really tested it so we don't know if it works or not.

    upload_2024-3-14_9-41-10.png
     
    Last edited: Mar 14, 2024
  20. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    336
    I'm happy you're keen on integrating UITK support with XRI. Unfortunately given that we still do not have worldspace UI support officially, it's not something our team has gotten very far in exploring, and I unfortunately cannot assist with the issue you're observing.

    I will say that I've started working with @uBenoitA on exploring how this integration will look like when we do support worldpsace UI, and helping shape the data and input models for 3D interaction with UITK, but it's very early and we don't have plans for official support via the render texture route you've taken.
     
  21. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    @ericprovencher but can you at least test it? I can attach the project by creating an issue in the issue tracker.

    I mean to know if what I pretend is possible somehow, I'm not asking for worldspace support like world space canvas with ugui but support through a render texture which I see is something supported at least without XR.... I mean, the 3D UI Toolkit use case with render texture via keyboard and mouse does work since 2021 with @antoine-unity example: https://forum.unity.com/threads/handling-pointer-events-on-render-texture.1158272/#post-7432208

    Apart from that I have seen this other example (to try something else) but it does not allow me to interact with 2 controllers at the same time:

    https://gist.github.com/RoxDevvv/83215ae2fe45c5e7416521fe1697fb03

    It is as if the SetScreenToPanelSpaceFunction method is not called when it should and I get in conflict the left and right controls if I use them at the same time.

    Thanks.
     
  22. SeanBannister

    SeanBannister

    Joined:
    Jul 13, 2020
    Posts:
    23
  23. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
  24. antoine-unity

    antoine-unity

    Unity Technologies

    Joined:
    Sep 10, 2015
    Posts:
    792
    Indeed at the moment the
    SetScreenToPanelSpaceFunction
    method is not sufficient to correctly support VR. More logic is needed to correctly interpret input devices on these platforms and send them appropriate UI events.

    I imagine it might be possible to make it work by firing events into the UI yourself, but we haven't tried this ourselves as we are already looking at supporting XR with the actual world space implementation that is progress (which doesn't require the
    SetScreenToPanelSpaceFunction
    override).
     
  25. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,068
    @antoine-unity but that's 2 year at least away... I expected some workaround to unlock this use case with 2022 LTS or 2023/6...

    "I imagine it might be possible to make it work by firing events into the UI yourself"

    Can you explain it more? Thanks.