Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

UI and multiple displays

Discussion in 'UGUI & TextMesh Pro' started by casimps1, Feb 11, 2016.

  1. casimps1

    casimps1

    Joined:
    Jul 28, 2012
    Posts:
    254
    I'm working on a project in Unity 5.3 w/ multiple display support. I'm rendering a separate canvas to each display in fullscreen native resolution. The idea is that one display is a touchscreen for input while the other is a regular non-touch display for output.

    Everything works great in the editor. When running a build, everything displays well, but no input from the touchscreen is registered when touching UI buttons.

    I found this post from an old beta thread that seemed to say that input didn't work on multiple displays, but the thread hasn't been updated in several months.

    Is there any update on this? Is it just impossible to get this working currently? Are there any workarounds?
     
  2. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    We disabled the multiple display support in 5.3 as it had a bug that caused none native resolutions to report incorrectly. This broke all ui so we had to disable it. Unfortunately it is still not fixed and will not be fixed in 5.4. We are waiting on the multiple display team but its not a simple fix. So for the moment multiple display and ui do not work together. Once its fixed we will backport to a patch if possible.
     
    KAYUMIY likes this.
  3. casimps1

    casimps1

    Joined:
    Jul 28, 2012
    Posts:
    254
    So, if I need to develop a multi-display app designed for two 1080p monitors, for example, is the best workaround to just build a single canvas at 1920x2160 resolution (1080p x 2) and rely on the OS virtual desktop to stretch the app across multiple displays?

    I've done this in the past and it seemed to work OK. But when I saw that proper multi-display support was added in 5.3, I was eager to give it a try.
     
  4. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Well you could use 5.3.0 which had it and wait until its fixed before upgrading. If you are using native res for both screens them it should work.
     
  5. casimps1

    casimps1

    Joined:
    Jul 28, 2012
    Posts:
    254
    I've been testing builds w/ 5.3.1 and they've been working and displaying properly. It's just that input (registering touches on Unity UI buttons) hasn't been working.

    Are you saying input works properly in 5.3.0? I am planning on full native res for both screens, so, if that's true, it's certainly a possible solution for me.
     
  6. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
  7. psarrett

    psarrett

    Joined:
    Aug 13, 2015
    Posts:
    1
    Any update on this issue? Is it fixed in the latest release (5.4.0f3)?
     
  8. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    No unfortunately significant issues were found in the multiple display system so it's with the Graphics team at the moment. I'll chase it up and see what the status is.
     
  9. James-Williams

    James-Williams

    Joined:
    Aug 30, 2016
    Posts:
    26
    Karl, were you able to get an update on this patch or it's release date. My team is working on a project which utilizes multi screen and would like to get some info or possible work arounds.
     
  10. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    This issue is not fixed yet. I need to chase up with the graphics team but dont expect a fix for some time.
    How is your multiple display system going to work? How many screens and will the UI be on all of them?
     
  11. James-Williams

    James-Williams

    Joined:
    Aug 30, 2016
    Posts:
    26
    There are 4 screens. 1 screen is a touch screen which will feature the UI and the other 3 are regular monitors which will simply show gameplay.
     
  12. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    If you make the touch screen the primary monitor then you should be fine.
     
  13. Paradox_Games

    Paradox_Games

    Joined:
    Aug 12, 2014
    Posts:
    1
    Is there any update on this bug? I'm currently running 5.5.0f3 and it seem the multi display input still isn't working. I unlike the OP require inputs through both displays.
     
  14. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    It depends on the platform. Windows works fine. Osx and Linux should work however the input will be sent too all displays, not just the relevant one.
     
  15. Shrikky23

    Shrikky23

    Joined:
    Jun 30, 2013
    Posts:
    24
    I am trying to have the gameplay on one monitor and UI on a touch screen. Does it work with Unity5.5, I followed multi display tutorial it says display length is 1 ? It doesn't detect even my 2nd monitor.
     
  16. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Yes it should be no problem for what you want. What platform is this on? The editor will always say 1 display, but will still work, you just dont need to activate displays in the editor. The player should return the correct number of displays.
     
  17. Shrikky23

    Shrikky23

    Joined:
    Jun 30, 2013
    Posts:
    24
    Hi Karl,
    Thanks for the reply, it works fine. I notice some issues here.
    1. The input doesn't work properly, I need to find out which window actually accepts Input (lets say I have 3 Game tabs with Display Target 1, 2, 3). I want the input to be accepted on Target Display 1 consistently, which its not happening in editor. Also I cannot maximize the game tab that is attached with the editor when I have multiple game tabs open. In build exe it works fine ( to the best of my knowledge, did not intentionally test)

    2. When I select a Target Display X, the UI on Target Display Y and Z gets changed , temporarily, I believe its trying to apply aspect ratio from the selected Target display on editor to all the game tabs?

    3. Now that I have game on display Target 1, UI X on display Target 2 and UI Y on Display Target 3. I am trying to create a Common Canvas that Can be rendered on Both the Display Target 2 and Display Target 3, as these players share some common data. How Can I create a canvas and render it on 2 out of the 3 display targets I have? (Note: I am not using Camera for rendering UI, I do it all based on Screen Space overlay and I choose the display target)

    Thank you for getting back to me so quickly. I appreciate that.
     
    Last edited: May 17, 2017
  18. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    There is currently no way to do this. You need to either use a world space UI(and use layers to control who sees it) or have a canvas per camera.

    The editor has limited multiple display support. It does not know which GameView/displayId is being interacted with so the input goes to all. Its just a case of something that needs more work on our end. The only platform that is currently able to distinguish between the input is Windows Standalone Player. We have plans to address this in the future.
    Yes I think this is an editor only issue. Can you file a bug report?
     
    Last edited: May 18, 2017
  19. Shrikky23

    Shrikky23

    Joined:
    Jun 30, 2013
    Posts:
    24
    Okay I'll make a project and send it once I get home from work. I really hope we could duplicate canvas across multiple screens, that would be epic. As of now I created the world canvas for UI and I placed it before my TargetCam1 and TargetCam2 (which are at 0,0,0), so my 3D UI is kind of overlaying my 2D UI to get a common shared UI between the display targets. Now that VR Is getting popular, multi display is getting really common.

    Thanks for such a quick response :)
     
    karl_jones likes this.
  20. zombience

    zombience

    Joined:
    Mar 29, 2012
    Posts:
    57
    Hi there, is using UI with mouse input on multiple displays stable now?
    I am currently running a 2 display build with main output on display 1, and a smaller different sized resolution output on display 2 that contains UI. The output looks fine, but I cannot interact with the UI on display 2. There are some scenarios where it may be possible that Display 2 is getting mouse clicks and propagating them to the UI, but the coordinates are completely wrong so I cannot interact with the UI.

    The coordinate offset is just a hunch, I have only seen a symptom that seemed to suggest offset coordinates once or twice, and I cannot reliably repeat this.

    Any insight on the status of multi display mouse input? Anyone else have this working properly?
     
  21. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    What platform are you building to? Are you using Graphic raycaster, physics?
     
  22. zombience

    zombience

    Joined:
    Mar 29, 2012
    Posts:
    57
    I am working on Windows 10 and building to Windows 10 desktop. I am using the Graphic Raycaster component on the canvas element.
     
  23. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    That should work. Can you file a bug report with a sample project?
     
  24. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    Hello @karl_jones , I am using Unity 2018.2.11f1 on macOS. I have 2 displays with 2 cameras and assigned main camera to display 1 and other camera to display 2. I have attached same script on both the camera in which it raycast and print the name of the collided gameobject. I have also placed two cubes in front of both camera. Both are working okay in unity editor. The problem I am facing is that the main display is showing the name of the collided object but not working with the second display. Will you please help me with this problem?

    I have event used separate canvases with different settings. In other words, any type of input is not working with second display.
     
    Last edited: Mar 11, 2019
  25. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Are these physics raycasters?
    Multiple display support was added in 2018.3.7. For Graphics raycasters it should work in 2018.2, if it does not please file a bug report.
     
  26. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    Woa, that speed of reply <3 . Thanks for so much fast reply. Yeah, those were physics raycasters. I'll switch to 2018.3.7 or later then. Thanks again.
     
    karl_jones likes this.
  27. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    @karl_jones hey man! I have downloaded Unity3D 2018.3.8f1 on mac and added everything like I have done before. It seams like it is also causing the same problem with UI and physics raycast as well. I have tried every setting of canvases.
     
  28. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Could you provide an example project so I can try?
     
  29. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    @karl_jones attached is the example project of Unity3D 2018.3.8f1 .
     

    Attached Files:

    • Proj.zip
      File size:
      22.9 KB
      Views:
      391
  30. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    @karl_jones Hey there mate. Have you got any chance to check the above given example project?
     
  31. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Your project is using Physics.Raycast, I meant the UI `Physics Raycaster component` supported multiple displays in 2018.3.
    You will need to make some changes to your script.

    This fixes the issue, we need to convert the mouse position into a relative one but only when not in the editor. Display.RelativeMouseAt doesn't work in the editor :(

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using UnityEngine.UI;
    4.  
    5. public class SelectorScr : MonoBehaviour
    6. {
    7.     public Camera myCam;
    8.     private RaycastHit hit;
    9.     private Ray ray;
    10.     public Text selectedGOName;
    11.  
    12.     private void Update()
    13.     {
    14.         if (Input.GetMouseButtonDown(0))
    15.         {
    16.             #if UNITY_EDITOR
    17.             var mousePosInScreenCoords = Input.mousePosition;
    18.             #else
    19.             // Convert the global mouse position into a relative position for the current display
    20.             var mousePosInScreenCoords = Display.RelativeMouseAt(Input.mousePosition);
    21.             #endif
    22.  
    23.             // z is the display Id
    24.             if (mousePosInScreenCoords.z == myCam.targetDisplay)
    25.             {
    26.                 ray = myCam.ScreenPointToRay(mousePosInScreenCoords);
    27.  
    28.                 if (Physics.Raycast(ray, out hit))
    29.                 {
    30.                     hit.collider.gameObject.SetActive(false);
    31.                 }
    32.             }
    33.         }
    34.     }
    35. }
    36.  
    37.  
     
    Last edited: Mar 12, 2019
    Circool likes this.
  32. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    Thanks alot for your help, it's really helpful. One more thing Karl, were you able to use UI buttons on the second display as well?
     
  33. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    This is what I see
    Untitled.png
     
  34. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    Yeah, I am seeing the same thing at my end too. So were you able to click the buttons at the bottom of both of the screens because at my end, I am only able to click the button of single display (left screen button). If yes? Kindly tell me your Unity version and platform you are building on.
     
    Last edited: Mar 13, 2019
  35. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Yes it works fine for me on Windows 10 with Unity 2018.3.8f1. Are both your screens the same resolution?
     
  36. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    No, these are of different resolution.
     
  37. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Its possible there is a bug when using different resolutions. Can you file a bug report?
     
  38. imranbinazhar

    imranbinazhar

    Joined:
    Apr 11, 2015
    Posts:
    12
    Yeah sure.
     
    karl_jones likes this.
  39. StudioEvil

    StudioEvil

    Joined:
    Aug 28, 2013
    Posts:
    66
    m4d, zyzyx and tabulatouch like this.
  40. tabulatouch

    tabulatouch

    Joined:
    Mar 12, 2015
    Posts:
    23
    We have the same issue, multi-display is a core feature of our current development.
    Hope this fix gets in priority!
     
    m4d, mkanevsky and zyzyx like this.
  41. JG-Denver

    JG-Denver

    Joined:
    Jan 4, 2013
    Posts:
    77
    m4d likes this.
  42. m4d

    m4d

    Joined:
    Jan 25, 2012
    Posts:
    27
    We're having the same issues here and are desperately longing for a fix.
     
  43. aalmada

    aalmada

    Joined:
    Apr 29, 2013
    Posts:
    21
    I'm developing an app that runs across 3 large multi-touch displays. I added the following script just to get all touch point coordinates:

    Code (CSharp):
    1.  
    2. using System.Text;
    3. using UnityEngine;
    4. using UnityEngine.UI;
    5.  
    6. public class TouchHandler : MonoBehaviour
    7. {
    8.     Text textComponent;
    9.  
    10.     void Awake()
    11.     {
    12.         textComponent = GetComponent<Text>();
    13.     }
    14.  
    15.     void Update()
    16.     {
    17.         var text = new StringBuilder();  
    18.         for (var index = 0; index < Input.touchCount; index++)
    19.         {
    20.             text.AppendLine($"{index} - {Input.GetTouch(index).position}");
    21.         }
    22.         textComponent.text = text.ToString();
    23.     }
    24. }
    25.  
    When I touch the first display, I get the coordinates for my 10 fingers. When I touch the other displays, nothing happens.

    I stretched the Windows-native Paint app across the 3 displays and the touch works on all of them.

    Is there a limitation for touch?
     
  44. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Could you please file a bug report so we can investigate?
     
    aalmada likes this.
  45. aalmada

    aalmada

    Joined:
    Apr 29, 2013
    Posts:
    21
    Hi @karl_jones

    I filled the bug but unfortunately it's still set as "open"...

    As described, our touch system works fine on the Windows desktop but, on Unity, it only works on one display. Since then, I've found that it only works on the main display.

    Unfortunately, merging the displays is not an option for us.
     
  46. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Do you have the bug number so I can take a look?
     
  47. aalmada

    aalmada

    Joined:
    Apr 29, 2013
    Posts:
    21
  48. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    8,230
    Thanks. QA will take a look. I suspect this is a limitation with our input system more than multiple dispplay. I know that our new Input System has plans to suppot multiple display touch but im not sure about the current systems limitations. I will speak to the input team.
     
  49. aalmada

    aalmada

    Joined:
    Apr 29, 2013
    Posts:
    21
    @karl_jones Thanks for the help!

    I just now added UI buttons and it works fine with them. The issue seems to be only with Input.GetTouch() API. Could it be that they are using different Windows APIs?

    I can't update the issue. Can you please update the team on this find?
     
    Last edited: Oct 9, 2019
  50. aalmada

    aalmada

    Joined:
    Apr 29, 2013
    Posts:
    21
    @karl_jones Sorry but, any luck? How can I add more info to a reported issue? Do I have to submit another one?