Using Unity 2018.3.11f1, I am building an application that spans across two separate screens. How can I determine which screen a touch (or mouse click) occurred on? Below please see my attempt. The problem is that for this to work, I need a way to determine which camera/display to use for my raycast hit test. How do I do that? Code (CSharp): // if the left/primary mouse button has been pressed if( Input.GetMouseButton(0) || Input.GetMouseButtonDown(0) || Input.GetMouseButtonUp(0) ) { // get both cameras Camera display1Camera = GameObject.Find( "Display1Camera" ).GetComponent<Camera>(); Camera display2Camera = GameObject.Find( "Display2Camera" ).GetComponent<Camera>(); Camera displayActiveCamera = MagicFunctionThatDeterminesWhetherClickWasOnScreen1OrScreen2( Input, display1Camera, display2Camera ); // This is what I am missing // create a ray from the active camera to the click position Ray ray = displayActiveCamera.ScreenPointToRay( Input.mousePosition ); // check if the ray hit anything if( Physics.Raycast( ray, out hit, touchInputMask) ) { // get the game object that was hit GameObject recipient = hit.transform.gameObject; // interact with it if( Input.GetMouseButtonDown(0) ) { recipient.SendMessage( "OnTouchDown", hit.point, SendMessageOptions.DontRequireReceiver ); } } }
do you run one app that is split or stretched over two screens? If it is one app why not ask for the position in resolution, if greater than resolution/2, it’s right or whatever needed.
Yes, this is one app that is using two screens. I am using two separate cameras, one with Target Display 1, and the other with Target Display 2. Input.mousePosition is returned relative to the current screen, i.e. something along the lines of if( input.mousePosition.x > screen1.width) { currentScreen = 2; } will not work because the x position returned by Input.mousePosition will never be greater than the width of the screen. I got it working for mouse click based on the approach discussed in https://answers.unity.com/questions/1349602/detect-the-display-of-current-mouse-position-windo.html and https://stackoverflow.com/questions...le-display-click-works-only-on-maincamera-tag. However, I still need to get it working for Touch events, i.e. what screen did the touch occur on?
When you raycast on touch you could check for an transparent blocking object and get the camera. Not sure if the eventsystem itself holds an option for detecting the current „touched“ camera