I want to have a camera rendering to RenderTexture. Another camera will just display(blit) that RT on screen finally. So I have multiple cameras (only 1 is MainCamera). While at the same time I want to handle mouse event (OnMouseOver, OnMouseEnter, OnMouseExit ...) on game objects rendered to RenderTexture. To do this I had tried and searched many ways. I finally try a method to use PhysicsRaycaster in RT-camera. This way my game object, cube, can accept events from EventSystem (OnPointerEnter, OnPointerExit, OnDrag ....). However I find an issue that will fail. When RenderTexture has different size with real screen, the accepted events all happen at incorrect mouse position. I'm not sure if this is an bug, or PhysicsRaycaster just won't work well with RenderTexture. I have made a small test project, the demo scene is TestCamera2.unity Initially, ScreenCamera generates a RT with same screen size. The following screenshot demos the pointer enter/exit events happen at incorrect cursor position in log. (Not on the cube but beside the cube when the window is resized, and RT size become different with screen.) Is there any suggestion to solve this issue? Or ... should this count as bug to report?