There is very strange bug when setting eyeTextureResolutionScale other than 1 in OculusGO. (at runtime) I have a simple way to detect input in my vr scene. I cast a raycast from the OculusGO controller and when it hits a collider with a specific script, it launches an event. BUT I also have in the same script the method OnMouseDown. I use it to fire the same event, so I can test it in the unity editor without having to build and run again and again. If eyeTextureResolutionScale = 1 everything works fine when running on OculusGo but when eyeTextureResolutionScale has a value other than 1, Unity or Oculus creates a raycast that changes direction depending on where you are looking at and that sends input signal like a mouse. (The raycast I send from the controller works perfectly, it is not the problem) So, If you press any of the controller's button, while looking around, the OnMouseDown function will get called (if you are looking in a random precise location). It will even start taking the swipe movement of the controller as an OnMouseDown! The strangest thing is that it only does this when eyeTextureResolutionScale is different from 1. If you want to test it out, just make a big cube near the camera with a BoxCollider and with a script with the OnMouseDown function. When eyeTextureResolutionScale = 1 see that when pressing the controller and looking around, OnMouseDown is not getting called. But with eyeTextureResolutionScale = 1.7 (for example), pressing the controller and looking around will call the OnMouseDown method. PD: To change eyeTextureResolutionScale at runtime you have to use Code (csharp): UnityEngine.XR.XRSettings.eyeTextureResolutionScale = 1.7 You can set it at the Start() or create buttons to change it yourself at runtime or whatever you want. Hope this post helps someone so he/she doesn't lose countless hours trying to figure out why there are invisible buttons in their scene.