Hi guys! As we all know, VR engines makes some "shifts" of eye's cameras to achive stereoscopic effect. In very high level of abstractions it's means that the same image looks slightly shifted to the right in the left camera and to the left in the right camera. In my case, I would like to know the exact value of that shift in texture's pixels units. I work on HTC Vive with display size of 1512x1680 per eye, as well as rendered textures by default. Roughly measured by direct comparing of 2 saved RenderTexuture, that shift is about 40-60px. How to get that value programmatically? There are several open discussion about left/right camera positions which uses XR.InputTracking.GetLocalPosition(Rotation) and cameraToWolrdMatrix. But I don't really understand is it right way to get desired value of shift in pixels or should I use some different approach? Providing the visual example: The black boxes on this images are located on the same positions (in pixel coordinates of image 1512x1680), but they covers different areas of the same picture because VR engine shifted that pictures to left and right to get stereoscopic effect.