Hi, i have a software which shows portals through different worlds (). It does so by using render textures, having a total of six cameras in different locations, all moving as the player moves. This works well on a computer screen. If I hook it up to Windows Mixed Reality, it doesn't look good at all. Is there something I have to know about using render textures in VR? Cheers
Could you elaborate on how you are using the render textures, your camera setup, and what issues you're seeing when using WMR? VR does require special considerations, so depending on your usage and setup, check out: 1) https://unity3d.com/learn/tutorials/topics/xr/rendering-vr 2) https://docs.unity3d.com/Manual/SinglePassStereoRendering.html
Hi @nienokoue, I have a similar issue now that I use a custom portal shader as found in the portal github project which works nicely in LWRP but in VR single pass rendering it only outputs to one eye causing massive flickering. Were you able to get your issue solved and if yes, would you share the result? @Benjams your input getting the above github project running in VR would be highly appreciated. Would be a nice showcase as well how to get custom shaders working in VR.
I recommend you familiarize yourself with VR rendering (https://unity3d.com/learn/tutorials/topics/xr/rendering-vr) and then use the stereo-rendering helper functions (https://docs.unity3d.com/Manual/SinglePassStereoRendering.html) to upgrade your shaders to be stereo-aware. You may also need to adjust your portal rendering camera to render in stereo, or set up a portal-camera/renderTexture pair for each eye and render into the correct slice of the eye texture when doing your portal shader pass. Alternatively, you may instead find it useful to use the stencil buffer for this kind of effect.
Playing in the editor with your device attached and the appropriate VR SDK selected in your player settings should work for most devices. You might also prefer to use the MockHMD as the VR SDK. The MockHMD is a good choice if you would rather not rely on having a device connected while developing your project.
@Benjams I used to live connect my device with the vive in the past and that worked nicely but now I have an Oculus Quest where I could not figure out yet how to get that to work. Do you know if it is supposed to work?
Maybe check the format you are using for your render textures ? I was using them in Oculus Quest and had weird glitches when using ARGBFloat format. Downgrading the format to RHalf solved my issue (as i just needed a low res R channel only). @rwetzold I think the "play and test" function is not currently availble on Quest. Oculus and Unity is planning to make it possible since the Quest will be usable to play PC VR games by pluggin it. For now, it's just build and run