Search Unity

Single pass anaglyph stereo image effect (red+blue)

Discussion in 'Image Effects' started by cecarlsen, Aug 24, 2017.

  1. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    862
    I am attempting to write an image effect that combines left and right eye as anaglyph 3D (red+blue). Because I want this to work in single pass stereo mode, I use a single camera and chose the VR API 'Split Stereo Display (non head-mounted)' while setting 'Stereo Rendering Method' to 'Single Pass'.

    EDIT:

    My main issue is that when I assign a RenderTexture to cam.targetTexture, then cam.stereoEnabled flicks to false. IF stereo cameras supported a targetTexture, then I could render at double width and then Blit the texture to screen, sampling from left and right. So at present time it seems that I am limited to render stereo at half width (if I want single pass to work).

    Findings:
    – The VR SDK 'Stereo Display (non head-mounted)' only supports output displays directly as exported app.
    – The VR SDK 'Split Stereo Display (non head-mounted)' renders to two textures if system does not support single pass (RTEyeTextureLeft0 and RTEyeTextureRight0) and one (texture RTEyeTextureDoubleWide0) if it is supported. But Unity's API does not provide a runtime indicator that tells us if it's doing one or the other.

    How can I render a single pass stereo image to one RenderTexture at a specific size (3840x1080)?. And then Blit that texture to screen at another size (1920x1080), reading from left and right side to combine them?

    This is easy piece if you use two cameras. But as I understand the docs, you need to use a single camera to support single pass stereo.
     
    Last edited: Aug 25, 2017
  2. Nateply

    Nateply

    Joined:
    May 20, 2014
    Posts:
    46
    I've been wondering the same thing. I am attempting to build a simple educational game or two for use at my school (8-11 year olds), where I teach. We have low-end PCs in our computer lab that barely achieve an acceptable framerate with one camera and relatively simple scenery. Oculus, Vive, etc, are out of the question due to cost. Cheap 3D, even if on desktop and using the red/blue glasses would be a plus. I've tried the red/blue anaglyph approach using two cameras, but it cuts framerate in half. Single pass VR would appear to do the trick if there was a way to intercept, filter, and render the passes to screen. Maybe a post-processing image effect would work, but I'm not a shader guru by any means to write one. It would be interesting to see what you find out about this.
     
  3. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    862
    If only Unity's stereo camera supported setting a target RenderTexture (or two). *Sigh*
     
  4. cyenketswamy

    cyenketswamy

    Joined:
    Jul 24, 2015
    Posts:
    41
    Use the builtin Particles/Standard Unlit shader

    Rendering mode to Modulate and Color Mode to Multiply.

    Create two images in an external photo editing software one cyan another red. Apply these onto a plane object in Unity and use the above shader. Create another plane in front of the red and apply some anaglyph image.

    If you look through the filters they behave exactly like physical anaglyph glasses. Place each filter in front of two cameras in your scene and configure to render out to the left and right eyes on your VR headset.