Search Unity

Graphics "VoluMedic For Unity", Direct Volume Renderer

Discussion in 'Tools In Progress' started by UlfvonEschlauer, Feb 23, 2022.

  1. UlfvonEschlauer

    UlfvonEschlauer

    Joined:
    Dec 3, 2014
    Posts:
    127
    Hello!
    I wanted to share a little of what I have been working on lately. My company develops tools for the rendering, processing and analysis of volumetric data from modalities like computed tomography. MRI, laser microscopy, etc.
    Our current flagship product is VoluMedic for LightWave3D. I am currently developing a cinematic quality volume renderer for Unity. Right now this is for internal use only (so don't expect to find it in the asset store any time soon):



    The video above features raytraced soft shadows, raytraced ambient occlusion and volumetric subsurface scattering. The render textures are composited in my own custom compositor after some filtering, etc.
    It technically renders at about 35 to 40 fps on my GTX 1060, but I had to make some dirty tricks to allow animation from the timeline, which lowered this framerate somewhat.
    I also wrote a custom Depth Of Field effect that can be seen in the rendering of Drosophila.
    Let me know what you think!
     

    Attached Files:

    julienkay likes this.
  2. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Woah, that's beautiful! You mention that render textures are composited - what does that mean exactly? Are you doing some preprocessing steps for subsurface scattering and AO? Either way, that's really nice - personally I think that high-quality volume rendering would be a tremendously helpful tool in Unity for all those medical mixed reality assets (or at least those that run on a PC, not expecting that to run on a mobile device).
     
    UlfvonEschlauer likes this.
  3. UlfvonEschlauer

    UlfvonEschlauer

    Joined:
    Dec 3, 2014
    Posts:
    127
    Thanks for the kind words! To answer your question: The volume renderer has multiple render textures as outputs (e.g. diffuse color, specular, subsurface scattering, drop shadows, etc) and those are then post processed in various ways before they are combined to make the final result in a compositing shader. There are other steps along the way and afterwards, e.g. a Depth of Field filter.
     
    one_one likes this.
  4. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Sounds great, so how long does it take to calculate those render textures? Is cutting through the volume or changing the lighting angle possible in real-time? Or real-time-ish (<1 second calculation time for deferred updates).
     
    UlfvonEschlauer likes this.
  5. UlfvonEschlauer

    UlfvonEschlauer

    Joined:
    Dec 3, 2014
    Posts:
    127
    The 3D textures are calculated once during import of the dataset. That can take a few seconds to 3 minutes or so, depending on the size and the type of the dataset (DICOM is a bit slower than my own file format).
    The rest is running completely in real time. There is no delay for animation, lighting calculations or transfer function changes and boundings.
    When it is not rendering to png files with Unity's recorder, it is at about 20fps (due to the update loop being required to run all the time so the Timeline gets updated). If I was to comment out that update code, it would be 35 to 40 fps.

    The Recorder- PNG- file saver is pretty slow writing out files (could be my hard drive too) and frame rate drops to about 7 FPS or so.
    When I have a moment, I will make a short video showing how it works.
     
    Last edited: Mar 27, 2022
    one_one likes this.
  6. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Alright, that would be very interesting! I suppose rendering times would more or less double for stereographic rendering, like with VR?
     
  7. UlfvonEschlauer

    UlfvonEschlauer

    Joined:
    Dec 3, 2014
    Posts:
    127
    Yeah, it would be quite slow on VR. Also note that VR could interfere with the current post processing pipeline. Are you interested in a collab of sorts?
     
    one_one likes this.
  8. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    I wrote you a PM!
     
  9. julienkay

    julienkay

    Joined:
    Nov 12, 2013
    Posts:
    170
    The video looks amazing!

    Since you've mentioned it: I've had really good experience using AsyncGPUReadback, then doing encoding / writing to disk in background threads for this type of stuff.
     
    one_one likes this.