Search Unity

Question Vertex displacement as a "camera filter" (noob)

Discussion in 'Shaders' started by mannyhams, Sep 12, 2021.

  1. mannyhams

    mannyhams

    Joined:
    Feb 6, 2015
    Posts:
    34
    I would like to apply a vertex displacement aesthetic to all rendered game objects relative to the camera (the animal crossing cam effect).

    I haven't been able to find a really nice solution for this, but as the titles states I'm quite new to graphics programming. Advice?

    Potential solutions I'm aware of:

    1. Camera shader replacement
    This works but also blows away other custom shaders I'm using for various other things (e.g. water planes, terrain texture resolution which accounts for blending with adjacent terrain, etc).

    2. Include vertex displacement logic in all the shaders
    I guess this works, and I guess I can use a custom cginclude for code reuse (I think? Not finding official documentation on this...). Perhaps this is the right way?

    3. Displace the vertex data in on the CPU side
    I don't want to do this because it unnecessarily affects game logic to achieve a strictly aesthetic outcome.

    4/5/6.. ???
    Probably there are many options I'm not able to see, and probably one of these is best :)

    Thanks in advance for sharing your time and brainpower!
     
  2. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    Option 2 is what you want. Yes you can put the vertex bend code into a CGINC file and reference that and just add the function to each shader's vertex program code.

    Modifications you make in the vertex program of a shader, only exist during the call of that vert/frag pass. Any further shader passes will not see the modification, so you have to add that modification to all shaders.

    Though... technically, with the new direct mesh modification GraphicsBuffer feature in 2021.2, you could use a Compute Shader to warp all the meshes vertices on GPU side, avoiding the costly CPU>GPU transfer each frame. This would allow you to avoid having to modify all the shaders you use, at the cost of another vertex pass (done in Compute) on all meshes that will be rendered in a frame.

    There's an example project here you can use for reference if you want to try that way: https://github.com/keijiro/NoiseBall6
     
    mannyhams likes this.
  3. mannyhams

    mannyhams

    Joined:
    Feb 6, 2015
    Posts:
    34
    Gotcha, thanks!

    Given the duplication inherent in option 2, is it expected that shader code can become bloated with app-global logic like this? I suppose this can be combatted with liberal use of CGINC includes.

    This sounds good but I must admit I don't fully understand the problem or solution :D But I am on 2021.2 so I'll have a look at the linked repo in the morning, thanks for pointing this out!
     
  4. mannyhams

    mannyhams

    Joined:
    Feb 6, 2015
    Posts:
    34
    Oh the
    GraphicsBuffer
    approach looks neat. A few questions:
    1. The compute shader in the linked repo doesn't seem to pass the buffer to downstream shaders. How would that work? Is the mesh data updated automatically in downstream shaders because we specify the type of the graphics buffer?
    2. I don't understand this. Don't we still pass mesh data to the GPU each frame so the shader pipeline can do its work (compute + vert/frag)? Is there caching going on that I've missed?
     
    Last edited: Sep 13, 2021
  5. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    It's passed downsteam because we are modifying the actual buffer where the mesh data has been stored on GPU side. The CPU only sends the mesh data to the GPU once or when modifications need to be made on CPU side and sent to GPU, otherwise it's just sitting in GPU memory being used by shaders each frame until the game says it no longer needs it and clears the buffer

    Yes like mentioned above. It would be quite wasteful if we had to send all of the scene's geometry each frame :p
    All that's sent are draw calls with pointers to graphics memory where the mesh data is stored.

    There is one detail I left out with this compute shader approach though... Since you're modifying the source data on GPU side, your warping will end up being progressive... So you'd likely need to put a script on each rendered object that makes a copy of the mesh data on
    OnEnable
    that your compute shader can read from as the "initial state" to warp from and outputting the results into the actual rendered mesh buffer. This would mean doubling the memory geometry uses.

    So while this system would be shader agnostic and plug-n-play once made, it has a bigger footprint. If you can manage it I'd def just go the CGINC route, adding the
    #include
    into each shader and add the function to warp the geometry into each shader's vertex program.

    edit: Actually... thinking about the initial state geometry issue.... We could reduce the footprint and complexity by storing the initial state information into the UV3/4 channels since most won't need them, or even 5/6. This would mean only dealing with the primary geometry pointer....
     
    mannyhams likes this.
  6. mannyhams

    mannyhams

    Joined:
    Feb 6, 2015
    Posts:
    34
    @Invertex Awesome thank you, this is filling in tons of gaps for me.

    Oh! I was unaware that the GPU hangs on to geometry data which is sent over from the CPU, I thought the default behavior was to transfer that data over to the GPU every frame. That's very good to know :)

    Ah yes makes sense. Using the
    UV3
    + channels could work, though I'm sure I'll regret it as soon as I need those channels for something else. I guess this is the fun of graphics programming... deciding what to spend your limited budgets on :)

    After posting yesterday I went ahead with the CGINC + #include route, which works but is less performant... I think this is because I was lazy for assets which use the Unity standard shader. Instead of actually extending the standard shader (which looks convoluted) I just created a new surface shader, leaving all the default logic in there, and added my vertex manipulation to it. I'll probably time-box an attempt to extend the actual standard shader tomorrow and see if that helps.

    Thanks again!