Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Question SOLVED: Mimic shadergraph vertex displacement in a Monobehaviour

Discussion in 'Universal Render Pipeline' started by gfas, Dec 7, 2022.

  1. gfas

    gfas

    Joined:
    Dec 30, 2020
    Posts:
    9
    UPDATE: I had mislabeled my shader variables, so the software and shader versions had 2 params switched ‍♀️

    Hi, I have a vertex displacement shadergraph that translates the world coords according to some projection.


    Now I'd like to render an object "normally" on the surface of the displaced object, so I'm trying to mimic this shader's logic in a Monobehaviour's Update() call to position my objects.

    The custom function matches the shader exactly, but I'm having trouble figuring out how to replace the shader graph's Position(world) and Transform(world-> object) nodes.

    I've tried various combination's of TransformPoint and InverseTransformPoint, both with my rendered object's transform, and the main camera'r transform.

    Lets say my object's mesh is just a point and it renders somewhere on screen. What code do I need on a unity GameObject rendering as a point (or a small cube/sphere) to end up on the same on screen pixel, given I have the ` Vector3 CustomFunction(Vector3 p)` implemented already?
     
    Last edited: Dec 10, 2022
  2. fleity

    fleity

    Joined:
    Oct 13, 2015
    Posts:
    235
    weird but okay

    Since your displacement function works in world space you can just pass in a gameobjects transform.position into your custom function (c# version), this gives you the displaced position in world space which you then just set on a transform again.
    That should be somewhat enough tbh because on the game object you can skip the entire transform local to world, world back to object and then inside the vertex function part you don't see again back to world space that needs to be done for vertices.

    The issue might be that you can not use only one gameobject to set this up. You need a source position/gameobject which does not move in respect to the original mesh and an output gameobject which receives the displaced position to prevent accumulating the displacement every time it is calculated.
     
  3. gfas

    gfas

    Joined:
    Dec 30, 2020
    Posts:
    9
    :)

    Thank you for your answer, you're right about the separate "source" object and that's what I'm doing.

    however there must be some additional transform somewhere as the shader version and the software version don't quite match up.
    The software version looks "scaled up", and I can get it to map a little closer to the shader version if I simply divide the projected version by around 3f.

    My scene setup is also very simple, the displaced object originates at world (0, 0, 0) and the camera is set up to look directly down the Z axis. So I might be missing some transform that just looks like scaling in this simplified scenario?

    Another, somewhat related question is that the displaced mesh is actually copied and tiled, and it looks fine, but on the above screenshot if I bypass the world->object transform node and output the world coords directly it introduces a small gaps between the mesh copies. Bypassing the last transform node in the shader appears to scale the vertices away from the origin.

    None of the objects in my hierarchy have any scaling, and the tiling copies are offset exactly to match the mesh vertices.

    I'm almost certain my software displacement is missing applying some transform that the shader version applies implicitly.