Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Procedural Stochastic Texturing prototype

Discussion in 'Graphics Experimental Previews' started by thomasdeliot, Feb 12, 2019.

  1. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    This is the feedback thread for our research prototype implementing Procedural Stochastic Texturing. Use this space to ask questions, offer feedback and show your results if you try it out !

    https://blogs.unity3d.com/2019/02/14/procedural-stochastic-texturing-in-unity/

    Our goal is to implement this technique as a new sampler node in ShaderGraph at some point, offering support for custom shaders and the new rendering pipelines.
     
    Last edited: Feb 14, 2019
  2. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    What is Procedural Stochastic Texturing?
    Did a google search and the only thing I saw was something that looks like a better version of triplanar.
     
    id0 likes this.
  3. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    twobob and Grimreaper358 like this.
  4. id0

    id0

    Joined:
    Nov 23, 2012
    Posts:
    454
    Is this method solve problems with parralax (height) map when rotating object on 45%? Triplanar blend textures from different angles and than some artefacts happen.
     
    Last edited: Feb 14, 2019
  5. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    417
  6. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Yes, the thread went up a bit before the blog post, thanks for linking it. I'll edit the top post
     
    JakubSmaga likes this.
  7. jaelove

    jaelove

    Joined:
    Jul 5, 2012
    Posts:
    302
    Does it work in Unity 5.6?
     
  8. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,753
    Woah??? any performance comparison with and without?
     
  9. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Currently it works with the last Unity 2018 releases in the built-in render pipeline. Since the plugin plugs into parts of the Standard Shader code, and some of that has evolved since the initial release in Unity 5, it’s not working with older versions as it is.

    I will be backporting it in the coming days as separate packages for older popular versions of Unity, mainly Unity 5.6!
     
    Last edited: Feb 14, 2019
    twobob, ceebeee and jaelove like this.
  10. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    We have performance comparisons on page 20 of the Technical chapter:
    https://drive.google.com/file/d/1QecekuuyWgw68HU9tg6ENfrCTCVIjm6l/view

    As a quick overview, it uses three input texture fetches in the fragment shader and some additonal math, so the cost is basically similar to that of Triplanar Mapping.
     
    twobob likes this.
  11. jaelove

    jaelove

    Joined:
    Jul 5, 2012
    Posts:
    302
    Thank you for still continuing to support 5.6
     
  12. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,753
    Very nice, this could be usefull for terrains and natural materials
     
    thomasdeliot likes this.
  13. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    When I download from github and install into my project on Unity 2018.3.5f1 on Mac (running Mojave), I get two hard stop console errors that will not clear...

    Assets/Shaders/Procedural Stochastic Texturing/Editor/StandardStochasticShaderGUI.cs(585,11): error CS1644: Feature `byref locals and returns' cannot be used because it is not part of the C# 4.0 language specification

    Assets/Shaders/Procedural Stochastic Texturing/Editor/StandardStochasticShaderGUI.cs(587,12): error CS1644: Feature `byref locals and returns' cannot be used because it is not part of the C# 4.0 language specification

    When a I create a new material and choose StandardStochastic from the drop-down (it's in the root of the shader hierarchy by default), I don't get something that looks like what's pictured in the blog, but something like this...
    stochasticfail.png

    Any ideas on how I can fix that?
     
    Kanda likes this.
  14. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    While it's obviously a good fit for terrain, it's also not going to be feasible if using a tiled caching system... any way to make them tango, perhaps being applied to the combined baked texture?
     
  15. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    I don't actually think it's a good fit for terrain; I tried this technique a while back in MicroSplat and abandoned it- it just doesn't look good on most textures. In fact, the use cases where this does look good are few IMO.

    If, for instance, you have a highly stochastic texture (like concrete) which will be viewed up close and far away, then it will work well- but if the texture is highly stochastic to begin with, it's not that hard to get it to tile well, and not that hard to break up the tiling with other techniques. I have yet to see a real world example where this is a real win, but I am happy to be proven wrong. (IE: in todays market of multi-gig gfx cards, 2k textures being the norm, and multiple samples and heavy ALU usage being plausible even on low end platforms like mobile)

    So what to do instead? I'd offer these options as all being cheaper and better looking:

    - Texture Clustering with 1/2 size textures
    + Less total texture memory. Uses 3 texture samples and one LUT sample, but doesn't use nearly the ALU this technique does
    + Better randomization possible
    + Textures do not require Stochastic nature (though can not be fully regular)
    - Requires designing multiple textures


    - Classic macro/detail texturing. Essentially using one texture to vary the other. So you tile your stochastic texture as normal, but use a second map to vary it's luminosity or normal.
    + Dirty cheap, 1 extra texture sample and a multiply on your UVs.
    + Can often use lower res textures
    + Works with non-stochastic textures
    - May still see tiling in stochastic texture to some degree

    This is just my opinion, of course- I'm happy to see the community come up with some real-world use cases where this technique might be worth the cost.
     
    Last edited: Feb 14, 2019
    Flurgle, Ericsheng, hopeful and 3 others like this.
  16. id0

    id0

    Joined:
    Nov 23, 2012
    Posts:
    454
  17. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I had a similar discussion half an hour ago, I felt that this isn't really a problem for most surfaces - you just use a bigger texture with more variation from the outset. But terrain remains the only place where I find tiling to actually become a problem, and Unity's doing cached terrain texturing soon, and I hope that's not going to be a problem.

    Really like your ideas Jason. Will probably pursue something along those lines... (HDRP else I'd be using microsplat already).
     
  18. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    That sorted it, thanks @id0
     
  19. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    I mean, generally cached terrain just renders out the inputs to the lighting equation and caches it in virtual texture, such that you don't need to render the full shader every frame. Unless something is animated, it should work with any shader technique that supports outputting those channels. I'll eventually port MicroSplat to HDRP, but I'm waiting for it to be more stable and demand to pick up; the LWRP module for MicroSplat sells a few copies, but nothing near the work required to port it or maintain it while Unity is changing it all the time.
     
    one_one, Kirsche, hippocoder and 2 others like this.
  20. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    Now that it's working, I'm trying to do some tests. I have noticed that if I change Stochastic Inputs to Albedo (from nothing) and click Apply, the 'Pre-processing textures for stochastic sampling' progress bar appears and just hangs in limbo. It effectively kills Unity, I can't save or open another scene or project, and can't even quit Unity. I have to manually force quit the Editor and start over.
     
  21. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    +1 to what Jason's said.

    HDRP is still new and in flux, and for many developers (as well as content creators) it's premature to start investing the time and energy into supporting it. Once it's stable and comes out of beta, developer interest and demand will follow.
     
  22. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    367
    Just came to say that this thing looks totally badass. Great work!
     
    thomasdeliot likes this.
  23. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Thanks for noticing the .NET 4 requirement, that's a mistake, will fix it.

    Your bug is interesting, what size of textures are you using ? The pre-process can be minutes long for large textures (a warning should show that) with a good CPU, which might look like a stall.
     
    Last edited: Feb 15, 2019
  24. misher

    misher

    Joined:
    Apr 22, 2016
    Posts:
    19
    I think it is very useful technic, I will try to use it for applications where I need to load models dynamically, as it can help to reduce the size of asset bundles or gltf files significantly.
    Here is a good example of using small textures for big surfaces and still having HD looking:
     
    JoeStrout, thomasdeliot and id0 like this.
  25. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    I've tried with 512 textures, 1024 textures, and 2048 textures (the 1024 and 2048 were scaled down with overrides after seeing the alert message.

    It initially worked a couple times, but hasn't worked most of the time after SEVERAL hours of testing. I gave it time (in a couple cases I let it go for an hour before coming back. Activity monitor does not indicate that Unity's busy doing much, giving me the impression that it's stalling or failing. I'm on a 12-core Xeon 2.7GHz with 64GB RAM.

    When it worked, it was promising, but if it's this inconsistent and unstable in the editor, I can't begin to imagine it being something to use in production. I fully anticipated that a preview would be quirky, but this mostly failed. I spent more time tinkering with this than I should have (waiting on another project to be ready for me to dive in for more work), lesson learned the hard way I guess.
     
  26. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Since this is a prototype in the mean time of something more thoroughly implemented, it has not been tested on many different configurations other than mine. If I understood correctly, it hangs while pre-processing without any error message in the console ? Are you trying it out on a clean project ? If you could maybe send me an archive of your project that would be helpful for my own work on this !
     
    Last edited: Feb 15, 2019
  27. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    I've been playing with this a bit this morning. Perhaps something was off when I implemented this, or some improvements have been made since the original paper, but I'm seeing some decent results on some of the terrain textures that didn't work very well before.

    It also seems like the hashing function could be moved into a LUT instead of the current implementation, which would get the cost much closer to Texture Clusters since you could lookup all 3 random values in a single sample; this would limit the randomization to some domain size, but that's likely ok for a lot of use cases.
     
  28. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Thanks for the feedback, I had not considered pre-computing the hashing function, that's something I'll have to try!
     
  29. trilobyteme

    trilobyteme

    Joined:
    Nov 18, 2016
    Posts:
    309
    Correct, it hung without any errors (or ability to cancel). Most of the time the console was clean, other times it threw errors related to shader keywords. It was a nearly clean project, one that had been setup to do some testing on another asset (and had my materials/textures folders). That testing had been finished and the project due to be nuked, so I used it as a safe place to try and do testing. After wasting way too many hours yesterday, it's been deleted and I've moved on.
     
  30. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    @thomasdeliot

    Hi Thomas, I might have botched something when converting your implementation over to my system, but I'm seeing a flattening of the normals which really muddies the image.

    Tiling:




    With stochastic Sampling:

    Now, the main difference is that I pack my normals with smoothness/ao as well, in the following format:

    R(Smoothness)
    G(Normal X)
    B(AO)
    A(Normal Y)

    I noticed that it looks like your code packs the normals as an RGB value instead of the encoding unity normally uses, which I think means you never read from an alpha channel in your example?

    Code (CSharp):
    1.  
    2.    float4 color;
    3.    color.r = UNITY_SAMPLE_TEX2DARRAY_LOD(_NormalSAOInv, float3(G.r, LOD, uv.z), 0).r;
    4.    color.g = UNITY_SAMPLE_TEX2DARRAY_LOD(_NormalSAOInv, float3(G.g, LOD, uv.z), 0).g;
    5.    color.b = UNITY_SAMPLE_TEX2DARRAY_LOD(_NormalSAOInv, float3(G.b, LOD, uv.z), 0).b;
    6.    color.a = G.a;//UNITY_SAMPLE_TEX2DARRAY_LOD(_NormalSAOInv, float3(G.r, LOD, uv.z), 0).a;
    I noticed that the gradient was missing from the alpha channel and it's all white. So I'm pretty sure this is where the issue is, as I had to comment out the LUT lookup to get the normals to look anything like the originals.
     
  31. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    So, I worked on this more this weekend. Anti-tiling techniques are something I really enjoy working on, so I couldn't really stop.

    MegaSplat uses a technique very similar to this one for it's splat mapping. This came from it's original implementation, where splat map data is stored in the vertices and barycentric weights are used to un-interpolate the texture index's it uses on each face. When adapting this to Unity Terrains, where the vertex data is changing dynamically due to LODs, I couldn't exactly use the triangle structure for this. Instead, I treat each pixel in the control map as a virtual quad, determine which side of the quad I'm on and create a virtual triangle for the barycentric blending.

    This virtual triangle concept is used in this paper as well, and made me remember this. The main benefit, in my mind, of this technique is the idea of blending 3 texture samples from three virtual triangles together, with those triangles being constructed from a simplex noise function and thus provide varying rotation and position to break up the pattern. This is actually the easy part; the hard part of the paper is resolving that blend in a way which is consistent with the original texture. The way this is achieved involves a lot of math, preprocessing the textures into a different format, as well as generating look up tables that must be sampled. It has further complications due to texture compression formats. IMO, this is a lot of downsides to the technique..

    So my thought was this; the simplex triangles provide the randomization we want- but the blending of that data is producing a lot of unwanted complexity. So is there a simpler way to solve the blending issue?

    My answer for this was to switch to a height map based blend instead of the one in the paper. Compared to the one used in the paper:

    The upside (vs Stochastic Sampling):
    - No (slow) preprocessing of textures required
    - No adjustments for texture compression format
    - No look up table textures needed
    - Less computations in the shader
    The downside:
    - Requires a height map for the blend (can generate acceptable data based off luminosity or normals)
    - Has a bias towards the higher portions of the surface

    The other technique to compare this to is the texture clustering system in MicroSplat. MicroSplat's texture clustering system uses a noise function to blend between 2 or 3 sets of textures using a height map based blend.

    The upside (vs MicroSplat Texture Clusters)
    - No need to author texture variations
    - Less memory needed to store extra textures
    The downside:
    - Less variation than using multiple textures
    - Cannot adjust performance characteristics (MicroSplat can do 2 cluster mode, which is 2 samples instead of 3)

    Now some screen shots:

    Tiled:


    Texture Cluster (3 Layer)


    Stochastic Height Blend:



    Tiled:


    Cluster (3 layer)


    Stochastic Height Blend:


    Here's a close up of the grass and rock areas intersecting:

    Tiled:


    Cluster3:


    Stochastic Height Blend:




    Overall, I think it's a nice technique, as it requires little shader code and no preprocessing of the texture data, or messing with compression formats. I'll be shipping this code in the MicroSplat Texture Cluster module soon.

    I also want to spend a bit more time with the original technique. One downside to using a height blend operator is that peaks tend to dominate the resulting image. As you can see here in this last image set, the stones become a lot more busy than in the other two images. I suspect that the original technique presented here will have less of that issue, though at the added complexity I'm not sure if it's worth it.
     
    ekakiya, camta005, hippocoder and 2 others like this.
  32. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    Here is an apples to apples comparison of tiled, procedural stochastic sampling using Unity's implementation, and procedural stochastic sampling using a height blend operator (my implementation):

    Tiled:


    Stochastic (unity):


    Stochastic Height Blend (mine):



    As expected, I think Unity's implementation leaves more of the valley's intact which preserves the original texture a bit better, but the two are more similar than I would have expected. I might try biasing the height operator in some manner..
     
  33. sigvald

    sigvald

    Joined:
    Jan 27, 2014
    Posts:
    3
    This tool is great and we could use for it for a big project.
    There is one thing though that would make it even better : handling multiple selection.
    In our case, we are procedurally generating textures in another tool and would need to apply this shader to materials with our textures.
    To do that, we created a menu item that checks for textures in a folder, create materials in another one, apply this shader to the materials and set the material’s texture’s “_MainTex”.
    The problem is that to trigger the “apply” function, we would have to tweak the StandardStochasticShaderGUI class ourselves since selecting more than one material at a time and clicking on “Apply” will result in the same texture being applied to all of them :/

    Anything you could do about it ?
     
  34. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    You are right in detecting the problem here, in my code I hard coded the setup of the inputs for that of Unity's standard shader as a necessity for having something that behaves like it. Because we want to avoid some artifacts with strongly correlated 3D data (albedo and normal) we do a specific optimization for those two inputs (decorrelate the 3D input space). So the XYZ normal format is hard-coded in this prototype.

    It is possible to discard this optimization and process your texture setup as 4 independant channels, with a 4-channel LUT as well. The "#region SPECULAR MAP" part of the code in the GUI script is an example of this.

    Luckily a shadergraph implementation will be much more flexible on this for people who don't write custom shaders in code. Basically this shadergraph sampling node would offer two modes : correlated 3D albedo/normal inputs, or other types of input where the 4 channels get processed independantly.
     
    Last edited: Feb 18, 2019
  35. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    Thanks for showing your experimentations. Height-based blending is actually a great idea for materials with strong height variations that never occured to me ! Great for lots of terrain uses.

    I'm curious on how you do the blending based on the three sampled height ? I don't see signs of the usual linear blending artifacts in your rock example, but you probably have some kind of soft blending between samples when the heights are close togerther (or do you just chose the highest and discard the two others ?)
     
  36. Bodyclock

    Bodyclock

    Joined:
    May 8, 2018
    Posts:
    171
    @jbooth Is this something that you will also implement in Megasplat?
     
  37. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    I have reproduced your use case to see why this happens and unfortunately I don't see a way of having the Apply button work correctly with multiple materials selected, due to how the editor scripts work in the engine. I'll send you a message if I can think of a solution!
     
  38. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    It's a height based blend, so basically it's only linear in a very small region where height maps are similar.

    Code (CSharp):
    1. half3 BaryWeightBlend(half3 iWeights, half tex0, half tex1, half tex2, half contrast)
    2. {
    3.     // compute weight with height map
    4.     const half epsilon = 1.0f / 1024.0f;
    5.     half3 weights = half3(iWeights.x * (tex0 + epsilon),
    6.                              iWeights.y * (tex1 + epsilon),
    7.                              iWeights.z * (tex2 + epsilon));
    8.  
    9.     // Contrast weights
    10.     half maxWeight = max(weights.x, max(weights.y, weights.z));
    11.     half transition = contrast * maxWeight;
    12.     half threshold = maxWeight - transition;
    13.     half scale = 1.0f / transition;
    14.     weights = saturate((weights - threshold) * scale);
    15.     // Normalize weights.
    16.     half weightScale = 1.0f / (weights.x + weights.y + weights.z);
    17.     weights *= weightScale;
    18.     return weights;
    19. }
    So technically the linear blend is still there, just only in very small regions based on the contrast value. MegaSplat basically uses this same idea to blend textures across a face; each vertex stores a texture index, and the bary centric weights are used to blend across that face with a height based resolve. Thus each pixel only needs to sample 3 texture sets, even when hundreds are used on the terrain - and since hundreds of textures can be used, gave rise to the idea of texture clustering (using similar textures blended together at each vertex to represent a single surface). In MegaSplat, this is all handled on the tool side when painting and selecting textures- clusters are a first class concept which can use noise, slope, or height to select which texture gets chosen for a vertex when painting the mesh/terrain.

    When I adapted this to unity terrains I didn't have vertices (CLOD makes them unreliable), so I use a texture and treat each pixel in the control texture as a small quad, and reconstructed the bary centric weights in basically the same way that you do in this technique. Once I realized the similarity I realized the height blend should work fine, which basically removes about 95% of the complexity from the technique.

    I was thinking in a more traditional shader, which can't rely on height data necessarily, luminosity or some other formulation (height from normal) could stub in for height and work pretty well. In my opinion, the preprocessing cost, extra lookup textures, and code required for specific texture compression formats put a very high complexity cost on this technique, and all of that complexity is only for the blending operator. The basic idea of using a distorted triangle grid to grab texture chunks from is solid though.

    Oh, one other suggestion would be to allow some control over the virtual triangle size. I have found with texture clustering that often letting the full texture show, such that there is a minimal amount of tiling (say 125%) is actually very beneficial in retaining the intended look of the source texture.
     
    hippocoder likes this.
  39. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    And btw- the asset store shipped the new version of this module this morning, if anyone wants to play with it.
     
    transat and camta005 like this.
  40. Pr0x1d

    Pr0x1d

    Joined:
    Mar 29, 2014
    Posts:
    41
    Is there a date of shader graph implementation release it would be really cool to have it on procedural planet generator I am working on and using shader graph for it. Since when I get up close the Triplanar does not look as good it could. Thanks

    Just quick image of how does the planet looks like right now.
    upload_2019-2-18_18-10-3.png
     
    PutridEx likes this.
  41. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,753
    @thomasdeliot is it possible to use this function on runtime? would like to integrate this into Materialize
     
  42. YuriyPopov

    YuriyPopov

    Joined:
    Sep 5, 2017
    Posts:
    227
    So just a quick question. How hard would it be to redo this technique in HDRP ?
     
  43. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    I valuate this tool as one of the best Unity developments over the last months.
    No more visible tiling on large surfaces like grass panes. Finally!
    The best thing is that @thomasdeliot mentioned in the Blog, that there´s a node planned for the ShaderGraph!
     
    newguy123 and hopeful like this.
  44. ReadyPlayGames

    ReadyPlayGames

    Joined:
    Jan 24, 2015
    Posts:
    49
    This is completely amazing. I did a test with a 10x tiled material and the results are wonderful.

    My only "complaint" is having to hit the apply button each time I change things that are not seemingly "calculated" (like adding a secondary albedo map) but I imagine that would be crazy to pull off or impossible.
     

    Attached Files:

  45. jaelove

    jaelove

    Joined:
    Jul 5, 2012
    Posts:
    302
    5.6 version still coming?
     
    twobob likes this.
  46. HiWill

    HiWill

    Joined:
    Jun 2, 2013
    Posts:
    18
    upload_2019-2-28_16-25-34.png I got this line with stochastic mode, how to fix it ?
     
  47. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    Switch the sampler to gradient; this will be automatic in the next update.
     
  48. thomasdeliot

    thomasdeliot

    Unity Technologies

    Joined:
    Feb 12, 2019
    Posts:
    14
    I've just uploaded an alternate release for Unity 5.6.6 on github and pushed the code on a new branch. I have not tested exactly with which versions of Unity it will be compatible, but everything around 5.6 will probably be fine !
     
    twobob and JoeStrout like this.
  49. twobob

    twobob

    Joined:
    Jun 28, 2014
    Posts:
    2,058
    Hug you
     
  50. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,407
    So last night I wrote this for Unity's new shader graph:



    This afternoon I found out they've removed the ability to write custom nodes in 2019.1..
     
    fherbst and Flurgle like this.