Search Unity

  1. Calling all beginners! Join the FPS Beginners Mods Challenge until December 13.
    Dismiss Notice
  2. It's Cyber Week at the Asset Store!
    Dismiss Notice

[Best Tool Asset Store Award] Amplify Shader Editor - Node-based Shader Creation Tool

Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.

  1. Little_Gorilla

    Little_Gorilla

    Joined:
    Oct 11, 2013
    Posts:
    69
    That did the trick thank you!
     
  2. ColtonK_VitruviusVR

    ColtonK_VitruviusVR

    Joined:
    Nov 27, 2015
    Posts:
    148
    We're building the game using SDK 6 but the TestKit is running 6.5. (6.5 is not compatible with version 2017.4 of Unity)

    Cheers,
    Colton
     
    Last edited: Jul 4, 2019
  3. Poupi

    Poupi

    Joined:
    Jan 11, 2012
    Posts:
    104
    I tested with Unity 2019.1.9f1 on a clean project (only with the toon shader compiled with ASE 1.6.8.005) and I still have the issue. I forgot to mention that it only happens when the color mode is on Linear.

    I zipped a very simple repro project so you can test and see if you can reproduce the issue.
    Thanks,
    Robin
     

    Attached Files:

  4. Somawheels

    Somawheels

    Joined:
    Apr 5, 2019
    Posts:
    10
    Hello,
    I am having some problems getting pixel world position in my post-processing shader.
    I am trying to make a post-process shader which projects a texture using worldspace as UVs. My shader looks like this:
    upload_2019-7-7_18-58-23.png

    The shader produces this result:
    upload_2019-7-7_18-52-28.png
    upload_2019-7-7_18-56-39.png

    At first glance this may seem fine, however upon closer inspection you will notice there is a grey pixelization around the edges of the alpha-masked foliage. This creates a very noticeable and uggly buzzing effect when the camera moves.

    The issue is also present on the edges of geometry, however it is much less apparent.

    What could be causing this issue?

    Edit:
    Also, when I look at the foliage form afar, it becomes pure grey pixels.
    upload_2019-7-7_19-47-13.png
     
    Last edited: Jul 7, 2019
  5. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    DepthFadeNotWorking.jpg Hi, I've been trying to get depth fade working for a basic shoreline and can't for the life of me figure why it's not working. I went to Unity LWRP then came back to 2018.3.8f1 and now it's not working. Doesn't look like it's reading the depth texture correctly as you can see from my very rudimentary screenshots. Any help would be greatly appreciated :)
     
  6. Cleverlie

    Cleverlie

    Joined:
    Dec 23, 2013
    Posts:
    205
    Hi guys, I want to bring to life an idea I had, but I'm not sure if I will be able to do it with Amplify so any help will be appreciated.

    So I have to do a 360 VR experience, and for that the artists will provide me with full 360 renders of different environments, interior and exteriors, so far this is something I did in the past, I create two spheres, I make an unlit shader which renders the texture in the spheres and I replace the UV projection of the mesh with a custom projection for the texture which does some math to do a sphere projection per pixel, then I use this shader to render the left 360 image to the L-Sphere which is on a layer that the left-eye camera will see, and viceversa for the right-eye camera.

    this was my approach in a previous project and it worked fine, although I would love to know if this is possible with only one sphere and making the shader render different textures for different eyes (in stereo rendering) all in one single material. this is my first question.

    my second question is that I want to take this approach one step further: I'll ask the artists to not only render the scene in a 360 image, but also give me a depthmap of the scene in 360, I want to use this depthmap to override (or write) to the depth buffer in Unity, this way I can have actual realtime 3D objects in the scene and this will be occluded with the 360 image, also I want to learn how to do this for another project where I want to do some sprites that write depth information to the depth buffer to achieve the same effect you have in the Sims 1 game, where the furniture is all 2D sprites, but all have "depth" and can occludee each other simulating 3D per pixel z-testing.

    here is what I mean with the sims 1 technique



    and I think you do something similar for the Amplify Impostors since they have a full depth representation of the original mesh.

    Anyone have any ideas on how to achieve this?? thanks in advance!
     
  7. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Hello,

    We ran a few tests and it seems we can replicate it to some extent. Not entirely sure what could be causing it but we will investigate.

    Thanks for the heads up!


    You'll have to do your own ddx/ddy calculations in order to avoid those artifacts, set your Texture Sample to "Derivatives" to reveal the necessary input ports.

    DDX Node
    DDY Node

    Seems to be working, what values do you have on the node?

    upload_2019-7-9_11-48-49.png



    That's an interesting case but I'm afraid I'm not entirely sure if you can have that type of eye-side rendering control using only ASE; could be wrong here, it's something that requires additional investigation.

    Regarding the depth, you can offset it but, for what you describe, you will have to create your own shader template if you want to override it with your own texture.

    I would recommend discussing it in our discord, I believe one of our users was building something very similar.

    Amplify Creations Discord
     
    Somawheels likes this.
  8. summerian

    summerian

    Joined:
    Jul 6, 2014
    Posts:
    128
    EDIT: Never mind. I had to set the camera to Deffered rendering

    I was going to make a water shader with foam and thought I could use the Force Shield example as inspiration especially regarding the intersection part.

    It seems that the example is broken on my computer though. macOS with Radeon Pro Vega 56 8 GB. Unity version 2018.3.12f1




    This is what it should look like


    from the youtube video





    Any idea how I can make intersection work?
     
    Last edited: Jul 9, 2019
  9. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    So i figured it out. The camera was in Forward so not rendering a depth texture by default. I had to write a little script to write the depth texture with that specific camera and problem solved :)

    Capture.PNG
     
  10. Somawheels

    Somawheels

    Joined:
    Apr 5, 2019
    Posts:
    10
    Thankyou so much. The mip-maps are indeed the problem. Howevever I still dont understand how to compute them with ddx and ddy. Do I need the world normals? if so how do I get those in post-process?

    Thankyou very much for your help
     
  11. summerian

    summerian

    Joined:
    Jul 6, 2014
    Posts:
    128
    Having some trouble with a shader on iOS. It looks like the intersection (generating the foam at the coastline) is bleeding through the globe in a pixelated format. So coastlines from the other side are visible.

    Here's how the shader looks in the editor:


    Here's how it looks on device:


    There are the settings:


    Any hints on how to fix it would be appreciated.

    Edit: Maybe it's because deferred rendering is not really supported by mobile devices.
     
    Last edited: Jul 10, 2019
  12. Horus_Sungod42

    Horus_Sungod42

    Joined:
    Oct 30, 2014
    Posts:
    67
    Hi y'all, I'm having an issue in amplify shaders:

    I want to use the grab pass to make a water effect (by scrolling a couple normal maps), and in the built-in shader graph, there an input in the Tex Coordinate node called UV. That lets me use the screen position to remap the grab pass on the model.

    However, there is no UV input in the Coordinate node that'd easily let me overwrite the UV, in Amplify.
    Is there a way to plug screen position in a Tex Coordinate node in Amplify, as I'd do in the shadergraph?


    This seems to do what I want:
    http://wiki.amplify.pt/index.php?title=Unity_Products:Amplify_Shader_Editor/Compute_Screen_Pos

    BUT, I still need to blend the normal maps to have the water distortion, and have not been able to figure out how to do so.


    boop.jpg
    The top image: in the shadergraph, the glass cube distorts only what is behind it.
    The bottom image: in Amplify, then entire grab pass of the image is mapped on the plane instead of being screen based, leading to strange interactions.
     
  13. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Hey there!

    Are you running the Editor with Metal?

    Edit: In that case, it could be just a matter of forcing the Camera do render the Depth texture in forward as mentioned bellow.

    Ah yes, my apologies I should have mentioned that earlier.

    Here's a reference script for anyone experiencing the same issue; you just have to call camera.depthTextureMode |= DepthTextureMode.Depth; on Awake.

    Code (CSharp):
    1. using UnityEngine;
    2. public class SetCameraDepth : MonoBehaviour
    3. {
    4.     private void Awake()
    5.     {
    6.         GetComponent<Camera>().depthTextureMode |= DepthTextureMode.Depth;
    7.     }
    8. }


    Sorry about, should have been more specific; one of our developer provided additional input as follows:
    You'll have to open the Shader function used(Reconstruct World Position From Depth) and change the mip type to Derivative, place the DDX and DDY nodes plugged into the sampler and plug the XY Screen Position into their inputs.

    Let us know if you run into any specific issues.

    Can you share a sample for further examination on our side?



    Hey there, have you tried using the Grab Screen Position Node? I recommend checking our Refracted Shadows sample for a specific example.

    Can you elaborate on the normal map use?

    Thanks!
     
    Last edited: Jul 10, 2019
  14. KnuckleCracker

    KnuckleCracker

    Joined:
    Dec 27, 2011
    Posts:
    63
    Regarding the Texture Array node. The docs indicate the "tex" port can be used. However, I get a shader compilation error when I hook it up: "Shader error in 'MyShader': undeclared identifier 'sampler_Texture0' at line 31 (on d3d11)".

    I have locked the tex node to "Locked to Texture 2D arrray". I've also included a default texture array on the tex node. Looks fine in the node preview. However, the generated shader has the error.

    The problem seems to be that the generated shader code still has "Texture0" defined as a "2D" rather than "2DArray" property and also doesn't call "uniform UNITY_DECLARE_TEX2DARRAY( _TextureArray0)".

    Now I can mess around with changing the Texture Array node to be mode "Reference", define another texture array node and set it up as the reference to get the shader to generate more correct code. But if that's the route to take, then the tex input port on the Texture Array shouldn't be present, it would seem.
     
    Last edited: Jul 10, 2019
  15. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Hey there,

    Not sure what could be happening, can you share an example for further examination?

    What's your current Unity and ASE version, and renderer used?

    Thanks!
     
  16. Horus_Sungod42

    Horus_Sungod42

    Joined:
    Oct 30, 2014
    Posts:
    67
    Cool, that worked! My mistake was plugging screen positions + the normal for the waves into the offset of a Tex Coordinate node, instead, it should just be plugged directly into the UV of the grab pass!

    Wack.
     
  17. KnuckleCracker

    KnuckleCracker

    Joined:
    Dec 27, 2011
    Posts:
    63
    Head slap.... I had a virtual texture object hiding out in my graph rather than a texture object (had been swapping things around and the boxes look the same in the graph). If I use a Texture Object, and force the cast to Texture 2D Array, the shader generates correctly. So it was the virtual texture object that was causing the error when connected to the texture array tex input.

    Not that it matters, but I'm using unity 2018.2.21f1 and ASE 1.6.8.

    On a slightly tangential topic; Is there a way to create a custom expression node that accepts as input the output of a texture object node that is cast to a Texture 2D Array? I'd like to do some custom manipulation of an array texture in a custom expression node.
    ASE1.jpg
     
    Last edited: Jul 11, 2019
  18. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    199
    @Amplify - Hi. I'm still getting the same issue with PPTemplate in VR, latest ASE/unity 2019.1.9.
    I have this simple scene and ASE setup:
    upload_2019-7-11_18-32-27.png
    upload_2019-7-11_18-32-54.png

    I tested the scene on android, both multi-pass and single-pass, the result is always the same(don't mind the inverted color, I just used linear depth):
    66246441_1604719386328611_3209284427830001664_n.png

    And here is a screenshot from the unity editor:
    upload_2019-7-11_18-35-9.png

    I tried a lot of stuff but I couldn't find a solution yet. If I find something, I will let you know.

    edit: I just found this article, it has a lot of useful info in there:
    https://unity3d.com/how-to/XR-graphics-development-tips
     
    Last edited: Jul 11, 2019
  19. lorddesu

    lorddesu

    Joined:
    Aug 20, 2014
    Posts:
    25

    Attached Files:

    Last edited: Jul 12, 2019
  20. Pancar

    Pancar

    Joined:
    Mar 11, 2013
    Posts:
    22
    Hi
    When i trying to use PPS shader with PPS Tool, my screen is like this. Screen Shot 2019-07-12 at 09.22.15.png

    i import PPStackTemplates, before import this package, i saw a triangle in front of the camera.
     
  21. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Good to know!

    Ah good to know you found the cause.


    We're actually using a Unity Macro that declares the array and sampler so you might need to dig a bit deeper to do the same with a custom expression.
    We're using:
    UNITY_DECLARE_TEX2DARRAY To declare the array
    UNITY_SAMPLE_TEX2DARRAY To fetch it.

    You could use the Texture2DArray type with the custom expression but you'll need to find out how Unity is doing it in order replicate how it's fatched ; we don't have this information readily available as we're simply using the Macro.


    Hey there,

    Just to be sure, did you pick up the latest version from our website?

    I will pass this on to the developer that handled the initial request.


    Hello, a good place to start is the Remap Node, you can see it in action in the video bellow.



    You also have a variety of blending nodes at your disposal such as Lerps, Blend Operations, and Weighted Blends to name a few.

    That's odd, do you see the same issue on a project without Vuforia?
    What's your current Unity version?

    Thanks!
     
    Last edited: Jul 12, 2019
  22. Pancar

    Pancar

    Joined:
    Mar 11, 2013
    Posts:
    22
    Hi,
    I created a new project without vuforia (Unity 2018.8.3f1 , ASE v1.6.8 rev 00), screenshots here
     

    Attached Files:

  23. summerian

    summerian

    Joined:
    Jul 6, 2014
    Posts:
    128
    I'm trying to pixelated my opacity output but it I have trouble converting the pixelated output to the correct format.

    This works:


    This doesn't work:


    Is there a way to format the output from the "Step (Input)" node so Opacity accepts it?

    Thanks!
     
  24. KnuckleCracker

    KnuckleCracker

    Joined:
    Dec 27, 2011
    Posts:
    63
    Yeah, it seems ASE will need to add support for passing a texture array to a custom expression for it to work (add another item to the Input Type dropdown). I could almost do it using the custom field, but the need for parentheses disallow it.

    To receive a texture array in a function unity macro's are also used. As an example if you put in a triplanar shading node, set it to receive a texture array, then hookup a texture array, this is the code that ASE generates in the shader:

    To call a function, UNITY_PASS_TEX2DARRAY macro is used:
    float4 triplanar61 = TriplanarSamplingSFA( UNITY_PASS_TEX2DARRAY(_TextureArray0), ...

    The function receives using this macro:
    inline float4 TriplanarSamplingSFA( UNITY_ARGS_TEX2DARRAY( topTexMap ), ....

    There's no way I can enter anything in the custom type field to get ASE to generate "UNITY_ARGS_TEX2DARRAY( topTexMap )" in the custom expression function (because of the parenthesis after the texture array name).

    However, this should in principal be possible but only by adding some support into ASE.

    If anyone knows some other way to pass and receive the texture array, I'm all ears. I can't figure out how to use the "Texture2DArray" as a type and get unity to compile the shader. I've only gotten the UNITY_PASS and UNITY_ARGS macros to work.
     
  25. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    This might not be strictly a ASE issue but i'm going to ask anyway. I have a custom terrain material (made in ASE) with a property i want to animate at runtime. Has anyone done this before? Is there anyway to do this? How do you reference a terrain material via animation or script. i could use global parameters but i don't really want to do that.

    Any advice would be great!

    Ben
     
  26. Somawheels

    Somawheels

    Joined:
    Apr 5, 2019
    Posts:
    10
    upload_2019-7-13_0-33-50.png

    Hello, I did this just as you described, however it did not fix the issue. And before you ask, yes I did re-compile both the material function and the shader that is using the function. Have you any other suggestions?
    Thankyou
     
  27. MiniDarkOF

    MiniDarkOF

    Joined:
    May 4, 2017
    Posts:
    15
    Hello everyone, I've always wanted to buy the Amplify Shader Editor (ASE), but recently with Unity updates, I've seen that now Unity has its own shader editor, is it still worth buying ASE?

    My english is a bit rusty.
    Thanks Anyway,
    - Emanuel Messias, R.d.S
     
  28. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Just to be sure we have the same exact configuration on our side, what's your current PPS version listed in the package manager?

    Thank you for elaborating, we appreciate it. Upon further investigation, it does seem to be a bit problematic as different platforms have specific requirements. We will indeed have to add this on our side, we've opened a ticket which we hope to tackle soon.

    It does seem like the terrain material property changes are not being picked up given that you can't access the renderer as you would with regular objects. I'm not entirely sure what to recommend as we've never really come across this limitation but the best way around this issue would probably be to add a material control script, so to speak, to your terrain that would allow you to animate those properties instead. Is this something you feel you could try on your side?

    Can you share a sample so that one of our ASE developer can tackle it directly?

    Thanks for asking!

    We're a bit biased of course but we would have to say yes, absolutely. ASE continues to be improved at a fast rate and remains an open(full source) and flexible solution for shader development in Unity. We're not constrained by the overall Unity roadmap, this is an editor built for the requirements of its community.

    More than an editor, you'll get responsive support, over 60 samples to get you started, HD and LW SRP support, Legacy Rendering support(Unity's editor is limited to SRP), Shader Functions, Shader Templates, a Custom Node API, and a set of learning resources to get you started.

    Be sure to join our Discord community for realtime discussions with other ASE users.
     
  29. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    199
    I found the problem, it's not on your side. The depth is only sampled correctly if the PostProcessEvent is set to BeforeTransparent, in the post processing script and for both _MainTex and _CameraDepthTexture, the Screen Position is connected to the UV port.
    upload_2019-7-15_17-0-14.png

    Hope it helps :)
     
  30. e199

    e199

    Joined:
    Mar 24, 2015
    Posts:
    99
    Hi, I'm making an outline effect, here is the simplified graph. I'm using LWRP.
    upload_2019-7-15_17-42-29.png

    When I try to connect inverted normal:
    upload_2019-7-15_17-44-23.png

    How can I invert the faces in the shader?
    ----------
    Update. It works if I switch shader from unlit to pbr.
     
    Last edited: Jul 15, 2019
  31. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    This is something i definitely can do on my side i just wanted to see if there was another example of this. Thanks for your help :)
     
  32. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    Does ASE support Gradient Nodes a la Shadergraph?
     
  33. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Thanks for the heads up, we really appreciate it.
    I will pass this on to the ASE developers.

    Interesting, we will look into it asap.
    Do you mean which faces are culled? You can adjust that in the SubShader parameters under "Cull Mode".

    Happy to hear it. Afraid not but be sure to let us know if you run into any issues.


    While you can achieve similar results, ASE currently does not provide Gradient Nodes; this is definitely something we consider including in the near future.

    Be sure to check our 3-color gradient example.

    Thanks!
     
    Last edited: Jul 16, 2019
  34. summerian

    summerian

    Joined:
    Jul 6, 2014
    Posts:
    128
    Is there a way to convert COLOR to SAMPLER2D?
     
  35. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    I'm afraid not, perhaps we can help achieve what you're looking for in another manner. Please do elaborate ;)
     
  36. summerian

    summerian

    Joined:
    Jul 6, 2014
    Posts:
    128
  37. cyuxi

    cyuxi

    Joined:
    Apr 21, 2014
    Posts:
    35
    Hi,
    We are trying to make shaders for terrain,
    Our project requirements and details in brief:
    - Run on IOS (IPhone 7 or above)
    - Terrain should at least support 8 splats(2 control, 8 color, 8 normal )
    - Unity2018.3.14 LWRP(4.xx)

    We have made the tarrain with Gaia and CTS, It is 3.5km in size and have 4 splats, which runs good steady 30+ framerate on device. However, it seems the CTS's shader is a bit heavy so we decided to give a try on making a custom shader with ASE.

    We followed the Terrain shader guide but The SimpleTerrainFirstPass node keeps on poping error log:
    Shader error in 'ASESampleShaders/Terrain/SimpleTerrainFirstPass': unrecognized identifier 'Input' at line 149

    where in code:
    void SplatmapFinalColor( Input SurfaceIn , SurfaceOutputStandard SurfaceOut , inout fixed4 FinalColor )

    I am not good at programming so I just try my luck by replacing the 'Input' with InputData which eliminated the error log, but I have no idea what's for replacing the SurfaceOutputStandard...

    And the both example SimpleTerrain and TerrainSnowCoverage are went down due to errors. So we are currently lack of decent references and information regarding to the workflow of making terrain shader .

    So is it still promising for ASE to make terrain shaders such as multi-pass 4+ splatmaps?
    Looking forward to any suggestions and advices. Thanks!
     
    Last edited: Jul 18, 2019
  38. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Sorry about that, not sure how I missed it!

    The Tex input on your Texture Samples expects a Texture Object node; in this case you would likely need to pixelate the actual noise texture instead of plugging it there. We would be happy to look at a sample.


    ASE is good choice for that but, based on the shader name listed above, it seems that you might be using a sample made for legacy rendering. Have you had the chance check the included Terrain LWRP sample? (Package "LW SRP Samples")

    Looking forward to your reply!
     
  39. mortenblaa

    mortenblaa

    Joined:
    Sep 3, 2014
    Posts:
    9
    Hi, don't know if this is the place to report this, but I believe I have run into a bug with LWRP, instanced properties and material property blocks. The shader is pink when setting material property block. The shader gives an error during compile:

    Code (csharp):
    1. Shader error in 'MyUnlit': undeclared identifier 'MyUnlitArray' at line 106 (on d3d11)
    2.  
    3. Compiling Fragment program with INSTANCING_ON
    4. Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_COLORSPACE_GAMMA UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR
    Steps to reproduce (Unity 2019.1.10f1 and ASE 1.6.8):
    1. Set Unity to use a Lightweight Render Pipeline asset.
    2. Create a new Amplify Shader.
    3. Set the Shader Type to "Lightweight Unlit" (also applies to "Lightweight PBR").
    4. Create a Color node and set the type to "Instanced Property".
    5. Connect the Color node to Color output in shader template.
    6. Compile the shader.
    7. Create a new material and assign it to multiple objects.
    8. Enable "GPU Instancing" on the material.
    9. Apply the following script to each object:
    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. public class SetMaterialPropertyBlock : MonoBehaviour
    4. {
    5.     private MaterialPropertyBlock m_block;
    6.  
    7.     private void Start()
    8.     {
    9.         m_block = new MaterialPropertyBlock();
    10.         var rend = GetComponent<Renderer>();
    11.  
    12.         rend.GetPropertyBlock(m_block);
    13.         m_block.SetColor("_Color", Random.ColorHSV());
    14.         rend.SetPropertyBlock(m_block);
    15.     }
    16. }
    Funny enough, using the legacy "Unlit" shader type works fine. Is it ok to use the legacy Unlit shader type with the LWRP?
     
  40. bigbenmatt

    bigbenmatt

    Joined:
    May 30, 2019
    Posts:
    10
    Hiya

    Could you demonstrate how to use the _CameraOpaqueTexture in the LWRP. I've included the render feature bit can seem to access the texture in a Sampler2D :? Does the shader have to be transparent to get access to that texture? I've tried using the Grab Screen Color Node but I can't access the parameter to change to _CameraOpaqueTexutre

    Ben
     
  41. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Greetings,

    I would recommend updating your ASE version as we recently corrected a few issues that could be related. Should the problem persist, we could really use a sample with the problem present for further examination on our side. We don't recommend using legacy shaders with LWRP, I'm not entirely sure what type of problems could arise in your specific situation.

    Be sure to get it directly from our website: Amplify Product Downloads

    What's your LWRP version?


    We've actually made a small update to allow for that so be sure that you're using the latest ASE version, I recommend picking it up from our website. To get the Opaque texture, simply enable it in your Lightweight Render Pipeline Asset and use the Grab Screen Color Node; do set the queue to transparent on the left but you won't need to fetch the texture by name.

    Let us know if you run into any issues, thanks!
     
  42. Tracecat

    Tracecat

    Joined:
    Mar 20, 2014
    Posts:
    15
    Hi!

    Is there way to select a HDRP StackLit masternode from a template? Just downloaded Amplfy and couldn't find it. That would be nice because I would like incorporate some elements out of the "measured materials" library from unity into amp shaders.
     
  43. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    Not at the moment, definitely something to consider for future HD related updates.

    Apologies for the inconvenience.
     
  44. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    434
    We have a project that's basically what you're asking about. I had actually asked the same question about the single shader for left/right eye textures on here, and the answer is yes. We got rid of our two sphere approached with a system based on 1 shader on 1 sphere written in ASE. Definitely doable. It ended up evolving into a whole custom system using color masks for collision detection, depth masks for cursor distance, animations, etc ... but yeah, it started with your exact question of if its possible to use 1 sphere and 1 shader for the left / right eye :).

    I also use a depth mask to determine the pointer's distance from the camera, so it simulates 3d pretty well.

    Attached is a screenshot from ... some version ... of our shader. We needed to be able to turn the VR effect on/off with a fade effect (project stuff, dont worry about why), and also needed to be able to change which eye would be shown in the editor ... so if you dont need it, ignore it. Main thing is the unity_StereoEyeIndex being used to determine which texture is getting shown.

    Wrote a pretty extensive c# script to control and automate all of this, including creating materials, setting the position in the world based on pixel x/ys that map to photoshop, and so on. You definitely don't need to stick with two spheres with two materials for each image :).
     

    Attached Files:

    Last edited: Jul 19, 2019
    Cleverlie and Amplify_Ricardo like this.
  45. Pancar

    Pancar

    Joined:
    Mar 11, 2013
    Posts:
    22
     
  46. cyuxi

    cyuxi

    Joined:
    Apr 21, 2014
    Posts:
    35
    Hi Ricardo, Thank you for pointing out the right sample for us. We are currently looking into the FourSplatsFirstPass function to see if we can make some modifications. The structure of the function is very clean to read and learn, so far so good, but a small part about calculating tangents is confusing that if multiply 0 wouldn't it produce 0 as always?
    2019722-星期一-103744.jpg
     
  47. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    1,387
    It does now! Pick up the latest version from our website.

    That triangle would probably appear if you attempted to generate a PPS effect using an incorrect shader type; I'm assuming that's not the case but I was unable to replicate the problem otherwise. Have you had the chance to check the included Mosaic PPS sample? If so, is the issue also present there?

    Are you simply trying to convert the Sobel example?

    That's actually hack that lets ASE know that we need both Vertex Normals and Vertex Tangents to be present in your shader Vertex Data. It's actually discarded as the tangent calculations required are done further ahead in the graph using a Custom Expression. This hack of sorts shouldn't be required in more recent versions, we may revisit this later on.

    Thanks!
     
  48. Pancar

    Pancar

    Joined:
    Mar 11, 2013
    Posts:
    22
    Hi Ricardo, mosaic sample working correctly and yes, i just convert sobel example,have you any advice for this case?
     
  49. mortenblaa

    mortenblaa

    Joined:
    Sep 3, 2014
    Posts:
    9
    Tried downloading the latest version from your website, but problem still exists. I've now tried on both Windows 10 and macOS 10.14.5

    Unity 2019.1.10f1
    Lightweight RP v5.7.2
    Amplify Shader Editor v1.6.8.007 (16th July 2019)

    Just to repeat:
    Create a material with an "instanced property", like a color property. Enable GPU instancing on material. Change color property using a material property block.

    This is the error on macOS
    Code (CSharp):
    1. Shader error in 'Unlit': undeclared identifier 'UnlitArray' at line 106 (on metal)
    2.  
    3. Compiling Vertex program with INSTANCING_ON
    4. Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_COLORSPACE_GAMMA UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR
     
  50. cyuxi

    cyuxi

    Joined:
    Apr 21, 2014
    Posts:
    35
    2019722-星期一-221514.jpg

    Hi Ricardo, Thanks for your advice.

    I have made some progress today by adding a height blend feature in the function. The result looks ok, all 8 slpats with normals are rendered, but only missed a little part about associating the HeightBlend parameter across the FirstPass and the AddPass, so currently there is no Height Blend on the border of two pass but only Control Blend.

    I tried apply the HeightBlendWeight to the Alpha but that will produce blacked area with pixelated aliasing on the AddPass side. I couldn't figure out why would such getting black result with Additive ? I also tried switching the blend mode to alpha blend and also did some stupid random twists to try luck....none was right....I guess...Maybe I need some fresh air.... or Maybe there is kind of a trick to turn this around but I just missed...
    If you don't mind, Please have a look at the sample project I uploaded. Thanks!
    (p.s. The project setting is set already, which can be simply open with untiy2018 3.14 as project)
    2019722-星期一-221537.jpg
     

    Attached Files:

    Last edited: Jul 22, 2019