A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
New Unity Live Help updates. Check them out here!
Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.
That did the trick thank you!
We're building the game using SDK 6 but the TestKit is running 6.5. (6.5 is not compatible with version 2017.4 of Unity)
I tested with Unity 2019.1.9f1 on a clean project (only with the toon shader compiled with ASE 1.6.8.005) and I still have the issue. I forgot to mention that it only happens when the color mode is on Linear.
I zipped a very simple repro project so you can test and see if you can reproduce the issue.
I am having some problems getting pixel world position in my post-processing shader.
I am trying to make a post-process shader which projects a texture using worldspace as UVs. My shader looks like this:
The shader produces this result:
At first glance this may seem fine, however upon closer inspection you will notice there is a grey pixelization around the edges of the alpha-masked foliage. This creates a very noticeable and uggly buzzing effect when the camera moves.
The issue is also present on the edges of geometry, however it is much less apparent.
What could be causing this issue?
Also, when I look at the foliage form afar, it becomes pure grey pixels.
Hi, I've been trying to get depth fade working for a basic shoreline and can't for the life of me figure why it's not working. I went to Unity LWRP then came back to 2018.3.8f1 and now it's not working. Doesn't look like it's reading the depth texture correctly as you can see from my very rudimentary screenshots. Any help would be greatly appreciated
Hi guys, I want to bring to life an idea I had, but I'm not sure if I will be able to do it with Amplify so any help will be appreciated.
So I have to do a 360 VR experience, and for that the artists will provide me with full 360 renders of different environments, interior and exteriors, so far this is something I did in the past, I create two spheres, I make an unlit shader which renders the texture in the spheres and I replace the UV projection of the mesh with a custom projection for the texture which does some math to do a sphere projection per pixel, then I use this shader to render the left 360 image to the L-Sphere which is on a layer that the left-eye camera will see, and viceversa for the right-eye camera.
this was my approach in a previous project and it worked fine, although I would love to know if this is possible with only one sphere and making the shader render different textures for different eyes (in stereo rendering) all in one single material. this is my first question.
my second question is that I want to take this approach one step further: I'll ask the artists to not only render the scene in a 360 image, but also give me a depthmap of the scene in 360, I want to use this depthmap to override (or write) to the depth buffer in Unity, this way I can have actual realtime 3D objects in the scene and this will be occluded with the 360 image, also I want to learn how to do this for another project where I want to do some sprites that write depth information to the depth buffer to achieve the same effect you have in the Sims 1 game, where the furniture is all 2D sprites, but all have "depth" and can occludee each other simulating 3D per pixel z-testing.
here is what I mean with the sims 1 technique
and I think you do something similar for the Amplify Impostors since they have a full depth representation of the original mesh.
Anyone have any ideas on how to achieve this?? thanks in advance!
We ran a few tests and it seems we can replicate it to some extent. Not entirely sure what could be causing it but we will investigate.
Thanks for the heads up!
You'll have to do your own ddx/ddy calculations in order to avoid those artifacts, set your Texture Sample to "Derivatives" to reveal the necessary input ports.
Seems to be working, what values do you have on the node?
That's an interesting case but I'm afraid I'm not entirely sure if you can have that type of eye-side rendering control using only ASE; could be wrong here, it's something that requires additional investigation.
Regarding the depth, you can offset it but, for what you describe, you will have to create your own shader template if you want to override it with your own texture.
I would recommend discussing it in our discord, I believe one of our users was building something very similar.
Amplify Creations Discord
EDIT: Never mind. I had to set the camera to Deffered rendering
I was going to make a water shader with foam and thought I could use the Force Shield example as inspiration especially regarding the intersection part.
It seems that the example is broken on my computer though. macOS with Radeon Pro Vega 56 8 GB. Unity version 2018.3.12f1
This is what it should look like
from the youtube video
Any idea how I can make intersection work?
So i figured it out. The camera was in Forward so not rendering a depth texture by default. I had to write a little script to write the depth texture with that specific camera and problem solved
Thankyou so much. The mip-maps are indeed the problem. Howevever I still dont understand how to compute them with ddx and ddy. Do I need the world normals? if so how do I get those in post-process?
Thankyou very much for your help
Having some trouble with a shader on iOS. It looks like the intersection (generating the foam at the coastline) is bleeding through the globe in a pixelated format. So coastlines from the other side are visible.
Here's how the shader looks in the editor:
Here's how it looks on device:
There are the settings:
Any hints on how to fix it would be appreciated.
Edit: Maybe it's because deferred rendering is not really supported by mobile devices.
Hi y'all, I'm having an issue in amplify shaders:
I want to use the grab pass to make a water effect (by scrolling a couple normal maps), and in the built-in shader graph, there an input in the Tex Coordinate node called UV. That lets me use the screen position to remap the grab pass on the model.
However, there is no UV input in the Coordinate node that'd easily let me overwrite the UV, in Amplify.
Is there a way to plug screen position in a Tex Coordinate node in Amplify, as I'd do in the shadergraph?
This seems to do what I want:
BUT, I still need to blend the normal maps to have the water distortion, and have not been able to figure out how to do so.
The top image: in the shadergraph, the glass cube distorts only what is behind it.
The bottom image: in Amplify, then entire grab pass of the image is mapped on the plane instead of being screen based, leading to strange interactions.
Are you running the Editor with Metal?
Edit: In that case, it could be just a matter of forcing the Camera do render the Depth texture in forward as mentioned bellow.
Ah yes, my apologies I should have mentioned that earlier.
Here's a reference script for anyone experiencing the same issue; you just have to call camera.depthTextureMode |= DepthTextureMode.Depth; on Awake.
public class SetCameraDepth : MonoBehaviour
private void Awake()
GetComponent<Camera>().depthTextureMode |= DepthTextureMode.Depth;
Sorry about, should have been more specific; one of our developer provided additional input as follows:
You'll have to open the Shader function used(Reconstruct World Position From Depth) and change the mip type to Derivative, place the DDX and DDY nodes plugged into the sampler and plug the XY Screen Position into their inputs.
Let us know if you run into any specific issues.
Can you share a sample for further examination on our side?
Hey there, have you tried using the Grab Screen Position Node? I recommend checking our Refracted Shadows sample for a specific example.
Can you elaborate on the normal map use?
Regarding the Texture Array node. The docs indicate the "tex" port can be used. However, I get a shader compilation error when I hook it up: "Shader error in 'MyShader': undeclared identifier 'sampler_Texture0' at line 31 (on d3d11)".
I have locked the tex node to "Locked to Texture 2D arrray". I've also included a default texture array on the tex node. Looks fine in the node preview. However, the generated shader has the error.
The problem seems to be that the generated shader code still has "Texture0" defined as a "2D" rather than "2DArray" property and also doesn't call "uniform UNITY_DECLARE_TEX2DARRAY( _TextureArray0)".
Now I can mess around with changing the Texture Array node to be mode "Reference", define another texture array node and set it up as the reference to get the shader to generate more correct code. But if that's the route to take, then the tex input port on the Texture Array shouldn't be present, it would seem.
Not sure what could be happening, can you share an example for further examination?
What's your current Unity and ASE version, and renderer used?
Cool, that worked! My mistake was plugging screen positions + the normal for the waves into the offset of a Tex Coordinate node, instead, it should just be plugged directly into the UV of the grab pass!
Head slap.... I had a virtual texture object hiding out in my graph rather than a texture object (had been swapping things around and the boxes look the same in the graph). If I use a Texture Object, and force the cast to Texture 2D Array, the shader generates correctly. So it was the virtual texture object that was causing the error when connected to the texture array tex input.
Not that it matters, but I'm using unity 2018.2.21f1 and ASE 1.6.8.
On a slightly tangential topic; Is there a way to create a custom expression node that accepts as input the output of a texture object node that is cast to a Texture 2D Array? I'd like to do some custom manipulation of an array texture in a custom expression node.
@Amplify - Hi. I'm still getting the same issue with PPTemplate in VR, latest ASE/unity 2019.1.9.
I have this simple scene and ASE setup:
I tested the scene on android, both multi-pass and single-pass, the result is always the same(don't mind the inverted color, I just used linear depth):
And here is a screenshot from the unity editor:
I tried a lot of stuff but I couldn't find a solution yet. If I find something, I will let you know.
edit: I just found this article, it has a lot of useful info in there:
Hello. Im trying to create recolor effect for 2d sprite. I need to extract dark, bright and mid tones of gray scale image. Then remap each one to custom color. Then blend it back together properly. Exact like AfterEffects Tritone/Tint effects: https://helpx.adobe.com/after-effects/using/color-correction-effects.html#tritone_effect
Can you help me plz?
When i trying to use PPS shader with PPS Tool, my screen is like this.
i import PPStackTemplates, before import this package, i saw a triangle in front of the camera.
Good to know!
Ah good to know you found the cause.
We're actually using a Unity Macro that declares the array and sampler so you might need to dig a bit deeper to do the same with a custom expression.
UNITY_DECLARE_TEX2DARRAY To declare the array
UNITY_SAMPLE_TEX2DARRAY To fetch it.
You could use the Texture2DArray type with the custom expression but you'll need to find out how Unity is doing it in order replicate how it's fatched ; we don't have this information readily available as we're simply using the Macro.
Just to be sure, did you pick up the latest version from our website?
I will pass this on to the developer that handled the initial request.
Hello, a good place to start is the Remap Node, you can see it in action in the video bellow.
You also have a variety of blending nodes at your disposal such as Lerps, Blend Operations, and Weighted Blends to name a few.
That's odd, do you see the same issue on a project without Vuforia?
What's your current Unity version?
I created a new project without vuforia (Unity 2018.8.3f1 , ASE v1.6.8 rev 00), screenshots here
I'm trying to pixelated my opacity output but it I have trouble converting the pixelated output to the correct format.
This doesn't work:
Is there a way to format the output from the "Step (Input)" node so Opacity accepts it?
Yeah, it seems ASE will need to add support for passing a texture array to a custom expression for it to work (add another item to the Input Type dropdown). I could almost do it using the custom field, but the need for parentheses disallow it.
To receive a texture array in a function unity macro's are also used. As an example if you put in a triplanar shading node, set it to receive a texture array, then hookup a texture array, this is the code that ASE generates in the shader:
To call a function, UNITY_PASS_TEX2DARRAY macro is used:
float4 triplanar61 = TriplanarSamplingSFA( UNITY_PASS_TEX2DARRAY(_TextureArray0), ...
The function receives using this macro:
inline float4 TriplanarSamplingSFA( UNITY_ARGS_TEX2DARRAY( topTexMap ), ....
There's no way I can enter anything in the custom type field to get ASE to generate "UNITY_ARGS_TEX2DARRAY( topTexMap )" in the custom expression function (because of the parenthesis after the texture array name).
However, this should in principal be possible but only by adding some support into ASE.
If anyone knows some other way to pass and receive the texture array, I'm all ears. I can't figure out how to use the "Texture2DArray" as a type and get unity to compile the shader. I've only gotten the UNITY_PASS and UNITY_ARGS macros to work.
This might not be strictly a ASE issue but i'm going to ask anyway. I have a custom terrain material (made in ASE) with a property i want to animate at runtime. Has anyone done this before? Is there anyway to do this? How do you reference a terrain material via animation or script. i could use global parameters but i don't really want to do that.
Any advice would be great!
Hello, I did this just as you described, however it did not fix the issue. And before you ask, yes I did re-compile both the material function and the shader that is using the function. Have you any other suggestions?
Hello everyone, I've always wanted to buy the Amplify Shader Editor (ASE), but recently with Unity updates, I've seen that now Unity has its own shader editor, is it still worth buying ASE?
My english is a bit rusty.
- Emanuel Messias, R.d.S
Just to be sure we have the same exact configuration on our side, what's your current PPS version listed in the package manager?
Thank you for elaborating, we appreciate it. Upon further investigation, it does seem to be a bit problematic as different platforms have specific requirements. We will indeed have to add this on our side, we've opened a ticket which we hope to tackle soon.
It does seem like the terrain material property changes are not being picked up given that you can't access the renderer as you would with regular objects. I'm not entirely sure what to recommend as we've never really come across this limitation but the best way around this issue would probably be to add a material control script, so to speak, to your terrain that would allow you to animate those properties instead. Is this something you feel you could try on your side?
Can you share a sample so that one of our ASE developer can tackle it directly?
Thanks for asking!
We're a bit biased of course but we would have to say yes, absolutely. ASE continues to be improved at a fast rate and remains an open(full source) and flexible solution for shader development in Unity. We're not constrained by the overall Unity roadmap, this is an editor built for the requirements of its community.
More than an editor, you'll get responsive support, over 60 samples to get you started, HD and LW SRP support, Legacy Rendering support(Unity's editor is limited to SRP), Shader Functions, Shader Templates, a Custom Node API, and a set of learning resources to get you started.
Be sure to join our Discord community for realtime discussions with other ASE users.
I found the problem, it's not on your side. The depth is only sampled correctly if the PostProcessEvent is set to BeforeTransparent, in the post processing script and for both _MainTex and _CameraDepthTexture, the Screen Position is connected to the UV port.
Hope it helps
Hi, I'm making an outline effect, here is the simplified graph. I'm using LWRP.
When I try to connect inverted normal:
How can I invert the faces in the shader?
Update. It works if I switch shader from unlit to pbr.
This is something i definitely can do on my side i just wanted to see if there was another example of this. Thanks for your help
Does ASE support Gradient Nodes a la Shadergraph?
Thanks for the heads up, we really appreciate it.
I will pass this on to the ASE developers.
Interesting, we will look into it asap.
Do you mean which faces are culled? You can adjust that in the SubShader parameters under "Cull Mode".
Happy to hear it. Afraid not but be sure to let us know if you run into any issues.
While you can achieve similar results, ASE currently does not provide Gradient Nodes; this is definitely something we consider including in the near future.
Be sure to check our 3-color gradient example.
Is there a way to convert COLOR to SAMPLER2D?
I'm afraid not, perhaps we can help achieve what you're looking for in another manner. Please do elaborate
I tried to describe it as good as possible in this post https://forum.unity.com/threads/bes...der-creation-tool.430959/page-97#post-4739774
We are trying to make shaders for terrain,
Our project requirements and details in brief:
- Run on IOS (IPhone 7 or above)
- Terrain should at least support 8 splats(2 control, 8 color, 8 normal )
- Unity2018.3.14 LWRP(4.xx)
We have made the tarrain with Gaia and CTS, It is 3.5km in size and have 4 splats, which runs good steady 30+ framerate on device. However, it seems the CTS's shader is a bit heavy so we decided to give a try on making a custom shader with ASE.
We followed the Terrain shader guide but The SimpleTerrainFirstPass node keeps on poping error log:
Shader error in 'ASESampleShaders/Terrain/SimpleTerrainFirstPass': unrecognized identifier 'Input' at line 149
where in code:
void SplatmapFinalColor( Input SurfaceIn , SurfaceOutputStandard SurfaceOut , inout fixed4 FinalColor )
I am not good at programming so I just try my luck by replacing the 'Input' with InputData which eliminated the error log, but I have no idea what's for replacing the SurfaceOutputStandard...
And the both example SimpleTerrain and TerrainSnowCoverage are went down due to errors. So we are currently lack of decent references and information regarding to the workflow of making terrain shader .
So is it still promising for ASE to make terrain shaders such as multi-pass 4+ splatmaps?
Looking forward to any suggestions and advices. Thanks!
Sorry about that, not sure how I missed it!
The Tex input on your Texture Samples expects a Texture Object node; in this case you would likely need to pixelate the actual noise texture instead of plugging it there. We would be happy to look at a sample.
ASE is good choice for that but, based on the shader name listed above, it seems that you might be using a sample made for legacy rendering. Have you had the chance check the included Terrain LWRP sample? (Package "LW SRP Samples")
Looking forward to your reply!
Could you demonstrate how to use the _CameraOpaqueTexture in the LWRP. I've included the render feature bit can seem to access the texture in a Sampler2D :? Does the shader have to be transparent to get access to that texture? I've tried using the Grab Screen Color Node but I can't access the parameter to change to _CameraOpaqueTexutre
I would recommend updating your ASE version as we recently corrected a few issues that could be related. Should the problem persist, we could really use a sample with the problem present for further examination on our side. We don't recommend using legacy shaders with LWRP, I'm not entirely sure what type of problems could arise in your specific situation.
Be sure to get it directly from our website: Amplify Product Downloads
What's your LWRP version?
We've actually made a small update to allow for that so be sure that you're using the latest ASE version, I recommend picking it up from our website. To get the Opaque texture, simply enable it in your Lightweight Render Pipeline Asset and use the Grab Screen Color Node; do set the queue to transparent on the left but you won't need to fetch the texture by name.
Let us know if you run into any issues, thanks!
Is there way to select a HDRP StackLit masternode from a template? Just downloaded Amplfy and couldn't find it. That would be nice because I would like incorporate some elements out of the "measured materials" library from unity into amp shaders.
Not at the moment, definitely something to consider for future HD related updates.
Apologies for the inconvenience.
We have a project that's basically what you're asking about. I had actually asked the same question about the single shader for left/right eye textures on here, and the answer is yes. We got rid of our two sphere approached with a system based on 1 shader on 1 sphere written in ASE. Definitely doable. It ended up evolving into a whole custom system using color masks for collision detection, depth masks for cursor distance, animations, etc ... but yeah, it started with your exact question of if its possible to use 1 sphere and 1 shader for the left / right eye .
I also use a depth mask to determine the pointer's distance from the camera, so it simulates 3d pretty well.
Attached is a screenshot from ... some version ... of our shader. We needed to be able to turn the VR effect on/off with a fade effect (project stuff, dont worry about why), and also needed to be able to change which eye would be shown in the editor ... so if you dont need it, ignore it. Main thing is the unity_StereoEyeIndex being used to determine which texture is getting shown.
Wrote a pretty extensive c# script to control and automate all of this, including creating materials, setting the position in the world based on pixel x/ys that map to photoshop, and so on. You definitely don't need to stick with two spheres with two materials for each image .
Hi Ricardo, Thank you for pointing out the right sample for us. We are currently looking into the FourSplatsFirstPass function to see if we can make some modifications. The structure of the function is very clean to read and learn, so far so good, but a small part about calculating tangents is confusing that if multiply 0 wouldn't it produce 0 as always?
It does now! Pick up the latest version from our website.
That triangle would probably appear if you attempted to generate a PPS effect using an incorrect shader type; I'm assuming that's not the case but I was unable to replicate the problem otherwise. Have you had the chance to check the included Mosaic PPS sample? If so, is the issue also present there?
Are you simply trying to convert the Sobel example?
That's actually hack that lets ASE know that we need both Vertex Normals and Vertex Tangents to be present in your shader Vertex Data. It's actually discarded as the tangent calculations required are done further ahead in the graph using a Custom Expression. This hack of sorts shouldn't be required in more recent versions, we may revisit this later on.
Hi Ricardo, mosaic sample working correctly and yes, i just convert sobel example,have you any advice for this case?
Hi Ricardo, Thanks for your advice.
I have made some progress today by adding a height blend feature in the function. The result looks ok, all 8 slpats with normals are rendered, but only missed a little part about associating the HeightBlend parameter across the FirstPass and the AddPass, so currently there is no Height Blend on the border of two pass but only Control Blend.
I tried apply the HeightBlendWeight to the Alpha but that will produce blacked area with pixelated aliasing on the AddPass side. I couldn't figure out why would such getting black result with Additive ? I also tried switching the blend mode to alpha blend and also did some stupid random twists to try luck....none was right....I guess...Maybe I need some fresh air.... or Maybe there is kind of a trick to turn this around but I just missed...
If you don't mind, Please have a look at the sample project I uploaded. Thanks!
(p.s. The project setting is set already, which can be simply open with untiy2018 3.14 as project)
Hi everyone! I'm very new using Amplify Shaders and I'm struggling with something I think it's very basic. I tried to look in the documentation and for tutorials but I still haven't figured it out.
I want to create a circle mask that works in screen space, the circle is in the middle of the screen and only shows the content in that area. My issue is by creating the mask and placing it in the middle of the screen... do you know any tutorial or resources I can look at to achieve this effect? Thanks so much!
Ah good, than it's more of a conversion issue here. I'll have to check it out, I don't think it's something you can convert just by changing the shader type but I will check with the developer that created it.
That's odd, seems to be working on my end using our sample. Just to be absolutely sure, you're using a default shader with just the color, correct?
The ASE developer is aware of the reported issue and will run additional tests as soon as possible.
We appreciate your patience, thanks!
I'll have to pass that on the developer, it's not something I have tackled before.