A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.
It's hard to create a custom material editor.
I guess I can't wait for news on the bug(((.
Ah yes, the Inverse Lerp was a user request; we regularly use our own Shader Function system to add new content to ASE.
Should be negligible, I'll confirm with the devs.
It can be challenging, especially because there isn't much official documentation out there.
I recommend checking this:
Hey there, apologies for the delay, the update is now available. Please update, re-save your shader, and let us know if the problem persists.
Is there a way to modify the Z-depth using textures like how Impostors can make a quad write a 3D depth into the scene and intersect with other objects and cast proper shadows?
Hi is it possible to use ASE written shaders for post processing V2 effects without ASE in unity project? Just copying shader and c# code written by tool to another project does run but with issues with expected effect on screen - not shown in scene view, texture scrolls off rather than wrapping. (Context - I've been using ASE to make shaders for group hobby projects, committing just the shader code so I don't inadvertently share ASE itself in source control). Thanks
Possibly, but it's not something we have on the editor side as an option, some work is required.
Image Effects made with ASE do not need the editor, what seems to be the problem and what files are you copying along? Removing ASE by itself should have no effect whatsoever on your shader, could you me be missing one of our default textures?
I made this waterfall shader and the mesh still casts shadows despite having it turned off, I only have received shadows on. What am I missing? *EDIT: Nvm I figured it out
I've copied the shader code, the texture file required (not an ASE one) and the c# file generated by the ASE tool for working with post process shaders into a different project that has never had ASE installed. Projects are both on same Unity and post process V2 stack versions
In the original project the post process shows in scene view and the panned texture covers whole screen and continually loops as expected. In the non ASE project I can assign the post process but it only shows in game view and the texture doesn't cover the whole screen and scrolls off the bottom rather than looping round.
That is odd, thanks for elaborating. Have you tested taking ASE over for a quick test? Just to be sure, the editor does not touch anything at all, hence why it's quite odd.
I would request that you send us the shader, script, and textures, directly to firstname.lastname@example.org so that we can further examine the problem; a screenshot of it working and not working would also be very helpful.
Thanks, everything works fine!
When I went to reimport and set up to get your requested examples it is now working! So no idea what's going on there but I can get them into club projects so I'm happy, thanks
I am targeting powervr on ios so I would like to set precision to 'half', but I heard that unity casts and recasts between 'half' and 'float' during runtime, erasing this optimization. Is this true?
Hey there, what was the context?
This thread made me rethink it:
I understand vertex and most vector stuff will be float but if Unity is recasting pixel depth then it seems silly to bother setting precision.
How can I make sure I get normals based on worldpsace (like plugging a 0,1,0 vector3 would always make the normals face upwards) with the spherical billboard option? Because no matter what I do, rotating the object changes the normals of the model as well.
I would refer to this page, hardware plays a part: https://docs.unity3d.com/Manual/SL-DataTypesAndPrecision.html
Can you show us your setup?
Thanks for digging that up. It was this bit from the PowerVR dev docs that originally caught my attention, but I think this is a futile effort or at least it's better to focus on something else for now.
"While all mobile architectures are great at half-precision computation, PowerVR is exceptionally good at it, so make sure you use it wherever you can. Using half-precision (FP16) in shaders can result in a significant improvement in performance over high precision (FP32). This is due to the dedicated FP16 Sum of Products (SOP) arithmetic pipeline, which can perform two SOP operations in parallel per cycle, theoretically doubling the throughput of floating point operations. The FP16 SOP pipeline is available on most PowerVR Rogue graphics cores, depending on the exact variant."
You're setting 0,1,0 in the Local Vertex Normal, you're assigning a new normal in Object Space; having the Billboard active will affect this value in order to be consistent with the vertex adjustment.
"How can I make sure I get normals based on worldpsace"
This is always local/object space, not worldspace, what exactly are you trying to achieve? Do you have a reference of the effect, perhaps a shot?
I suppose we could consider adding an option to the Billboard that would prevent it from adjusting the normal but we'd like to be sure this is what you need.
I'm defining a rolling cloud shape in worldspace and I use a model with stacked planes with billboards to always face the camera to simulate something like raymarching but as you can see the normals also change depending on the angle you look or the orientation of the object. And it creates these normal changes no matter what I plug there.
Amplify version: 1.8.8.000
Unity version: 2020.1.14f1
Renderer: Whatever the default is
Lighting setup: 3 directional lights, no shadows. 1 baked reflection probe.
Using post-process layer
Render type is set to opaque, see attached
Hello everyone. Someone knows how to make the screen position node return the center instead of the lower left corner?
Thanks for the additional details, we're going to add a Normal toggle to the Billboard parameters; should resolve the issue. We'll notify you directly when it's available.
Thanks for the additional details.
-Are you using baked lighting? Please show us your actual Lighting Window, screenshot would be great.
-We can't examine the shader, the ASE metadata has been removed. Is there a specific reason for this?
Please feel free to reach out to us via email@example.com if you prefer.
Hey there, we'll need some additional context.
Amazing! Thank you.
I am creating a procedural shader that can be rotated and offset to the center of the screen.
Also is it possible for you to make a version for the Atan2 node that makes its range from [0,1]?
Hello, does someone know how to import function file and use it with the custom expressions?
I am using the Universal/PBR which dont contain the "Additional directives" on the main node.
There is no description for "Normal Create".
Please let me know where to find it.
This node has the following note.
"This node requires Amplify Texture to be imported into the current project."
Are there any other similar nodes?
If there is a list somewhere, someone please let me know.
It would be helpful to me.
So no one knows how to solve my problem? I posted the shader code above.
I was actually using UV Set 4. The frame was titled "UV5 Selection" changing the Frame Title to "UV4 Selection" of course did nothing. At a loss on how to fix this.
You can create a Shader Function from this node network, but I will pass it on to the devs.
We're actually going to add that option soon but you can achieve the same results by adding your own CGINC and calling it from a custom expression.
What shader type are you using, our own URP PBR? Directives can be found here:
Yes, some of the more recent shader functions do not have a page yet. In any case, you can double click it to see what's being done. It's similar to the Normal Create in shader graph.
Similar, for what purpose?
That node is very specific, it's a legacy node used by one of our Plugins(Amplify Texture) that has since been retired from the store.
Replies can take some time, especially on weekends. For quick questions, I definitely recommend using our Discord server, we have specific channels for ASE discussion.
You're using the Rotator node, you have to input the center offset value directly in the Anchor input.
What are you referring to by "frame"? Did you re-save your shader?
Just saw that your shader was made with 1.8.7, update, change the value, re-save your shader.
Removed my previous answer, this node is till using UV5.
Manipulations with center offset in Anchor input do not give the desired result.
Improved my volumetric explosion shader. No scripts, no extra libraries, no post processing tricks, correct depth sorting with solid objects and fully modifiable flame / opacity. Also pretty lightweight on the GPU.
Thank that fixed that.
But I cannot seem to update beyond v1.8.7 rev 12. Tried delating my AmplifyShaderEditor folder and redownloading it still keeps installing v1.8.7 rev 12.
Edit: Downloading directly from your website fixed it.
Hiya, hows things? I have a few questions:
[edited] - emailed
I need to access light attenuation in URP, it seems missing. I would be using height and normals to add extra self shadowing, and that's not possible in URP...
I need to access the final pixel colour (this is important) so I can add dither / in-shader grading etc at a shader not post level.. so it has to be the pixel after lighting... (see https://forum.unity.com/threads/ton...rocessing-for-oculus-quest-or-mobile.1043182/) - also in URP
Where can I find bent normal support in URP? I have been looking... I have great bent normal and AO baking capabilities here.
Please let me know how I can achieve these in amplify as they're the only things preventing me using ASE instead of custom code.
Hi again, are there plans to support the creation of Post-Process shaders like built-in but for HDRP?
Anyone have any examples of a hair shader with Anisotropic Specular Highlights? Right now I'm using Transparent Transparent with Cull Mode: Off, but I'm getting a see-thru effect. Anything else and either I get a cut out effect on the transparency, or the hair is not "full". Any tips or examples would be greatly appreciated!
Check "Unpack Normal Map" in the "Texture Sample" node and save it.
When I open the editor again, the "Unpack Normal Map" is unchecked.
Am I using it incorrectly?
save & close
I think you need to change the "Default Texture" to Bump. Then check the Unpack Normal.
thank you for the advice!
I tried it, but the condition didn't change.
After trying a little more, I've noticed.
If I open the Shader in an editor, the reported symptom does not occur.
When I open Material in an editor, the reported symptom occurs.
With the Material open in the editor, manipulate the values from the Inspector (for example, Tilling, Render Queue). This will immediately uncheck "Unpack Normal Maps" in the "Texture Sample" node.
I meant the center of the texture, you still have to account for position.
You might have experienced a Unity Package Manager cache problem there, deleting those files could help according to other users. (unrelated to our editor)
You'll need to use the SRP shader function for Additional Lights when doing Custom Lighting stuff with URP; this is demonstrated in the video bellow, we're using LWRP there but it's the same for URP. (sample in the description)
Try using a custom expression for the final color; check our URP terrain sample.
Bent Normals on URP, not sure if this is possible at the moment; has Unity put something out for URP with this?
Yes, for a future ASE version; we have it partially working but it's not something we can publish at the moment.
Maybe a Masked(cutout) alpha multiplied by a simple dither could do the trick; transparent surfaces are problematic. Is that a single mesh?
What's happening there is that you're adding textures that are not Normals, the Sampler Node will automatically change to adapt. This is not a problem if the the shader is closed, but if it's open it will change; try it, add a regular texture and then a normal.
Don't place textures that are not Normals on a Texture Sample node with Unpack checked, that's just going to give you incorrect results. What's your goal here?
Heya! Thanks for the video (fun vid!), but custom lighting isn't available on URP from what I can see. That final slot doesn't exist in URP, perhaps I missed a step!
Did you have a place I can put in code for editing the final pixel colour after lighting etc has been done? I must have missed it, I'm tired, or maybe not seeing it show up in URP version.
You might have missed a bit; you'd use the Unlit URP shader type for custom lighting. Along with the SRP Additional Light node as demonstrated, which is required for URP.
Regarding the Bent Normal, you'd probably need to create your own URP Template(which you can base off our own) in order to modify how the Normal is used. I'll consult with the devs for additional details in the morning.
My mistake there, the URP terrain version does not use the Custom Expression as does the built-in terrain shader. Let me get back to you on that.
I see.. I think we crossed wires! I want to use Unity's lighting but alter their attenuation so that I don't have to rebuild their lighting, but modify their attenuation with my own from bent normal/ao + my own stuff.
As for final pixel, that's something really, really great for mobile and VR. It means we can tone map, dither and grade in model (shader) space instead of post process. As you can imagine... .very big deal for VR
Thanks for following up, I guess my use cases are nontypical.
Instead of creating a dedicated texture, I just wanted to reuse the texture.
Thanks for the advice!
Ah, in that case it's a bit beyond our editor, you really need to dig into their code. Perhaps our URP Lit template could help you getting started in implementing your own, we wouldn't recommend it lightly.
Regarding the finalcolor;
Turns out our template does not allow for that; the function that does the final lighting calculations is called after all the input code is generated; meaning that we don't have access to that final value on our current inputs.
The only way around this is to, once more, edit the actual template file. e.g. create a new input port for code declared after the final lighting calculations; provided that you know which data to access, you'd then use a custom expression for whatever manipulation you may require. (based on dev feedback, this has not been tested)
We're talking about the Texture Sampler auto-adjustment, what's the context of what you're referring to?
There are ways to re-use a texture, what's your goal? Please elaborate.
how to recompile all shader in one click?