Search Unity

Shader Forge - A visual, node-based shader editor

Discussion in 'Assets and Asset Store' started by Acegikmo, Jan 11, 2014.

  1. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    You can't customize or create custom passes in SF at the moment, so, your best bet would be to use two meshes in the same location, or make two shaders in SF and paste them together in code manually, though you might run into some issues.
     
  2. dogzerx2

    dogzerx2

    Joined:
    Dec 27, 2009
    Posts:
    3,971
    That's ok! :)

    Man this editor is amazing!!

    Hey... by any chance that I could make a transparent material, with correct z order (it's for 2D game, but it's gets some bad z order with the default transparent material) Or maybe I can I control the z order with the z position?
     
    Last edited: Feb 3, 2014
  3. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
  4. dogzerx2

    dogzerx2

    Joined:
    Dec 27, 2009
    Posts:
    3,971
    I think I can work with just the clip transparency! it'll hardly show the difference!

    Although... on a scale from 1 to 10, what are the chances that Shader Forge can add a secondary "Main node", that works as a second pass? And this happens in the next couple of months?

    :3~

    Anyway, this editor opens so many others to unity game developers! I just been messing with it an hour ago and the possibilities are endless!
     
    Last edited: Feb 3, 2014
  5. kenshin

    kenshin

    Joined:
    Apr 21, 2010
    Posts:
    940
    Yes it happen also with high poly mesh but I discovered another thing, the problem is reduced if the camera is very far away from the object.
    Let me know if you need attitional info or test.

    PS: Thanks for this usefull update! :)
     
  6. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Well that's something that would require *massive* restructuring, simply because the main node doesn't generate just one pass, it can generate about 6 passes at most, depending on your settings, so it's not a 1:1 mapping.

    I have thought about making a "manual mode", in which you set up both passes and the vert/frag split manually, but it would be a long way down the road, so don't wait for it (sadly)
     
  7. dogzerx2

    dogzerx2

    Joined:
    Dec 27, 2009
    Posts:
    3,971
    Oh, well never mind then! :p I might end up winging it, maybe I can add second pass manually by script. :-0
     
  8. Airborn-Studios

    Airborn-Studios

    Joined:
    Oct 31, 2012
    Posts:
    30
    I know your question was regarding shader forge, i just wanted to show you what you are doing, so you understand it better.
    what you will need is either of the 3 inputs i gave you, looking at the nodes list i could easily find "scene color" and "scene depth" :)
    those should be your inputs to blur

    @Acegikmo: how would i go about using the normalmap as an input to for instance construct the lambert shading? i know i should use normaldirection and set it to pertubed to use the normal input of the shader - but what if i want to use a different normalmap or say a mip level of the original normalmap? i can't just input the normalmap as i would do in UDK? ah got it, just need to tranform it from tangent to worldspace?
     
    Last edited: Feb 3, 2014
  9. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    What do you want to do? Generally you'll only want to push normal maps into the normal input of the main node, which is enough usually :)
     
  10. Becoming

    Becoming

    Joined:
    May 19, 2013
    Posts:
    781
    Man it feels good to zoom out, taking a step back does so much for the overview! Thanks!

    edit: were the cables of the selected node highlighted before too or is this a new feature thats not mentioned on the changelog? I'm not sure if its new or if i just did'nt notice by now :D
     
    Last edited: Feb 4, 2014
  11. Doddler

    Doddler

    Joined:
    Jul 12, 2011
    Posts:
    269
    So I have a bit of an issue I'm working on. I'm working on a crossfade shader for textures with alpha transparency. The core of it is that it fades between two textures in a specific way that it looks like the scene with the destination texture is being faded in over top of the original scene. It's kind of a mess, but basically, I use the following to get the current color from the screen:



    This works most of the time, and the results are what you'd expect. However, I've run into an issue where if I use the shader on an object rendered on a second camera, that's not being cleared before hand (used for rendering UI/effects over the scene), the scene color that's being returned has it's UV's flipped vertically. I can't for the life of me figure out why it would be flipped though, perhaps something to do with the way textures are handled internally? Any idea why Scene Color would give the wrong values?
     
  12. FPires

    FPires

    Joined:
    Jan 5, 2012
    Posts:
    151
    This plugin is amazing. This last update made it even more amazing.

    Here's a shader I made with Shaderforge for the Slime enemy of my RPG:




    I have some requests:

    1) When I use the Depth Blend in a shader, SSAO stops working both in Forward and Deferred. Not sure if this is an issue with the generated shaders or a general Unity problem. Is there any way to fix it?

    2) Add a mode to convert a normal map texture set as a standard texture to Unity's normal map. I managed to do it with the existing nodes but it takes a lot of space.

    3) Add a way to control the normal map intensity.


    Anyway, fantastic work!
     
    Deon-Cadme likes this.
  13. Airborn-Studios

    Airborn-Studios

    Joined:
    Oct 31, 2012
    Posts:
    30
    there are plenty of usecases, say i want to create an eye shader with a different normalmap for the cornea/specular (bulging out) than for the diffuse/iris (bulging in with extra detailfor the iris, pupil etc.)

    or a preintegrated skin with mipmaps of the normalmap to control how much the translucency blurs the normalmap

    which brings me to a new question, I can copy a 2d sampler, but can i instanciate that texture?

    and thank you sooooo much for the zoom function!
     
  14. donzen

    donzen

    Joined:
    Oct 24, 2009
    Posts:
    54
    Cool,
    you should use the fresnel node to achieve a wet look.
     
  15. RodolfoLasparri1

    RodolfoLasparri1

    Joined:
    Feb 24, 2013
    Posts:
    14
    That looks pretty good, but it's a bit too contrasty, I tried turning it into a half-lambert shader, something like this (from a UDK Team fortress 2 shader tutorial):

    http://media.moddb.com/images/articles/1/29/28341/auto/002.jpg

    but I didn't know where to plug the Light Attenuation, :confused: sorry if this is super basic, I'm a unity shader newbie
     
  16. TRIAX GAME STUDIOS

    TRIAX GAME STUDIOS

    Joined:
    Nov 27, 2013
    Posts:
    49
    HI !

    @ACEGIKMO !

    FEATURES REQUEST.

    1 )- We need a Function to Select a node - CTRL + CLICK ( Select all nodes Parented to the Seleceted node ) So we can copy Large Trees Easily. - I think we need this even "before" the NEsting Nodes Feature.

    2) The possibility to Have 2 Shaderforge Windows Running at Same time and Copy paste Nodes betwen Windows Would be good / And also Needed Even before the Nested Nodes .. As can Gives us a Alternative way to store And debug code Snipets.

    3) We should have a Mask Node were we can Select up to 4 Custom colors and output those colors just. This to Use also CMYK beside RGBA methods in Mask, sow e have support to Cmyk Images Splat maps ...

    4) Screen Shots of REaaly large Nodes dont work as it goes beyongd 4000 px unity limitation ?

    4.1) Blur And Distort Nodes ??

    -----

    QUESTIONS :

    5) Is there a Way To Calculate Convexity ?

    6) Is there a Way to Calculate Oclusion ?

    7) Is there a Way to Calculate Geometry Boundaries ? Showing Wireframe and Smooth Expand it ..

    8 ) Is there any node That Emulate UDK : Coordinates / Object radius ? http://goo.gl/BrQY6t

    9) Aproximated Transluncency ? is it possible ? http://goo.gl/dgVeJR http://goo.gl/XBCdoh

    10) Can we do a SixPlanar Projected textures ?


    Can you show us Nodes Of those 5,6,7,8,9,10


    As Customer : I would like to have a Answer to my Questions if its possible for you.


    THANK YOU SO MUCH !


    TRISH DIAN / TRIAX GAME STUDIOS Tecnical developer.
     
    Last edited: Feb 4, 2014
  17. Airborn-Studios

    Airborn-Studios

    Joined:
    Oct 31, 2012
    Posts:
    30
    as this image was created by me, i can say - all you need is in there ;)

    half lambert is

    (Normal Dot Light)*0,5 + 0,5
    or
    (N.L)*0,5+0,5
    but if you want to make the shading even flatter, i've recently tested

    (N.L)*X+(1-X)
    where is can be 0,5 to create a half lambert or something lower or higher to control the contrast of the shading.
     
  18. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Nice one! Looks great :)

    I thought I had gotten rid of that by now, but apparently not. I'll have to look into this. Could you open a request on the feedback page?

    Not sure what you mean, could you elaborate?

    This can already be done, use a Lerp node, plug the normal map into B, and a Vector 3 of (0,0,1) into A, and a slider between 0 and 1 into T :)
    That said, normal intensity should be handled in the texture itself to save instruction counts, but it's of course not always possible to do that if you want dynamic shaders etc.!

    Glad you like SF!

    Yes, this is what you use the Texture Asset node for :)
    As for MIP sampling, plug in a value somewhere between 0 and 8 into the MIP input of the Texture2D nodes

    I'm glad you like the zoom function! Took a long time and had to fix a bunch of things to get it sorted, but now it finally works!

    The node screenshot I sent is already is a "half-lambert"-supporting shader, the light falloff will look like the ramp texture provided, so if the light goes from the start to finish, it will have that look :)

    Possibly, shouldn't be too hard to implement. That said, this should be pretty much remedied now that you can both zoom and box select, no?

    Yeah maybe, though it would require quite a lot of rewriting of code.

    You should be able to use the component mask node for that :)
    Though I'm not sure what this has to do with color spaces like CMYK

    Seems to work for me. Did you use the screenshot button?

    You can distory by offsetting and altering UVs before you plug it into another texture. Blurring can be done by sampling other MIP levels, though it might not look good enough in some cases.

    Nope, perhaps as a post-process shader, but not on the surface itself. You'll need to bake this beforehand

    Same as the question above, though it depends on what occlusion you're talking about

    Nope

    You can somewhat do that now, but not based on object scale, but rather based on world coordinates instead. There's no node for object radius/scale yet, so I'll have to add that at some point!

    Using custom lighting, yes
     
  19. GermyGames

    GermyGames

    Joined:
    May 20, 2012
    Posts:
    38
    In my case I'm lacking a lot of the fundamentals of working with shaders, so most of it can be pretty confusing :p I've learned a bit of Shaderlab and CG, but there's a lot of basic learning needed in my case.

    The one biggest suggestion I would have is documentation of what kind of data a node can take in, and can output. A lot of times I find myself scratching my head as to why I can't mix certain nodes together. Most of the time, it's something silly like multiplying a UV by an RGB value and expecting it to still feed into a UV input, but other times I find myself just guessing at what data type a node is looking for and will spit out.

    As far as converting between UDK and Shaderforge goes, I think the most important thing is to highlight the differences in naming and listing off what features work differently between the two.

    -----------------------------------------------

    That aside, I'm currently trying to figure out how to achieve an effect. I want to distort a texture using another texture so that I can get a ripply/melting look that can be animated.

    Here's what I have so far:
    $Screen Shot 2014-02-04 at 10.27.42 AM.png

    It looks okay, but I have to divide the multiplied UV by some arbitrary value to get the texture to be centered and scaled correctly. This number also seems entirely dependent on the size of the texture.

    If I don't divide the UV, I only see the bottom left corner of the texture. Is there a better way to achieve this effect, or is there a way for me to reposition and re-scale the texture by the amount that it was offset by?
     
  20. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    When dealing with distortion, you're usually better off using textures that are vectorized, such as normal maps. I'm not sure which texture you're using now, but a white value will offset and scale it diagonally, whereas a normal map would perturb it in various directions locally, without a general offset.
     
  21. TRIAX GAME STUDIOS

    TRIAX GAME STUDIOS

    Joined:
    Nov 27, 2013
    Posts:
    49
    HI ACE !
    Thanks for the Product Support ...
    ( Needed to know if is possible or not some things before i Continue try them )
    If we have a way to Select ANY COLOR / In this case a Exact CMYK in SHaderforge Instead of just RGBA
    - It turns more Easy to do Splatmaps .
    - - As For example You can bake CMYK SPlatmaps in 3dsmax ( Or any 3d studio that dont export Alpha in baking ) And use the Colors Directly - instead of going with the pain of Doing RGB + A Manually in photoshop etc / Overall Doing RGBA Splatmaps Images is A hell lot more dificult And time wasting than CMYK
    - - - If We can Choose and Select and separate a CMYK Output from Images directly in shaderforge would be turning everything in a production With Heavy SPlatmaps baking at least 10% more easily
    If we have a Color Picker and Are able to select custom colors from RGB XYZ World and local and normal Outputs We actually can find More Planar levels to choose in 1 go - Instead of rely in Matematic Computations of Stuffs like X*Y/Z To find For example The Edges or Convexity on a Geometry Sampling World to local to normal Coordinates ... and that cause we would just need to chose the colors degrees that define the SPace in the Normal That we want to afect Exactly, And that Way we can Even do it Visually ... We can also do in math but is crazy stuff were with Simply Picking colors Turns everything so much easy to find The Exact Diagonals in Normals and World and local spaces !
    A color picker near each 4 Mask Outputs would be Awsome To find and select exact Vertex Colors baked And overall Have Unlimited colors Masks That we can Sample from 1 texture so we can Sample for example 16 Diferent collors Mask Splats from 1 Single texture / just baking in the 3d studio with individual colors overlays and in Shaderforge choosing the exact color with the color picker interface.

    In shaderforge this would be done Just adding a Collor Picker in each 4 outputs to Select Wich color We want to find and use as mask Exactly / instead of Just the Traditional RGBA Fixed Outputs ...

    ^^

    Using custom lightning im sure we would be able to Invert the light fallof / And convert the World space to local / Do some Mat on the Normals / Subtract to DIfuse / and have a Complete ( Fake ) Ambient Oclusion ( The Inverse of lightning + Normals highlights ) efect ...

    If you were able to make Lightning be casted from spheres as you demonstrated, You also be able to make SHader Based Geometry aware ( fake ) Oclusion ...

    There Is Just the need to In Custom lightning we be able to Chose and select 1 Exclusive light that will cast the oclusion efect / So only that Single light Contributes and controls the efect
    We Should be able to do it using both Custom lightning to drive (fake occ) And Comon Lightning same time ...

    CONGRATULASTIONS FOR THE 0.23 RELEASE ! ITS A HUGE LEAP FORWARD !
    Thanks for listening And feedbacking With so much efforts and so entusiasticaly to all your customers requests : )

    BEST REGARDS

    TRISH DIAN / Triax Game Studios
     
  22. sabba2u

    sabba2u

    Joined:
    Feb 3, 2012
    Posts:
    42
    I was wondering if there was a method to create a blur as well as how to create a cut off based on alpha like in this example:

    http://www.patrickmatte.com/stuff/physicsLiquid/

    We are close to finishing up our game, and SF has come at the right time when we are optimizing our water physics for H2FLOW.

    Right now we have a camera rendering to a rendertexture. It has a blur camera image effect with a mask, as well as a contract image effect. If we could cut those down and do it all on the shader we could get our game to run a 30 fps on much slower devices. Any ideas?
     
  23. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Well, the blur image effect is already a shader, so, I don't think it will make much of a difference really.

    Plus, you can't do multi-pass blurring in SF at the moment, so a blur effect will most likely be quite expensive to do, and take a long time to create, as blur methods usually utilize for loops, which you can't do in SF at the moment
     
  24. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Sabba2u! :D Still working on the fluid eh?

    At some point you would have to output the blurred pixels to a texture and then pass it back in. You could try just outputting blurred/clamped particles in one go and see how it looks, but I think ideally you need to blur across multiple particles which requires all particles to be output first.
     
  25. Becoming

    Becoming

    Joined:
    May 19, 2013
    Posts:
    781
    I want to scale an object with vertex offset by multiplying the worldposition by a value. That works but of course the scale will be skewed when moving the object away from the scene origin. To counteract this effect i want to substract the object position from the worldposition but it breaks the shader for (to me) unknown reason. I turns pink. What is the reason?
     
  26. voxi

    voxi

    Joined:
    Dec 3, 2013
    Posts:
    60
    Found another project that compliments this project nicely :)

    http://catlikecoding.com/unity/products/numberflow/

    Lets you generate lots of randomized, procedural noise in real time.

    Shaderforge + Nuberflow = No bitmaps to download. You can make a lot of things with just a few functions.


    My question now:

    Will Shaderforge ever have nodes to generate perlin,voroni or simplex noise? Please forgive me if this already exists.

    This application is based on noise, you can make thousands of textures :
    http://neotextureedit.sourceforge.net/
     
  27. FPires

    FPires

    Joined:
    Jan 5, 2012
    Posts:
    151
    Some of us like to use Normal Maps with the Alpha Map, so instead of setting the texture to "Normal Map" in Unity we have to use it as a standard texture and convert it to the normal map format via shader (saturating the (texture*2-1) , obtaining the b from 1-(r^2-g^2) if I recall correctly).

    It's all doable with the current nodes, so my suggestion was more of a quality-of-life thing!
     
  28. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    That's because world position is the position of the current *vertex*, but you're trying to access the position of it before you've moved it, and it's expecting it to have already been moved, so it breaks.

    You'll want to use object position instead, which is the world position of the object :)

    Glad it works well!

    Generally you'll want to use textures, as from Numberflow, when doing perlin noise and similar. I'm not sure if it's really possible or viable to generate noise on a GPU level, first because there's no random value function, so you'll need to use a noise texture anyhow, and secondly because I haven't found much documentation on how to achieve these on a real-time level.

    Also, I noticed you haven't responded to your review on the asset store, assuming you are "Voxi 3D"! If you're not, then nevermind :)

    Ah, right, this bas been requested before too: http://shaderforge.userecho.com/topic/291454-unpack-normal-node/
    I might be able to throw this in for 0.24 :)
     
  29. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Thank you very, very much for the zoom function!
    I am not going to go as far as offering you my first born - but you get the point :)

    @Airborn-Studios:
    Neox a Unity User now? Kewl. :)
     
  30. laurenstjong

    laurenstjong

    Joined:
    Feb 28, 2013
    Posts:
    6
    Dear Ace,

    I really enjoy working with the ShaderForge. There is one thing bothering me at the moment though.

    As I am working in Unity I tend to navigate my scenes, animation editor and other editors by using: alt-, left mouse, right mouse and middle mouse button and centering selections by hitting F. Also selecting elements by using ctrl-click and shift click and clicking while dragging to select or deselect multiple nodes at once.

    edit: since a lot of artists use wacom tablets for navigating and working in 3D packages the "scrolling button" as the only option for a zoom out function is also not ideal because this scroll button is lacking on a tablet pen (thus the alt- mouse combination becomes even more relevant)

    Is there any way the navigation of your editor could be closer to how I am used to navigate in unity?

    I hope I am not the only one in this... otherwise i will have to learn and adjust:)

    Thank you so much for your awesome work
    Cheers
    Laurens
     
    Last edited: Feb 4, 2014
  31. FPires

    FPires

    Joined:
    Jan 5, 2012
    Posts:
    151
  32. Doddler

    Doddler

    Joined:
    Jul 12, 2011
    Posts:
    269


    Finally got my crossfade shader working. ShaderForge's Scene Color doesn't correct for the half pixel offset issue you get in DirectX though, so half the shader is fighting to make it work well in a pixel-perfect environment on a Windows system. That's not really a shaderforge problem however, it's still an absolute joy to work with.

    I had a question though. Is it possible to specify a target shader version? I can edit the shader target manually from 3 to 2, and it runs fine, but I have to do this each time I open the shader. Admittedly not all the features would work properly under a 2.0 target, but it would be nice to specify either way since my project is targeting 2.0 devices.
     
  33. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I'll add a 2.0 force checkbox in 0.24 :)
    (It will be an experimental feature, which means I won't provide any bugfixing/support for it, unless the feature itself is completely broken)
     
  34. roach_779

    roach_779

    Joined:
    Aug 2, 2012
    Posts:
    37
    Shader Forge creates shader script correct? If so, script would be easily readable and tweaked from code? Sorry if this is double post.
     
  35. Doddler

    Doddler

    Joined:
    Jul 12, 2011
    Posts:
    269
    Awesome, thanks! :)
     
  36. Steven

    Steven

    Joined:
    Dec 23, 2009
    Posts:
    43
    Hey Acegikmo - I am the nitwit with the vertex colours AO issue. It seems Blender does vertex colours backwards , just like everything else it does. In Blender, if I paint in blue, it shows up in the red/green channels(hence my blue channel being empty and my AO not working). If I paint a combination of red/green(yellow) it appears in the blue channel.

    Basically the point of this post is to let you know that Blender users get a weird result because of Blender, and to ask what program you did your demo object vertex colours in.


    EDIT - I have continued looking into this and found out that Blender doesn't export an alpha channel for the Vertex colours, could this throw off the vertex colour node?
     
    Last edited: Feb 5, 2014
  37. jcarpay

    jcarpay

    Joined:
    Aug 15, 2008
    Posts:
    561
    Thanks!
     
  38. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Yep, you can read and alter the code after SF has written it :)
    That said, your edited code will be regenerated by SF if you open it after altering.

    Strange, it seems reversed then yeah. You could add a One Minus node and plug the vertex color into it. However, this should already be sorted in Blender, strange.
    I personally use Maya for my modeling/vertex painting needs, which works great :)
     
  39. GermyGames

    GermyGames

    Joined:
    May 20, 2012
    Posts:
    38
    Thanks! That got me over that hurdle. I'm now trying to get the shader to work with Unity's built in sprites. It seems to still move the texture around instead of distorting it. I've tried the shader with 2D Toolkit sprites, which works fine, but it doesn't seem to work correctly with the native Unity 2d system. Do you know why?
     
  40. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I haven't looked into their 2D shaders yet, so I suspect there are things in there I haven't accounted for yet
     
  41. GermyGames

    GermyGames

    Joined:
    May 20, 2012
    Posts:
    38
    Alrighty. Another question: How would I go about infinitely tiling together a texture that's moving with a panner?
     
  42. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    UV coords wrap around if the texture import settings are set to Repeat instead of Clamp. ??
     
  43. GermyGames

    GermyGames

    Joined:
    May 20, 2012
    Posts:
    38
    That's the ticket, thanks!
     
  44. voxi

    voxi

    Joined:
    Dec 3, 2013
    Posts:
    60
    I bought an asset store plugin called Vpaint for vertex painting. It works as advertise , just not with shaderforge for some reason.
    https://www.assetstore.unity3d.com/#/content/9100

    Vertex colors seem to render properly in the diffuse node, but does not work for masking for some reason. Open the FBX in Blender and the colors are all perfect.

    Seems like the channels are swapped , anyone have a workaround for this ?

    Hopefully it is a cheap solution, I dont have any money left after buying Vpaint for this trivial task.

    I am going to write a review on the Vpaint store page warning that it does not work with shaderforge so no one else makes this mistake.

    Thanks in advance!


    EDIT: I had the wrong shader in the viewport model. PEBCAK
     
    Last edited: Feb 5, 2014
  45. RodolfoLasparri1

    RodolfoLasparri1

    Joined:
    Feb 24, 2013
    Posts:
    14
    ah that is great, I got it working, many thanks :)
     
  46. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Shader Forge doesn't handle vertex colors in any weird way, it's a direct read from the mesh vertex colors, so I'm not sure what could be wrong.

    Are you sure the masking didn't work as you expected? How did it look?
     
  47. voxi

    voxi

    Joined:
    Dec 3, 2013
    Posts:
    60
    Hi, and thanks,

    Seems that it is a Unity error. Unity log never says anything about PEBKAC when it occurs.

    I had the wrong shader in the viewport. (All Unity's fault. I never make such mistakes...ever!)

    Sorry for the false alarm, I went back and tipple checked everything and found my mistake. Think i need a break today. I hear there is this place called "Outside" where there is sun, sky, and sometimes clouds.
     
  48. Doddler

    Doddler

    Joined:
    Jul 12, 2011
    Posts:
    269
    I'm sorry if I'm missing anything obvious on it, but is there an explanation for these stats anywhere?

     
  49. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Yes!
    Right here:

    This is the instruction counter bar, where you can roughly get an idea of how expensive your shader will be.

    From left to right:
    Instructions per-vertex
    Instructions per-pixel
    Texture samples per-pixel
    Texture samples per-vertex
     
  50. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,156
    Is there an effective way I could batch some textures in shader forge? Like, could I store my shaded texture and lit texture in the same texture2D asset and handle all that internally? I know that for some things (like monochromatic data) I can store up to three textures if I just separate the RGB values properly, but I feel like this could help reduce some overhead for my needs.