Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Convert to SRP HD

Discussion in '2018.1 Beta' started by Elecman, Jan 26, 2018.

  1. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Sorry if this has been asked before (can't find anything) but how do I convert my existing project to the SRP HD pipeline?
     
    OfficialHermie and TooManySugar like this.
  2. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Hi,

    I wrote a general answer as this question will come very often.

    Short version:
    There is no automatic way to convert from Builtin to HD. We only provide a facility to re-assign few textures bind to Standard shader to the HD Lit shader.
    Edit -> Render Pipeline -> Upgrade -> High Definition -> Upgrade Standrad Materials to Lit Materials (Note: path may change in the future). The visual will NOT MATCH. It is necessary to rework the asset and the lighting to take advantage of HD.

    Keep in mind HD still experimental, later we will try to provide a bit more tools for conversion but we can't except to match, just give a start to rework data.

    Long version:

    Builtin Unity contain various mistake regarding PBR and sRGB/gamma conversion for lighting, also HD use inverse square attenuation. For HD everything has been fixed but then the visual will not match anymore.

    Main advice: When converting a project for HD, you should not aim at matching the visual of Builtin Unity as this is what we have attempt to fix. You should redo your lighting / asset to looks good with HD.

    Also HD Lit Shader pack ambient Occlusion and details mask inside the metal/smoothness texture. So reusing the same texture than standard shader may introduce mistake.

    Here is a list of divergent behavior from builtin Unity that can help if you are looking for to do project conversion

    • HD rely on a metric system. 1 == 1m. This scale need to be respected to work correctly with lighting

    • Better pre-integration of cubemap with GGX BRDF. builtin Unity pre-integrate cubemap with NDF, then apply a tweaked Fresnel term with roughness. HD pre-integrate DFG term and apply it on pre-integrate cubemap with NDF. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).

    • Pre-integration of GI (Lightmap/Lightprobe) with Disney diffuse. builtin Unity do nothing. HD apply in a post step the pre-integrated DFG term. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).

    • HD Light attenuation is inverse square falloff and use linear intensity. There is a smooth terminator function to force the attenuation to be 0 at range boundary, there is also an option to not apply the attenuation at all. builtin Unity use a texture to store the attenuation with special falloff formula and use gamma intensity. Going from gamma intensity to linear intensity is done with linear_intensity = gamma_intensity^0.4545. Note: As attenuation is different, even using this formula make no guarantee of match. HD light intensity must be way higher than in Builtin Unity.

    • HD Spot light attenuation use 2 angle (inner angle and outer angle) to control spot attenuation. builtin Unity use only one.

    • HD correctly perform a divide by PI of the whole BRDF. builtin Unity have an inconsistency in the BRDF between specular and diffuse, specular is divide by PI but not diffuse. This mean whatever the effort done to match lighting between vanilla and HD, one of the component will be PI different.

    • HD interpret the influence parameter for Reflection Probe in a more artists friendly way. Influence is inner influence (mean transition happen inside the volume). builtin use outer influence. This have been switch because Artists prefer to setup their volume then simply tweak transition size. With outer influence, they need to update the volume size when tweaking transition size (in indoor).

    • HD use camera relative rendering. builtin Unity don’t support it. It mean that light/object sent to shaders in HD have a different translation.

    • Metal/smoothness - Specular/smoothness is handled inside the same Lit material

    • Additive blend mode apply opacity unlike StandardShader

    • DoubleSidedGI is automatically coupled to two sided lighting flag and not expose in the UI

    • MotionVectors work as in builtin Unity (if skinned or rigid transform render motion vector) but in addition, in HDRP if a shader have an enabled MotionVector pass (velocity pass), it will render into the buffer independetly of moving or skinning (so it handle vertex animation).

    Important:

    • HD use camera relative rendering
     
    Last edited: Jan 29, 2018
    dog_funtom, m4d, cubrman and 7 others like this.
  3. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    417
    So in a shorter way, Is it possible to convert our projects to HDRP (completely) or do we need to create new projects with HDRP built-in (Template) and redo everything?
     
    Last edited: Jan 29, 2018
  4. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Thanks for the writeup. I have Unity 2018.1.04b and there is no Edit -> Render Pipeline menu. It could have something to do with a convoluted method of downloading a GIT repo, fixing dependencies, etc. Never got that part to work. Could that be the cause? Why not make it easier to test?
     
  5. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    417
    Edit->Project Settings->Graphics., First field.
     
  6. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Hmm... The template version is an old one (maybe 2 month old), not sure where this was located at this time
     
  7. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    HD is still experimental, so you shouldn't start a production on it. The template is pretty old and when we will update it, your project will be broke (various settings have move, different way to author them).

    Converting a project with our script just mean you change standard shader to lit shader and assign automatically few textures. That's all. It doesn't handle the lighting or the settings (like sky settings). So what you really save by "converting" your level is level layout + texture assignment.
    I will recommend to start from scratch if you can afford it, but many people can't.
     
    JakubSmaga likes this.
  8. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    417
    Is there a recommended time/date/HDRP version when we will be able start working on our new projects (HDRP) that won't break after a SRP/HDRP update?
     
  9. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    When its actually released?
     
  10. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    417
    I mean, I know that. But probably no one wants to wait 5-9 months just to start working on their new project/convert the current one.
    For sure there is going to be an experimental build that's going to be one of the final releases before the final one that's just going to work and won't break, That's why I was asking.

    I'm just excited about this feature and I want to get my project on it as fast as I can.
     
  11. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    We don't recommend to start a production project on HD before GDC 2018,
    we have not yet define a release date for HD. Goal of the experimental version is to experiment with toy projects.
     
    DanielDickinson and JakubSmaga like this.
  12. colincw

    colincw

    Joined:
    Jan 21, 2017
    Posts:
    4
    Why pack AO with metal/rough and not height? I feel like baking AO into textures is kind of outdated and now it's either dynamic or handled by the lightmapper. Is it just to bring it in line with UE?
     
    OfficialHermie likes this.
  13. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    There is no way to satisfy all users with the default packing. Every project have its own needs.
    Best will be to be able to select the channel dynamically in the UI but this was implying more code complexity and edge cases. Or alternatively developing a packing tool in texture importer, but there is no plan for that for now.
    Shader graph is coming to help with that. With it, you will have the freedom to chose your packing.

    Feel free to edit the shader code to put the channel that you need. All happen in LitDataIndividualLayer.hlsl.
     
    theANMATOR2b likes this.
  14. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    But it is still required to download the appropriate version from GIT, then download a custom post processing folder, right?

    By the way, it would be nice to see some side by side comparisons of the regular Unity look and the HD look.
     
    Last edited: Jan 31, 2018
  15. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    One other question. Is it possible to have an end user config option to choose between the different render pipelines or does it require maintaining two completely separate projects and executables?
     
  16. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Yes.

    With the new Unity Hub you can select a HD template (it still using an old HD release however) that do it for you.

    Unity users will certainly come with comparisons at some point, as people do for Unity vs Unreal.

    There is the template system mention above that allow you to start on a chosen render pipeline.

    You shouldn't switch render pipeline during a project (unless your render pipeline is a fork of another one).
    That is not the purpose. We are not aiming at comparing render pipeline between each other, you must chose a render pipeline at the beginning of your project based on project needs. If your target all platforms, chose Lightweight. If you know that all your target platform will be compute shader capable and your are aiming at high end graphic, chose HD.
     
  17. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    And eventually, pick from many of the (soon to be) community authored SRPs that will undoubtedly come out!

    Probably the most exciting part for me is that the rendering pipeline will be less in the hands of the developers and more in the hands of community, which means that if we dont like X Y or Z, we can finally just add it ourselves!
     
    hippocoder likes this.
  18. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yes, it's the only sane way. Have to resist casual users demanding it be everything, because - duh - we already have that and don't want the same mistakes.
     
    Elecman likes this.
  19. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    So how to convert my project to the low end pipeline? What things to take into account?
     
  20. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Better to create another forum thread for this one. It is better define and we material upgrader for it, but would not be appropriate to answer here as information will be lose given the topic "Convert to SRP/HD".
     
    hippocoder likes this.
  21. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    Given what we have seen in the blog posts about writing SRP and the new pipelines, I really don't think its too outside the realms of possiblity of creating a (either official or community authored) program that will "translate" shaders from Standard Pipeline to one of the new pipelines. There will be definite detail loss, and lots of cases difficult to cover but I am sure getting something that effectively converts from one to the other possible as long as 1:1 is not what your trying to achieve (and shouldn't be if your changing the pipeline).

    I'm going to be playing around with this idea, is anyone else going to have a shot at something like this or interested in something like this?
     
    Last edited: Feb 1, 2018
    theANMATOR2b likes this.
  22. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Ok, here it is: https://forum.unity.com/threads/convert-to-srp-low-end.515291/
     
  23. SuperPingu

    SuperPingu

    Joined:
    Feb 3, 2016
    Posts:
    1
    Hi,
    it's great to have new customizable rendering pipelines. But I have few questions about the Buildin one:

    Will those mistakes be fixed someday?
    Does the Builtin pipeline expected to disappear or being converted to scriptable?
     
  24. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    >Will those mistakes be fixed someday?
    No, because it break existing project...

    >Does the Builtin pipeline expected to disappear or being converted to scriptable?
    I have no answer to provide here
     
  25. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    How do I convert a custom shader to be compatible with the HD pipeline?
     
  26. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    If this is an unlit shader, there is not too much problem. Here is a minimal unlit shader helper for HDRP:
    Code (csharp):
    1.  
    2. EasyHDRP.hlsl
    3.  
    4. struct appdata
    5. {
    6.     float4 vertex : POSITION;
    7. #ifdef MESH_HAS_UV
    8.     float4 uv : TEXCOORD0;
    9. #endif
    10. #ifdef MESH_HAS_UV2
    11.     float4 uv2 : TEXCOORD1;
    12. #endif
    13. #ifdef MESH_HAS_UV3
    14.     float4 uv3 : TEXCOORD2;
    15. #endif
    16. #ifdef MESH_HAS_UV4
    17.     float4 uv3 : TEXCOORD3;
    18. #endif
    19. #ifdef MESH_HAS_NORMALS
    20.     float3 normal : NORMAL;
    21. #endif
    22. #ifdef MESH_HAS_TANGENT
    23.     float4 tangent : TANGENT;
    24. #endif
    25. #ifdef MESH_HAS_COLOR
    26.     float4 color : COLOR;
    27. #endif
    28. };
    29. struct v2f
    30. {
    31.     float4 vertex : SV_POSITION;
    32. #ifdef MESH_HAS_UV
    33.     float4 uv : TEXCOORD0;
    34. #endif
    35. #ifdef MESH_HAS_UV2
    36.     float4 uv2 : TEXCOORD1;
    37. #endif
    38. #ifdef MESH_HAS_UV3
    39.     float4 uv3 : TEXCOORD2;
    40. #endif
    41. #ifdef MESH_HAS_UV4
    42.     float4 uv4 : TEXCOORD3;
    43. #endif
    44. #ifdef MESH_HAS_NORMALS
    45.     float3 normal : NORMAL;
    46. #endif
    47. #ifdef MESH_HAS_TANGENT
    48.     float4 tangent : TANGENT;
    49.     float4 bitangent : TEXCOORD5;
    50. #endif
    51. #ifdef MESH_HAS_COLOR
    52.     float4 color : TEXCOORD4;
    53. #endif
    54. };
    55. #ifdef SHADER_CUSTOM_VERTEX
    56. v2f SHADER_CUSTOM_VERTEX(v2f i);
    57. #endif
    58. v2f vert(appdata v)
    59. {
    60.     v2f o;
    61. #ifdef MESH_HAS_UV
    62.     o.uv = v.uv;
    63. #endif
    64. #ifdef MESH_HAS_UV2
    65.     o.uv2 = v.uv2;
    66. #endif
    67. #ifdef MESH_HAS_UV3
    68.     o.uv3 = v.uv3;
    69. #endif
    70. #ifdef MESH_HAS_UV4
    71.     o.uv3 = v.uv3;
    72. #endif
    73. #ifdef MESH_HAS_NORMALS
    74.     o.normal = TransformObjectToWorldDir(v.normal);
    75. #endif
    76. #ifdef MESH_HAS_TANGENT
    77.     o.tangent = float4(TransformObjectToWorldDir(v.tangent.xyz), v.tangent.w);
    78.     o.bitangent = cross(o.normal, o.tangent);
    79. #endif
    80. #ifdef MESH_HAS_COLOR
    81.     o.color = v.color;
    82. #endif
    83.     // Transform local to world before custom vertex code
    84.     o.vertex.xyz = TransformObjectToWorld(v.vertex);
    85. #ifdef SHADER_CUSTOM_VERTEX
    86.     o = SHADER_CUSTOM_VERTEX(o);
    87. #endif
    88.     o.vertex.xyz = GetCameraRelativePositionWS(o.vertex.xyz);
    89.     o.vertex = TransformWorldToHClip(o.vertex.xyz);
    90.     return o;
    91. }
    92.  
    93.  
    94. SimpleTextureFade.shader
    95.  
    96. Shader "VFX Demo/VFX Static/Simple Texture Fade"
    97. {
    98.     Properties
    99.     {
    100.         _MainTex("Texture", 2D) = "white" {}
    101.         _Fade("Fade", Range(0.0,1.0)) = 1.0
    102.         _Tile("Tile", Vector) = (1.0,1.0,1.0,1.0)
    103.     }
    104.     HLSLINCLUDE
    105.     #pragma target 4.5
    106.     #define MESH_HAS_UV
    107.     #define SHADER_CUSTOM_VERTEX customVert
    108.     #include "CoreRP/ShaderLibrary/common.hlsl"
    109.     #include "HDRP/ShaderVariables.hlsl"
    110.     #include "EasyHDRP.hlsl"
    111.     sampler2D _MainTex;
    112.     float _Fade;
    113.     float4 _Tile;
    114.     v2f customVert(v2f i)
    115.     {
    116.         i.uv.xy *= _Tile.xy;
    117.         return i;
    118.     }
    119.     float4 frag(v2f i) : SV_Target
    120.     {
    121.         float4 col = tex2D(_MainTex, i.uv.xy);
    122.         return col * _Fade;
    123.     }
    124.     ENDHLSL
    125.     SubShader
    126.     {
    127.         Tags { "Queue" = "Transparent" }
    128.         Pass
    129.         {
    130.             Name ""
    131.             Tags{ "LightMode" = "ForwardOnly" }
    132.             Blend One One
    133.             ZWrite off
    134.             HLSLPROGRAM
    135.                 #pragma vertex vert
    136.                 #pragma fragment frag
    137.             ENDHLSL
    138.         }
    139.         Pass
    140.         {
    141.             Name ""
    142.             Tags{ "LightMode" = "DepthForwardOnly" }
    143.             HLSLPROGRAM
    144.                 #pragma vertex vert
    145.                 #pragma fragment frag
    146.             ENDHLSL
    147.         }
    148.     }
    149. }
    150.  
     

    Attached Files:

    tatoforever and hippocoder like this.
  27. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    If you try to do a new lighting model, then it is way more complex and we haven't write documentation yet. you will need to create a forward shader with 2 passes: ForwardOnly, DepthForwardOnly. And write all evaluate function and various other that are declare in lit.hlsl for your lighting model. Docuementation to come later. IF you want your model to work in deferred it need to replace the lit.shader which I don't recommend, better to modify it directly.
     
  28. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    How is specific order of rendering done? I have a sword and shield that sometimes needs to be rendered on top of some but not all the other objects, and sometimes they don't... due to clipping issues.

    I cannot render it without depth or the models render incorrectly. They also need shadows, so it's not an easy solution with builtin.

    I wonder if HD offers a simpler way to control these situations, thanks.
     
  29. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Looks like what you describe is similar to first person shooter hands/weapons (just a guess). In this case there is various way to script it (do a specific foreground pass, or reserve a range in Z for this, use a specific render queue). But we don't provide a solution for this in HD, you need to implement it.
     
    hippocoder likes this.
  30. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Thanks for the example. Will try to make it work. My shader heavily abuses the standard mesh data (colors, normals, etc.) so I hope that won't pose any problems.
     
  31. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Ok, I got the shader to compile but it doesn't draw anything. Can you see anything wrong with it right of the bat?

    By the way, how is fog handled now?

    Code (CSharp):
    1.  
    2. Shader "Lights/Omnidirectional"{
    3.  
    4.     Properties{
    5.  
    6.         _MainTex ("Light Texture", 2D) = "white" {}
    7.         [KeywordEnum(Doughnut, Equal)] _Lobe ("Radiation pattern", Float) = 0  
    8.         _MinPixelSize ("Minimum screen size", FLOAT) = 5.0
    9.         _Attenuation ("Attenuation", Range(0.01, 1)) = 0.37
    10.         _BrightnessOffset ("Brightness offset", Range(-1, 1)) = 0
    11.     }  
    12.  
    13.     SubShader{
    14.  
    15.         Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
    16.         Blend SrcAlpha One
    17.         AlphaTest Greater .01
    18.         ColorMask RGB
    19.         Lighting Off ZWrite Off
    20.  
    21.         Pass{
    22.  
    23.             //SRP conversion:
    24.             //Added:
    25.             HLSLINCLUDE
    26.             #pragma target 4.5
    27.             #include "CoreRP/ShaderLibrary/common.hlsl"
    28.             #include "HDRP/ShaderVariables.hlsl"
    29.  
    30.             //SRP conversion:
    31.             //Removed:
    32.            // #include "UnityCG.cginc"
    33.  
    34.             #include "lightFunctions.cginc"
    35.  
    36.             #pragma vertex vert
    37.             #pragma fragment frag        
    38.             #pragma multi_compile _LOBE_DOUGHNUT _LOBE_EQUAL
    39.             #pragma multi_compile_fog //Enable fog
    40.             #pragma glsl_no_auto_normalization
    41.  
    42.             uniform sampler2D _MainTex;      
    43.             float _MinPixelSize;
    44.             float angle;
    45.             float _BrightnessOffset;
    46.             float _Attenuation;
    47.  
    48.             //These global variables are set from a Unity script.
    49.             float _ScaleFactor;
    50.             float _GlobalBrightnessOffset;
    51.  
    52.             struct vertexInput {
    53.  
    54.                 float4 center : POSITION; //Mesh center position is stored in the position channel (vertices in Unity).
    55.                 float4 corner : TANGENT; //Mesh corner is stored in the tangent channel (tangent in Unity). The scale is stored in the w component.
    56.                 float4 normal : NORMAL; //Rotation forward vector is stored in the Normal channel (normals in Unity).
    57.                 float2 uvs : TEXCOORD0; //Texture coordinates (uv in Unity).
    58.                 float2 RGback : TEXCOORD1; //RG(B) back color is stored in a UV channel (uv2 in Unity).
    59.                 float2 up : TEXCOORD2; //Rotation up vector is stored in a UV channel (uv3 in Unity).
    60.                 float2 z : TEXCOORD3; //Rotation up vector (3rd component) is stored in a UV channel (uv4 in Unity). The x component is used for the blue component of the back color.
    61.                 half4 RGBAfront : COLOR; //The front color is stored in the Color channel (colors32 in Unity). This channel is used as an ID for strobe lights.  
    62.             };
    63.  
    64.  
    65.             struct vertexOutput{
    66.  
    67.                 float4 pos : SV_POSITION;
    68.                 float2 uvs : TEXCOORD0;
    69.                 half4 color : COLOR;
    70.  
    71.                 //This is not a UV coordinate but it is just used to pass some variables
    72.                 //from the vertex shader to the fragment shader. x = gain.
    73.                 float2 container : TEXCOORD1;
    74.  
    75.                 //Enable fog.
    76.                 UNITY_FOG_COORDS(2)
    77.             };  
    78.  
    79.  
    80.             vertexOutput vert(vertexInput input){
    81.  
    82.                 vertexOutput output;
    83.                 float gain;
    84.                 float distanceGain;
    85.                 float angleGain;
    86.                 float dotProduct;
    87.                 float scale;
    88.  
    89.                 //Get a vector from the vertex to the camera and cache the result.
    90.                 float3 objSpaceViewDir = ObjSpaceViewDir(input.center);
    91.  
    92.                 //Get the distance between the camera and the light.
    93.                 float distance = length(objSpaceViewDir);  
    94.              
    95.                 //Use a fixed alpha because we use the alpha value to modulate the gain instead.
    96.                 half4 RGBAfont = half4(input.RGBAfront.r, input.RGBAfront.g, input.RGBAfront.b, 0.2h);
    97.  
    98.                 #if _LOBE_EQUAL
    99.                     angleGain = GetEqualLobe();
    100.                     output.color = RGBAfont;
    101.                 #else
    102.                     //Compose the back color. There is no float4 slot left, so we use UV channels to store the colors.
    103.                     float4 RGBAback = float4(input.RGback.x, input.RGback.y, input.z.x, 0.2f);
    104.  
    105.                     float3 viewDir = normalize(objSpaceViewDir);
    106.  
    107.                     //Compose the up vector. A UV channel only holds two floats,
    108.                     //so we need to fetch the z coordinate from a third UV channel.
    109.                     float3 upVector = float3(input.up.x, input.up.y, input.z.y);
    110.  
    111.                     //TODO: figure out a way to get the rotated normal from elsewhere instead.
    112.                     float3 rotatedNormal = ProjectVectorOnPlane(upVector, viewDir);
    113.                     dotProduct = dot(viewDir, rotatedNormal);
    114.  
    115.                     //Use a Phase Function to simulate the light lens shape and its effect it has on the light brightness.          
    116.                     angleGain = GetRoundLobe(dotProduct);
    117.                     //angleGain = GetEggLobe(dotProduct);
    118.                     //angleGain = GetEqualLobe();        
    119.                     //angleGain = GetTearDropLobe(dotProduct);
    120.  
    121.                     float forwardDotProduct = dot(viewDir, input.normal);
    122.  
    123.                     //Create a smooth transition between the two colors.
    124.                     //output.color = Get015DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    125.                     //output.color = Get03DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    126.                     //output.color = Get06DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    127.                     output.color = Get10DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    128.                 #endif
    129.  
    130.                 //Calculate the scale. If the light size is smaller than one pixel, scale it up
    131.                 //so it remains at least one pixel in size.
    132.                 scale = ScaleUp(distance, _ScaleFactor, input.corner.w, angleGain, _MinPixelSize);
    133.  
    134.                 //Get the vertex offset to shift and scale the light.
    135.                 float4 offset = GetOffset(scale, input.corner);
    136.  
    137.                 //Place the vertex by moving it away from the center.
    138.                 //Rotate the billboard towards the camera.
    139.                 output.pos = mul(UNITY_MATRIX_P, float4(UnityObjectToViewPos(input.center), 1.0f) + offset);
    140.  
    141.                 //Far away lights should be less bright. Attenuate with the inverse square law.
    142.                 distanceGain = Attenuate(distance, _Attenuation);
    143.  
    144.                 //Merge the distance gain (attenuation), angle gain (lens simulation), and light brightness into a single gain value.
    145.                 //Note that the individual light brightness is stored in the front color alpha channel.
    146.                 gain = MergeGain(distanceGain, angleGain, _GlobalBrightnessOffset, input.RGBAfront.a + _BrightnessOffset);
    147.  
    148.                 //Send the gain to the fragment shader.
    149.                 output.container = float2(gain, 0.0f);
    150.  
    151.                 //UV mapping.
    152.                 output.uvs = input.uvs;
    153.  
    154.                 //Enable fog.
    155.                 UNITY_TRANSFER_FOG(output, output.pos);
    156.  
    157.                 return output;
    158.             }
    159.  
    160.             //SRP conversion:
    161.             //Note: not sure whether to use COLOR or SV_Target
    162.             half4 frag(vertexOutput input) : COLOR{
    163.  
    164.                 //Compute the final color.
    165.                 //Note: input.container.x fetches the gain from the vertex shader. No need to calculate this for each fragment.
    166.                 half4 col = 2.0h * input.color * tex2D(_MainTex, input.uvs) * (exp(input.container.x * 5.0h));      
    167.              
    168.                 //Enable fog. Use black due to the blend mode used.      
    169.                 UNITY_APPLY_FOG_COLOR(input.fogCoord, col, half4(0,0,0,0));
    170.                  
    171.                 return col;
    172.             }
    173.             ENDHLSL
    174.         }
    175.     }
    176. }
    177.  
     
    Last edited: Feb 12, 2018
  32. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Please, follow the example of the shader I provide, you need to declare passes (i.E this "LightMode" = "ForwardOnly" is important) + you need camera relative position

    1. Tags { "Queue" = "Transparent" }
    2. Pass
    3. {
    4. Name ""
    5. Tags{ "LightMode" = "ForwardOnly" }
    6. Blend One One
    7. ZWrite off
    8. HLSLPROGRAM
    9. #pragma vertex vert
    10. #pragma fragment frag
    11. ENDHLSL
    12. }
    13. Pass
    14. {
    15. Name ""
    16. Tags{ "LightMode" = "DepthForwardOnly" }
    17. HLSLPROGRAM
    18. #pragma vertex vert
    19. #pragma fragment frag
    20. ENDHLSL
    21. }
    22. }


    23. For the fog this is handled automatically by opaque with a full screen pass (Add a SceneSettings and add a fog component). For transparent it will be handle automatically too... (see EvaluateAtmosphericScattering() in ShaderPAssForwardUnlit if you want to see implementation details)
     
  33. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Thanks for clearing that up. Here are some more issues I found.

    I assume that UnityCG.cginc is depreciated with the release of the SRP (it causes compile errors), but this helper function is missing (maybe more):
    Code (CSharp):
    1. ObjSpaceViewDir()

    My new conversion attempt compiles, but still doesn't work correctly. There seems to be a problem with the billboard world/screen space conversion as I can see the billboards incorrectly on screen of in the distance. Before I was using this function:
    Code (CSharp):
    1. output.pos = mul(UNITY_MATRIX_P, float4(UnityObjectToViewPos(input.center), 1.0f) + offset);
    I converted this like so (incorrect, I am sure):
    Code (CSharp):
    1. output.pos.xyz = TransformObjectToWorld(input.center.xyz);
    2. output.pos = mul(UNITY_MATRIX_P, float4(output.pos.xyz, 1.0f) + offset); //Rotate billboard towards camera. Not sure how to convert this to SRP.
    3. output.pos.xyz = GetCameraRelativePositionWS(output.pos.xyz);
    4. output.pos = TransformWorldToHClip(output.pos.xyz);
    Another question. I see that the SRP shader version uses two passes. My original only uses one pass. Isn't this bad for performance?

    New shader attempt:
    Code (CSharp):
    1.  
    2. Shader "Lights/Omnidirectional"{
    3.  
    4.     Properties{
    5.  
    6.         _MainTex ("Light Texture", 2D) = "white" {}
    7.         [KeywordEnum(Doughnut, Equal)] _Lobe ("Radiation pattern", Float) = 0  
    8.         _MinPixelSize ("Minimum screen size", FLOAT) = 5.0
    9.         _Attenuation ("Attenuation", Range(0.01, 1)) = 0.37
    10.         _BrightnessOffset ("Brightness offset", Range(-1, 1)) = 0
    11.     }  
    12.  
    13.     //SRP conversion:
    14.     //Added:
    15.     HLSLINCLUDE
    16.     #pragma target 4.5
    17.     #define SHADER_CUSTOM_VERTEX customVert
    18.     #include "CoreRP/ShaderLibrary/common.hlsl"
    19.     #include "HDRP/ShaderVariables.hlsl"
    20.  
    21.     //SRP conversion:
    22.     //Removed:
    23.     //#include "UnityCG.cginc"
    24.  
    25.     #include "lightFunctions.cginc"
    26.  
    27.     //SRP conversion:
    28.     //Removed:
    29.     //#pragma vertex vert
    30.     //#pragma fragment frag    
    31.  
    32.     #pragma multi_compile _LOBE_DOUGHNUT _LOBE_EQUAL  
    33.     #pragma glsl_no_auto_normalization
    34.  
    35.     //SRP conversion:
    36.     //Removed:
    37.     //#pragma multi_compile_fog
    38.  
    39.     uniform sampler2D _MainTex;      
    40.     float _MinPixelSize;
    41.     float angle;
    42.     float _BrightnessOffset;
    43.     float _Attenuation;
    44.  
    45.     //These global variables are set from a Unity script.
    46.     float _ScaleFactor;
    47.     float _GlobalBrightnessOffset;
    48.  
    49.     struct vertexInput {
    50.  
    51.         float4 center : POSITION; //Mesh center position is stored in the position channel (vertices in Unity).
    52.         float4 corner : TANGENT; //Mesh corner is stored in the tangent channel (tangent in Unity). The scale is stored in the w component.
    53.         float4 normal : NORMAL; //Rotation forward vector is stored in the Normal channel (normals in Unity).
    54.         float2 uvs : TEXCOORD0; //Texture coordinates (uv in Unity).
    55.         float2 RGback : TEXCOORD1; //RG(B) back color is stored in a UV channel (uv2 in Unity).
    56.         float2 up : TEXCOORD2; //Rotation up vector is stored in a UV channel (uv3 in Unity).
    57.         float2 z : TEXCOORD3; //Rotation up vector (3rd component) is stored in a UV channel (uv4 in Unity). The x component is used for the blue component of the back color.
    58.         half4 RGBAfront : COLOR; //The front color is stored in the Color channel (colors32 in Unity). This channel is used as an ID for strobe lights.  
    59.     };
    60.  
    61.     struct vertexOutput{
    62.  
    63.         float4 pos : SV_POSITION;
    64.         float2 uvs : TEXCOORD0;
    65.         half4 color : COLOR;
    66.  
    67.         //This is not a UV coordinate but it is just used to pass some variables
    68.         //from the vertex shader to the fragment shader. x = gain.
    69.         float2 container : TEXCOORD1;
    70.     };  
    71.  
    72.     vertexOutput vert(vertexInput input){
    73.  
    74.         vertexOutput output;
    75.         float gain;
    76.         float distanceGain;
    77.         float angleGain;
    78.         float dotProduct;
    79.         float scale;
    80.  
    81.         //SRP conversion:
    82.         //Missing function ObjSpaceViewDir2(), so copied from UnityCG.cginc
    83.         float3 objSpaceViewDir = ObjSpaceViewDir2(input.center); //Get a vector from the vertex to the camera and cache the result.
    84.  
    85.         //Get the distance between the camera and the light.
    86.         float distance = length(objSpaceViewDir);  
    87.              
    88.         //Use a fixed alpha because we use the alpha value to modulate the gain instead.
    89.         half4 RGBAfont = half4(input.RGBAfront.r, input.RGBAfront.g, input.RGBAfront.b, 0.2h);
    90.  
    91.         #if _LOBE_EQUAL
    92.             angleGain = GetEqualLobe();
    93.             output.color = RGBAfont;
    94.         #else
    95.             //Compose the back color. There is no float4 slot left, so we use UV channels to store the colors.
    96.             float4 RGBAback = float4(input.RGback.x, input.RGback.y, input.z.x, 0.2f);
    97.  
    98.             float3 viewDir = normalize(objSpaceViewDir);
    99.  
    100.             //Compose the up vector. A UV channel only holds two floats,
    101.             //so we need to fetch the z coordinate from a third UV channel.
    102.             float3 upVector = float3(input.up.x, input.up.y, input.z.y);
    103.  
    104.             //TODO: figure out a way to get the rotated normal from elsewhere instead.
    105.             float3 rotatedNormal = ProjectVectorOnPlane(upVector, viewDir);
    106.             dotProduct = dot(viewDir, rotatedNormal);
    107.  
    108.             //Use a Phase Function to simulate the light lens shape and its effect it has on the light brightness.          
    109.             angleGain = GetRoundLobe(dotProduct);
    110.             //angleGain = GetEggLobe(dotProduct);
    111.             //angleGain = GetEqualLobe();        
    112.             //angleGain = GetTearDropLobe(dotProduct);
    113.  
    114.             float forwardDotProduct = dot(viewDir, input.normal.xyz);
    115.  
    116.             //Create a smooth transition between the two colors.
    117.             //output.color = Get015DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    118.             //output.color = Get03DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    119.             //output.color = Get06DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    120.             output.color = Get10DegreeTransition(forwardDotProduct, RGBAfont, RGBAback);
    121.         #endif
    122.  
    123.  
    124.         //Calculate the scale. If the light size is smaller than one pixel, scale it up
    125.         //so it remains at least one pixel in size.
    126.         scale = ScaleUp(distance, _ScaleFactor, input.corner.w, angleGain, _MinPixelSize);
    127.  
    128.         //Get the vertex offset to shift and scale the light.
    129.         float4 offset = GetOffset(scale, input.corner);
    130.  
    131.         //SRP conversion:
    132.         //Instead of this:
    133.         //output.pos = mul(UNITY_MATRIX_P, float4(UnityObjectToViewPos(input.center), 1.0f) + offset); //Place the vertex by moving it away from the center, rotate the billboard towards the camera.
    134.         //Use this instead, but not sure if it is correct...
    135.         output.pos.xyz = TransformObjectToWorld(input.center.xyz);
    136.         output.pos = mul(UNITY_MATRIX_P, float4(output.pos.xyz, 1.0f) + offset); //Rotate billboard towards camera. Not sure how to convert this to SRP.
    137.         output.pos.xyz = GetCameraRelativePositionWS(output.pos.xyz);
    138.         output.pos = TransformWorldToHClip(output.pos.xyz);      
    139.  
    140.         //Far away lights should be less bright. Attenuate with the inverse square law.
    141.         distanceGain = Attenuate(distance, _Attenuation);
    142.  
    143.         //Merge the distance gain (attenuation), angle gain (lens simulation), and light brightness into a single gain value.
    144.         //Note that the individual light brightness is stored in the front color alpha channel.
    145.         gain = MergeGain(distanceGain, angleGain, _GlobalBrightnessOffset, input.RGBAfront.a + _BrightnessOffset);
    146.  
    147.         //Send the gain to the fragment shader.
    148.         output.container = float2(gain, 0.0f);
    149.  
    150.         //UV mapping.
    151.         output.uvs = input.uvs;
    152.  
    153.         //SRP conversion:
    154.         //Removed:
    155.         //UNITY_TRANSFER_FOG(output, output.pos);
    156.  
    157.         return output;
    158.     }
    159.  
    160.     //SRP conversion:
    161.     //Note: not sure whether to use COLOR or SV_Target
    162.     half4 frag(vertexOutput input) : COLOR{
    163.  
    164.         //Compute the final color.
    165.         //Note: input.container.x fetches the gain from the vertex shader. No need to calculate this for each fragment.
    166.         half4 col = 2.0h * input.color * tex2D(_MainTex, input.uvs) * (exp(input.container.x * 5.0h));      
    167.              
    168.         //SRP conversion:
    169.         //Removed:
    170.         //UNITY_APPLY_FOG_COLOR(input.fogCoord, col, half4(0,0,0,0)); //Use black due to the blend mode used.  
    171.                  
    172.         return col;
    173.     }
    174.     ENDHLSL
    175.  
    176.     SubShader{
    177.  
    178.         Tags {"RenderType"="Transparent"}
    179.  
    180.         Pass
    181.         {
    182.             Name ""
    183.             Tags{ "LightMode" = "ForwardOnly" }
    184.             Blend SrcAlpha One
    185.             AlphaTest Greater .01
    186.             ColorMask RGB
    187.             Lighting Off
    188.             ZWrite Off
    189.             HLSLPROGRAM
    190.                 #pragma vertex vert
    191.                 #pragma fragment frag
    192.             ENDHLSL
    193.         }
    194.  
    195.         Pass
    196.         {
    197.             Name ""
    198.             Tags{ "LightMode" = "DepthForwardOnly" }
    199.             HLSLPROGRAM
    200.                 #pragma vertex vert
    201.                 #pragma fragment frag
    202.             ENDHLSL
    203.         }
    204.     }
    205. }
     
  34. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Is there anyone who can shed some light on the camera transform issue?
     
  35. tatoforever

    tatoforever

    Joined:
    Apr 16, 2009
    Posts:
    4,369
    @Elecman,
    The DepthForwardOnly pass will be used only when there's a depth pre-pass.
     
  36. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Yeah, I figured that, but since I was slapped on the hand, I figured I'd better do what SebLagarde said:
     
    theANMATOR2b likes this.
  37. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    How about using custom projection matrix?
     
  38. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Since RC1 is available, I wondered if there is any manual available how to write shaders for the HD SRP. So far I can't find anything, which is rather odd.
     
    mephistonight likes this.
  39. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    I think that later HD SRP will also get shadergraph.
     
  40. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Hi,

    there is no documentation about how to write a shader for now, our focus is on user documentation first (which will take few month).

    If you do an unlit shader, the shader provided above is sufficient.

    If you want to write a lit shader, then thing start to be complex. We don't have surface shader for HD (unlike built-in Unity), mean when writing a shader for HD you need to consider all the lighting + how to fit in the current framework. There is a lot to write and do (As we want our shader to have debug functionality, be compatible with all our light type (including area light, etc..) .

    Here I provide you with a "template" (one for UI side, other for shader side), this is not really a template, just a bunch of file copy/paste from Lit shader where most of the code have been remove and that use the forward rendering (compatible with deferred rendering). you can use it as a start for your own lit shader. To use it, you need to uncompress both zip file (they must be located in Material directory and in material.hlsl you need to add this new mterial to the list:


    #ifdef UNITY_MATERIAL_LIT
    #include "Lit/Lit.hlsl"
    #elif defined(UNITY_MATERIAL_UNLIT)
    #include "Unlit/Unlit.hlsl"
    #elif defined(UNITY_MATERIAL_STACKLIT)
    #include "StackLit/StackLit.hlsl" <===
    #endif

    Note: We will not provide support for this "custom template" and we are still making API evolving, so new version can break the shader.

    >Since RC1 is available,
    HD is still experimental until 2018.3
     

    Attached Files:

  41. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    934
    Yes, shader graph is the next step for HDRP
     
  42. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,374
    Thanks for the example. The main point my shader breaks on is the camera relative rendering. I have an unlit emissive billboard shader which does some non-standard vertex manipulations so it is causing problems.

    In my case, I won't be using shaderGraph because that means I will have to re-write my shaders from scratch using a system I am not familiar with. It is likely faster just to do the code conversion.