Search Unity

Feedback Wanted: Scriptable Render Pipelines

Discussion in 'Graphics Experimental Previews' started by Tim-C, May 9, 2017.

  1. elbows


    Nov 28, 2009
    Thanks for all the quality info, as always, in your recent posts.

    Will we be able to tell fairly easily which commit breaks the master compatibility with public 2018.2 betas? eg will the last compatible version get tagged/released, or the breaking change at least be mentioned in the commit comments?
  2. Reanimate_L


    Oct 10, 2009
    @SebLagarde Also can we get random/stochastic sampling for the volumetric fog? right now the pulsing when the camera are moving are too noticable. Also the sample banding
  3. relativegames


    Nov 22, 2013
    Am now on Unity 2018.1.0f1 and I still get this with the github files. Is this feature actually going to work with 2018.1 or is it pushed to 2018.2 ?

    UPDATE : I installed 2018.2.0b2 too and now I got
    Assembly has reference to non-existent assembly 'com.unity.postprocessing.Editor' (Assets/ScriptableRenderPipeline-master/Tests/Scripts/Editor/Tests_Editor.asmdef)

    Still hoping for a working sample project...
    Last edited: Apr 28, 2018
  4. Jean-Moreno


    Jul 23, 2012
    Are there any #define directives planned for SRP - and to a broader extent for each package - to check if a specific package has been installed? I couldn't find any.

    e.g. I have a custom script deriving from LightweightShaderGUI.cs which only exists within the Lightweight Pipeline package, so something like #if LIGHTWEIGHT_PIPELINE would prevent the script to break when lightweight isn't installed.
    Even better, include version number and make things similar to UNITY_2018_1_OR_NEWER.
    Maybe that could be automatized for every package based on name + version.

    Unless there are other guidelines for this?
    My current workaround is to include a speficic .unitypackage file that users need to unpack when they use Lightweight (this is for my Asset Store shaders).
    GameDevCouple_I likes this.
  5. Grimreaper358


    Apr 8, 2013
    You need Post Processing V2 inside your project - I would recommend to get it from the Package Manager.
    If you just want to use the HD Render Pipeline then I would also recommend to get it from the package manager as well or create a new project with the HDRP Template, everything will be set up for you.
  6. TJHeuvel-net


    Jul 31, 2012
    Yeah, half suggestion half question. Just a bit confused to see some of the same functionality in another UI element.
  7. Drezzel


    Jul 10, 2017
    there are an issue with objects motion vector the unity_MatrixPreviousM is not updated correctly it seem that the matrix is stuck to a value that is not the previous frame transform.

    To reproduce the bug:
    1. Press play
    2. Debbuging->Fullscreen Debug Mode->MotionVectors
    3. Move an object
  8. SebLagarde


    Unity Technologies

    Dec 30, 2015
    Thanks to report this problem, we are aware of it and we have a fix for it but it is C++ side. It is part of 2018.2 (beta containing the fix not out yet), we try to backport it.
    P_Jong likes this.
  9. Drezzel


    Jul 10, 2017
    Thanks for the answer

    There are another issue enableRandomWrite does not affect texture created with CommandBuffer

    here the code:
    C# SRP
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Experimental.Rendering;
    5. [ExecuteInEditMode]
    6. public class BasicAssetPipe : RenderPipelineAsset
    7. {
    8. #if UNITY_EDITOR
    9.     [UnityEditor.MenuItem("SRP-Demo/01 - Create Basic Asset Pipeline")]
    10.     static void CreateBasicAssetPipeline()
    11.     {
    12.         var instance = ScriptableObject.CreateInstance<BasicAssetPipe>();
    13.         UnityEditor.AssetDatabase.CreateAsset(instance, "Assets/BasicAssetPipe.asset");
    14.     }
    15. #endif
    17.     protected override IRenderPipeline InternalCreatePipeline()
    18.     {
    19.         return new BasicPipeInstance();
    20.     }
    21. }
    23. public class BasicPipeInstance : RenderPipeline
    24. {
    25.     Material m_RWTextureMaterial;
    26.     public BasicPipeInstance()
    27.     {
    28.         m_RWTextureMaterial = CoreUtils.CreateEngineMaterial("SRP/RWTextureShader");
    29.     }
    30.     public override void Dispose()
    31.     {
    32.         base.Dispose();
    33.         uav_renderTexture.Release();
    34.         CoreUtils.Destroy(uav_renderTexture);
    35.         uav_renderTexture = null;
    36.         CoreUtils.Destroy(m_RWTextureMaterial);
    37.     }
    38.     private static RenderTexture uav_renderTexture;
    39.     public override void Render(ScriptableRenderContext context, Camera[] cameras)
    40.     {
    41.         base.Render(context, cameras);
    43.         foreach (var camera in cameras)
    44.         {
    45.             context.SetupCameraProperties(camera);
    47.             var cmd = new CommandBuffer();
    49.             var color_id = Shader.PropertyToID("color");
    50.             var color_rti = new RenderTargetIdentifier(color_id);
    51.             var color_rtd = new RenderTextureDescriptor(512, 512, RenderTextureFormat.ARGB32, 0);
    52.             cmd.GetTemporaryRT(color_id, color_rtd, FilterMode.Bilinear);
    54.             //Create RenderTextureDescriptor for both uav target
    55.             var uav_rtd = new RenderTextureDescriptor(512, 512, RenderTextureFormat.RInt, 0);
    56.             uav_rtd.enableRandomWrite = true;
    57.             //setup commandBuffer uav
    58.             var uav_commandBuffer_id = Shader.PropertyToID("uav_commandBuffer");
    59.             var uav_commandBuffer_rti = new RenderTargetIdentifier(uav_commandBuffer_id);
    60.             cmd.GetTemporaryRT(uav_commandBuffer_id, uav_rtd, FilterMode.Bilinear);
    61.             //setup RenderTexture uav
    62.             if (uav_renderTexture != null && (uav_renderTexture.width != uav_rtd.width || uav_renderTexture.height != uav_rtd.height || uav_renderTexture.format != uav_rtd.colorFormat))
    63.             {
    64.                 uav_renderTexture.Release();
    65.                 CoreUtils.Destroy(uav_renderTexture);
    66.                 uav_renderTexture = null;
    67.             }
    68.             if (uav_renderTexture == null)
    69.             {
    70.                 uav_renderTexture = new RenderTexture(uav_rtd);
    71.                 uav_renderTexture.hideFlags = HideFlags.HideAndDontSave;
    72.        = "uav_renderTexture";
    73.                 uav_renderTexture.Create();
    74.             }
    75.             var uav_renderTexture_rti = new RenderTargetIdentifier(uav_renderTexture);
    77.             //Write uav
    78.             cmd.SetRandomWriteTarget(1, uav_commandBuffer_rti);
    79.             cmd.SetRandomWriteTarget(2, uav_renderTexture_rti);
    80.             cmd.Blit(null, color_id, m_RWTextureMaterial, 0);
    81.             cmd.ClearRandomWriteTargets();
    82.             cmd.SetGlobalTexture("SRV_FROM_COMMANDBUFFER", uav_commandBuffer_rti);
    83.             cmd.SetGlobalTexture("SRV_FROM_RENDERTEXTURE", uav_renderTexture_rti);
    84.             //Show results
    85.             cmd.Blit(color_id, BuiltinRenderTextureType.CameraTarget, m_RWTextureMaterial, 1);
    86.             cmd.SetRenderTarget(BuiltinRenderTextureType.CameraTarget);
    87.             //Release temp texture
    88.             cmd.ReleaseTemporaryRT(color_id);
    89.             cmd.ReleaseTemporaryRT(uav_commandBuffer_id);
    91.             context.ExecuteCommandBuffer(cmd);
    92.             cmd.Release();
    93.             context.Submit();
    94.         }
    95.     }
    96. }
    Shader Code
    Code (CSharp):
    1. Shader "SRP/RWTextureShader"
    2. {
    3.     Properties
    4.     {
    5.     }
    6.     CGINCLUDE
    7.     #include "UnityCG.cginc"
    8.     #pragma target 5.0
    9.     struct appdata
    10.     {
    11.         float4 vertex : POSITION;
    12.         float2 uv : TEXCOORD0;
    13.     };
    14.     struct v2f
    15.     {
    16.         float2 uv : TEXCOORD0;
    17.         float4 vertex : SV_POSITION;
    18.     };
    19.     v2f vert (appdata v)
    20.     {
    21.         v2f o;
    22.         o.vertex = UnityObjectToClipPos(v.vertex);
    23.         o.uv = v.uv;
    24.         return o;
    25.     }
    26.     //
    27.     float4 decode(uint val)
    28.     {
    29.         return float4(float((val & 0x000000FF)),
    30.         float((val & 0x0000FF00) >> 8U),
    31.         float((val & 0x00FF0000) >> 16U),
    32.         float((val & 0xFF000000) >> 24U))*(1.0/255.0);
    33.     }
    34.     uint encode(float4 val)
    35.     {
    36.         val*=255.0;
    37.         return (uint(val.w) & 0x000000FF) << 24U |
    38.         (uint(val.z) & 0x000000FF) << 16U |
    39.         (uint(val.y) & 0x000000FF) << 8U |
    40.         (uint(val.x) & 0x000000FF);
    41.     }
    43.     RWTexture2D<int> UAV_FROM_COMMANDBUFFER :register(u1);
    44.     RWTexture2D<int> UAV_FROM_RENDERTEXTURE :register(u2);
    46.     Texture2D<int> SRV_FROM_COMMANDBUFFER;
    47.     Texture2D<int> SRV_FROM_RENDERTEXTURE;
    49.     float4 write_uav (v2f i) : SV_Target
    50.     {
    51.         uint2 coord = i.vertex;
    52.         float4 color = float4(i.uv,0.0,1.0);
    53.         int encoded_color = asint(encode(color));
    54.         UAV_FROM_COMMANDBUFFER[coord] = encoded_color;
    55.         UAV_FROM_RENDERTEXTURE[coord] = encoded_color;
    56.         return 0.0;
    57.     }
    58.     float4 show_result (v2f i) : SV_Target
    59.     {
    60.         uint2 coord = i.uv*512;
    61.         float4 result_uav_commandBuffer = decode(asuint(SRV_FROM_COMMANDBUFFER[coord]));
    62.         float4 result_uav_renderTexture = decode(asuint(SRV_FROM_RENDERTEXTURE[coord]));
    63.         if(i.uv.x<0.5)
    64.         {
    65.             return result_uav_commandBuffer;
    66.         }else
    67.         {
    68.             return result_uav_renderTexture;
    69.         }
    70.         return 0.0;
    71.     }
    72.     ENDCG
    73.     SubShader
    74.     {
    75.         Tags { "RenderType"="Opaque" }
    76.         Pass
    77.         {
    78.             ColorMask 0
    79.             CGPROGRAM
    80.             #pragma vertex vert
    81.             #pragma fragment write_uav
    82.             ENDCG
    83.         }
    84.         Pass
    85.         {
    86.             CGPROGRAM
    87.             #pragma vertex vert
    88.             #pragma fragment show_result
    89.             ENDCG
    90.         }
    91.     }
    92. }
    Expected result
    Expected Result.JPG
    Current result
  10. Drezzel


    Jul 10, 2017
    Also the CameraType value of "Preview Camera" is CameraType.Game instead of CameraType.Preview
    Here the code to test it:
    Code (CSharp):
    1.             if ("preview"))
    2.             {
    3.                 Debug.Log("CameraType: "+camera.cameraType.ToString() +" Name: "+;
    4.             }
  11. Reanimate_L


    Oct 10, 2009
    @SebLagarde is there a decal:add and decal:blend support in HDRP just like in Built in Surface Shader? this is usefull for Geometry Decals
  12. konsic


    Oct 19, 2015
    Is default 2018.1 rendering (without LW or HD SRP) as stable as 2017.4 ?
    Can I make game without LW?
    Last edited: May 2, 2018
  13. Grimreaper358


    Apr 8, 2013
    Yea, Unity won't get rid of the current rendering pipeline yet.
  14. SebLagarde


    Unity Technologies

    Dec 30, 2015
    No. HDRP have its own decal system. This system support volumetric decal but not mesh decal.
    Have in mind that HDRP is very different than builtin Unity and many system aren't compatible.
  15. SebLagarde


    Unity Technologies

    Dec 30, 2015
    Thanks we will investigate this issue.

    >Also the CameraType value of "Preview Camera" is CameraType.Game instead of CameraType.Preview

    I think we have done this on purpose, but yes it is weird. We will try to fix this.

  16. Reanimate_L


    Oct 10, 2009
    Any plan to support mesh decal?
  17. pixelshader1


    Apr 1, 2018
    Is it planned to implement the smooth distance based light culling and dynamic shadow caching in the HDRP? I think that this features can significantly improve the performance of the game. Here's a great example of their implementation.

    Psynaptix, AntonioModer and konsic like this.
  18. Grimreaper358


    Apr 8, 2013
    You can already set it so that after a certain distance the light's intensity fades and turns off/cull

    The setting is both for lights and shadows, they are separate so you can set them individually

    Enable Aditional Settings on the light

    then you can see the settings
    P_Jong, Psynaptix, OCASM and 2 others like this.
  19. montyfi


    Aug 3, 2012
    Will HDRP support VR at some point? Now, when I turn XR on it renders nothing in HMD. In Editor camera works and moves with head movements, but not in headset.
  20. equalsequals


    Sep 27, 2010
    GameDevCouple_I and montyfi like this.
  21. SebLagarde


    Unity Technologies

    Dec 30, 2015
    Everything is on the roadmap, there is just no ETA for it :) (Resources, priority etc...).
  22. fdsagizi2


    Nov 4, 2013
    how we will use shaders if we want use HD and Light in the same project ?

    It will be separate shaders? separate materials ???
  23. Remy_Unity


    Unity Technologies

    Oct 3, 2017
    You shouldn't use HD and LW pipeline in the same project. Either stick with HD if you wan't high quality graphics and can accommodate the high cost, or use LW if you wan't to target a larger span of hardware.
    Or write your own Render Pipeline (or extend LW ?) to fit your need.
  24. fdsagizi2


    Nov 4, 2013
    Stupid decision! It must be possible!

    99% materials have identity params - like Albedo, Normal, Spec... And it MUST be possible use that materials with different shaders - like shaders lods - same inputs, different outputs!

    On Mobile can use LW, on Steam HD in same project.
    Last edited: May 4, 2018
  25. hippocoder


    Digital Ape Moderator

    Apr 11, 2010
    The decision has been made. I expect politeness at all times on the forum.
  26. rizu


    Oct 8, 2013
    Did you see my thread about this?

    While I understand your logic behind this, it's not something everyone can or want to do. The moment you start using your modified Render Pipeline, you have to maintain it yourself, instead of relying Unity to do it. I'm pretty sure most people using Unity want to use it for building games, not to build tech.

    I personally feel that HD SRP should have scalability options so that you could scale it down to match LW. Having it designed to work only properly with higher end systems isn't going to work if you release a game on PC platform for example. Unity games scaling to low end systems have been one of Unity strengths in the past, now having to choose between LW and HD is taking that choice away from people who don't want to maintain SRP themselves (it's still very technical and requires plenty of specific knowledge about renderer tech). To make a point, I'll quote your blog post for this:
    As a workaround, naively stuffing both LW and HD pipelines into same build may solve this easier, especially if you automate the process yourself. I'm still curious if there are some other considerations than just swapping the pipeline assets and shaders (besides obvious need to rework the PP, modify some assets and dealing with pipeline specific feats like decals). I'm not huge fan of telling "don't do it" without giving proper explanations why not. Can you give us some valid reasons why we absolutely shouldn't do this?

    I mean, it seems to be technically possible (check the link at the start of this post), and even having to maintain extra scene duplicate is totally acceptable for many who don't have many scenes in the final game, it's pretty much same work as having to tweak the scene for different time of day really.
    Last edited: May 4, 2018
  27. GameDevCouple_I


    Oct 5, 2013
    They have entirely different models of literally everything ,from scattering to lightmapping and beyond. So yes possible, but mad work.
  28. rizu


    Oct 8, 2013
    But different projects have different demands, what might be mad work for some might work just nicely for some other.

    Edit-> I only now noticed he's talking about mobile and PC ports, none of what I've written on this topic so far has been about about maintaining two different SRPs for two different platforms. That being said, if he does PC version he'll likely need to change a lot other things too from mobile version, renderer is just one element then.
  29. hippocoder


    Digital Ape Moderator

    Apr 11, 2010
    It kind of does, or will :) By the time HD is mature, pretty much everything will support it anyway, and you can turn it all off. This is a nonsense discussion of what-ifs (not an attack, only observation!)

    It's targeting mobiles for goodness sake. Give it 2 years and most if not all relevant mobiles will have compute and so on. Have you seen how scalable HD is? I have and have been examining the source. The answer is "very".
    GameDevCouple_I likes this.
  30. GameDevCouple_I


    Oct 5, 2013
    Right, hence the SRP which you can finally edit yourself.

    If you are wanting a pretty specific use case that likely wont be usable for most generalist users and definately will require insane amounts of upkeep just to keep it running(including tons of upkeep every new unity release) then you should make it yourself, otherwise your just hoping that unity will partially make your product for you.

    They made SRP possible which is exactly what you need, so create an SRP that does this if you need this specific case? Just dont expect unity devs to take time away from their very much future planned roadmap for it, because it likely wont be for most users.
    hippocoder likes this.
  31. SebLagarde


    Unity Technologies

    Dec 30, 2015
    I will clarify, HD is for compute shader capable platform. HD is written with DX11 level and compute shader in mind. If a platform don't support compute it can't run HD. There is no scalable option possible in this case. Outside of this HD work and it is up to you to scale the game down by using lower LOD or less feature (no volumetric, no decals etc...). In HD we try to not pay for features that we not use (but for various reasons it is not always possible, like for decals if you enable the support but don't have it you still have a slight cost increase).

    Builtin Unity is still here, nothing force you to use LW or HD, particularly if you ship this year :) LW and HD are here to fill a gap that wasn't possible with builtin unity (i.e builtin unity have no high end graphic and are not optimize for very low platform).

    I really discourage such approach.

    Here is a sum up of what is different between LW and HD:
    - Not the same GI (HD use correct quadratic attenuation + smooth falloff)
    - Not the same reflection probe (HDReflection probe)
    - Not the same light properties / Behavior (HDLight)
    - Not the same materials
    - Not the same sky/fog settings (Volume settings system)
    - Not the same camera (HDCamera)
    - Not the same way to allocate render target (HD don't use GetTemporaryRT at all but a system name RTHandle)
    - Not the same supported render pass (No motion vector in LW, no distortion...)
    - Not the same shader (outside of shader library)

    Of course you are free to do what you want, and having both LW and HD in the same project is technically possible, but we can't support such a choice and we will not provide help for this :). The only way you can do it is by duplicating all your material, lights, reflection probe, settings etc...
    syscrusher, hippocoder and rizu like this.
  32. rizu


    Oct 8, 2013
    This was the answer I was waiting for, big thank you :)
  33. rizu


    Oct 8, 2013
    I'll definitely try if I can get HD running on lower end system first at acceptable perf but I do like to consider my options. It's definitely the easiest option.

    The list SebLagarde kindly wrote doesn't really contain any huge showstoppers from my point of view as you can easily sync these things between scenes via script, making sure the right scene uses right light and probe types etc. You can probably automate at least 90% of the related work and tweak the rest. This kind of work would also benefit those crossplatform titles as it would be easier to port between if there's tooling to make that happen easier.
  34. hippocoder


    Digital Ape Moderator

    Apr 11, 2010
    Yeah it's called waiting for 2018.3, the projected release date for HD, there will be more information then, don't worry.
    GameDevCouple_I likes this.
  35. TwiiK


    Oct 23, 2007
    1. What is the end goal with the built-in HD pipeline, or the various pipelines in general?

    So far I've looked at the included HD pipeline example scene as well as tried to upgrade one of my existing projects to the HD pipeline. What I can see so far is that this feels extremely foreign to me, even as someone who has used Unity for more than a decade. Won't this lead to an insane amount of support for you guys and make it impossible for us to communicate with each other? Every question/answer will have to be prefaced with "what pipeline are you using?" as if we're all using different engines.

    Basically how will this look 1-2-3 years from now? Will it still be different ways of defining materials, lights, graphics settings etc. between the pipelines or will we only have one pipeline with one workflow?

    2. Are there any obvious benefits to using the HD pipeline over the default one at the moment?

    The project I upgraded is a scene where I'm only trying to achieve great looking visuals in a small space with no thoughts given to performance. It feels perfect for the HD pipeline, but so far I'm not even able to replicate the graphics quality I had before and stuff like SSR is not even supported yet, not to mention all the things that broke and the fact that i have to relearn everything to be able to continue with the project now. :p

    Are there any obvious benefits for me? Looking at the example scene I can't really see anything that isn't already possible in the default pipeline. To be frank it looks rather basic. Are there subtle differences I'm missing here? Or are there big things that are only possible in the HD pipeline that you can't do in the default pipeline?

    Sorry if these questions have been answered already, if they have I'd love a nudge in the right direction to read the answers. I've been away from Unity for what feels like years now, but I finally have some spare time to fiddle with it again and I'm currently trying to wrap my head around the HD pipeline.
  36. mukki014


    Jul 30, 2017
    Hi I'm Mukul. I love using unity. And I've been playing with unity 2018.1.0f2 free version. I'm using HDRP and trying to get used to it and learn it. It's really awesome. Just wish I could use terrain and speed trees. Well I tried to enable terrain by creating a new material and set it shader as HDrenderpipelineLit so it's make it appear the terrain but there's no use. I've also used the HDRP layer lit material with tessalation and it's really awesome. I create a test demo by using other 3D models so I can see the power of unity 2018 SRP holds. And it's really amazing. I'm more of a designer than a programmer so I've always find it hard to set up lighting on my own but with HDRP I can create some amazing results like you can see in the pic. My laptop is 720p so resolution won't be quite good but still you can see the high Fidelity graphics. I'm really happy with the new SRP. Just hoping terrain support will come soon. One more thing I wanna ask as in the book of dead demo you've everything from terrain to trees. How you've managed to get it ? Thank you 2018-05-05_035210631.PNG 2018-05-05_034837104.PNG 2018-05-05_035301586.PNG 2018-05-05_034806797.PNG
  37. hippocoder


    Digital Ape Moderator

    Apr 11, 2010
    mukki014 likes this.
  38. Ferazel


    Apr 18, 2010
    Apologies if I did something wrong, but now that 2018.1 is out of beta and the feature is preview I wanted to do start doing some tests with the SRP. Is the SRP supposed to work for standalone Windows builds? I was looking to do some very basic profiling between the standard renderer and a custom one. I can get it to render in the editor fine, but when I build for windows I don't see anything but the Unity UI rendering. Is there maybe a setting I didn't set correctly?

    Unity: 2018.1.0f2 PackManVer: 1.1.5-Preview
  39. rizu


    Oct 8, 2013
    Yes, it's supposed to work. I have tested 1.1.5 only in 2018.2 betas in packaged but I'm pretty sure earlier SRPs worked in 2018.1 in packaged builds too.
  40. equalsequals


    Sep 27, 2010
    SRP has the SetShaderPassEnabled functionality and the Docs say "By default, all Shader passes are enabled", it would be beneficial in my mind to be able to specify a Pass Tag that the Shader Pass is disabled, and have the omission of that tag use the default.

    Understandably, you could manage this from the MaterialEditor or ShaderGUI level, but adding this feature could eliminate some boilerplate code.
  41. SebLagarde


    Unity Technologies

    Dec 30, 2015
    Hi, Book of Dead use Terrain, but it is a slightly modified/hacked engine compare to raw Unity/HD. We currently try to closed the gap and see how we could provide a compatibility for the terrain with the layered lit shader of HD. no ETA. The demo will be deliver with 2018.2.

    As other content, there is also the Fontainebleau demo: This use a raw 2018.1 engine/HD version, so don't use any terrain tech.
    mukki014 and antoripa like this.
  42. SebLagarde


    Unity Technologies

    Dec 30, 2015
    The goal of various SRP to provide are:
    - Provide example of pipeline you can write with SRP (write your own, or modify ours, you can see them as tutorial)
    - Close gap on high end (High quality graphic) and low end (Performance/efficiency on low platform) not provided by Unity

    Goal of HDRP: achieve quality/performance of modern engine to produce AA to small AAA games (think about Batman, Remember Me, Fortnite, Deus Ex, Bioschock etc...) + have a mode for even higher quality (i.e not game)

    If builtin Unity fit your need and is higher quality for your project, then use it :), Chose the pipeline that fit your project. I can't recommend HD for any project shipping this year.
    The benefit of HDRP really depend on your context/project. And of course, being an experimental technology there is still plenty of bug/problem and limited documentation. To help you to better understand the difference with Unity, here is the list of feature of HDRP in 2018.1.

    2018.1 features / functionality

    • Platform support: DX11, DX12, PS4, XBoxOne, Metal, Vulkan

    • HD don’t use Resources folder mechanism but rely on a RenderPipelineResources Asset that reference all orphan shader and compute shaders
      • Pros: Users don’t need to add mandatory shaders to always include shaders

      • Cons: Not using Resource folder is problematic as asset referenced in RenderPipelineResources suffer from another issue related to asset pipeline and meta that cause random link break when we upgrade them. Problem appear with Postprocess stack also.

      • Not extensible by the users, even more painful with package manager
    • Constant buffer at Frame, Camera and RenderPass frequency are setup from C#
      • Done for camera except for Shadow matrix

      • ConstantBuffer arrangement compatible with the new batcher from Arnaud Carre ("UnityPerMaterial" and "UnityPerDraw”)

      • TAA jittering matrix handled by HDRP, VR camera retrive with C# API and sent to the shader
    • Camera relative rendering (to have better precision)

    • RenderPipeline Asset control supported features of HD (shadowMask, VR, etc…)
      • This allow to definitely remove memory allocation or shader variant
    • FrameSetting by camera. FrameSettings allow to define which rendering features are enabled for this camera (shadow, sky/fog, sss, opaque, transparent, postprocess…)

    • Scene settings control via generic volume settings hierarchy (similar to postprocess)
      • This handle settings like sky, cascade shadow, fog
    • RenderTarget management via RTHandle
      • No use of GetTemporaryRT() call that cause a lot of trouble with multiple camera and a lot of reallocation. Target are allocated via RenderTexture and based on percentage of screensize (except texture like shadow).
    • Generic Debug windows that can handle any debug command / control. This windows is available both in editor and at runtime.
      • FrameSettings per camera control (Override current FrameSettings of the camera)

      • Material debug mode
        • Display any properties of a given master shader either deferred or forward
      • Lighting debug mode
        • Can display diffuse lighting only, specular lighting only, shadow map, sky cubemap, lux meter

        • Can display shadow of selected light in editor
      • Property override mode
        • Override albedo, diffuse, normal
      • Mipmap and streaming debug mode
        • Only based on baseColor of Lit.shader
      • Property debug mode
        • Display mesh with a given property like POM or Tessellation
      • Intermediate buffer
        • Can display intermediate AO buffer, motion vectors
      • Color picker debug mode
        • Allow to pick a pixel of current HDR buffer or any debug mode enabled and display its value
      • Nan checker

      • Light tile / Cluster / material classification debug mode
    • New light type / Features
      • HD Custom light editor

      • Inner angle control for SpotLight

      • SpotLight shape: Cone, Box (Box is what people call local directional) and Pyramid Projector
        • Box don’t support orthogonal shadow for now

        • Pyramid projector don’t use a correct shadow (Use cone settings one)

        • Pyramid support perspective shadow

        • Pyramid and box have no edge attenuation, this is control by the cookie
      • Color temperature

      • Colored cookie textures on directional, point and spot lights
        • Repeat and clamp mode for directional

        • Clamp to black only for spot

        • Animated cookie supported with CustomTexture
      • No attenuation option (For indoor), Dimmer option (to turn on/off smoothly light by script), Fade distance option (Smoothly fadeout with distance)

      • Affect Diffuse / Affect specular option (Don’t save any cost)

      • HD Reflection probe editor

      • Reflection probes with improvement
        • Support OBB and sphere shape

        • Separate influence volume and proxy volume (Only API Side, not UI)

        • Various fading option: per face fade distance, per face fade, normal oriented based fade, weight dimmer

        • Intensity multiplier

        • Realtime reflection probes supported

        • Reflection probe convolution done on GPU at runtime
          • No need for “Specular (Glossy Reflection)” option for custom cubemap (Save import time)

          • Allow different texture resolution for different platform (Platform just use regular mipmap required then convolve)

          • CPU convolution still enable but is useless (need to disable it)

          • For real time cubemap: GPU built-in unity convolution still execute in addition to HD correct convolution.
        • GPU convolution that we do is fast and use importance sampling, it is not a good fit for HDRI with a Sun (Sun should be analytic), it generate noise.
      • Realtime Planar reflection probes
        • Handled exactly like reflection probe (except it is a plane)

        • Only mirror (No glossy reflection)
      • Rectangular area light (no shadow, no GI) - high cost

      • Line light (no shadow, no GI) - high cost

      • Physical light unit (Lumen for punctual light and area, Lux at ground level for Directional)

      • Built-In area lights are not supported (replace by Rectangle light). However in 2018.1 Rectangle Light don’t support baking, so there is no equivalent

    • Shadows:
      • HD additional shadow data editor
        • Shadow dimmer

        • Shadow resolution
      • ShadowMask and ShadowMask Distance feature of built-in Unity supported

      • Deferred shadow for deferred and forward opaque (for GPU performance)

      • Screen space contact shadow for directional light only - control on volume settings - high cost (not polished)

      • Cascade shadow map distance control on volume settings

      • Support various filtering algorithm: MSM, VSM, EVSM, PCF, TentPCF
        • Shadow atlas

        • Default to TentPCF

        • Control is done via code
      • Various shadow bias control
        • View bias

        • Normal bias
          • Use normal map. For directional light only in deferred.
        • Edge leak fix

        • Sample bias (disable by default, expensive)

    • Lighting architecture
      • Prepass option
        • Prepass always perform for forward opaque/opaque alpha tested and deferred opaque alpha tested. Then during regular rendering (Forward or deferred), the alpha testing is not perform and it use depth equal z test (Performance reason).

        • Optional full prepass in deferred (i.e render opaque non alpha tested as well)
      • Tile rendering for deferred
        • Lighting and emissive stored together in emissive buffer (RT3)
      • Tile or cluster rendering for forward opaque, transparent always cluster. If MSAA always cluster
        • Current limit is 24 light per tile (could be change to 31 with few effort)
      • Tile/Cluster use TextureArray
        • Cookie, CubeCookie, ReflectionProbe, PlanarReflection have fixed size

        • Realtime version version are part of the TextureArray with same size restriction

        • None of this texture are compressed (Compressing texture in editor currently don’t change that - uncompressed in TextureArray)
      • Light classification for deferred case

      • Reflection and refraction hierarchy
        • Screen space refraction (not working with sky background), no screen space reflection

        • Fallback on reflection probe (Reuse same probe for Refraction)

        • Fallback on sky
      • Feature parity between deferred and forward renderer (Decals, SSSSS)
        • Except for shadow bias normal with deferred directional light

    • Sky
      • Realtime GPU cubemap convolution (See Reflection Probe)

      • Sky settings control by volume settings

      • Sky manager allowing to have different sky type
        • Optional separate sky settings for baking (Ambient probe + Lightmaps)

        • Optional lighting override (lighting different from visual)

        • Designed with dynamic time of day in mind
          • Sky is rendered into a cubemap when changed and then a convolution is done on GPU. Then sky is render normally in background.

          • Currently cubemap setup in a sky material for built-in Unity system

          • Time of day not implemented
      • HDRI sky

      • Procedural sky (Same as built-in Unity)

      • Fog/Atmospheric scattering: Linear or Exponential
        • Support height based fog

        • Fog can be tint by sky color (Use mips from the sky cubemap)

        • Apply to both opaque and transparent

    • SS Ambient Occlusion (From stack V2: MSVO)
      • SSAO is apply during lighting pass on lighting buffer (which include GI * albedo * AO + emissive, mean double occlusion and AO on emissive that are barely visible)

      • SSAO can be apply on direct lighting with a percentage factor

      • Work in both forward and deferred (Depth only)

      • Use multibounce approximation formula from Jimenez

    • SS Specular occlusion
      • Use Tri-ace trick on top of SSAO

      • Min with specular occlusion from material (no double occlusion)

    • Global illumination
      • Hack to support alpha map and transmissive for PVR (Custom transmissive map should be supported if user add it)

      • Inverse square attenuation (with smooth attenuation) for progressive lightmapper

      • Enlighten don’t support inverse square attenuation
    • Material architecture
      • Material classification for deferred case

      • 4 GBuffer (3 for material, one for lighting) in deferred + 1 for shadow mask

    • Lit
      • Opaque/Transparent
        • Blend mode for transparent: Alpha, Add, Premultiplied alpha

        • Compatible with Fog

        • Blend mode specular lighting: better handling of specular lighting with transparent blend mode
      • Support receive decal

      • Motion vector for vertex animation (with shader graph)

      • Transparent Queue priority allowing to force sorting order of transparent

      • Two sided lighting: None, Flip, Mirror

      • Object space (OS) and tangent space (TS) normal map

      • Parallax occlusion mapping (POM)
        • Can adapt to object scale and tiling
      • Vertex displacement map (Can be apply with or without tessellation)
        • Can adapt to object scale and tiling
      • Tessellation (Phong)

      • Disney parametrization (Metal/BaseColor) or Specular/Color parametrization

      • Detail map (Smoothness, Albedo, normal)
        • Tiling of detail map can inherit from tiling of base
      • Surface gradient framework (allows to have correct tangent basis for UV sets other than 0). Normal mapping in tangent space rely on a tangent basis that is calculated from the UV set. In Unity, tangent basis is only available for UV0, it means that if other UV sets are used for normal map (like it is often the case for detail normal map), it is incorrect. Surface gradient framework allows to generate a tangent basis on the fly for UV sets other than UV0 and have correct normal mapping result.

      • UV0-UV3 mapping, Planar and Triplanar mapping (compatible with POM and normal map OS or TS)
        • UV1 still reserved by static lightmap (Built-in behavior)

        • UV2 still reserved by dynamic lightmap (Built-in behavior)

        • Precision on static batching issue wrt Lightmap UVs. When static batching happens, *lightmap* UV transformation is applied directly inside the batched mesh on the UVs and not in the shader anymore so it means that it's applied for everything that uses these particular UVs
      • Bent normal support and use to fetch lightmap/light probe (provides better result than using normal map)

      • Specular occlusion with BentNormal (bad quality)

      • Emissive color/mask that can be affected by albedo or not

      • Standard BRDF
        • Isotropic GGX for specular with multi scattering support for specular

        • Burley Disney for diffuse
      • Various BRDF features (Add on top or replace part of standard)
        • Anisotropy
          • Anisotropic GGX with multi scattering support for specular

          • Parameters goes from -1 to 1 to support along tangent and along bitangent anisotropy
        • Subsurface scattering
          • Normalized Burley diffusion (Disney) subsurface scattering

          • Control via a diffusion profile (Diffusion Profile asset)

          • SSS Mask

          • Optional transmission
            • Thickness map for transmittance

            • Transmittance handle shadow for thin object

            • Hack to support shadow for thick object
        • Iridescence
          • Replace fresnel of specular

          • Iridescence Mask

          • Iridescence Thickness map
        • Translucent
          • Transmission Only (See Subsurface scattering)
        • Clear coat
          • Clear coat Mask

          • Fixed smoothness, reuse normal map of standard
      • Rough refraction option for transparent material
        • Support 2 mode (plane and sphere) with thickness

        • Absorption

        • Index or refraction

        • Transparent can be render before rough refraction pass (but then break the sorting) to be visible in rough refraction
      • Distortion (Distortion is based on distortion vector and target artistic effect unlike rough refraction which is physical)

      • GI
        • Linked Two sided lighting with Double Sided global Illumination

        • Meta pass is correctly handled for vertex color and UVs and support planar and triplanar
      • Transparent only sorting help
        • Material can define relative priority (either pre refraction or not)

        • (not expose) Support for a depth prepass for transparent object (to help with sorting issue)

        • Support of back face then front face rendering for improve sorting issues

        • Support of depth post pass to solve issue with Postprocess (DOF/MB)
      • Support instancing

    • Layered Lit
      • Two to Four Layers
        • Layer is a Lit material that only support material type: Standard, Subsurface scattering, Translucent)
          • Transparent may not behave correctly

          • Flipped normal with two sided lighting may not work correctly with multiple layer and surface gradient framework with UV1-3.

          • Parallax occlusion mapping (POM) does not work currently with multiple layers
      • Blending based on
        • Blend mask
          • Separate UV mapping and tiling
        • Vertex color (two mode: additive, multiplicative)

        • Heightmap (optional)
      • Optional influence mode of Main layer
        • Influence mask layer for fine control

        • Current baseColor, normal and heightmap Layer can be modified (influenced) by the Main layer
      • Optional density mode
        • Alpha in diffuse controls a threshold to make layers appear (Giving the illusion that we control the density of a particular material, like small pines)
      • 2 buttons to be able to synchronize properties of a given material on a given layer
        • One to synchronize all properties

        • One to synchronize all properties but UV mapping and tiling

    • Decal
      • Decal Projector
        • DBuffer approach for deferred and forward opaque

        • Cluster decal for transparent (Crash on PS4)

        • Support BaseColor, Normal, Smoothness (Single blend value)

        • Still unstable
    • Support Postprocess stack V2 support except SSR

    • Motion vector: Support of skinned motion vector and add concept of MotionVectors pass

      Comparison with vanilla Unity

      Divergent behavior from vanilla Unity
      • HD rely on a metric system. 1 == 1m. This scale need to be respected to work correctly with lighting

      • Better pre-integration of cubemap with GGX BRDF. Vanilla Unity pre-integrate cubemap with NDF, then apply a tweaked Fresnel term with roughness. HD pre-integrate DFG term and apply it on pre-integrate cubemap with NDF. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).

      • Pre-integration of GI (Lightmap/Lightprobe) with Disney diffuse. Vanilla Unity do nothing. HD apply in a post step the pre-integrated DFG term. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).

      • HD Light attenuation is inverse square falloff and use linear intensity. There is a smooth terminator function to force the attenuation to be 0 at range boundary, there is also an option to not apply the attenuation at all. Vanilla Unity use a texture to store the attenuation with special falloff formula and use gamma intensity. Going from gamma intensity to linear intensity is done with linear_intensity = gamma_intensity^0.4545. Note: As attenuation is different, even using this formula make no guarantee of match.

      • HD Spot light attenuation use 2 angle (inner angle and outer angle) to control spot attenuation. Vanilla Unity use only one.

      • HD correctly perform a divide by PI of the whole BRDF. Vanilla Unity have an inconsistency in the BRDF between specular and diffuse, specular is divide by PI but not diffuse. This mean whatever the effort done to match lighting between vanilla and HD, one of the component will be PI different.

      • HD interpret the influence parameter for Reflection Probe in a more artists friendly way. Influence is inner influence (mean transition happen inside the volume). Vanilla use outer influence. This have been switch because Artists prefer to setup their volume then simply tweak transition size. With outer influence, they need to update the volume size when tweaking transition size (in indoor).

      • HD use camera relative rendering. Vanilla Unity don’t support it. It mean that light/object sent to shaders in HD have a different translation.

      • Metal/smoothness - Specular/smoothness is handled inside the same Lit material

      • Additive blend mode apply opacity unlike StandardShader

      • DoubleSidedGI is automatically coupled to two sided lighting flag and not expose in the UI

      • MotionVectors work as in vanilla Unity (if skinned or rigid transform render motion vector) but in addition, in HDRP if a shader have an enabled MotionVector pass (velocity pass), it will render into the buffer independetly of moving or skinning (so it handle vertex animation).

      • HD use camera relative rendering
  43. mukki014


    Jul 30, 2017
    Hi I want to know. Why is unity get stuck during a build using HDRP it's stuck on shader compilant variant. So I'm not able to build. I know HDRP isn't ready for production but I just want to test it performance and other features after build. Thank you.
  44. SebLagarde


    Unity Technologies

    Dec 30, 2015
    so HD have a lot of shader varaint and it can take few minutes to compile, this is maybe what you experience. This is even worse when you build a player. In more recent version we have added a progress bar when building the player for shader compilation and also we reduce the number of shader variant to be compile in 2018.2.

    There is also some rare case where when you have an error in the shader (mean you have modify it), you can get stuck in shader variant compilation. Try to wait a long time (1H), and see if your problem is solve.

    There is currently effort for 2018.2 on shader compilation to make it faster. HD shader are more complicated than regular Unity and with many more variant and Unity was not architecture shader compiler was not ready for this :)
    mukki014 likes this.
  45. mukki014


    Jul 30, 2017
    Thank you for the fast response. I will try to build again for PC.
  46. cfree


    Sep 30, 2014
    UE also have a "Compiling Shaders" loader, so it is expected... I hope we can be fast as they are, or faster ;)

    I hope as well this "rare case where when you have an error in the shader" gets fixed soon... 1 H is a long time to wait, even more with the possibility of having an error in the end :)

    Thanks for your team hard work... we (the community) have BIG expectations on the HD pipeline!
    mukki014 likes this.
  47. Kustuk


    May 29, 2016
    Do I understand right - SSR isn't implemented yet in HDRP? It has SSR option here and there but it does nothing
  48. TwiiK


    Oct 23, 2007
    Thanks for the thorough reply, but I feel like it didn't answer my first question. I've experimented quite a bit with the HD pipeline now and read all the articles I could find about it and I can see a number of features it provides that the normal pipeline doesn't, so that point is fair.

    But most of the games you gave as examples are made with Unreal Engine which afaik has one way of defining materials, lights etc. all the while both supporting low end and the highest end rendering. My question or worry is that working with Unity going forward is going to feel like working with multiple different engines depending on what pipeline you're using. That's why I was curious if all the pipelines, at least those provided by default by Unity, will at one point use one standardized system of defining these things or if it will always be this way.

    I may be overreacting here, but this feels like Boo, Unityscript and C# all over again. I remember back when I was learning Unity that it was an awful experience for me to find a tutorial or example and for it to be in the "wrong" language. So far in my experimentation there seems to be no rhyme or reason between what produces a good looking scene in the HD pipeline compared to the default pipeline so a lot of what I know from Unity beforehand feels moot when working in the HD pipeline, it feels just as if I'm learning a new engine, which scares me a bit.

    And I get that these pipelines can be extended or that you can write your own, but I'm not at that level and I never will be, as is the case for most people who use Unity I imagine. I'm questioning the decision of having these "example pipelines" provided as starting templates with the installer when they are seemingly so different to work with and the knowledge I gain in one doesn't feel like it's transferable to the other.
    chingwa, mukki014 and nipoco like this.
  49. mukki014


    Jul 30, 2017
    I've got SSR working in hdrp.
  50. Kustuk


    May 29, 2016
    definitely doesn't work in 2018.2b3