Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Feedback Wanted: Scriptable Render Pipelines

Discussion in 'Graphics Experimental Previews' started by Tim-C, May 9, 2017.

  1. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,027
    I have several questions that I wasn't able to find documentation for, other than writing an entirely new pipeline:

    1] How do I inject a command buffer at a specific render point without writing a whole new render pipeline. For example, I want to copy the depth buffer right after it is created and create several downsampled versions of it (half, quarter, etc.). This was easy with a camera command buffer using before depth texture (forward) or before reflections (deferred).

    2] I want to inject several other additional command buffers: one right after the opaque objects are rendered and one right before transparent objects are rendered, and a final one right after transparent objects are rendered. How do I do this with the new render pipelines? Is this kind of granularity possible?

    3] What is the replacement for light command buffers? For example I want to capture and perhaps modify the cascade and screen space shadow map...?

    Will this be different for lightweight vs hd pipeline? What about custom pipelines?

    It would be great to hook into rendering points without writing a new pipeline, similar to how camera command buffers work, but maybe that's not how these pipelines are supposed to work...? I am quite new to SRP so please forgive my newness.
     
    Last edited: Jan 10, 2019
    Nyarlathothep, HolyShovel and Shorely like this.
  2. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    Hello everyone, i've got some question to you regarding by this article.
    https://github.com/Unity-Technologies/ScriptableRenderPipeline/wiki/Upgrading-to-HDRP
    You did wrote here: "The HDRP template examples were made to look good but were not realistic, whereas this Scene uses physically correct light values: an afternoon direct sun with no clouds in the sky is much brighter than even the best professional construction spotlight. However, the spotlight is still casting light and shadows on the side of the wall."
    SO my question is, how you'll make it realistic further?
    Also any news about post process stack 3, with pre-exposure, and rule sunny 16 for physical base lighting setup?
    P.S
    Can you share evening lux for sun and sky? I can't figure out it
     
    Last edited: Jan 11, 2019
  3. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    22
    That document's intensity values looks wrong.
    For the clear sky noon, Sun=100000 and Sky=20000 are typical number for iluminance(lux), not for luminous flux(lumen).
    And the surface's emissive intensity must be luminance(nit).

    The sun(directional light)'s intensity is ok at 100000lux.
    To make the sky 20000lux, the sky's emissive intensity must be 20000/PI nit if the sky is flat white.

    The LightBulb is 8500lumen, it's luminous flux(lumen).
    So the lightBulb's emissive intensity must be 8500/total surface area nit, if it's light distribution characteristics is isotropic.
     
    YJack and Danua like this.
  4. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    Ok, but I want to undrerstand what lux settings apply for sky and sun in THE EVENING. 5-6 PM
     
    Last edited: Jan 11, 2019
  5. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    #Feature requset:
    Allow for volumetric fog color(single scattering albedo) HDR values here
    upload_2019-1-11_15-24-3.png
     
  6. XRA

    XRA

    Joined:
    Aug 26, 2010
    Posts:
    189
    I've noticed the Cameras array in Render( context, cameras[] ) is always length of 1 even when both SceneView and GameView are visible rendering at the same time, is this intended?

    *EDIT* Yea it looks like the scene view and game view are called separately.

    I was trying to have some per-frame-data render globally across all active cameras and was treating it as if the Cameras array length is the currently active & rendering cameras, so that I know when all of them have rendered, but it doesn't seem to work that way.

    What is the intent of having Cameras as an array?

    *edit* here is the workaround, you can get an idea of what I am needing to do
    Code (CSharp):
    1. for (int i = DebugRenders.Count - 1; i > -1; i--)
    2. {
    3. #if UNITY_EDITOR
    4.     int count = Application.isPlaying ? 2 : 1;
    5.     if (DebugRenders[i].RenderCount >= count)
    6.     //if (DebugRenders[i].RenderCount >= cameras.Length) //TODO cameras.Length always 1 ?
    7. #endif
    8.     {
    9.         DebugRenders[i].Release();
    10.         DebugRenders.RemoveAt(i);
    11.     }
    12. }
     
    Last edited: Jan 13, 2019
  7. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Core unity does need this class as we call Render on it. We may be able to do something a little better here though.

    This is being added today :)
     
  8. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    You can have more than one active camera. Think split screen or a 'video' camera in your game world that records to render texture.
     
    ReadyPlayGames likes this.
  9. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,027
    Very nice!
     
  10. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    @Tim-C, what's about that?
    #Feature requset:
    Allow for volumetric fog color(single scattering albedo) HDR values here
     
  11. equalsequals

    equalsequals

    Joined:
    Sep 27, 2010
    Posts:
    105
    @Tim-C @phil_lira Is there an API for querying whether or not a given Shader is supported by the active RenderPipeline?

    There exists
    Shader.globalRenderPipeline
    for setting what the active SRP is, but I don't see any way to query if a Shader has the correct
    RenderPipeline
    Tag. It would be great if either
    ShaderData
    ,
    ShaderInfo
    , or
    ShaderUtil
    had some way of querying this for Editor tools.

    On a side note, it would be extremely beneficial if the Shader List dropdown in the Material Editor would hide unsupported Shaders by default. Our artists quite frequently complain about there being a lot of "garbage" created when selecting a Shader now, since so many of them are unsupported. It's a Quality of Life feature, for sure, but with an API I could script this type of behavior myself with a custom MaterialEditor.
     
  12. whidzee

    whidzee

    Joined:
    Nov 20, 2012
    Posts:
    104
    Is there an easy way to revert back to not using a scriptable render pipeline once you've converted to the HDRP? my project is coming along nicely however i found out that some tools i have no longer work in the HDRP and reverting back to not using any scriptable render pipeline will allow my tools to continue working.
     
  13. equalsequals

    equalsequals

    Joined:
    Sep 27, 2010
    Posts:
    105
    Remove the pipeline asset from the Graphics Settings, modify all your Materials to no longer be HDRP and remove any special components.

    Or, if you're using version control, just revert back to a revision before HDRP was used.
     
  14. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    385
    We have this in our backlog but given the amount of things to wrap-up for getting out of preview this didn't get high priority.

    About the Shader api to check if shader is supported. What's your use case? Is it for filtering shaders in the editor dropdown?
     
    Prodigga likes this.
  15. equalsequals

    equalsequals

    Joined:
    Sep 27, 2010
    Posts:
    105
    I was able to achieve the filtering with some heavy use of Reflection and a blacklist ScriptableObject. It's quite an elaborate and not exactly as futureproof as an official API though. ;)

    For this specific case yes, but I can see some additional use cases as well. If a ShaderGUI or Material Editor for an asset needs to change its behavior depending on the active SubShader per RenderPipeline.
     
  16. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,332
    Just a bit of well intentioned criticism about the HD and LW teams not communicating.

    There are semantic differences between hand written shaders in HD and LW which are really not necessary. This is making shader shader conversions from legacy to HD/LW harder than it should be. I guess it is too late to change that now, but in the future, can teams please communicate a bit more in order to keeps things standard? That would help us a lot. Thanks.
     
    Rewaken, kelloh, hippocoder and 3 others like this.
  17. JJJohan

    JJJohan

    Joined:
    Mar 18, 2016
    Posts:
    208
    I've got a question about the possibility of drawing both color and depth output in a single pass and whether or not this is feasible with the scriptable render pipeline. Currently my application renders point cloud data, and it is desirable for the user to load quite high density data. This performs well, however there is a caveat.

    In order to improve the visual outcome to the user we also use a screen effect to show the depth differences between each point. This works well, but means we have to draw everything in two passes.

    In a shortened pseudo version:
    Code (csharp):
    1. Pass 1
    2. {
    3.     vertex shader
    4.     geometry shader
    5.     pixel shader
    6. }
    7.  
    8. Pass 2
    9. {
    10.     Tags { "LightMode" = "ShadowCaster" }
    11.     ColorMask 0
    12.  
    13.     vertex shader
    14.     geometry shader
    15.     pixel shader
    16. }
    Without the separate ShadowCaster pass the depth simply isn't output and never ends up in the image effect. The problem is this essentially results in drawing e.g. 6 million vertices instead of 3 million. Is this something that can be achieved with the scriptable render pipeline? I've tried to do some research but did not find much on the topic of combining color and depth output into one pass.
     
  18. drcrck

    drcrck

    Joined:
    May 23, 2017
    Posts:
    279
    How to make Light.AddCommandBuffer work in HDRP?
     
  19. antey3064

    antey3064

    Joined:
    Mar 21, 2016
    Posts:
    15
    The last three weeks HDRP downloaded from GitHub does not work.
    I'm using unity 2019.1.0 b3. New project.
    I copy the hdsrp files downloaded from github to the "Packages" folder. All except
    Tools,
    TestProjects,
    com.unity.testing.srp.lightweight,
    com.unity.testframework.graphics,
    com.unity.render-pipelines.lightweight,

    How to fix this error?

    Packages\com.unity.render-pipelines.high-definition\Runtime\RenderPipeline\HDRenderPipelineAsset.cs(203,32): error CS0115: 'HDRenderPipelineAsset.terrainDetailLitShader': no suitable method found to override
    Packages\com.unity.render-pipelines.high-definition\Runtime\RenderPipeline\HDRenderPipelineAsset.cs(211,32): error CS0115: 'HDRenderPipelineAsset.terrainDetailGrassShader': no suitable method found to override
    Packages\com.unity.render-pipelines.high-definition\Runtime\RenderPipeline\HDRenderPipelineAsset.cs(219,32): error CS0115: 'HDRenderPipelineAsset.terrainDetailGrassBillboardShader': no suitable method found to override
    Packages\com.unity.visualeffectgraph\Editor\PackageInfo.cs(20,80): error CS0117: 'PackageInfo' does not contain a definition for 'GetAll'
     

    Attached Files:

  20. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    2019.1 requires 5.x.x HDRP, github master is at 6.x.x and works only in 2019.2. You can grab 5.4.0 from github today and it will work on 2019.1 or if you want to try wip branch, get https://github.com/Unity-Technologies/ScriptableRenderPipeline/tree/release/2019.1 (but be aware this isn't always stable).
     
    antey3064 likes this.
  21. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    458
    Hi, you can't. HDRP don't use this mechanism, the lighting architecture is different.
     
  22. drcrck

    drcrck

    Joined:
    May 23, 2017
    Posts:
    279
    How to access shadowmaps in HDRP?
     
  23. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    103
    Is hdrp getting support for raytracing and voxelized shadow maps?:)
    I saw them in the git hub repositories,what can you tell about them?:)
     
  24. Tartiflette

    Tartiflette

    Joined:
    Apr 10, 2015
    Posts:
    59
    Looks like it. I suspect they can't say anything about it just yet because that will be one of their big reveals at GDC (which is in 3 weeks).
     
    AlanMattano and hippocoder like this.
  25. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,828
    Wink wink, nudge nudge, say no more.
     
  26. iriguchi

    iriguchi

    Joined:
    Feb 1, 2016
    Posts:
    30
    Unity 2019.1.0b4

    How do I update Reflection Probe via scripting in HDRP?
    I'd like to update Reflection Probe on demand because of performance, RealTIme update is too heavy to use.

    Here is Reflection Probe Inspector SS in standard render pipeline.
    I can see the vis scripting.
    RenderProbe() works fine!!
    rp_std.png

    But after install HDRP(5.3.1).
    The dropdown list item "via scripting" is gone away form inspector.
    And RenderProbe() do NOT works......
    rp_hdrp.png
     
  27. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    22
    Hi,
    I'm trying to get values from Volume objects at the position of a target object, not at the camera position.
    Like “VolumeManager.instance.Update(targetObject, postProcessLayer);”

    Are there any recommended way to get the transform of an object in the current scene, in RenderPipeline script?
     
  28. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    41
    I can't get Volumetric Lighting to work in HDRP (2018.3.6f1). I have this all activated :
    https://github.com/Unity-Technologies/ScriptableRenderPipeline/wiki/Volumetric-lighting

    And I set up:
    Volume (is global)
    Experimental Fog / Volumetric Fog
    Volumetric Lighting Controller
    Volumetric Fog
    Point Light, with component "Density Volume" including a Texture3D. Volumetrics is checked to "enable".

    I played with all the sliders, but it won't work.

    Here are some screenshots on imgur
    https://imgur.com/a/CpJ7ERi

    And one picture for the forum :)


    Has anyone got it working? Any tips, a tutorial / documentation link?
     
  29. Onigiri

    Onigiri

    Joined:
    Aug 10, 2014
    Posts:
    88
    @KYL3R Visual environment > Fog Type > Volumetric fog and enable Volumetric fog component
     
    KYL3R likes this.
  30. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    572
    Change Exponential fog to Volumetric Fog under the Visual Environment tab.
    upload_2019-3-2_9-3-56.png
     
    KYL3R likes this.
  31. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    41
    aaah thanks a lot. Working now. Simple solution, but the whole workflow seems a bit complicated, like everything in SRP. But Once you know how, it's pretty clear. Probably just a side effect of the flexibility.

     
    Grimreaper358 likes this.
  32. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,828
    It is not complicated, merely it's not the same as what you have already been doing. In addition, the reason for a lot of this is so you can have moodboxes or volumes that control every aspect of your game from any part automatically...

    So you might want exponential in one area or volumetric in another, or adjust shadow quality per area, it makes it trivial and simple!

    Problem is this early, there's not many tutorials :)
     
  33. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    827
    Could you add for volumetic fog an option for gradient color scattering albedo ?
     
  34. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,828
    What would it do?
     
  35. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    827
    It would add gradient color to fog.
     
    Lars-Steenhoff likes this.
  36. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    Yes like in stylistic fog from bit bucket repository
     
    Lars-Steenhoff likes this.
  37. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    Its all about make possible to separately set color for different disrance.
    As we now from RL fog aren't just a single color. It's different and that one became with distance. It's called aerial perspective!
    Veryfar distance almost white fog
    Far have more bluish shade.
    Not far light blue.
    And etc
     

    Attached Files:

  38. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    572
    That's what Atmospheric Scattering does. This seems to be for stylized rendering than anything else.
     
  39. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    41
    Hey. I got my hands dirty with the LayeredLit Shader in HDRP. It feels weird.

    This is what I ended up with after a LOT of fiddling.


    It looks okay, you may think. But while painting, it doesn't feel like it fills the crevices first, and when it does - it is very transparent. When I paint more, it already leaks onto the stone's surface too much.

    And it only works when I use the height-map of the stones as the Layer Mask Texture. I don't understand why I need that, when I just want to blend the layers using their heightmap and vertex color.

    The "Height Transition" slider is like "all or nothing". 0,001 to 0,007 shows small differences, but the rest is like "nope - no changes here". (Same feeling when using the slider for Subsurface Scattering (Index of Refraction by the way)

    Here are more images: https://imgur.com/a/ZAryaDz

    So, can anyone point out a mistake I am making? Or is this not intended to heightblend stones, grass and sand?
     
  40. CarlLee

    CarlLee

    Joined:
    Mar 4, 2015
    Posts:
    8
    Hi, I have 4 problems to report about LWRP, using 4.10.0:
    1. There's no checkbox to turn on soft particles wherever (used to be in quality settings). I ended up setting SOFT_PARTICLES_ON macros in the shader manually.
    2. Soft Particles doesn't work for LWRP particle shader. It worked after I changed those lines(ShaderLibrary/Particles.hlsl, line 66)

    Code (CSharp):
    1.  
    2. // Fading vertex function
    3. #if defined(SOFTPARTICLES_ON) || defined(_FADING_ON)
    4. #define vertFading(o, positionWS, positionCS) \
    5.     o.projectedPosition = ComputeScreenPos(positionCS);\
    6.     o.projectedPosition.z = -TransformWorldToView(positionWS.xyz).z
    7. #else
    8. #define vertFading(o, positionWS, positionCS)
    9. #endif
    10.  
    3. Depth textures doesn't work if I only turn "Depth texture" on in LWRP asset and turn off "Opaque texture".
    4. All shaders have compile errors if I upgrade LWRP(from 4.9.0 to 4.10.0), I have to delete the whole Library folder and reopen Unity to make them work again.

    I reported 1,2,3 from bug reporter inside Unity too.

    There're screenshots of depth texture problem:

    https://imgur.com/a/fv8YviN

    I tried to create a depth prepass myself to work around this. The best injection point would be IBeforeRenderPass, but that it's run before "SetupForwardRenderingPass", so variables are not set up for rendering. There's no way to fix this without changing the source code of LWRP.

    If I turn both "Depth Texture" and "Opaque Texture" options on, the frame rate drops from 60 to 40 instantly. There's too much overhead for those 2 textures. I captured API calls with RenderDoc and found out that Unity was rendering to a FBO(let's call it A) with a renderbuffer set as depth buffer. If I enable "Depth Texture" and "Opaque Texture", Unity will create another FBO(let's call it B) with a texture2d as depth buffer. After opaque pass, the color texture and depth texture will be copied to another 2 textures. After all renderings were done, B will be blit to A and A will be blit to backbuffer. That's just too many render target switches and copying, it cost 7ms for an empty scene on a Snapdragon 636 device.

    Why can't we just set a texture2d as depth buffer for A, and copy it to a separate texture when opaque renderings are done? I didn't find an easy way to do this with LWRP
     
    Last edited: Mar 6, 2019
  41. CarlLee

    CarlLee

    Joined:
    Mar 4, 2015
    Posts:
    8
    Another problem is that: why expose IRenderSetup interface but leaving no way to inherit from DefaultRenderSetup?

    Also, what's the best way to change bits and pieces from LWRP and still get updates when you guys update the code base on github?

    How can I setup IDE integration when I want to change LWRP's source code? It's very hard to write without code completion.
     
    Last edited: Mar 6, 2019
  42. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    359
    I submitted a couple of bugs regarding Cubemap rendering in the LWRP. However, on some of them regarding Stereo Cubemap Rendering being broken, they just get closed with the message "this feature is not supported, thus this is not a bug".
    I can't find any information about that - neither the LWRP nor the HDRP documentation nor the github repo contain any information about which features are supposed to work, which are deliberately omitted, and which are known to currently being broken.

    With LWRP out of preview, I find this very disturbing as it means that people (like us with the Cubemap rendering) might stumble across those issues halfway throughout development.

    Am I missing something? Is there a clear list of supported/unsupported features somewhere?

    Also, @Tim-C the documentation link in your first post is broken.
     
  43. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    22
    If the system will not use, please open to use the rest float3 of the unity_RenderingLayer for general purpose.

    That UnityPerDraw buffer is valuable for custom SRP with SRPbatcher (without ECS?), so if I can set that float3 value on mesh component from script, it can use for many purpose.
    Make variation per object, Pass the object’s volume for scale detail map repeatation and other actual size values baked on asset, and so on.
     
  44. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    827
    How to get emissive material in SRP?
     
  45. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,828
    It's on the shader?
     
  46. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    827
    What do you mean on shader?

    If I use material emisive light, and put it on object, it still have shadows. But emissive materials shouldn't have shadows.

    Also, I don't know if I should use albedo color with emissive color or put it as white or black?
     
  47. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,828
    You need to turn shadow casting off on the mesh. Emissive objects do not receive shadow.
     
  48. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    827
    What does this mean ?

    Code (CSharp):
    1. Recursive rendering is not supported in SRP (are you calling Camera.Render from within a render pipeline?).
    2. The targets array should not be used inside OnSceneGUI or OnPreviewGUI. Use the single target property instead.
    3. UnityEditor.Experimental.Rendering.HDPipeline.HDLightEditor:OnEnable()
    4. The serializedObject should not be used inside OnSceneGUI or OnPreviewGUI. Use the target property directly instead.
    5. UnityEditor.TransformInspector:OnEnable()
     
  49. vrycue_unity

    vrycue_unity

    Joined:
    Mar 12, 2019
    Posts:
    2
    I am having this issue using LWRP

    Screenshot_20190312-200301.jpg

    It should look like the image below

    2019_03_10_00.18.45.jpg

    The details for the device which the issue occurred is on the image below

    Screenshot_2019-03-13-07-23-27.jpg
     
  50. ReadyPlayGames

    ReadyPlayGames

    Joined:
    Jan 24, 2015
    Posts:
    45
    I'm reading through this: srp-batcher-speed-up-your-rendering and I see the "UnityPerDraw" CBuffer but whenever I put any of the items in the CBuffer (say, unity_WorldToObject) I get an error telling me that it's already been declared. What am I doing wrong?