Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Will new HD pipeline be new quality in rendering?

Discussion in '2018.1 Beta' started by kubawich, Jan 13, 2018.

  1. kubawich

    kubawich

    Joined:
    Mar 2, 2014
    Posts:
    11
    I'm curious about new HD pipeline which as mentioned will be announced soon. I didn't had occasion yet to try new rendering system, but how will it be made? Is this made that way that lightweight pipeline is sth mobile quality, and HD will have quality of previous renderer, or lightweight renders similar way as older versions, and HD will be yet more high end like UE(sic!)? Regards.
     
  2. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    4,190
  3. richardkettlewell

    richardkettlewell

    Unity Technologies

    Joined:
    Sep 9, 2015
    Posts:
    1,207
    There is a demo video at the end of the Unite Europe keynote too:
     
    Last edited: Jan 13, 2018
    Peter77 likes this.
  4. KarolisO

    KarolisO

    Joined:
    Feb 2, 2014
    Posts:
    28
    The demo at the end of the Unity Europe 2017 keynote left me more confused. I could not tell any of the new features/improvements the HD pipeline is suppose to show off.
     
    Last edited: Jan 13, 2018
  5. richardkettlewell

    richardkettlewell

    Unity Technologies

    Joined:
    Sep 9, 2015
    Posts:
    1,207
  6. KarolisO

    KarolisO

    Joined:
    Feb 2, 2014
    Posts:
    28
    richardkettlewell likes this.
  7. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    Hi,

    I am leading the team working on the HD Renderpipeline,
    Please disregard any of the previous documentation linked above, they are obsolete and don't reflect current state of HD anymore.

    Keep in mind that HD is in highly development and thing are still changing fast and a lot. There is no ETA yet for this feature, but change will slow down for 2018.1 release. We will provide documentation for the release, but until then, there will be neither support or documentation provided. Being a small team working on this feature, several "available in all engine" features are still not supported / incomplete.

    HD is targeting compute-shader capable platform, and focus first on PC/PS4/XBoxOne, Metal is supported but there still few artifacts, Vulkan will be supported in the future.
    HD is different from LightWeight in many way, Lightweight is targeting all platform. If you aim to develop a cross platform project, use Lightweight, there is no mix of pipeline possible.

    Here is an exhaustive list of current state of development of HD, hope it will help to answer some of your questions.


    General
    • RenderPipeline resources with reference to all compute/shaders
      • Bypass Resources folder

      • Users don’t need to add mandatory shaders to always include shaders

      • Only resources for used pipelines are loaded
    • Try to setup Frame, Camera and render Pass constant frequency in C# (Done for camera except for Shadow matrix)

    • Camera relative rendering (to have better precision)

    • PS4 / PC DX11 support, XBone (almost), Metal (almost)

    • Prepass option
      • Optional Prepass in deferred

      • Always present Prepass for forward opaque (in deferred and forward)

      • When doing a prepass with alpha tested object, the GBuffer and Forward pass don’t do the alpha testing (save GPU time)
    • Generic volume hierarchy similar to postprocess to handle various scene settings
    Debug
    • Debug windows that handle various debug view mode (available in editor and at runtime)
      • Lighting debug mode: Can display diffuse lighting only, specular lighting only, shadow map, sky cubemap
        • Can display shadow of selected light in editor
      • Material debug mode: Display any properties of a given master shader either deferred or forward

      • Mipmap debug mode

      • Property debug mode: Display mesh with a given property like POM or Tessellation

      • Intermediate buffer: Can display intermediate AO buffer, motion vectors

      • Nan checker (Inverted in SceneView currently)

      • Light tile / material classification debug mode

      • Display opaque/transparent only

      • SSS toggle

      • Fog toggle
    Lighting
    • New light type
      • Rectangular area light (no shadow) - high cost

      • Line light (no shadow) - high cost

      • SpotLight shape: Cone, Box (Box is what people call local directional) and Pyramid Projector
        • Box don’t support orthogonal shadow for now

        • Pyramid support perspective shadow

        • Pyramid and box have no edge attenuation, this is control by the cookie
      • Colored cookie textures on directional, point and spot lights
        • Repeat and clamp mode for directional

        • Clamp only otherwise
      • Reflection probes with improvement
        • Support OBB and sphere shape

        • Decoupled influence volume

        • Various fading option: per face, normal oriented based
      • MaxSmoothness property for point/spot light to fake sphere area light

      • No attenuation option (For indoor)

      • Affect Diffuse / Affect specular option (Don’t save any cost)
    • Light architecture
      • Fptl/cluster forward renderer

      • Fptl/cluster deferred renderer

      • Material/Light classification for deferred case

      • Baked lighting and emissive stored together (This may evolve in the future. Have limitation when applying AmbientOcclusion in deferred for example)

      • Reflection and refraction hierarchy: Blend correctly between all reflection/refraction type

      • Feature parity between deferred and forward renderer (Decals, SSSSS)
    • Sky
      • Sky manager allowing to have different sky type (WIP)
        • Designed with dynamic time of day in mind
          • Sky is rendered into a cubemap when changed and GPU convolution is done

          • Currently cubemap setup in a sky material for vanilla Unity system

          • Currently can’t do dynamic time of day as Enlighten do a synchronized readback of textures which syncs with GPU (Edit: Need to use the new async readback feature)

          • Then sky is render normally in background
      • Realtime GPU cubemap convolution
        • Reflection probe use GPU convolution at runtime
          • Allow different size for different platform

          • CPU convolution still enable but is useless (need to disable it)

          • For real time cubemap: GPU unity convolution still execute in addition to HD correct convolution.
        • Note: GPU convolution that we do is fast and use importance sampling, it is not a good fit for HDRI with a Sun (Sun should be analytic).
          • Provide Multiple importance sampling option for HDRI with Sun on GPU (WIP)
      • HDRI sky

      • Procedural sky (Same as Vanilla Unity)

      • Fog/Atmospheric scattering is supported on transparent.
    • Shadows:
      • ShadowMask and ShadowMark Distance feature of Unity supported

      • Deferred shadow for opaque (for GPU performance)

      • Support various filtering algorithm: MSM, VSM, EVSM, PCF, TentPCF
        • Control is done via code
      • Quality/Performance is still a concern (Quality due to the various bias control)
        • Need a scalable way to expose resolution (low, medium, high)

    • Lighting Postprocess
      • Ambient occlusion (From stack V2: MSVO)
        • SSAO is apply during lighting pass on lighting buffer (which include GI * albedo * AO + emissive, mean double occlusion and AO on emissive that are barely visible)

        • SSAO can be apply on direct lighting with a percentage factor

        • Work in both forward and deferred
    • GI
      • Hack to support alpha map and transmissive for PVR (Custom transmissive map should be supported if user add it)

      • GI is not matching correctly (PI factor discrepancy for direct lighting)
    Material
    • Lit
      • Opaque/Transparent
        • Blend mode for transparent: Alpha, Add, Premultiplied alpha

        • Compatible with Fog

        • Blend mode specular lighting: better handling of specular lighting with transparent blend mode
      • Two sided lighting (Automatically flip or mirror the normal + disable backface culling) - To check: Flipped normal may not work correctly with multiple layer and surface gradient framework with UV1-3.
        • DoubleSidedGI is automatically coupled to two sided lighting
      • Object space (OS) and tangent space (TS) normal map

      • Parallax occlusion mapping (POM)
        • Can adapt to object scale and tiling
      • Vertex displacement map (Can be apply with or without tessellation)
        • Can adapt to object scale and tiling
      • Tessellation (Phong)

      • Detail map (Smoothness, Albedo, normal)
        • Tiling of detail map can inherit from tiling of base
      • Surface gradient framework (allows to have correct tangent basis for UV sets other than 0). Normal mapping in tangent space rely on a tangent basis that is calculated from the UV set. In Unity, tangent basis is only available for UV0, it means that if other UV sets are used for normal map (like it is often the case for detail normal map), it is incorrect. Surface gradient framework allows to generate a tangent basis on the fly for UV sets other than UV0 and have correct normal mapping result.

      • UV0-UV3 mapping, Planar and Triplanar mapping (compatible with POM and normal map OS or TS)
        • UV1 still reserved by static lightmap (Vanilla behavior)

        • UV2 still reserved by dynamic lightmap (Vanilla behavior)
      • Bent normal used to fetch lightmap/light probe (provides better result than using normal map)

      • Emissive color/mask that can be affected by albedo or not

      • Standard
        • GGX for specular with multi scattering support for metallic + Burley Disney for diffuse
      • Anisotropy
        • Anisotropic GGX with multi scattering support for metalic

        • Parameters goes from -1 to 1 to support along tangent and along bitangent anisotropy
      • Subsurface scattering
        • GGX for specular + Diffusion profile + Disney diffuse approach

        • Control of SSS via a diffusion profile (SubsurfaceScattering Profile asset)

        • Jimenez style diffusion or approximate normalized Disney diffusion
          • Disney contain single scattering and is more accurate. Also the value can be take from measure material. It is a bit more costly than Jimenez.
        • Thickness map for transmittance
          • Transmittance handles shadow only for thin object, hack for thick object
      • Clear coat (will evolve a bit in the future)

      • Rough refraction option for transparent material
        • Support 2 mode (plane, and sphere) with thickness

        • Absorption

        • Index or refraction

        • Transparent can be render before rough refraction pass (but then break the sorting) to be visible in rough refraction
      • Distortion (Distortion is based on distortion vector and target artistic effect unlike rough refraction which is physical)

      • GI
        • Linked Two sided lighting with Double Sided global Illumination

        • Meta pass is correctly handled for vertex color and UVs (but not supporting planar/triplanar yet)
      • Transparent only sorting help
        • (not expose) Support for a depth prepass for transparent object (to help with sorting issue)

        • Support of back face then front face rendering for improve sorting issues

        • Support of depth post pass to solve issue with Postprocess (DOF/MB)

    • Layered Lit
      • Two to Four Layers (Layer being a Lit material without anisotropy, SSS or clear coat)

      • Opaque/Transparent (Transparent may not behave correctly yet)

      • Two sided lighting (Automatically flip or mirror the normal + disable backface culling)

      • Parallax occlusion mapping (POM) (Does not work currently, change few time and need to be very constraints like all layers must use the same UV mapping)
        • Can adapt to object scale and tiling
      • Vertex displacement map (Can be apply with or without tessellation)
        • Can adapt to object scale and tiling
      • Tessellation (Phong)

      • UV0-UV3 mapping, Planar and Triplanar mapping for all layers

      • Blending based on
        • Blend mask
          • Separate UV mapping and tiling
        • Vertex color (two mode: additive, multiplicative)

        • Heightmap (optional)
      • Optional influence mode of Main layer
        • Influence mask layer for fine control

        • Current diffuse, albedo and heightmap Layer can be modified (influenced) by the Main layer
      • Optional density mode
        • Alpha in diffuse controls a threshold to make layers appear (Giving the illusion that we control the density of a particular material, like small pines)
      • 2 buttons to be able to synchronize properties of a given material on a given layer
        • One to synchronize all properties

        • One to synchronize all properties but UV mapping and tiling
      • Tiling can consider object scale (optional). I.e tile will scale with object scale
    UI
    • HD Light editor

    • HD Reflection probe editor

    • HD camera editor

    • SubsurfaceScaterring Profile asset
      • Dedicated to subsurface material (There is hard constraint here that prevent to do what we want. Index of subsurface material needs to be fixed).
    Camera/Postprocess
    • Postprocess stack V2 support
      • All, including TAA, SSAO except SSR (that are not post processes but Lighting features)
    • Motion vector: Support of skinned motion vector and add concept of MotionVectors pass



      Initial setup

      Before running HD, it is necessary to setup the quality settings correctly
      • Set Anti Aliasing to disable. This will be handled by HD. If MSAA is enabled, the grid of the scene view will be overlay on the scene.

      • Set Linear mode

      • Be careful, all camera should have allowHDR disabled (yes disabled) to have better precision. The flag cause two problem: Banding + icon in editor are incorrectly depth tested
     
    Last edited: Jan 15, 2018
  8. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    We have also fix various incorrect behavior of built-in RenderPipeline to have correct lighting.
    These divergence are listed here:

    • HD rely on a metric system. 1 == 1m. This scale need to be respected to work correctly with lighting

    • Better pre-integration of cubemap with GGX BRDF. Vanilla Unity pre-integrate cubemap with NDF, then apply a tweaked Fresnel term with roughness. HD pre-integrate DFG term and apply it on pre-integrate cubemap with NDF. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).

    • Pre-integration of GI (Lightmap/Lightprobe) with Disney diffuse. Vanilla Unity do nothing. HD apply in a post step the pre-integrated DFG term. This better match the reference (Reference being brute force integration with the full BRDF (DFG)).
      Builtin
    • HD Light attenuation is inverse square falloff and use linear intensity. There is a smooth terminator function to force the attenuation to be 0 at range boundary, there is also an option to not apply the attenuation at all. Vanilla Unity use a texture to store the attenuation with special falloff formula and use gamma intensity. Going from gamma intensity to linear intensity is done with linear_intensity = gamma_intensity^0.4545. Note: As attenuation is different, even using this formula make no guarantee of match.

    • HD Spot light attenuation use 2 angle (inner angle and outer angle) to control spot attenuation. Vanilla Unity use only one.

    • HD correctly perform a divide by PI of the whole BRDF. Vanilla Unity have an inconsistency in the BRDF between specular and diffuse, specular is divide by PI but not diffuse. This mean whatever the effort done to match lighting between vanilla and HD, one of the component will be PI different.

    • HD interpret the influence parameter for Reflection Probe in a more artists friendly way. Influence is inner influence (mean transition happen inside the volume). Vanilla use outer influence. This have been switch because Artists prefer to setup their volume then simply tweak transition size. With outer influence, they need to update the volume size when tweaking transition size (in indoor).

    • HD use camera relative rendering. Vanilla Unity don’t support it. It mean that light/object sent to shaders in HD have a different translation.

    • Metal/smoothness - Specular/smoothness is handled inside the same Lit material

    • Additive blend mode apply opacity unlike StandardShader

    • DoubleSidedGI is automatically coupled to two sided lighting flag and not expose in the UI

    • MotionVectors work as in vanilla Unity (if skinned or rigid transform render motion vector) but in addition, in HDRP if a shader have an enabled MotionVector pass (velocity pass), it will render into the buffer independetly of moving or skinning (so it handle vertex animation).

    Important:

    • HD use camera relative rendering
    • Buitlin Terrain is not supported by Hd
     
    Last edited: Jan 15, 2018
  9. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    Keep in mind that it is current state and code base continue to evolve
     
    Llockham-Industries likes this.
  10. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,186
    This brings in few questions, so technically, you can't swap the pipeline on the run (or with game restart after config chnage)? Or you mean that the pipeline requires so different setup it's not going to look good on both at the same time?

    Curious about this as I thought swapping the pipeline on game config setting would have been possible: lightweight setup for low end, VR setup for VR and HD setup for regular high end desktop, but apparently you can't do this?

    Especially the VR and HD mix would be beneficial on games that are not strictly VR-only games but have optional VR mode built in.
     
  11. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >This brings in few questions, so technically, you can't swap the pipeline on the run (or with game restart after config chnage)?

    There is no mix of RenderPipeline allowed by design (technically you can do it but everything will break). You MUST use the same render pipeline for your project.
    Both Lightweight and HD will support VR, no separate render pipeline for VR are plan.

    Note: Switching from a render pipeline to another is only allowed in the editor, and require to launch an upgrade script to convert the data else it will not work (only exception is if your render pipeline is a fork of one existing, then depends on your modification switching may not require any upgrade)
     
  12. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,203
    So how is the use case of making a mobile game that targets two devices low end and highend?

    For example the low end would use the Lightweight and the high end device Apple A10 and up with metal support and compute support will use the HD pipeline.

    Normally I would do a device detection script and switch complexity in shaders and level details based on the device.
    I don't see how this can be done if its only allowed in the editor.
     
    Last edited: Jan 15, 2018
  13. WilkerLucio

    WilkerLucio

    Joined:
    May 2, 2017
    Posts:
    18
    By what I understood so far, if your project aims only the high end platform, then you should use the HD pipeline. In this case, wich targets multiple platforms whith diferent capabilities at once, you'll have to use the LW pipeline because it's more multi purpose type.
    If thats not enouth, and you want to go fancy, you can write or customise or blend the two pipelines, into one super-device-detecting-pipeline frankenstein monster.
     
    hippocoder likes this.
  14. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,186
    Thanks for the answer, unfortunately that was what I suspected. I'm guessing there no practical way to "fix" this? I mean this does take away a big potential from the system as we could target a lot wider target platforms if we could have "low end" setup as separate option. I'm guessing if we did that now, it would mean directly doubling the build size as you'd get every asset twice there when you build for both pipelines...
     
  15. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >So how is the use case of making a mobile game that targets two devices low end and highend?

    >I mean this does take away a big potential from the system as we could target a lot wider target platforms if we could >have "low end" setup as separate option

    @WilkerLucio summed it correctly. If you go cross platform, use Lightweight. It support all the platform including the Apple A10. Lightweight doesn't mean mobile render pipeline, lightweight mean efficiency for all platforms (but currently have a restricted set of features that will grow with time).

    >Normally I would do a device detection script and switch complexity in shaders and level details based on the device.

    you can still do that with LightWeight.

    HD assume compute shader capable device (and made heavy use of it) to unleash power of modern API/hardware. It can't go below this minimum configuration.

    >I'm guessing if we did that now, it would mean directly doubling the build size as you'd get every asset twice there when >you build for both pipelines

    As stated above, we can't have two render pipeline by design. A project use only one.
     
  16. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,203
    If i still want to use the compute shaders in lightweight is that possible? or tesselation for mobile on metal.

    this is what I mean with apple a10 and up, tesselation and higher end features on mobile.

    this wil become much more common soon
     
  17. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >If i still want to use the compute shaders in lightweight is that possible? or tesselation for mobile on metal.

    yes, it will be possible as long as the platform support it and you write your own shader for it.
     
  18. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    2,044
    In your list I see support for GI and reflection probes, only one mention of light probe in the context of bent normals, are light probes still the only way to GI light dynamic objects?
     
  19. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    Yes, we still heavily rely on builtin Unity GI system (Enlighten, Progressive lightmapper) for the GI. No fully realtime GI. Bent normal allow to improve sampling of GI at the cost of an additional bent normal map.
     
    Last edited: Jan 15, 2018
  20. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    2,044
    Gotcha.
    How about those reflection probes, seems you're making quite some modifications. Could you make them write SH9 values in the shaders, so they'd be like dynamic lightprobes? (so we wouldn't even have to prebake GI in 50% of situations)
     
  21. superpig

    superpig

    Quis aedificabit ipsos aedificatores? Unity Technologies

    Joined:
    Jan 16, 2011
    Posts:
    4,200
    The way that I understand all this (and I'm not on the gfx team!) is that switching between pipelines is a change like switching between Gamma and Linear colour spaces. Your content has to be authored specifically for the pipeline you are using. Simply swapping pipeline assets around won't make anything look good without revisiting all of your textures, materials, lights, cameras, post processing, etc, and adjusting all of them to account for the change - at which point you are pretty much making and shipping two entirely separate versions of your game.

    It's not like just changing today's Quality Settings level - HDRP is not 'Lightweight but with more fancy lights enabled,' it's a totally different rendering strategy. @SebLagarde can correct me if I am wrong, but I believe what you can do is have multiple different variations of the same render pipeline, configured with different settings; that's how you would offer a different experience on high-end vs low-end devices, similar to today's Quality Settings system.
     
  22. Aurecon_Unity

    Aurecon_Unity

    Joined:
    Jul 6, 2011
    Posts:
    232
    This means IES lighting profile support has been dropped from the HD pipeline?
     
  23. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    > I believe what you can do is have multiple different variations of the same render pipeline, configured with different settings;

    Exactly. Thanks for the clarification.

    >Could you make them write SH9 values in the shaders, so they'd be like dynamic lightprobes?
    Not plan.

    >This means IES lighting profile support has been dropped from the HD pipeline?
    Every features are on the roadmap :), but no ETA. Always a matter of time, resources and priorities.
     
    Last edited: Jan 16, 2018
    KarolisO likes this.
  24. Aurecon_Unity

    Aurecon_Unity

    Joined:
    Jul 6, 2011
    Posts:
    232
    Great, I hope that Unity can see that prioritising IES support will bring feature parity compared to their competitors and really help those working in non-gaming applications!
     
    thylaxene and KarolisO like this.
  25. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    2,044
    why does HD need 1 unit = 1 meter?
    Can't just boost light intensity arbitrarily anymore or it'll melt the mesh?
     
  26. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,332
    It would be nice if the sky color (procedural sky), height fog color and multiple scattering (blurred fog), atmospheric scattering (rayleigh, mie), are all integrated. There is currently no package on the asset store which has all these features and mixing different packages can't be done due to compatibility issues.

    The main issue is that people forget that the blue color of the sky and the blue mountains in the distance is caused by the same physics. Using blue fog is bad hack which looks terrible and matching up the correct blue look of rayleigh scattering on distant objects with the blue color of the sky is an ongoing issue.
     
    OCASM and laurentlavigne like this.
  27. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >why does HD need 1 unit = 1 meter?
    >Can't just boost light intensity arbitrarily anymore or it'll melt the mesh?

    (1 unit = 1 meter): This is already the case in Untiy, it is just that users didn't know or don't care.
    Physics for example rely on this if you want more predictable result.
    For HD it is important for physical light unit and light attenuation. Lighting is accurate/predictable only if metric is respected. Of course, if you don't care about accuracy, photo-realism etc... There is no issue, just eye-ball everything. But for those who care, life will be way easier with such metric.

    This is one of the design of HD: Everything should be physically correct by default and it should require effort to make it wrong. Compare to current Unity design: Everything is independent and it require effort to make it physically correct.

    Keep in mind that Lightweight render pipeline is here to fit various needs and work on all platform.
     
  28. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >It would be nice if the sky color (procedural sky), height fog color and multiple scattering (blurred fog), atmospheric >scattering (rayleigh, mie), are all integrated.
    Every features are on the roadmap :), but no ETA.

    Currently we haven't start work on sky, just have minimal feature. Physical sky will come one day.
     
  29. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,113
    Amazing work.

    Also i would like to say that it is refreshing to have such an honest and in depth update on the work being done from Unity, it really really helps to hear this!

    I am so excited for the SRP!
     
  30. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    2,044
    You should do it because then the shader can sample the environment diffuse at a lower cost than reflection probes and handle dynamic environment, unlike static lightprobes.
     
  31. ZeBraNS

    ZeBraNS

    Joined:
    Feb 21, 2015
    Posts:
    23
    Hi, I am looking at the release notes on the new 2018.1.0b3 and there supposed to be templates for Lightweight, Lightweight VR, and High Definition.
    Earlier SebLagarde said that bout Lightweight and High Definition will support VR. Is something changed?

    And some pros and cons on the new pipelines when using VR would be nice. How are they better than Forward and Defered, again in terms of VR (Vive/Oculus).

    ps. really excited about new stuff that your preparing :)
     
  32. timsoret

    timsoret

    Joined:
    Apr 9, 2015
    Posts:
    13
    Some great features there, it's a huge upgrade for Unity that was seriously lagging behind technically.

    Bonus points for
    + motion vectors for vertex animation
    + correct light values & falloff (please provide a way to customise the falloff for creative usages)
    + colored cookies
    + semi transparent shadows
    + upcoming volumetric stuff
    + cluster renderer

    The serious setback is that porting an HD pipeline game to Switch or iPad later will require a very serious effort & investment.
     
    OCASM and hippocoder like this.
  33. kite3h

    kite3h

    Joined:
    Aug 27, 2012
    Posts:
    93
    I think HD SRP is better than unreal render pipeline.
    But also I think It is more difficult than unreal render pipeline.
     
  34. kite3h

    kite3h

    Joined:
    Aug 27, 2012
    Posts:
    93
    now HD SSS profile has 'index of refraction'.
    This makes it easier to match the skin's specular tone.
    But I wonder why the basic production. I understand the need for refinement of SSS in BSDF. But what does index mean?
     
  35. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    4,190
    In optics, the refractive index or index of refraction of a material is a dimensionless number that describes how light propagates through that medium.

    https://en.wikipedia.org/wiki/Refractive_index
     
  36. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >Earlier SebLagarde said that bout Lightweight and High Definition will support VR. Is something changed?

    Yes, they will support, VR is not implemented yet on HD.
     
    Llockham-Industries likes this.
  37. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    >Bonus points for
    >+ motion vectors for vertex animation
    plane but through shader graph only

    >+ correct light values & falloff (please provide a way to customise the falloff for creative usages)
    Cutsomization easly break GI. So just remapping in shader is not enough (but you can do it if you want). We are working on way to allow GI to take into account such modification.

    >+ colored cookies
    supported already

    >+ semi transparent shadows
    Many technical challenge for this one, require C++ change that we haven't start. Currently shadow render opaque object only :(, no ETA

    >+ upcoming volumetric stuff
    lot of work still to do, no ETA

    >+ cluster renderer
    supprted
     
    timsoret likes this.
  38. Lex4art

    Lex4art

    Joined:
    Nov 17, 2012
    Posts:
    171
    For HD render it will be good to:
    1)Have a 32bits per channel G-buffer for smooth reflections on curved meshes and over stuff (maybe this precision will be handy in some other areas too): https://fogbugz.unity3d.com/default.asp?797969_t9lr43sgif7hpa36 (if not as default but at least as an option, like now we have R10G10B10A10 and FP16 switch in "Tier" settings).
    2)Have a 10+ bits per channel light cookies format - currently even with smooth image with gradient there is a lot of noticeable pixelisation/banding.
    3)Have a screen space anti-aliasing applied while capturing reflection probes. As option it will be just great to have supersampling there too - in many cases it will allow to use 2x lower resolution for reflection probes because of increased sharpness and quality of the reflection texture.
    4)Have a 16 bits per channel for normalmaps and heightmaps textures (at least optional, but it's very essential thing - every backed normalmap suffers from 8-bit precision -> seams on characters necks, banding on reflections, etc).

    If something feels reasonable - please, take it in to account ).
     
    Last edited: Jan 19, 2018
  39. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    The good thing with SRP (and HD / LW) is that you can script it to fit your specific requirement for your specific project.
    HD is design so such kind of change of RenderTarget is possible in C#.
     
    elbows likes this.
  40. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,203
    Just to check
    in the coming unity versions there will be 3 pipelines

    classic
    lightweight
    High definition

    correct?
     
  41. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    416
    Correct.

    The current one, Lightweight & High Definition.
     
  42. superpig

    superpig

    Quis aedificabit ipsos aedificatores? Unity Technologies

    Joined:
    Jan 16, 2011
    Posts:
    4,200
    3 pipelines that we ship. You can, of course, add more yourself.
     
  43. Lex4art

    Lex4art

    Joined:
    Nov 17, 2012
    Posts:
    171
    I'm just a puny 3D artist, coding graphics pipelines is out of my reach... out of the box is all I've got (or maybe someone create and sell a proper asset for this...). Good that all of this possible, though.
     
    Last edited: Jan 20, 2018
  44. DerDicke

    DerDicke

    Joined:
    Jun 30, 2015
    Posts:
    181
    I think it is a great idea to split up renderpipes and reduce complexity. Good for both, you developers, us 'clients'.

    Will easy accessibility be one of your design goals in the HD pipe?
    What I mean is a more artist friendly workflow like UE. Drag your art to the scene, make some lights, start capturing a nice promotion video, sell thousands of copies of your art package.
     
  45. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    4,190
    Do you know if the "current one" has been re-implemented in the context of SRP, or is it the very same code we're using already?
     
  46. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    No change on for the current one, no re-implementation for now. Lightweight will be the nearest of the builtin render pipeline
     
    Peter77 likes this.
  47. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    The artist friendly workflow of UE is due to artist tools (and appropriate settings by default), not due to the render pipeline. This is what we start to upgrade in Unity with the new shader graph (not HD specific) and template sytem, other tools will come in the future.
     
    discofhc and Lars-Steenhoff like this.
  48. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,203
    yea its nice to see attention to artist tools. very welcome!
     
  49. discofhc

    discofhc

    Joined:
    Dec 18, 2017
    Posts:
    39
    There are plans for Volumetric Fog (i mean add fog to the scene and allow it to be lit by Directional, Point and Spot lights)?
    Maybe Fog Volume?
     
  50. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    454
    Sorry for the copy/paste answer but any question related to upcoming features (i.e not in the list I provide above) will have similar answer for HD for now
    "Every features are on the roadmap :), but no ETA. Always a matter of time, resources and priorities."
     
    XCO and discofhc like this.