Search Unity

  1. Unity 2020.1 has been released.
    Dismiss Notice
  2. We are looking for feedback on the experimental Unity Safe Mode which is aiming to help you resolve compilation errors faster during project startup.
    Dismiss Notice
  3. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Unity Feedback Wanted: Streaming Virtual Texturing

Discussion in 'Graphics Experimental Previews' started by AljoshaD, Mar 18, 2020.

  1. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Hi!

    Streaming Virtual Texturing
    is in preview for Unity 2020.1 from beta 14 and up. You can download the sample project here. The sample uses the HDRP 9-preview.33 package that you can find in the package manager when enabling preview packages in the project settings.

    Streaming Virtual Texturing is a texture streaming feature that reduces GPU memory usage and Texture loading times when you have many high resolution Textures in your scene. It works by splitting Textures into tiles, and progressively uploading these tiles to GPU memory when they are needed.

    You'll need Shader Graph to add Streaming VT to a shader. The online user guide here goes into the details of how to set up your project for VT and how to author your content.

    StreamingVirtualTexturing-TileDebugView.png

    The sample project shows a basic scene with roughly 1GB of compressed 16K, 8K and 4K streaming textures. Streaming is enabled by adding the textures to a Sample VT Stack Node in ShaderGraph. Memory is optimized by setting the textures to "Virtual Texturing Only" in the importer. The system allocates 384MB of GPU caches to stream in the texture tiles.

    Try out our sample project and let us know what you think by replying on this post.
    We look forward to hearing your feedback!

    Aljosha
     
    Last edited: Jul 3, 2020
  2. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    4,992
    What are the cons for using virtual texturing? When wouldn't I want to use it?
     
    Lorash likes this.
  3. sergiusz308

    sergiusz308

    Joined:
    Aug 23, 2016
    Posts:
    114

    You want to use it for open-world type of games or corridors but with highly detailed environments, non-repeating materials, mostly first-, third-person games.


    So if you not fall into this category, you can skip them.


    In general virtual texturing means higher load in artistic department in the first place, since you have to create a lot more, unique textures and materials. Obviously it's not first choice for mobile platforms, mostly because of the space requirements and latency in loading up required texture tiles from the storage, which breaks gameplay.


    There's additional cost related to computing which part of giga-texture-atlas engine should currently load up from the storage and make available for the shaders. Shaders are little bit more complicated and sea of storage is required to store full PBR materials.


    On consoles and PCs it's forgettable cost however, so it’s in use for a decade already, at least.


    Took unity some time to integrate it from the company they acquired, few years back.



    Anyway, great that it coming to unity. Hopefully together with camera stacking….
     
    razzraziel likes this.
  4. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    It's good to understand the technical limitations and what you can gain from it to be able to draw your conclusions when to use it.

    main gain is:
    - you can use higher resolution texturing with very little GPU RAM, meaning sharper texturing on GPU's that have less physical VRAM.

    potential downsides are:
    - since the VT streams in texture data based on what's currently rendered on the screen, quick scene changes or fast camera rotation can momentarily display lower resolution data on your game world until the streaming catches on. This also happens with traditional texture streaming. I think old Granite integration used to have API call you could use to give VT system a hint that you'll be switching to different view soon, not sure if this new system yet has something like that (it's especially useful for things that you know will happen in advance, like camera cutscenes for cinematics etc)
    - since you can use higher resolution textures, the obvious downside is that they can take a lot more disk space. This will be way bigger issue on say, open world game than in say, game that just has more limited environment.
     
  5. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    533
    I'm suffering with Texture Streaming in Unity 2019.3. For distance object, it just fine but for near object first spawned object shows lowest mip level and it streamed to high mimap at next frames. User can see texture flickering in this timing. I think that same problem can occur with virtual texturing. Will Unity provides APIs for fine tuning and handling these problems?
     
    CarXdev likes this.
  6. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    I mentioned this on the above post. For regular texture streaming they let you preload the textures for new camera position, see "Camera cuts" section on this doc: https://docs.unity3d.com/Manual/TextureStreaming-API.html#ControlCameras
     
    Jatin998 and Kichang-Kim like this.
  7. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Streaming Virtual Texturing works for any game type. It makes most sense when you have dense scenes. These could be large worlds that are densely populated or compact scenes. With dense I mean many objects with high res textures close to the camera at the same time. Mipmap streaming would struggle with this because it tries to load entire texture mipmaps based on the distance from the objects to the camera. Virtual Texturing only loads the areas (tiles) of the texture mipmaps that are actually visible. In most frames textures are only partially visible (front of a character for example) so much less texture data needs to be in video memory than when using mipmap streaming.

    You can see in the image below that only a fraction of the 8k PBR textures (+displacement) are visible in the shot. The white lines represent the individual texture tiles that are loaded by the VT system. You can toggle this debug view in the sample scene I shared in the first post and see how it works exactly.

    It is certainly correct that SVT can handle many high res textures which obviously take time to create by artists. It been used for photogrammetry content because there you automatically have a lot of unique high res content. However, what I'd like to point out here is that even if you have a limited set of high res textures, you can use them more freely everywhere in your scene. You can have 20 objects with four 8K textures each right in front of the camera and still use very little video memory without any texture quality loss.

    StreamingVirtualTexturing-tiling.png

    The downside of using the current VT system is that it requires CPU and GPU cycles. Roughly speaking you need to allocate 1.5ms per frame to using SVT. You get memory or texture quality at the expense of these 1.5ms (it can be lower, it depends on your hardware). The best thing to do is budget this from the start of your production if your game is designed around using high res textures. It's also pretty easy to convert your project to using SVT so you can quickly experiment and see what the performance impact is versus the memory gains you have. In Conan Exiles by Funcom they discovered that they could have their ultra high texture quality on devices with limited video memory. The compact user guide explains in more detail how the VT sampling works so you can understand where the performance impact comes from.

    The automatic requesting works really well but you can indeed request texture tiles to be streamed from script. This allows you to prefetch textures mips or even subareas of mips before they become visible. Camera jumps, etc. can be anticipated this way. The sample I shared earlier actually has an experimental script (on the camera, it's disabled by default) that prefetches some higher mipmaps (low res) of objects based on camera distance. This way, the VT system always has some texture data to sample. Texture tiles from mip 0 or 1 are then exclusively automatically requested by analysing the actual visible texture tiles in screen space.
     
  8. TokyoWarfareProject

    TokyoWarfareProject

    Joined:
    Jun 20, 2018
    Posts:
    624
    Exciting! wonder if lightmaps will take advantage of this even if at future and if stuff like impostors could take advantage too.
     
    JoNax97 likes this.
  9. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571
    For sure. At the moment i use 8K Lightmaps.
    So i have around 12x8K custom Lightmaps . One per interior PBR Material.


    The problem is that PLM never can calculate these quality and sizes.
    You have to use external lightmappers.

    Check the Unreal implementation. Lightmap streaming done right.

    Would be nice if you could provide an
    Experiemental Fontainebleau branch with Virtual Texturing implementation.
    Then we must not do all the testing alone.)

    Please use Adressables from beginning in your development.
    For projects with Virtual Texture Streaming is quite obvious.

    Sorry. I am still stressed out because all Granite projects we had to stop last year.
    You also were loosing some good customers with reference projects to EPIC because of this more than one year of information blockade.

    However. Please try to get it in preview fast in a package who fits the latest Unity 2020.1.0bx + HDRP 8.x-Preview. You referencing an HDRP 9.x-Preview?
    We could not wait any longer too and because of this i must mirrror our complete pipeline on unreal for an test.
     
    Last edited: Mar 19, 2020
    TokyoWarfareProject likes this.
  10. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    @AljoshaD How do we enable the the debug tiles on this (which are enabled on all your screenshots)?
    I saw GRA_DEBUG_TILES in GraniteShaderLib3 but it's actually commented out so it will not do anything. It's kinda hard for users to tell what virtual texturing is doing without ability to visualize the tiles.
     
    keeponshading likes this.
  11. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571
    Hi.
    How i can enable the old granite debug view or the debug view showed in the user guide.

    I found this one but its hard to interpret.

    debugVT.JPG


    added:

    old DebugTiles are here
    VirtualTexturing/Debug T iles

    DebugTiles.JPG
     
    Last edited: Mar 19, 2020
    rz_0lento and AljoshaD like this.
  12. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    keeponshading, sorry to hear that the wait for the new integration caused problems for you. Unfortunately it was impossible to estimate when we were going to be ready. I hope it now works well for you and I look forward to hearing your feedback!

    Please also see this as an experimental/preview feature for now. It is built on the Granite SDK runtime so it is well tested with big shipped game titles. However, we re-did the whole texture import workflow to remove any built steps and redundant files. So the production workflow is much nicer than before.

    We don't support assetbundles/addressables for now but it is our highest prioritized feature. We really want to get this in for 2020.2. Please take a careful look at the bottom of our user guide where we mention the current limitations.
     
  13. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571
    So this is a gamechanger for stereo equirecangular VR applications because you can do now uv reprojection passes
    in R32G32 or two in R32G32B32A32.
    They need this precision and minimum size of 8K better 16K.



    hyped.JPG

    edit:
    ooh i saw under Important Notes they are not supported.
    So please support them. This would be an real game changer.
     
    Last edited: Mar 19, 2020
  14. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    Are UDIMs going to be supported eventually or is it a limitation that will hold due to the approach taken on the texture import workflow?
     
  15. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571

    Hi. I did a very fast test in the example scene with Adressables and i seemed to work on a build.

    Adressables.JPG

    However. This need some further testing.

    As an Use Case example here some dots why Adressables are important

    We have large data
    • so car interior is up to 4GB
    • car exteriror is up to 3GB
    • car tracks are up to 20GB and splitted through multi scene and Adressables.
    Everything is loaded/unloaded per Adressables and the Build is now very confortable (4GB limit)
    Now we could use even larger scan textures for cars and tracks who were on hold for 1.5 yrs.
    Combined custom Lightmaps up to 64k.
    Scanned Track and Car Textures up to 16K for primary scene views.

    So a working and solid Addressable compatibility would be very important.

    Also the custom Lightmaps are always in the process of improvement. Even after project end.
    When there is free GPU/CPU time they will be auto improved up to FullGI GroundTruth for different lighting situations.
    So they are marked as Adressable and often renewed.
     
    Last edited: Mar 19, 2020
  16. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571
    The old WETA Demo who s an really perfect VT example on mobile phones was using UDIMS

    https://graphinesoftware.com/blog/2...-ARHorse-razor-sharp-glance-into-future-of-AR

    We start to switch to UDIM now because Blender got an nice implementation.
    So i hope UDIMS will be on the List too.
     
  17. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    I also tried this briefly on DX12 and DXR. Some of the raytracing effects seem to work with VT, some throw errors and for example raytraced reflections just render pitch black.
     
  18. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    I just imported the packages in our project on 2020.1b2, virtual texturing is activated in the package manager, but a console info says it's disabled and I neither get the option in the player settings nor in the pipeline asset.
    I already tried disabling and reenabling the package. (the sample works in 2020.1b2 by the way)
    Does someone have an idea what could have gone wrong?
     
  19. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    571
    use this packages in your project
    \StreamingVirtualTexturing\StreamingSample_2020_0_0b2\Packages

    upload_2020-3-19_22-16-57.png

    and it should work
     
  20. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    thats exactly what I did, I can also see the virtual texturing menu in the toolbar, but the option in the player settings is just not there..
     
  21. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,674
    Are you sure you look in the right place? It's right at the top of Player settings on same place where you set company and product name, default icon etc.
     
    keeponshading likes this.
  22. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    Thanks, that was it :D, I was looking in "other settings" where these things usually are (and also in the compact guide).
     
  23. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Yes there is some UI we still like to change. We've documented those in the boxes in the user guide.
    In one of the next betas the VT module will be hidden and you will need to remove it from the package manifest. It will always be on and you'll only need to toggle the project setting.

    On UDIM, it is not supported currently. It's on our list but with a lower priority. That might change but looking at the current roadmap it is unlikely that it will be natively supported in 2020. Same for raytracing.
     
  24. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,342
    Feature request: Universal Render Pipeline support.
     
  25. TheOtherMonarch

    TheOtherMonarch

    Joined:
    Jul 28, 2012
    Posts:
    111
    Without UDIM support this will be basically pointless for our project. If it had UDIM support we would be using this for our mesh terrain.
     
    Last edited: Mar 25, 2020
  26. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    Building with IL2CPP doesn't work, Unity tells me that the c++ tools and windows sdk have to be installed, even though I have everything installed. Is this a known bug, or did I miss something?

    Update: never mind, seems like something was broken with my vs installation.
     
    Last edited: Mar 30, 2020
  27. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    First of all I wanted to say it's amazing for this early state how easy to use it is (even more comparing it to earlier versions of granite and amplify texture). You guys did a great job of integrating it thoroughly into the known workflows.
    I just built a scene with almost 50gb of texture data (most of the textures 8k or 4k), and loading times are almost non-existent, everything looks great and is very performant.

    There are some situations were textures just don't appear though. Here is a screenshot which shows the problem and the granite error in the console. One thing the incorrectly textured characters have in common is that they are spawned in runtime. Might this be the reason for it? Do you know any way to fix this? Besides of that it's very usable already. HHVTTest.png
     
  28. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Thanks for the feedback Onat! That's great to read!

    On your issue, I can't think of a reason you are seeing this. Could you file a bug so we can reproduce it? If you share the link here we can jump on that immediately https://unity3d.com/unity/qa/bug-reporting
     
  29. dreamerflyer

    dreamerflyer

    Joined:
    Jun 11, 2011
    Posts:
    927
    vt3.jpg vt.jpg vt2.jpg it seems not good at mac,very low fps,and get 1G texture memory...
     
    Last edited: Apr 6, 2020
  30. dreamerflyer

    dreamerflyer

    Joined:
    Jun 11, 2011
    Posts:
    927
    how to show feedback feature in vertex shader ?
    vt4.jpg
     
  31. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Hi dreamerflyer, on the first screenshot, the errors tell you that you need to assign a texture to each texture slot on your Sample VT Stack node. Or you need to create a node with less slots.

    On the memory, you might need to set your caches smaller. Also, if you don't enable "Virtual Texturing Only" on the texture importer then Unity will keep the texture in memory ontop of the texture tiles that are streamed with VT.

    We are looking into the last screenshot.
     
  32. dreamerflyer

    dreamerflyer

    Joined:
    Jun 11, 2011
    Posts:
    927
    I did not change anything about this example,and the textures 's "Virtual Texturing Only" is enable .
     
  33. niflying

    niflying

    Joined:
    Jul 11, 2012
    Posts:
    75
    I couldn't find the "support for VT" in player settings even enbabled the built-in module “Virtual Texturing” in the Package Manager. unity 2020.1b4
    TIM截图20200407220753.jpg TIM截图20200407220806.jpg
     
  34. niflying

    niflying

    Joined:
    Jul 11, 2012
    Posts:
    75
    I found it finally.^^
    It's under the Icon setting.
     
  35. vistaero

    vistaero

    Joined:
    Mar 23, 2017
    Posts:
    4
    16K is the maximum for a single object? I'd like to use at least 32K texture, since I'm making solar system planets and I need as much resolution as I can to get sharp views from low orbit.
     
  36. niflying

    niflying

    Joined:
    Jul 11, 2012
    Posts:
    75
    Are there any guide for how to use it for terrain or HDRP lit shader?
     
  37. NicoLeyman

    NicoLeyman

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    5
    Currently 16k is the limit as we are dependent on the max texture size of the engine. Most hardware only supports up to 16k textures. In order to offer compatibility to non VT we have to abide by this limit. Ideally what you need is UDIM texture support so you can use multiple large textures on the same mesh. We had this for the old plugin and intend to bring this to our new solution in the future but for now it's not concretely planned on our roadmap.
     
    futurlab_xbox likes this.
  38. NicoLeyman

    NicoLeyman

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    5
    There is no built-in support for the HDRP bundled shaders. It makes more sense to quickly roll your own in shadergraph based on your personal needs.
     
    futurlab_xbox likes this.
  39. vistaero

    vistaero

    Joined:
    Mar 23, 2017
    Posts:
    4
    Then I can't get the point. 16K is not that much. I'm already using a planet shader that combines 4 images to get a 16K texture, and it runs smoothly even on mobile. Today's hardware is very powerful. Efficiency is OK but breaking the limits is the exciting thing. The old plugin should still be available for those who need that feature.
     
  40. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    4,992
    Apple A9 GPUs (iPhone 6S and 6S Plus) introduced texture support for up to 16384 px. Earlier iOS devices support up to 8192 px only.

    There is currently no iOS GPU that supports textures larger than 16384 px, see https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf for details.

    Unfortunately, Android devices are so diverse, that there exist no single feature set table like on iOS. I would assume similar or slightly worse limits on recent Android devices.

    Here is a brief overview how much memory different texture sizes cost at 8 bits per pixel:
    • 8192x8192 @ 8bpp = 64 mb
    • 16384x16384 @ 8bpp = 256 mb
    • 32768x32768 @ 8bpp = 1 gb
    • 65536x65536 @ 8bpp = 4 gb
    I guess, even just disk storage wise it gets unpractical on a majority on mobile devices to have 32k / 64k textures.
     
  41. dreamerflyer

    dreamerflyer

    Joined:
    Jun 11, 2011
    Posts:
    927
    do u can run this demo on mac?get 340m memory?
     
  42. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,727
    Hey all,

    I was wondering if eventually the feature will be available for the Universal Pipeline, and if so does it make sense to use this in a VR context (and to a far lesser extent mobile VR)?

    It's not so much the need for full uniqueness but really to reduce material complexity and reduce separate textures and reduce ram requirements, so any information would be great.

    My questions are probably about a year too soon aren't they? :)
     
  43. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Hi everyone,

    textures for VT are imported using the regular unity texture importer and that one is currently limited to 16K per texture. We are thinking about how to increase this but it might take a while before we get there. If you need higher res, you'll need to use multiple textures. The VT system will optimize loading and memory so you can use many 16K textures in your scene.

    It makes sense to use VT in VR games and applications. A bunch of VR titles have shipped using Granite SDK, our VT middleware. For example Raw Data (Survios) and Everest VR (Solfar). You do need some frame budget for streaming.

    We are now focusing on getting the first HDRP preview out so we can't say much about URP. VT in URP will target high end devices using the Vulkan, Metal, DX11 or DX12 graphics APIs.
     
    Last edited: Apr 14, 2020
  44. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    Hi Aljosha, sorry I have totally missed your reply. I'm trying to reconstruct it, but it's hard to recreate with another project, and it's impossible to send you ours with the bug reporter (the project excluding the library is about 90gb...)
    What does the error message mean? (Granite Error in Graphine::Granite::Internal::Trancoder::Tick: The transcoded bitstream was invalid, this may indicate a corrupted file or incompatible transcoder version)
    That might help me dig deeper and nail down the cause.

    I have also noticed, that the system seems to have problems with objects behind transparent surfaces.I know that it doesn't work with the transparent surfaces themselves and that's fine, but it should be possible to render objects behind them correctly. When we were using amplify texture some years ago, I could change the layer of transparent objects to "transparent FX" so they were ignored by the texture streaming. Something like that would be a good solution, or even better: Just read out if transparency is enabled on a material for example.

    One more improvement idea: Currently the models display nothing if the textures aren't loaded yet and in our case they appear black until the streaming system loads the data. It would be great to have at least an option to load let's say the lowest mip (128x128) for each material on start, so it won't be as obvious (and should be small enough for most people to not steal too much precious memory ;) )
     
  45. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    It's an internal error that in this case doesn't tell us much. We are adding more error reporting in the latest version.

    On the issue behind transparent surfaces, we'll look into this, thanks for the feedback.

    On your idea, the RequestRegion API allows you to prefetch some data. The sample project has an experimental script on the camera that demonstrates this. It's not synchonous though so you'll need to wait a few frames before showing content at the level start.
     
  46. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    99
    I just tried it and I totally understand why it's "experimental", using it makes everything exactly 10 times slower :D

    Thank you, looking forward for the next iterations :)
     
  47. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    Thanks for the feedback Onat! The script is just an experiment but the RequestRegion call should be fast. Could you elaborate on what performance impact you are seeing, your scene setup and texture sizes?
     
  48. yumeng1022

    yumeng1022

    Joined:
    Dec 4, 2019
    Posts:
    6
    Does VT has any plan for supporting Ar Foundation in further ?
     
  49. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    35
    We are focused on HDRP first (in preview soon) and we'll add support for URP after that. That brings us close to supporting AR foundation. We still need to evaluate the mobile devices that we'll be able to support with VT though.
     
  50. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,646
    I hope with URP mobile devices on iOS that support metal and have 3GB Ram will be supported.

    For example iphone 7plus is the lowest device I would expect to work.

    Everything lower than that spec is not reasonable
     
    Last edited: May 19, 2020
unityunity