Search Unity

Thoughts on the engineering of the sample project - feedback on content / design

Discussion in 'FPS.Sample Game' started by hippocoder, Oct 26, 2018.

  1. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Hi Unity staff, please take my feedback with a pinch of salt. It is likely I've probably been working with Unity longer than some of you, so it's not a criticism really but feedback based on poking around the project:

    1. Your decisions about how bright the sun is:
    Users peeling apart the project will see the sun is 1000 lux, which is acceptable for a bright room. This doesn't make sense for current projects without a physical camera setup which HDRP doesn't have at time of writing for beta. To compensate for this you abuse the auto exposure to bring the values back down. This is OK but:
    • renders lightprobe preview broken
    • reflection probes are virtually impossible to preview correctly
    • you require many more probes to work with (more notes follow)
    • your scene lighting, specifically GI is unusable for dynamic time of day
    • knock-on effect of having to anti-tweak everything else to fit
    • scene lighting range potentially far from 0-1 therefore lightmaps compress badly
    2. Lightprobes are difficult to get the best of. In 2011, Robert made the call to use tetrahedral probes and argued that it was an artistic job.But FPS Sample automates it, and struggles to follow the advice of using a minimum amount. In fact the probes are so dense, I thought it was a point cloud when I took a look. It needs to change/improve. Notes:
    • Too many probes take too long to inject GI into or change / burns perf and memory.
    • Your placement is bad. Use navmesh at least to generate relevant placement
    • You went for thousands and thousands of probes to band aid the limitations of tetrahedrons
    • If you want a handful of tetrahedron based probe lookups on on a 2010 phone it's OK...
    3. The general authoring is a mess of changes and fixes. Then heaped more mistakes on top of that to fix the mistakes. Then you basically tweaked everything over and over till it looked great. There is no elegance here and it's a bad project for people to learn from. I get it, small team forced to keep adapting to emerging tech that is not finished. That's hard! But here's a chance to address these problems within Unity:
    • Improve probes. Make a small building. Thin roof. thin walls. Thin is actually a meter/unit! I guarantee you even by hand you'll struggle with optimal placement and few probes. I think we need something that will work in more use cases since the FPS Sample shows clearly we need thousands for a small deathmatch level.
    • Prefab variant setup is OK in this project but often done for the sake of it to demonstrate rather than being efficient.
    • Some objects are layered lit but appear low resolution :/. It's a wild waste of performance and streams badly. Could've just been a single PBR mat without layered lit.
    • Use of decals overkill and sub-optimal. Mesh decals would have solved this a lot. I am guessing this arting phase must have occurred before the actual addition of mesh decal support in HDRP. Also, a lot of those places would've just been served better with 1 texture variant.
    I could go on but I don't want to beat on what is otherwise a very, very impressive piece of work. I never even got to the code side (which I will of course) or the networking.

    I wanted to just be a little bit honest here, I appreciate, love the work and effort but people are going to learn quite badly from the way the scene is arted together and some of the decisions made plus some of the things in unity (lightprobes) really need fixing, especially since you fight years of your own advice to finally accept that they aren't that useful in current form.

    Thanks for reading and I'm sorry if I annoyed anyone. I've a fair idea what kind of battle it is to make a game with a small team but for the sake of improvement I leave these notes.
     
  2. Vestergaard

    Vestergaard

    Unity Technologies

    Joined:
    Oct 20, 2016
    Posts:
    31
    Hi HippoCoder and thanks for the feedback! I will try and go through your points one at a time, but please account for massive jet-lag nonsense as I just landed back in Copenhagen from Unite LA.

    Here are my thoughts and excuses :)

    HDRP advocates physical light units, however due to specular aliasing issues and limitations in the current Auto Exposure and light probes I had to clamp the sun at a fairly low 1000 lux (realistic sun is upwards to 120.000 lux) as well as have more dense light probe placement than I might have liked, or preferred, to be necessary.

    I chose to follow HDRP’s lighting patterns because it is forward-looking and I want the content to be as progressive as possible as to not deprecate within the first year of launch, this informs us internally as well as we too notice the areas of discrepancy between HDRP features and the capabilities of systems like for example light probes, and in this case the strain it puts on our light probe placement both in terms of density and lack of built-in automation process. This is feedback we have already given to the lighting team, so we have awareness of this issue.

    Solving for realistic light intensities, dynamic lighting, and realistic architecture (as you mention thin walls is a real challenge), this is something I will champion internally and push for, and this highlights the duality of the sample projects as in this way we serve equally to inform the internal development and to serve users. The project pushes on these pain points, specifically because I think they will hit our users in the near future.

    HDRP is still preview and as such in a way FPS Sample is very much too, we hope to show and ship gradual improvements over time. We’re looking into changing auto exposure, and this content will be an immediate testing ground for it.

    Scene view debug for reflection probe debug not being drawn before the exposure is a recent regression I hope it to be fixed soon, in the inspector preview however there now is an exposure slider, so they can easily be pre-viewed at the needed exposure.

    A fix from the lighting team to move light probe debug graphics to before the exposure pass is also imminent.

    Light probe placement in the level is denser yet because of a desire to have the users and ourselves be able to move the sun and rebake without the need to redo all the light probe placements, this allows us and users to test different lighting configurations. In this case I am well aware we err on the side of experimentation and fun to play with rather than strictly performance best practice.

    Most of the light probe groups are stored inside the modular prefabs so the with relative ease can be changed without individual placement in the level. So yes there are a lot of redundant light probes for the current lighting configuration, but that said as HDRP only supports DX11, DX12, Vulkan, Xbox, PS4, Mac, all fairly high-end platforms, and because we aim to optimize the game to run on hardware similar to the ps4 and above, then on this hardware thousands of light probes are not a _major_ concern for runtime cost, nor do they account to a very significant use of memory on the platforms. The project is not meant to exemplify good practices for mobile.

    Lightmaps on FPS Sample are compressed in bc6h a full hdr compatible format, and there is no significant compression damage to the lightmaps in the shipped level. Using the renderpipe debugger (windows -> analysis -> render pipeline debug ) You can easily isolate the indirect bake and inspect the quality with the use of the exposure control.

    The scene lighting is indeed unusable for dynamic time of day as both the light bounce from the sun is baked down and we run with baked shadow masks, however this allows the control over the distance shadowmask feature with which you can control how far dynamic shadows are drawn as well as prevent lights from leaking when their shadow casting is disabled at a distance. You can tweak the distance at which we blend between the baked shadow masks and dynamic lighting on the Level_01_Volume_00 in the HD Shadow settings.

    For layered materials there’s multiple aspect here, they are more expensive of course, but they do make it easier to toy with the content and make global changes to the assets without re-authoring every single texture and just have fun with the scene by replacing materials and changing settings. I hope this would be nice just to highlight the feature and to make the scene a better playground.I hope it is obvious to most of the artists we are targeting with this content that they are more expensive than a plain flat lit material. If nothing else I’ve said it here.

    Some day we hope to ship the Substance Painter files which will make it very easy to export flattened textures should you so wish, or convert content to a different render pipeline.

    In terms of the achievable resolution with layered materials this can be immediately achieved by increasing the repetition of the layered materials or introducing blend masks to the layers, but the art style of the FPS Sample does not really aim for high visual noise so they are currently tweaked towards larger broad stroke details.

    I have not noticed any issues with streaming, and in theory layered materials should if anything improve texel density when streaming, you can set the streaming budget in the quality settings if you have the VRAM and wish to increase fidelity. Feel free to send me screenshots at martink at unity3d.com if you find the streaming suspicious I did not have a chance to check it when we upgraded to latest beta so a regression is certainly possible. In general, however, optimization both for quality, speed, streaming and memory usage is ongoing, doing quality settings in render pipelines is not really a solved problem yet, so there is likely to be changes and we will keep updating and improving as we go onwards.

    Mesh decals did indeed arrive after the content was authored, so yes we definitely would have done a lot of things differently. Both me and Janus Kirkegaard the environment artist are HUGE proponents of mesh decal workflows, and we used it on everything on Hitman. But alas it came too late.

    However, in regards to performance, I am not sure why you dislike projected decals, as they are rendered from an atlas and all executed in a single draw call with a very lean shader. You can set the size of the atlas in the hdrp asset, or disable decals there entirely.

    Prefab variants are used everywhere, in fact until they arrived we used a custom solution, they are in use almost entirely as a way to separate the raw art content completely from the prefab so that art contractors could work in complete separation from our prefabs and scene data files without requiring any changes from our side.

    Again thanks a ton for all the attention to detail in your post. I think in most cases we are generally in agreement and most of these things come down to striking a good balance between pushing new features and shaming fixes into existence internally, versus crafting good tutorial content & best practice. In each case I try to balance the pro's and con's of each and select the most meaningful approach, but likely not end up pleasing everyone.

    /Martin Vestergaard Kümmel
     
  3. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Hi Martin,

    Absolutely amazing post, I think we are going to spend time digesting, and am hugely grateful there's some movement regarding probes as automation is not actually that hard but there is always one annoying (literal?!) corner case that causes art to look bad. Nobody wants to make beautiful game art or movie art and find it's lit poorly in the scene in that one place nobody checked!

    In Robert's talk here : http://twvideo01.ubm-us.net/o1/vault/gdc2012/slides/Programming Track/Cupisz_Robert_Light_Probe_Interpolation.pdf he mentions several strategies - and I tried them all where I could. I couldn't try the milo one but the rest were pretty ineffective when tested in the real world (need hundreds more probes in a small area to influence and fix leaks).

    Robert Cupisz:
    upload_2018-10-27_16-3-0.png
    So some visibility bake will allow Robert's awesome paper to function and FPS Sample to be fully automated and only take a couple of seconds to calculate too without needing to place them in prefabs).

    The situation of inside/outside is very common in real world games, and only gets more so with something like the FPS sample where it's common to be near the outside but just inside a door. It'll be impossible to deal with in the battle royale style games to come. I have tried so many things - even down to having my own influence volumes to move the probe anchor weighted to where I needed to avoid leaks! But it is an insufficient hack.

    Outside probes aren't really a problem - you can loosely spread these in a lattice foam over the world for time of day at quite a bit less granularity than you would expect (perhaps a separation of even 6-12 units apart works well for me). It's just the inside/outside or 1 unit wall thickness that'll get you every time, and inside needs greater density.

    I'm pretty good at probes at this point. I can automate reflection probes without problems, in fact I do this so well because I control the bounds of that probe's influence, something I just struggle so much with in lightprobes. So much so I'm thinking of just not using lightprobes because a) our world is too big to do it by hand and b) too diverse to put them in prefabs.

    Regarding outside / inside probes I think some 99% of problems can be solved simply by having lightprobe groups that don't combine. Or some way to segregate them....
    I see they solved it in COD 2017 - probably as Robert suggests with visibility info.

    In any case, I love the project. Thanks for bringing AAA quality to Unity!

    (Anyone can call me Rob if Hippo is a mouthful - not fussy)
     
    rigidbuddy and Thomas-Pasieka like this.
  4. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    upload_2018-10-27_21-44-32.png

    These are the current probes in the FPS Sample.

    What I would like to have is a small tweak to the probe API that allows us to provide visibility information ourselves. This is something I can very easily compute! I can reason with raycasts for example, so it's not a big change on Unity's part. It's not a lot of work for Unity to add an early feature like that to 2019.1 or .2

    Perhaps an allowed indices list or something like that.

    I guess I'm fishing for a way to do it now using the existing API. Seems like Call of Duty: Infinite Warfare used the same probe system but of course added a little vis to it.


    But if there's a way I can achieve it now I would love to know!
     
  5. robert

    robert

    Moderator

    Joined:
    Dec 21, 2008
    Posts:
    265
    Heya! Just a quick note about light probes. TL;DR: yeap. ;)

    The tetrahedral mesh interpolation worked quite well in Shadowgun (2011, mobile), where we couldn't afford to bake or store dense probe grids, not to mention visibility info. Levels were tiny, so it took no time to place the probes manually and by doing that - optimize the lighting variation they'd capture, avoid leaking through walls, etc.

    Shortly after it became clear this system won't go much further, for a few reasons:
    1. Doesn't fit well with multi-scene, be it dynamically loaded scenes or even just edit time only. The reason is that either you need to work with multiple hulls or merge them, and neither option is great. Multiple hulls are tricky to interpolate nicely, especially if they're overlapping, but it's doable. What's worse though, is that now every renderer needs to store multiple last tetrahedron I was in indices. Single hull is bad because re-tetrahedralization will cause light pops (and takes time).
    2. Doesn't translate to gpu per-pixel interpolation as well as 3d texture lookup would.
    3. Hard to make an automatic placement tool, which would cover all cases.

    We need a solution covering these cases for sure. Additionally, imagine a case like that: two scenes, each with its own set of probes, one is a forest, the other - a cabin. The cabin scene gets dynamically loaded into the forest as you approach it. Apart from all the concerns you mentioned (leaking through walls, representing the entrance), we also have the issue of the cabin probes somehow taking priority over the forest probes.

    Thankfully the Lighting Team is working on it and I'll leave it to them to comment further :)

    P. S. Tonemapping being applied to light probes has been fixed last month, but I don't remember if it has made it to beta already; not sure if that is even the issue you're referring to wrt light probe visualisation.
     
    Reanimate_L, rasmusn, pcg and 3 others like this.
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Thanks @robert, honestly a relief, because I figured I'd have to roll my own solution or something. Hope it comes super-soon (2019.1 anyone?!). I've 5 years of probe auto placement experience ready to test it, since anyone who's done auto probe placement will know it's quite an absorbing puzzle...
    Will check for replies far too often then. Really important to me that things are lit beautifully. It's actually the rendering thing I obsess about the most. Light and shadow.

    *refreshes browser*
     
    robert and AcidArrow like this.
  7. Jesper-Mortensen

    Jesper-Mortensen

    Unity Technologies

    Joined:
    Mar 15, 2013
    Posts:
    232
    Yes, we are aware of the pain points of authoring probes with the current functionality.

    We are doing a push on lightprobes now and it will be an ongoing project. As you know I can't comment on when the various features will land but I can shed light on some of the things we are looking into:
    • Deringing: We've had serious issues with SH ringing artifacts when using intense direct light in probes. Deringing landed in 18.3 and is being improved in 19.1.
    • Probe Visualization: With HDR it has been impossible to visualise the probe lighting. This is likely to land in 19.1.
    • Contributing Probe lit Objects: In order to make probe sets a symmetrical lighting container similar to lightmaps it has to be possible to have probe lit objects that contribute to the GI. We are working on this.
    In addition to these foundational pieces we are experimenting with exactly how to have probe sets that can be streamed in and out and how to prioritise between multiple probe sets when rendering and the UX complexity when authoring for that. This work also includes new containers (grids etc). Further out we have work on auto placement and visibility guided interpolation.

    Cheers,
    Jesper
     
  8. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
  9. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,129
    Hi Vestergaard. Can I know when this issue will be fixed? I dun really like to set lighting to 1000 lux that I cannot see anything without enable post-processing. Hopefully I can get the workflow just like the real world lighting configuration without doing all kinds of hacks.
     
  10. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,407
    i was wondering that since (i'd guess) this is mainly a fps+networking example,
    why add any graphics, lighting and effects at all?

    could have used those probuilder debug textures and white boxing for the whole level..
    (=much smaller project, works on most computers and small laptops, fast import, no baking if there is, just purely focused on networked fps stuff for tens or hundreds of players..etc)

    and then could build separate high quality hdrp level project with all the effects, 4k textures and what not..
    (and anyone can easily import the level into their fps project if they want)
     
    andreyefimov2010 likes this.
  11. superpig

    superpig

    Drink more water! Unity Technologies

    Joined:
    Jan 16, 2011
    Posts:
    4,655
    You guessed wrong :) The point is to be a vertical slice of a full production.
     
    hippocoder, optimise and mgear like this.
  12. Vestergaard

    Vestergaard

    Unity Technologies

    Joined:
    Oct 20, 2016
    Posts:
    31
    Optimise, well hdr lighting implies that you cannot see anything without the correct exposure, and exposure being part of the postfilter means this is the current workflow, as such there is nothing wrong with the scene being white with postfilters disabled, that is expected.

    Personally i would like to see an exposure slider in the scene / gameview seperate from the postfilters, this is what we did on the previous engine i worked on, but UI bloat like this is not trivial to get to pass in an internal PR, especially when specific to certain project configurations.

    As a workaround you could make a script to toggle all non exposure related post effects, but yeah it's not at all ideal.
     
    hippocoder and optimise like this.
  13. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    upload_2018-11-2_19-34-46.png

    Perhaps there could be a repo or package for all the things Unity staff would love to have but can't get past the evil overlords of PR. Above is my nifty scene view speed slider for getting around (I also removed any of that silly acceleration lark).
     
  14. Lex4art

    Lex4art

    Joined:
    Nov 17, 2012
    Posts:
    445
    GCodergr and Mauri like this.
  15. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    @robert is there going to be any improvement on lightprobe? just curious
     
  16. alexandre-fiset

    alexandre-fiset

    Joined:
    Mar 19, 2012
    Posts:
    715
    Why is the FPS Sample environment made so static then?

    For a vertical slice made to promote features of a major game engine, something as important as changing the lighting settings and loading additive scenes in realtime should be a must, no?

    It scares our team off because we had tons of problems with additive scenes and dynamic lighting when Enlighten was implemented in Unity years ago. It took months for a major issue to be solved, and even two years later we couldn't ship our game with Dynamic GI on the Switch because "streaming realstic worlds is not really supported in Unity".

    While this thread and the FPS Sample are technically interesting, they never touch anything outside of their respective and super limited usage.

    Light Probes placement is not an issue to us and we think it shouldn't be for anyone. Our game (Kona) has 3 square kilometers of open environment and we never struggled placing light probes and handling thin walls. You can play it and see for yourself; no problem at all.

    Support for real LUX value for our sun would be really nice, but means nothing if we can't have decent day and night cycle with global illumination working on all major platforms. Anyway with settings such as light range, it will never be accurate and will always require fine tuning.

    So my wish is for Unity to use a more complex environment setup to allow procedural and streaming worlds to be doable with HDRP and Dynamic GI.

    Otherwise we'll have to keep making games with loading screens even in 2020...
     
  17. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It's nothing to do with the placement and everything to do with the fact even with hand-placement, there are very many situations probes break... due to the fact they cannot be occluded / don't have control over which probes blend.

    Unity mentioned they're researching this so I'd like to hear about it.