Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Occlusion Probes

Discussion in 'Graphics Experimental Previews' started by DavidLlewelyn, May 15, 2018.

Thread Status:
Not open for further replies.
  1. DavidLlewelyn

    DavidLlewelyn

    Unity Technologies

    Joined:
    Aug 14, 2012
    Posts:
    34
    Hi all,

    A number of you have been asking about Occlusion Probe system mentioned in the 2018 release blog. There seems to have been some confusion generated by our messaging here. I hope I can offer some clarification.

    Presently Occlusion Probes are not 'a feature' and do not exist officially on our roadmaps. Rather they are an example of how the Custom Bake API in 2018.1 can be used to extend the Progressive Lightmapper.

    While an experimental implementation of the Occlusion Probe system exists, the 3D texture data generated by the probes is not handled natively in any standard Unity shaders. To use the data, you will need to write your own shaders - and more to the point - be brave enough to modify Unity's lighting functions.

    In Legacy these lighting functions exist in some CG includes within Unity. We have an example implementation here:
    https://drive.google.com/a/unity3d.com/file/d/1YgoX5Zz7SDMWSjhzG2Plizn8vlmMAiLa/view?usp=sharing

    In HDRP and Lightweight render pipelines, the process is a little more involved and requires modification to the render pipelines themselves. Again, an example implementation is provided here:
    https://github.com/Unity-Technologies/ScriptableRenderPipeline/tree/lw/occlusion-probes

    We hope to have a blogpost and an example project available for public release soon. This should hopefully shed some more light on how to implement the Occlusion Probe system in your own project. Again though, this is an entirely experimental feature and is not offered with official support.

    There is early discussion underway for how or when we may include the feature in HDRP in an official sense, but it is too early to say more or offer dates. More news as it becomes available.

    Hope this was helpful!
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It was really helpful, thank you! Hopefully can be cheeky enough to ask for more posts in future :)
     
  3. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    Thank you!

    EDIT:

    some form of example minimal SRP, or an example of editing the LWRP to add a minimal feature would be amazingly helpful in this.

    When you say "modify SRP", it is basically saying "jump into this knowledge black hole" currently
     
    Deleted User and Shorely like this.
  4. Thall33

    Thall33

    Joined:
    Sep 18, 2013
    Posts:
    134
    Not going to lie, this feels like a big kick in the guts. I'm an artist who wants to demo outdoor environments with this fidelity but I need to programmer to decipher it all...This is not what we were led to believe from the blogs and keynote.
     
    Marc-Saubion, Pr0x1d, Shorely and 9 others like this.
  5. Kolyasisan

    Kolyasisan

    Joined:
    Feb 2, 2015
    Posts:
    397
    Dang, that's really too bad! This feature is really needed to be a part of out-of-the-box HDRP pipeline.
     
    Pr0x1d and Shorely like this.
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yup. The blogs etc have also been very clear that this is future stuff, stuff that's WIP and so on, so be careful of assuming, it's unfair on you, unfair on Unity.

    So what I am doing is collecting a list of information so that I can share with you, when these kind of posts pop up, which will help fill the documentation void. As a lot of this stuff is subject to change you can probably sympathise a little with Unity staff. There's WIP docs on github (various wiki's) you can check out meanwhile: https://github.com/Unity-Technologies/ScriptableRenderPipeline/wiki not perfect, but also not too shabby. HD has until the end of 2018.3 to be called anywhere near beta so I guess we're all just showing our excitement. Can't blame anyone.
     
    MadeFromPolygons and Kolyasisan like this.
  7. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    Thanks, but at the same time please think about artist who work without programmers in unity. Don't assume we can just code all the things that could be enabled with a button checkbox and a shader.
     
    Shorely likes this.
  8. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I'm pretty sure they will never make that assumption, especially since people are not afraid to speak up when a feature they want is omitted!

    Honestly, I dont think non-graphics-programmers should be afraid for the future just because Unity will be promoting the various systems that are being opened up to programmers. They want to cover all the bases, but it is natural that in some cases Unity can deliver the API for programmers far quicker than they can deliver full feature in one of their own render pipelines. Especially given the early stage that the LW and HD pipelines are in. But I'm sure Unity know that the render pipeline makes up a notable part of a game engine, and that their own pipelines will have to deliver, out of the box, over the years to come. But that wont stop them mentioning that programmers could do something themselves in the meantime - this doesnt help everyone but it does help some people, and its not something Unity can really just use as an excuse never to provide most key features themselves.

    I do think there may have been some marketing/presentation/blog mistakes made by Unity this year though, in that some of their stuff may have made the pipelines and certain features sound more ready than they really are. It's hard for me to be sure because I've been following the technical reality via github more than the hype, and some perceptions will end up out of sync with reality even when Unity deliver their messages with complete accuracy.
     
    neoshaman and Lars-Steenhoff like this.
  9. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
    Once again Unity showcases a demo with "extra" features they don't plan to support. Color me surprised.
     
  10. KarelA

    KarelA

    Joined:
    Dec 30, 2008
    Posts:
    422
    Here is a good post by Robert Cupisz about occlusion probes. https://twitter.com/robertcupisz/status/954493639442432001?lang=en

    You can see how much difference occlusion probes make and without them you wont come even close to the visuals of Book of the Dead demo.

    By reading his feed you kind of get an idea that occulsion probes are experimental and may take a while until it becomes an official feature in Unity. But the problem is that no one knows about that (apart of us who hang in this forum or follow certain people on twitter).

    We start to get these silly Unreal vs Unity graphic youtube videos and on the Unity side the video is misrepresenting a bit since sowcase is built on custom stuff that may (or may not) end up in the final engine in near (far) future. :)
     
  11. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    649
    I love Unity. It's basically my entire source of income at this point. However, let's be very clear in our understanding ... Unity runs on marketing ... its a "company". When they release a new high tech video ... its intent is not to show off what the vast majority of their user base will ever be able to accomplish, nor is it what their engine actually supports without being modified. Its there to show off their shiny new features as well as what is possible to be done with Unity in the most hype building way possible. Anyone is speaking above their pay-grade if they try to pretend otherwise. That doesn't take away from the work of the Devs (which is awesome), nor what we're capable of doing with Unity ... but it does mean we have to remember that not everything the company does is geared towards any individual user, but rather, towards producing numbers, like any other business. Thus, it goes without saying that for any given announcement, we won't actually know what it ends up being until we're using it.
     
  12. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    It was based on this that I made a post yesterday asking for a tech demo that is not based on this kind of stuff, but it was immediately locked (and I totally get why it wasn't really constructive or going to yield anything constructive).

    I really just want a decent idea of what is capable without modifying the engine.
     
    Last edited: Jun 5, 2018
    747423517 and Shorely like this.
  13. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    649
    I happen to agree with you. It would be nice if we also got a demo of how to best solve the issues these tech demos faced without relying on extended scripts we may never get. I would love to know how an accomplished artist would build a scene using the tools we will have available, since that's the only thing I can actually try to learn from. But ... I don't think that's the purpose of those demos.
     
  14. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Well, at least the occlusion probes are supposed to work (do they really?) on the engine version we got. It's not like first Adam tech demo that had custom tech we got partly only years later.

    That being said, it would be cool if the tech demos would be runable on the same engine versions we get, otherwise they are pointless as they don't represent the engine they are supposed to represent.

    I'm guessing we'll be getting some example how to hook the occlusion probes up once Unity releases the example map using book of the dead assets. It was supposed to be out in April, then last month so I guess we might get it this summer? :D

    I did look at the occlusion probes github code earlier and there weren't that much to setup, there just wasn't any example or instructions so having a sample project (even super simplified) where everything is configured to use the occlusion probes would help people getting started with them.

    Above link only contains shaders, not the whole implementation. You get the additional scripts from github branch but I haven't tried combining these. I'd really love to use this feature on legacy renderer so it's nice to see that shaders have been implemented already.
     
  15. KarelA

    KarelA

    Joined:
    Dec 30, 2008
    Posts:
    422
    Exactly. And that is frustration with the current state of Unity. These demos are just a marketing tools for the engine and nothing more. Even the older demos that have been out for years. They were also marketing fluff since most of the stuff like atmospheric scattering and that undocumented complex shader that they used on their mesh terrain are constatnly breaking and not officially part of the engine. I would not be surprised if the same happens with Book of the Dead demo. Does not matter if they release the source project or not. Chances are that extracting useful knowledge from it will be limited.

    If you want to learn and grow as an artist who specialize on real-time stuff then Unity is hard choice to make. The amount of education is so limited and you have to waste tremendous amounts of time (and money since you are likely to rely on custom shaders and tools from assetstore) on trial and error to figure out the workflow.

    Just today quixel made a 5 minute quick tutorial on how they created gorgeous snow material and how it was set up in UE. It all seemed so easy, it was just working and the result were amazing. Everyone could recreate it easily and learn from that. But try to find something similar for Unity and you will not find anything. Quixel had a similar video for Unity but it was a complete hack.

    Hopefully when the time is right and HDRP is officialy released there will be a "monkey see monkey do" type tutorials where we all have the same tools and can follow along and get visually amazing results. Something that could serve us as a good starting point to learn and grow.

    I am currently learning by watching artists work on their UE scenes and try to apply their techniques into Unity where i can. I am using Alloy, Uber and ASE and sometimes i can get pretty close to their result. But as a Unity user i feel kind of silly to educate myself this way. A constant uphill battle.
     
    shredingskin likes this.
  16. xrooshka

    xrooshka

    Joined:
    Mar 5, 2014
    Posts:
    59
    Stuff like occlusion probes just not for the dull designers hobbyists. It's all about some companies that can allow itself a tech artists. By the way Unity is not for some lone hobbyists making their own huge photoreal sandboxes. Maybe not so big not so photoreal sandboxes...

    Or you just have to be generalist. Or maybe you'll collaborate with some tech artist for making your common dreams come true.
     
  17. IgnisIncendio

    IgnisIncendio

    Joined:
    Aug 16, 2017
    Posts:
    223
    What does Occlusion Probes do, exactly? Are they a better type of ambient occlusion or something?
     
  18. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    But even those hobbyists would be able to use it if there were organized samples and documentation (we still have hope for that sample).

    Thing is, Unity itself has promoted occlusion probes on 2018 so people got their expectations higher than they should have. You can't really blame the users for getting excited or claim people should hire tech artists to be able to pull something off when unity has said you can do this with upcoming version.

    I'd be just happy to see fully setup example scene myself, I don't care if it's not fully polished or not implemented as stock unity component as long as it works (I prefer having more of the source code around so it's even better this way IMO).
     
    Last edited: Jun 7, 2018
    Shorely and IgnisIncendio like this.
  19. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    "Each probe is just a scalar of how much the sky is occluded at a given point. It's stored in a 3D tex spanning the entire forest and used to occlude direct sky contribution. We do some tricks on the lightmapper and the script side to avoid self-occlusion and other artifacts."
     
    IgorAherne and sqallpl like this.
  20. Thall33

    Thall33

    Joined:
    Sep 18, 2013
    Posts:
    134
    - Exactly on point.
     
  21. Hafazeh

    Hafazeh

    Joined:
    Mar 26, 2017
    Posts:
    18
    Honestly, if you present a demo as the representative of the future of graphics in unity. Don't lie about the things used within the demo. It's false advertising.

    Don't use features that won't be available out of the box.
    Unity seems to be the engine that does this the most.

    This happened with the Adam demo.
     
    Shorely, thelebaron, OCASM and 2 others like this.
  22. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    @FrakkleRogg I've spend a lot of time trying to make these probes work. Is the baking features really expected to work as is with Unity 2018.2 or 2018.1 and the scripts in https://github.com/Unity-Technologi...Pipeline/Core/CoreRP/Features/OcclusionProbes ? I've done a lot of things and I mainly get to preview the already baked data (that legacy shader you linked in original post shows the occlusion right) but I can't make the sample scene or anything to bake any new data. Also if I manually wipe OcclusionProdeData and create a new one, it will be forever blank.

    I spent some time yesterday by updating lw/occlusion-probes to 2.0.3-preview as well (can share the fork if people are interested but I didn't have time to figure out how to properly place some things in the lightweight SRP as there had been major refactoring + I don't even care about lw myself).
     
  23. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Book Of The Dead Environment is now out with occlusion probes setup: https://www.assetstore.unity3d.com/en/?stay#!/content/121175

    Edit-> I tried the demo, still can't make occlusion probes to bake any new data :D I tested this on the BOTD environment and additionally on the test level from the github, I really struggle to understand how the occlusion baking is supposed to work in current unity version (I'm testing on 2018.2.0b8)
     
    Last edited: Jun 19, 2018
    elbows and one_one like this.
  24. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It has already shipped dude. It's on asset store. https://blogs.unity3d.com/2018/06/1...-workflow-preview-and-more-from-unite-berlin/

    Also let me be extremely clear: there will be no backlash. That stuff has been tried by many people over the years. Here, logic is used. And there is no logic to what you're talking about.

    I suggest you read the blog post, download the example project and learn what to do.
     
  25. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    They already tried it, and left a negative review complaining about having to import 2.4gb of assets, and lack of documentation, and comments about what 'rendering features unity has to have out of the box'.

    News flash: the term out of the box doesnt always mean that much in the era of the package manager and scriptable render pipelines. If you are going to appeal to Unity to make these features a part of unity rather than customised stuff for a particular demo project, you should be appealing to them to make some of the custom stuff a standard part of the HD render pipeline, not 'please separate them' or 'please put them in the box'.
     
  26. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Nice that's more like it.
     
    MadeFromPolygons likes this.
  27. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Not that I would place all of the blame on the user in this case, because this is a new era that would very much benefit from an extra degree of careful communication by unity staff during presentations. The lines between unity version, HD pipeline version and customised demo-specific work can still be a little blurry and confusing in places, and more than one thing said at GDC is likely to turn out to not quite be as described, and as we will probably discover in the roadmap talk in less than 12 hours time, subject to slippage.
     
    MadeFromPolygons likes this.
  28. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Indeed, it would be nice to polish the separate pieces and have them on github in little bite sized repos, and I'm sure many people do just that. I like Keijiro's github - clean little snippets.
     
  29. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I'm not sure how well snippets fit with customising a render pipeline though. One of the strengths of these pipelines is also a potential weakness, if customisations involve changing main pipeline code, these custom systems no longer become small amounts of code that can be dropped into any project, but require the use of an entire specific version of the render pipeline. By the way, based on my very brief messing around with the book of the dead environment asset this evening, it looks like the HDRP is a customised version of 2.0.0. I wont hold my breath to see if they keep updating book of the dead things to keep up with pipeline version evolution, the past suggests they might not put much effort into this, but I should not use the past as my only guide.

    I have to say that at this point, given changes to shaders and possible customisation work in that area, as well as the rapid evolution of the HD pipeline, it seems like it can be tricky enough to even transplant assets such as trees from projects like Fontainebleau into our own projects that use a different HDRP version. I kind of expect that to be even more true with the Book of the Dead Environment, although as usual there is much devilish detail (eg wind customisation) and I hope to be at least somewhat wrong about this over time.

    Mind you as I've said in other threads on related subjects, this sort of 'take the pipeline and customise it for a particular project' does mirror the way things work for a lot of games etc in the real world. Thats one sort of valid reality, and its not really surprising that this rubs awkwardly against a different reality, that of 'users out of the box expectations regarding specific game engine features that have been promoted at them in some fashion'. A long-standing bone of contention that I dont think we will see eliminated in 2018, though I always hope the obvious things can be improved upon by Unity on this front over time, even if its more on the communication front than the feature reality.
     
  30. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    So... you got the occlusion probes to bake on that project?

    After spending several days trying to figure this out, I'd love to hear from Unity staff if the probe baking is even supposed to work on the Unity versions we get. If not, real question is, are we going to get such version? Also to make this perfectly clear, I'm not talking about being able to visualize the already baked data, that works afaik, you just can't bake any changes / new data atm.
     
    Shorely likes this.
  31. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I really wouldnt file my view in a simple 'positive' category. I consider it a somewhat balanced view that certainly takes the long history of Unity into account. For example, I am complaining about aspects of Unitys communication, especially some sentences spoken at GDC. However, I also believe that some of that is due to people taking things said out of the context in which they were said. Also, I never claimed that the occlusion probe stuff works 'out of the box'. I'm really not interested in 'making logic' out of anything, I am interested in the reality, and the gap between message, user expectations, and reality.
     
  32. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Also, there is not too much point in dwelling on what someone said in March, when we've since had this thread that makes it clear that the reality is quite a bit different.

    By all means make it clear to Unity that you want to see this feature become a first class, supported feature in the Unity world, or complain about past communication like I often do. But to still expect something generally useable to come with the book of the dead demo, 2018.2 or a current version of HD pipeline, when we've already been told that the present stuff is a customised example rather than a general feature right now, makes little sense.
     
  33. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Hiya I hear you and Unity hears you but they have to discuss and allocate work to people how best to bring these things to you. Meanwhile forum can help out here - Just be patient please!

    1. In software development at a scale Unity is at, things will slip deadlines. Because A often waits for work from B.

    2. Natalya isn't "the graphics girl". She is Natalya. I prefer "Graphics Lead" or "Director of Global Gfx" as a good title for her. Please don't be sexist and I know you didn't mean to but I respect her too much to not mention this.

    3. Unity are aware of how troublesome it might be to get to the features you want and after Berlin they could work on that. It's not a promise, it's an intention - I'll ping Unity's equivalent of Batman, perhaps he can chime in on a new standard snippets library or something like that: @willgoldstone

    In fact I even argued against standard assets, in the form back then (for different content). But in this form, it makes so much more sense (cut down feature implementation example) - people are finding it hard to get to and understand the techniques that might be in play because they're buried in a contextually heavy project.

    Perhaps the docs will always remain too narrow to be of use and a whole library of feature samples like what Keijiro does would be really beneficial going forward?

    Thanks for your understanding and patience, give a bit of time then poke me or staff again - no need to get angry :D
     
    AcidArrow and elbows like this.
  34. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    My main issue isn't the contextually heavy project tho, occlusion probe baking doesn't seem to work either on BOTD or on the very simple 10 cubes testscene for it found on the github. There's not much you can cut away from the latter, it just doesn't work on my end.
     
    Last edited: Jun 20, 2018
  35. xrooshka

    xrooshka

    Joined:
    Mar 5, 2014
    Posts:
    59
    Oh. As I understand there is a big amount of "project specific features" in BotD like occlusion probes, atmospheric scattering and even it's own specific Lit shader. So if I want to use some features I should use this specific BotD SRP instead of HDRP out of the box, or I should to get into the code to adapt some features making my own SRP. And with both choices I'll have difficulties with official feature updates.

    This working pipeline is quite useful for some commercial teams making their own projects with their own project specific features. But not the pure students trying to slap their art into Unity to make some interactive experience.

    By the way, I'll be happy to see this cool features like Occlusion Probes, Atmospheric Scattering, SSRR, autofocus as a part of HDRP package.
     
    hippocoder likes this.
  36. xrooshka

    xrooshka

    Joined:
    Mar 5, 2014
    Posts:
    59
    Well, there was a question does anyone make the BotD Environment's occlusion probes work. I've done some tests on dublicate of the AssetLibrary scene. Just disabled the native OP gameobject and make the same one by hand. And I've seen the similar results. So I've tried to make the quick scene with more dense forest to see more dense effect. And result seemed not so dense untill I realised the trees by itself will not bring you as lot sky occlusion as needed. So I scale the rocks to make some well effect.

    Now you can see the Occlusion Probes in action

    upload_2018-6-21_2-54-14.png upload_2018-6-21_2-54-29.png

    I'm using Unity 2018.2b9. Maybe it's important.

    By the way, after this tests I think this tech may be useful when you work with area, that's highly occluded by some landscape features, but it doesn't work on the plain. I'm interested in how it will work with buildings with indoor/outdoor.
     
    Last edited: Jun 21, 2018
  37. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    The progressive lightmapper is used to bake occlusion probe data
     
  38. Heechee

    Heechee

    Joined:
    Jul 3, 2015
    Posts:
    4
    In the project, in the folder features-> BakeGrassOclussion there is a scene with the logic to create the occlussion probes. The occlussion probes script screates a boundig box on the object to occlude, and the saveoclussiontotexture script stores it in a texture. Is this a demonstration scene or part of the real demo, ie the occlusions are created by object and then combined in the final scene? What does the strange matchbounds script do? Definitely this project needs documentation.

    OclussionScene.jpg
     
  39. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Ok so it's just me :D But it's nice to hear it can actually work on the engine versions we get. I have to try to reproduce this myself on that other sample, I've tried the actual BOTD level and other sample level without success. I'm using the latest 2018.2.0b9 as well.
     
  40. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Quote from the official BOTD environment thread:
    "It's a feature that will make its way into HDRP one way or another eventually"
     
  41. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Don't take that as gospel though. Not on roadmap? not coming - now that's official! :) Remember, this is the sort of thing the community grabs hold of and pretends it's sanctioned and a cast-iron promise. It's not. It's a maybe what-if at best with zero specified time.
    Just a heads up - just I have had a lot of people try to manipulate Unity over time, and since Berlin it's ramped up, so just making it clear.
     
  42. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I've been trying this again but I really can't make the bake to do anything useful. I suspect the issue lies on the OcclusionData asset itself.

    @xrooshka How did you handle it? I can't make a new asset for OcclusionData by default (I actually tried coding a utility that let you create these but it didn't help). If I assign existing OcclusionData object to it, it just shows what's already baked there, I can't alter this data. For AmbientProbeData, that's generated once you just create a subfolder for the scene with scene name and hit bake but the issue atm is how we are supposed to create OcclusionData.
     
  43. Heechee

    Heechee

    Joined:
    Jul 3, 2015
    Posts:
    4
    I think that the asset occlussionprobedata can not be generated manually. The PL automatically creates it when the bake process ends. I have tried to erase the existing one in the bakegrassoclussion scene of the project and effectively recreate it again.
     
    rz_0lento likes this.
  44. xrooshka

    xrooshka

    Joined:
    Mar 5, 2014
    Posts:
    59
    You should leave the data fields empty and then just bake the scene lighting. After the light baking data assets will be created automatically.
     
    rz_0lento likes this.
  45. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Oh, I now actually got it to bake the data for once on a fresh scene. This is totally weird tho, it literally did it only once and then when I changed the scene a bit and hit bake again, it just refuses to start again from the bake button on Occlusion Probe.

    I think this is the same issue I've had when tried to rebake on existing samples. That being said, I can get to bake the occlusion probe data again by going to Window->Rendering->Lighting Settings and setting the Auto Generate off + manually generating lighting there. It updates the Occlusion Probe data in the progress. Somehow if I leave this on auto it will not update it (it still bakes the lightmap again but not occlusion probe, and if I hit bake on occlusion probe, it just ignores it.
     
  46. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    Oh right, now that I read the button again, the bake button is for ambient probe only. :D Well, this explains things.
     
  47. willgoldstone

    willgoldstone

    Unity Technologies

    Joined:
    Oct 2, 2006
    Posts:
    794
    Hi all @hippocoder thanks for pinging me on this. Just catching up just landed back from Berlin Unite so apologies for delay.

    This is something that pains me a lot and we know we need to get better at. The UX of forcing artists to go to github for something we call a feature (in any official capacity) isn't what we aspire to, and we need to improve that a lot - we know this but I totally appreciate you raising it, it's valid.
    This said, we have a huge undertaking in both LW and HD render pipelines. It's a whole new world (don't you dare close your eyes) and we're at a point where some super smart people are trying to race ahead and bring the tech to somewhere that is mature enough to not call Preview any longer. I know that in reality, a better UX and fewer new features here would probably be appreciated, but we're balancing the wants for latest tech, with our need to share and get feedback from all of you - it's not easy. Package Manager is one way we're trying to make it easier and hopefully that'll help.

    In instances like this where we make demos - those of you above who said this are totally right - we do make demos to make Unity look good, because it CAN achieve gorgeous looking things these days and with more cool stuff in the past two years than ever before (trust me i've been at this 7 years!). In the past we did ship things with bespoke builds and tech you couldn't wrangle at all - but these days we ship on the same Unity as you all get your hands on - including those folks on Personal Edition - we literally try and make the widest accessibility to the highest end rendering tech we can. So does that mean we have highly experienced tech artists and art directors like Robert and Vess? absolutely - they're here to inspire you and personal opinion I think they do a fantastic job - I look at this as a good thing - we couldn't really build the tech and then not take it to its ultimate limits. Moreover - it's not the job of those folks to make that extended tech into a feature - be it package or otherwise - so yes it ends up on gitHub often. I of course acknowledge that if we pitch certain things as a feature, it should become one within a reasonable timeframe - so i'm not discounting the example that spawned this thread.

    So key takeaways for us are - we need to prioritize our strategy of getting UX onto artist centric features where it makes sense (we really do discuss this! and I want that to be clear that it's a known problem) Key takeaways for you are that we have a ton of cool stuff we're landing, and we really appreciate your patience on this but there are some things that are more important things to get right about HD render pipeline than others. Additionally - we're not gonna stop extending our systems (And sharing it with you) for demos because we can't make it the best UX on version 1.0.

    Hope that makes sense all, happy to be challenged on any and all of it.

    ps. I know the issue here can in the form of standard assets - but I don't think that because this feature is an example now means it should stay that way. I do want to do a new standard assets effects pack, but its further off, and our team is focused on character controllers at the time of writing.
     
  48. xrooshka

    xrooshka

    Joined:
    Mar 5, 2014
    Posts:
    59
    Post about how the audience is hurrying up the events, confusing the feature and preview ))
     
  49. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    Will, a fantastic post, but I do believe that summing that all up in a blog post might be very beneficial. For every vocal member here, are likely 10 who dont bother to come here to voice it but still feel the same way.

    For me, most of the issue is our expectations vs the trutrh, and the fact that now that we are getting all the stuff we moaned about for ages, not everyone knows how to react. People are acting like unity is doing what it has always done, despite literally almost every pet peeve of majority of community since I watched unity 5 drop, being touched on (whos going to say unreal this unreal that now?), and I feel like just letting people know that the team does discuss things much more frequently and in depth than before an that XYZ are thought about will help people stop running around like headless chickens.

    I like to think once we get past 2018.3 we will have fixed a lot of long standing issues and make keeping the engine cutting edge + maintaining good comms over this a much less daunting task.
     
  50. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Fact is most devs still don't make proper or full use of the Unity 3.5 feature set, let alone all the new bells and whistles... :)
     
    soleron and Dreamora like this.
Thread Status:
Not open for further replies.