Search Unity

Feedback The State of Unity & Packages in 2020

Discussion in 'General Discussion' started by smcclelland, Mar 5, 2020.

Thread Status:
Not open for further replies.
  1. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    Thank you for the clarification on the points! I've updated my notes and added a few new ones based on the clarification and details you added.
     
  2. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    Any examples or things you feel package manager shouldn't be doing?
     
  3. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,979
    Thats great as they are the main team that have been avoiding the difficult questions, which is wrong as they are the main ones making such breaking and massive changed regularly. So they deserve to be questions, and we deserve to hear their reasoning behind the current direction and choices.

    Its not that they never answer, ofcourse they do. But mostly as soon as we ask about surface shaders (which would aleviate most of the graphics related strains) it immediatly goes dark with 0 communication back. Avoiding the questions is never better than saying an answer people dont want to hear. Its batting the problem down the road and compounding frustrations for a bigger explosion at a later date.

    If we can just get a single clear answer back about surface shaders, I promise I will shut up for more than 2 months on any whining that is graphics related.
     
    Awarisu, a436t4ataf and pm007 like this.
  4. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    +1 for not having breaking changes to a package’s API unless there a new major version. But that doesn’t mean a major version should necessarily have breaking API changes, it could just have new features of course.
     
    konsic likes this.
  5. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    I like how specific you are :D
     
    pm007, MadeFromPolygons and transat like this.
  6. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    @smcclelland What a lot of people here would like an answer to (unless it’s miraculously been recently answered in another thread) is what’s the plan - if there is any - regarding surface shaders or some kind of abstraction later that would make upgrading considerably less painful for everyone? If you’re not able to answer this for some reason not within your control, give us a wink emoji at least. :)
     
    MadeFromPolygons likes this.
  7. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    I've passed a note onto our graphics lead here in CPH to see if someone can follow up directly on this matter. It's mostly out of my control aside from being able to relay information. My area of expertise and focus is around the internal developer workflow and how we can build tools/services/infrastructure that allows us to develop, test, and ship Unity/packages in a better way going forward. ;)

    Happy to carry on the conversations and gather requirements regarding surface shaders, just want to set the frame that I'd just be a middleman in getting that feedback to the graphics team.
     
    JoNax97, transat and MadeFromPolygons like this.
  8. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    @smcclelland

    Can API be implemented as modular system? That way older package would work on newer versions of SRP without breaking it.

    Lets do Blender .blend package as example. One can load .blend file 2.6 in any newer Blender 2.8 which updates older API to newer one. You can still use it.

    Unity SRP is more complex than that but it should be build from modular holistic approach.

    In Unreal upgrading simple project form 4.19 to 4.25 doesn't break it. Project just needs to recompile shaders and whatnot but you can run it.
    Upgrading project from HDRP version 4 from 2018.4 to Unity 2019.3 HDRP v. 7.2 breaks whole project.

    This is why Unity should take holistic and modular approach.
     
    pm007 likes this.
  9. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    Keep in mind, I simply said "I am not aware of" not that we don't. I know when I was at Autodesk we had them for various products and compliance reasons. That said, looking across the software landscape it really varies on what levels of certification certain software products (most are obligated due to the industries they operate in).
     
    MadeFromPolygons likes this.
  10. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
    Package manager feedback:

    • Tags, it would be nice if tags can be added inside unity, not just from the asset store website
    • The load more is breaking the workflow when you have lots of asset store packages
    • Select multiple packages to import or update all at once.
    • Import packages in the background process
    • Keep track of current project packages ( my assets too )
    • Keep track of current asset store version on local project
     
    transat likes this.
  11. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    That would entirely depend on a few things such as scope of the API, dependencies, and whether it's relatively isolated or requires other core changes. Backwards compatibility (using an older version of something in a newer version of the software) is fairly straightforward if you have the right upgraders in place and handlers. That's effectively how Blender/Maya and the DCC's work is you can generally move forwards fine. The challenge becomes forwards compatibility (modifying a newer version that renders it incompatible with previous versions or requires backporting).

    As someone who comes from a DCC background, the challenge isn't so much the scene files themselves but all the dependencies around them. It's rare in DCC's where your file format changes drastically but occasionally you will have entry points that need upgraders/handlers to move data. In game engines, it's the surrounding bits that bite you where the entire lighting system gets swapped so the scene is still valid, but it no longer has the same data (Beast -> Enlighten). These are breaking changes that there really isn't a good way to handle because they're entirely different systems and going forward breaks forwards compatibility (a scene opened in that version will now no longer function in an older version). If Blender swapped a major system they'd experience the same thing so even though your file may load into the new version it will likely have some errors/issues to be fixed.

    Yes, this is because a lot of Unreal doesn't fundamentally change at the core level. They will add a lot of new systems and do migrations to new systems and then deprecate the old path. I've rarely seen them just outright refactor an entire system or make a major core modification since so much of their productions and pipelines become dependent on it for better or worse.

    These are all valid points and have inherent benefits and trade-offs on both ends. I'm not justifying the fact that migrating major versions should break the whole project but instead saying that major versions *can* come with some breaking changes as that's why they're major versions but not in an utterly catastrophic way.
     
    Lars-Steenhoff likes this.
  12. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    Totally agree with all those points (which I’ve tried to make in other threads). My assumption is that Unity doesn’t do any user acceptance testing as such. They really really should. And one of the user groups they should be testing with is what I call the “valued-customer-who-has-over-1000-asset-store-packages.”. As it stands Unity punishes you for having bought too many assets.

    On that note, @smcclelland why oh why am I still not given an option of where I’d like the asset store packages to be downloaded to?? I use a symlink to an external hard drive, otherwise “My Assets” would totally eat up my macbook’s hard drive space. This is the hack recommended to us by UT staff on one of these threads SEVEN(!!!) years ago. I’d love for the editor to have this as a preference in much the way every-single-other-app-that-I-can-think-of-with-such-a-library does. :rolleyes:
     
    Last edited: Mar 12, 2020
    n3xxt, AnvilNight, a436t4ataf and 2 others like this.
  13. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,979
    Thats great, thanks! And again thank you for your continuing engagement, its really refreshing and heartening to see :)
     
    pm007, transat and Peter77 like this.
  14. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    @smcclelland An example of what we've mentioned... The 'Platforms' package has seen 3 updates since the 25th of Feb. But the changelog mentions none of these. So I have no idea if the version released today is the one that will allow me to upgrade 'Serialization' and 'Properties' without my project breaking. 'Burst' also has a new version out today - also unmentioned in its changelog. That's 2 misses out of 2 for today. Tip: tell ALL your teams that they are forbidden to release a package without updating the changelog first.
     
    StephanieRowlinson likes this.
  15. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    @transat - good question on the My Assets. Let me ping the asset store and package manager teams to see if there's any plans around addressing that. Also regarding the changelog, yeah that's something we can pretty easily capture inside of CI or through our PR process to ensure the changelog gets the right capture points. Noted it to bring back to our source team.
     
  16. bugfinders

    bugfinders

    Joined:
    Jul 5, 2018
    Posts:
    1,729
    Thanks for asking :)

    Why does package manager need to be an asset manager too? I get reworking the asset manager as the old one has had some major issues, but at the same time, combining it all together seems unnecessary, even if they have some commonality.

    Packages on the whole are more like using external libraries, you get what you're given, with assets you are putting them in your project like you maybe had made them, you can easily change, remove, move assets, you dont with packages.

    The new package manager doesnt allow you to browse through assets you might like.. and buy/get the free ones.. You also cant easily filter on ones not downloaded, or ones that have been updated, sure if you keep on top of it... but if you only update a couple it and time passes it would get messy, you cant seem to easily download more than one at a time, nor does it list the recent changes like the old one.. for me, it fails as an asset manager. it also defaults to only a few and most of those are cheap nasty ones, you cant seem to hide any more..

    So, other than it also doesnt work behind corporate firewalls.... I sadly am not entirely feeling the love
     
  17. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    611
    First of all, I apologize if I came across as rude.
    Secondly, I read this as "This is such a valuable opportunity, please don't fade into obscurity"
     
    Last edited: Mar 12, 2020
  18. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    Nah didn't come across as rude at all! Just had to laugh that someone checked the last time I logged in :)
     
  19. quixotic

    quixotic

    Administrator

    Joined:
    Nov 7, 2013
    Posts:
    122
    Re: Surface Shaders
    I'm on the Graphics teams (summoned here via @smcclelland), right now we’re actively discussing surface shaders. We will followup in the next week, and we will create a separate thread (that I'll link to here), as to let this thread stay focused on packages.
     
    n3xxt, jbooth, Lars-Steenhoff and 7 others like this.
  20. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    First, there's one thing that isn't clear at first glance: are Tech releases even intended to be used for shipping games by the Unity team? The FAQ at the bottom of the release page states:

    TECH stream releases are for developers who want to access the latest features and capabilities. This year we’re shipping three major TECH stream releases (2019.1, 2019.2, and 2019.3). We add updates and bug fixes to the current TECH stream release on a weekly basis until the next TECH release is officially launched, then the cycle begins again.

    The last TECH stream release of the year becomes a Long-Term Support (LTS) release and receives continued support for another two years. In terms of versioning, we increment the final TECH stream release of the year by one and add “LTS” (for example, TECH stream release 2018.3 became 2018.4 LTS).

    Unlike the TECH stream, the LTS stream does not have any new features, API changes or enhancements. Instead, any updates address crashes, regressions, and issues that affect the wider community, console SDK/XDKs, or any major issues that would prevent a large number of developers from shipping their games or apps.

    LTS releases receive bi-weekly updates. The LTS stream is for developers who want to continue to develop and ship their games/apps on the most-stable version for an extended period.

    If you are in production or close to release, we recommend the latest LTS release. If you want to use the latest Unity features in your project, or are just getting started with production, the TECH stream is recommended.

    From those answers, there's a major emphasis on using LTS for shipping and Tech for new projects, unless a project absolutely needs a feature that only exists on a Tech release, with an implication that projects using Tech should eventually be updated to LTS, so shipping a Tech sounds like it is intended to be an exception, not the norm.

    As for solidifying the releases, you need better QA that more closely reflects how Unity is being used in the wild. I lost count of Unity staff replying to forum bug reports with bafflement to issues that could be easily reproductible if they did X on Y device. There seems to be a high reliance on people sending their entire projects up via the bug reporter, which is unfeasible for anything larger than a hobbyist project who doesn't have contracts, NDAs, and whose project isn't on the tens of gigabytes.

    I've said this quite a few times, but here it goes again: Unity needs some actual game projects of their own to better battle test their tools and reduce the disconnect between how the Unity teams and how Unity users think. We need people inside Unity struggling with their games crashing on certain Android devices, fighting to hit 60fps on Switch, having to wait minutes to open a project with too many assets.

    That would help, yes. Following semantic versioning would help a lot.

    Unfortunately, no. There's little transparency on what drives the decision to move a package from preview to verified, and as said before there's a lack of confidence in Unity's internal testing being reflective of real world use cases. Even if a package is verified, I feel I can only trust it after reading about other users' experiences with it, gauging forum bug reports, and browsing the source code for a while.
     
  21. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,129
    Hi @smcclelland. I'm here to follow up on above 2 matters.
     
  22. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    611
    One thing to consider is that, now that you've reduced the TECH releases from 3 to 2 per year, now 50% of the releases get to receive long term support. At this point, is it even worth it?

    Why not give all the releases 1 year of support, where the first 6 months (roughly until the next release) allow for all kind of fixes and improvements, and the reminding 6 months are treated with a LTS-level strictness (because at that point, the bigger changes are going into the latest release)?

    IDK, just a thought. I still vote for having a non-fixed release shedule, though.
     
  23. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    I had all my plans laid out to locate and corner one of them at GDC, but the event got cancelled... oh well.
     
  24. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    This, though they probably have their fare share of the highlighted by default.

    However, the reason this is brought up often (and I see my post got overlooked by the unity rep) is a matter of internal culture and how it impact organisational structure and the result it produce. Ie making game on a deadline with tight resources constraints and a certain ambition. We have seen the impact on unity when the game ReCore made it plainly aware they were not ready for big project, even though it's not an internal project. Small educational project don't matter, and generally features demo, hacked around outside unity, to get result. Worse the solution they used are then not implemented in the core engine, no wrinkle shader nor character specific shadow since black smith, no occlusion volume since book of the dead ...

    The reason I open with the henry ford quote (faster horses) is simply to highlight the discrepancy with the core of a problem and the solution applied to it, which something unity struggle with. In that case Henry ford solved the transportation problem, which is implied by faster horses. It could be argue that horses are superior to car, they don't mindlessly jump off cliff, they can bring stuff on their own through 'self driving', can handle many terrain and not just special made infrastructure and you can left them to graze grass to refuel, no need for expensive and sparse fuel dispenser. YET car did win.

    What I'm pointing at is that unity might be in a position where it heal symptoms and not the problem, so they are bound to be stressed every time a fire happen that need to be urgently put off, and there will be temptation to ignore other smaller fire because there is always something urgent to do, up until they grew up to be the next important thing, and everything need to be dropped to deal with it...

    And I feel like we kinda are seeing that with unity finally addressing surface shaders, which is a typical innovative solution that no one asked before unity did it, and was so great that's all people want, it's not perfect but it's like what cars was to horses, it solved a core issue. Which is why I picked it up in my first intervention. BUT I don't think addressing surface shader ALONE will be enough, it's just the tip of the iceberg, and there was much more told in Jbooth various rants, and I hope those where picked up too and not just "surface shader is what people want", because I fear it will end up like performance by default, who is the reason surfaces were removed in the first place (avoiding multipass but without addressing the reason surfaces existed in the first place).

    I mean one recurring complain is that it's "the right idea but the wrong implementation". That's the hint about symptom vs problem. @smcclelland If you want to make your life easier, pay attention to this.

    That's why I'm not directly talking about the direct issues I have with package and the whole LTS thing, those are symptom, fixing it won't change the whole matter, UNITY will just move to the next fire. I already been burn multiple time, I had to kill projects and refuse works, just because unity is so godamn unstable, not in code, but basically in offering, I have already paid a huge personal price for investing in unity, because I wasn't so lucky to be on the right side of stuff that didn't move too much. And I see the same cycle repeat just in different place, everytimes.
     
    StephanieRowlinson likes this.
  25. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I admit I totally didn't read this wall of text (all posts on this thread) so forgive me if I'm repeating points that have already been presented here.

    I follow few Unity communities closely and I see same pattern surfacing all the time and it's basically people hating the package workflow mainly because various previews don't play ball together. It doesn't always have anything to do with package workflow but new feats being developed separately and full integration truly only happening on released/verified packages.

    This has always made sense for me (that feats in early previews will not have full integration with other preview packages) but for things that take years to finish (SRPs, Input etc) you'd expect some collab between the teams even if the feat is in preview.

    I also do feel that people still don't fully understand what preview means and publicly bash the state Unity is at atm due to preview packages simply existing out there. I feel this is partly due to Unity promoting upcoming feats on blogs and events (like it should do) but not always putting disclaimers (like it should do) that these are still in preview and it may not be all smooth sailing if you start using these before they are marked to be ready to use in production.

    I'd like to think that people would know by now what preview means but since we still see page long rants even from Unity users that have been around for a decade, I'd try to make it even more obvious.
     
  26. smcclelland

    smcclelland

    Administrator

    Joined:
    Dec 19, 2016
    Posts:
    147
    Apologies, I missed these! Regarding the first query, the team is currently investigating it and will update the community once they have more info to share. Regarding the second one, a complex code base and not a lot of good internal tools for us to be able to reason about what impact a change will have. That's something the group I work with is tackling now and I outlined some of the ideas we have on how to tackle it in this post. There's a number of ways I think this will solve not just our internal problems but also the issues our customers currently face trying to bring together working versions of packages and editor versions. I need to do more validation on that though to ensure we're on the right path :)
     
    Last edited: Mar 13, 2020
    optimise likes this.
  27. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    Actually there are 2 TECH releases and then LTS. Though the 2nd TECH release does itself become the LTS one, it is considered separate, as far as I understand. Seems to me that this is pretty much the alpha->beta->release(LTS) model but using confusingly different terminology and also setting dates as somewhat arbitrary deadlines that create false expectations (cf 2019.3) and come back to bite the marketing department.
     
  28. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    @smcclelland Here are some findings regarding My Assets in the 2020 Package Manager:

    If I sort by "Update date" (for this I need to remember to click on the "Update date" with an upwards pointing arrow next to it if I want to see the latest updates first) I am then indeed able to see packages that were recently updated in the store. But these are actually old versions unfortunately - not the newer ones from the store. That's what the version number and release dates tell me. Initially I thought this was just the PM showing the wrong version numbers (otherwise why would these show up in the sort) but actually I'm pretty sure now that they are just the wrong version.

    What I usually do to be safe is switch over to 2019.3 and update the packages in the in-editor store there (not the PM as there is no sort filter, at least not in 2019.3.1f1, and I have 1000+ assets, which need to be loaded a few at a time). Then I restart 2020 and try to remember which ones I have in my project and reimport them there. But what I've noticed today is that if I update the assets in 2019.3, all I need to do is click on refresh (not even sure this was needed actually) in the 2020 PM and the packages suddenly show the right version. I can save a whole 5 minutes this way by it having to quit and restart 2020! :)

    Could you please pass this on to the Package Manager team in the hope that they finally deal with a bug that has been there all through the alpha cycle (and now into beta). It doesn't appear to be a "known issue" (although you'd assume that even the minimum amount of testing would have picked this up).
     
    Lars-Steenhoff likes this.
  29. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
    Nice to see another user with over 1000 packages from the store!

    Can we please make the load next button optional, I don't mind the waiting, as long as the results are cached after load, and that package refresh from the server only happens once in a while and not every time.

    perhaps one time per day would be enough.
    @smcclelland

    And I would love to know if managing tags will come to the package manager?
    adding, removing.
    Then at least I will be able to manage my packages.
     
    bugfinders and transat like this.
  30. unitedone3D

    unitedone3D

    Joined:
    Jul 29, 2017
    Posts:
    160
    Hi there! Just a 2 cents.
    I can't add further because everything has been said. But I will just say Unity Technologies is doing a great job
    but there is a very real uncertainty when we see new stuff appearing. Preview packages are great but
    can have game dev breaking bugs. Bugs are truly the bane because if you are less skilled it is
    like facing one (near insurmountable) obstacle after another (nobody/not many said game development was easy (except for very experienced)
    once you know you know/part of the job of game dev is problem soliving (a quality); but like someone said earlier; we wish to make games
    Now not in 5 years when things function Then. The many new features are welcome and lots of great stuff; but
    can be confusing a bit and that uncertainty creeps in :
    Do I go LTS, do I stay with the version I have, do I Try Preview, Will it all crash, why do I get 1 million console errors, I feel I have a broken tool ...
    I would be the first to say that developers must use caution and just 'stick to something' at some point (especially, if progress is farther);
    like, I was on UNity 5.1 for a while...then I upgraded to the Unity 2016?..Unity 2017...new stuff came..I upgreded to 2018...the year 2018 went by
    I decided this Has to stop...I upgraded because I wanted th eNew features -many are good Really good..but like Preview, Alpha, Beta, Zeta, Omega...which one
    do I go for, is like playing roulette. Again -> Bugs. It seems we are constantly worried about them as they pop up in new versions (and I fullyt understand
    or least imagine, that no software is devoid of problems...but if you read the Unity forums it is like : ''Ok..I guess I won't become a game developer;
    there are 2trillion posts about bugs''. It is thus bug invasion; many are fixed (it's amazing what you do) but then, new stuff 'breakes' later...
    that's scary because some people are trying to Rely...on this tool..but an Unreliable Tool makes your work a 'gamble' because you just hope 'it works'.
    And then, when it doesn'T you look bad. Again don'T take anything I said wrongly, Unity engine has improved immensely (I think) but some features have
    had more time to be developped instead of concentrating that Everything works togather (enough let's say). Packages...scripts...stuff...all working in unison;
    I hope to see - no console errors in my lifetime in the Unity (a dream, we/i can keep on dreaming). I know that game dev is 'roll with it/there will be
    problems/deal with it - make it work (somehow)'. Like for example, I am wondering should I go for Unity 2019.4 LTF or shoould I stick with 2019.1f (which I use...
    right now I think I will stay with that one for ever). I Really want to see those new things but - stability/bugs?...the uncertainity it all craps when I load
    project - games must be made and I know Unity said : 'Stick to a version when in far production/too late to switch'...but like some new features Are worth
    switching...but...bugs prob/unstabilitY/possibilty it all does not work...so then must 'backgrade down' to 2019...that happened and it Time Consuming and stressful.
    The Raytyracing I wanted that but now is in Preview and need RTX/1080 Nvidia..now I think forget it, if I upgrade to Unity 2019.3 or 2020 or wait 2019.4LTS
    then what says many other things don'T work - yes we must try but these new features 'tempt us' but also put that fear it will bug the prohect. Now I think
    I will stick with good old raster fake-it tricks and makedo/ miracles with that. When I solid version of Unity comes out 2021-2022...2025? I will then change
    because it will be 'safe enough' and have already upgraded 10 times in almost last 4 years. Game can't be made if stability issues; we wisht make games now
    not in 10 years when it might work. Not trying to sound pessimistic (I apologize because I admire/thank Unity immensely/without you game would never be).
    Thanks for reading.
    Just a 2 cents.
     
  31. AdamGoodrich

    AdamGoodrich

    Joined:
    Feb 12, 2013
    Posts:
    3,782
    Hey there,

    We are the authors of Gaia, CTS and a bunch more.

    I have not read the wall of text as it is late, but I can share my experience as a publisher of a bunch of popular assets on the store.

    1. Pipelines... there are just 3 Builtin, LW & HD... um nope LW is now URP so that makes 4.. but in reality there are a stack more because there are sub releases within every major unity release.

    Our assets have to "just work" across all Unity versions, and all pipeline versions because if they don't people complain, and then 1* us.

    The decision to support the different pipelines for things like terrain water, trees etc, which Unity doesn't have adequate support for has been an expensive nightmare.

    Case in point 2019.1...

    CTS 2019: There are 5 versions of the various pipeline packages that have been released, and CTS has 4 different terrain shaders. So this is 4 (shaders) x 5 (pipeline package releases) x 2 (render pipelines) + 1 (builtin) combinations that we have to support. Crazy.

    Now we move to 2019.2, same sort of deal, and 2019.3... already 3 pipeline releases, LWRP went away, and URP was released..

    PLEASE. Just 1 version per unity version and keep them in sync!

    2. Cross version pipeline compatibility and shadergraph.

    So we thought we would move from amplify to shader graph, and did a bunch of work in custom nodes in 2019.3 thinking could convert this back. Nope.

    Let us control shader graph across versions via script, then at least we could control what gets injected into them and guarantee some sort of consistency & quality. Manually coding graphs across all these versions, when we have the ability to automate would be much appreciated. At the moment it is an incompatible mess.

    3. Package manager.

    Can't wait until it is integrated with the asset store and made backwards compatible across all Unity versions. Lack of backwards compatibility kills it for us.

    Changing 1 line of code to release a patch and then forcing our customer base to re-download and install the whole thing is expensive for Unity, and a poor experience for our customers.

    4. Scripted package installation

    We scripted this up at one point, and then ended up removing it. Trying to support people switching between render pipelines and making sure everything just works was fraught with danger.

    5. LWRP, HDRP, URP etc etc

    We get poor reviews when our stuff is behind the latest thing that is in the unity hub. Everyone see's the latest unity tech demo and expects that our stuff support it. Better messaging about its preview status would be a great help.

    Perhaps not purely package based... this lack of clarity and compatibility within and across unity versions continues to cost us a bomb to support... and then people 1* us and call us "greedy" because we now charge cheap annual upgrades to try and offset the cost.

    6. Latest facepalm moment

    At the time of release Unity 2019.3 the verified version of HDRP 7.1.8. So we updated CTS and rebuilt our shaders to support it. The Unity then released HDRP 7.2.0. and we are now getting reports from our customers that CTS is broken again.
     
    Last edited: Mar 13, 2020
  32. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    It's considered separate in naming only. Upgrading from the last Tech to the LTS is no different than upgrading to a patch release: no API changes, no sudden feature removals, no "everything is pink". Just the traditional "hold on, reimporting all assets from scratch to remind you to not ever dare making large games on Unity".

    Which reminds me: we need to talk about the Library folder. Even with a cache server it takes way too long to rebuild. Since upgrading to even a patch release forces a Library rebuild, it works to discourage developers from upgrading Unity due to the significant downtime it can cause. On projects that package to under 4GB a full rebuild takes around two hours in my experience. It means that, for an AA or AAA sized project, it could easily consume an entire working day.

    I also work with Unreal projects, and there when people check out a fresh project from source control it opens up no slower than if they had opened it for the 5th time, because imported asset data is stored in the assets themselves and is forward compatible. A "cache server" is used only for platform-specific cooked data and compiled shaders.

    There needs to be an effort to study new forms of storing imported asset data in Unity that can be reused across engine upgrades, and be stored in VCS, so that only the person who first imported the asset has to go through the waiting.
     
    Last edited: Mar 13, 2020
    n3xxt, JoNax97, pm007 and 2 others like this.
  33. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
  34. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,979
    Strong agree on all points related to asset imports and library rebuilds. I too enjoy the way unreal handles this by storing the imported data inside of the asset.
     
    Lars-Steenhoff likes this.
  35. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    What I expect from My Assets in the Package Manager:

    1 - The list of packages I own is locally cached, and includes a version number and GUID for each package (as well as unique identifiers for all the assets in that package) and whether or not the package exists on my local drive. Let's call this Cache 1. A copy of this cache should also exist on a Unity server to handle copy protection, but I won't go into that here.

    2 - If I import a package into my project, the asset details (inc. version number) get pulled from Cache 1 and copied into the project's own My Imported Assets cache. Let's call this Cache 2.

    3 - When I open the PM and go to the My Assets tab, I can see a list of all my assets instantaneously. No "load next 100" button. That list is essentially Cache 1 but also uses the info from Cache 2 to tell me which assets are in my project. Yay! Never before seen in Unity!

    4 - I click on a "Check for updates" button and this sends a query to your asset store database (Cache 3), with all the package GUIDs from Cache 1. The query determines if an update exists for that package. Instantaneous.

    5 - The response is received by the PM and added to Cache 1. Instantaneous.

    6 - The PM queries Cache 1 to see if any updates are available for Cache 2. If there are updates, they get shown in the same way they would for Unity's own packages. Instantaneous.

    7 - I click on an "upgrade in project" button (not import or update) for a package with an update and a new query is sent to Cache 3 with the data from Cache 2, including the GUIDs of all the assets from the package I want to update (these correspond to the GUIDs from all my .meta files). This time the query determines which assets actually need to be downloaded from the Store and which assets have not changed since the previous version of that package. So I don't need to download 3GB of textures just because a shader has changed. Only the changed shader gets downloaded onto my local drive, updating the older shader (and Cache 1).

    I'm realising now that I should have framed all of this in terms of git commands as that's likely what would be happening for proper versioning but oh well...

    8 - As I clicked on "upgrade in project" instead of "update on local drive", that newly downloaded/updated shader also gets updated in my project (or optionally imported if there is no .meta file with that GUID).

    9 - The version number of the package in Cache 2 gets updated to the new one.

    Voila. 90% of problems solved. All of the above steps (aside from the downloading/importing bit) should take no more than 1-2 seconds really. Cache 3 is essentially a git repo. So basically it's a simple git fetch, and then optionally a git pull.
     
    n3xxt, bugfinders and Lars-Steenhoff like this.
  36. Metron

    Metron

    Joined:
    Aug 24, 2009
    Posts:
    1,137
    Why, oh why, did they reinvent the wheel and not implement something based on nuget...
     
    bugfinders and JoNax97 like this.
  37. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,979
    I do often wonder this myself. Unity seem to want to create everything inhouse as much as possible which has both upsides and downsides. But nuget is such a widely adopted standard that not using it is actually detrimental mostly.
     
    Metron likes this.
  38. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    Ideally though, the Package Manager would exist in the Hub, and would also act as an asset browser, allowing me to view prefabs, textures, etc. And it would let me view the individual assets in a package, so that I can pick and choose which ones I want to import. As it stands now, i need to import a whole package, before I'm able to properly look at the textures, decide there's only one I want, and delete the unwanted ones.
     
    Metron and MrPaparoz like this.
  39. MrPaparoz

    MrPaparoz

    Joined:
    Apr 14, 2018
    Posts:
    157
    This is so true. Let us open a new project with selected packages from Unity and AssetStore itself while you are at it.
     
  40. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,087
    They didn't. They built it based on npm instead.
     
  41. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,979
    Thats great to know :) npm is also pretty solid
     
  42. jerome-lacoste

    jerome-lacoste

    Joined:
    Jan 7, 2012
    Posts:
    206
     
  43. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I guess for me when it comes to packages, my biggest issue is that my customers often don't understand the more technical aspects of things. They upgrade one package not realizing it upgrades another due to dependencies, and not realizing that some code they downloaded from the asset store has a different dependency which can't be resolved.

    So first, everything needs to go through packages so that cases like these can even be resolved, and that's a few years out from my perspective, since I'll have to support older versions of Unity for a long while.

    Second, there are too many versioning systems in use. Major unity packages with breaking changes should be aligned to unity releases. I know the move to packages was to get away from this, but my users don't ask "Is this compatible with Unity 2019.3.3f0 with HDRP 7.2.1", they ask if this is compatible with 2019.3. When they run into an issue, I often can't even answer their questions without getting answers to five more questions first.

    Unity has a lot of non-technical users, and the current package interface and system scream "for coders" to them. While I personally like it when interfaces are bare bones, if this is going to be the primary place where people install assets, it needs to be a lot friendlier to those who aren't technical. This includes better layouts of information, info graphics, etc.. Even the current drop down to "My Assets" feels weak, it should be more like a browser with tabs for these categories.

    I've also noticed that some of my users have my assets in their My Assets listing and some don't. And they report that the store often has a newer version than the package manager, and then fight with 2020 trying to get the download from the store instead. One user downloaded it in a previous Unity version and copied it over. Now, this might all be confusing with the new workflow, I'm not sure.

    All packages need a define added to the project. Every asset store developer does this manually right now for their assets, but for any package I really need to be able to:

    Code (CSharp):
    1. #if UNITY_SRP_HDRP
    2.    #if UNITY_HDRP_VERSION_718_OR_LATER
    3.    #endif
    4. #endif
    There is no way to make things optionally dependent right now without this, which makes auto integrating things together very difficult. Standardizing this would also save time, since everyone rolls their own for all this stuff.

    Finally, the options around this as an asset publisher are very unclear. What happens when I check "Include dependencies" when uploading my asset? If I'm on URP 7.1.8 and upload my asset, will it downgrade them to 7.1.8 with this checked if they are on 7.2.1? When I create assets I usually uploaded them on the minimum possible Unity version, and have various #if #endif blocks to handle forward compatibility with new versions if needed. This means I ideally want to do the same for package versions. Right now the store only allows things to be uploaded tied to Unity versions, not package versions, and that way of doing things is a lot harder anyway, so I really don't think it should be encouraged and extended into packages.
     
  44. Amplify_Paulo

    Amplify_Paulo

    Joined:
    Jul 13, 2017
    Posts:
    297
    Hi, Amplify developer here.

    Both Amplify Shader Editor and Amplify Impostors have been heavily affected by the way the new packing system works. We now spend considerably more time creating tools and solutions for issues that indirectly and direct came from the packaging system and the way releases are done than actually delivering new features to our products.

    I share the sentiment that the idea does sound good in theory but it's poorly executed.

    We traditionally get three types of users, the ones that don't use the package manager at all and never update their packages besides the main one. The ones that always update to the very latest one. And the ones that for one reason or another are bound to a specific package version (usually for stability).This last group is actually very small tho.

    We developed an additional tool in ASE that automatically detects the packaged installed and imports the necessary shaders and files that were created to be compatible with that version, this was done mainly to support the group that never updates their packages. However, as you can imagine this was and currently is a monumental drain in our time and effort to support SRP. Like it has been mentioned here before, many small increments in the API require shader changes that can range from supporting new features to simple name changes done to the internal functions. In order to save some time we have a mix between support packages that we automatically import and shader changes that actually use another "tool" that writes to shaders what is the currently API version being used. Why? Because a shader can't tell which version of the SRP is currently being used.

    NOTE: Please add versioning to SRP in shaders just like we already have for unity versions.

    As an example, the update from 7.1.8 to 7.2.X broke all of our shaders in many different ways. Most of them had to be rewritten and new "support packages" had to be created. This took considerable time and just like I mentioned before the group that always uses the latest version kept asking for support. And all of this was for what exactly? What were the big new features that were introduced that required such extensive changes?

    I understand that unity can't be confined in changing things for the sake of asset developers, however the message that is being sent to consumers is not that a new beta package has been released that users shouldn't use in their stable branches, in fact it's the opposite, the "preview" versions were left out in favor of the "verified" name which further deepens the notion consumers have that new versions are stable, so they start using it right away. New versions should always be a preview/beta unless they signify a repackage for a stable version.

    And new versions don't come free of bugs either. I've found some that even after reported and confirmed are still present to this day and some have even been denied a fix with the justification of "there's not enough people complaining about it" as if the average user would know how to even detect them, they either see something working or not, and if it's not they automatically think it's something in the editor or something they are doing wrong and not an issue of the underlying system supporting it.

    The 7.2.X also brought the continuation of something we still can't understand. Why are these packages being closed off of extensibility? In this new version even the diffusion profile asset was closed off. Think about it, you have a type of asset in your project that you can't manage by script, you can't reference it or check it's properties. Can you name any other type of asset that you can create in your project by simply right clicking that you can't use it in a script? It's bonkers! We have tried all kinds of solutions, creating our own assembly definitions, hacking into SRP to make it visible to the outside, etc. All of them have caveats, this could all be solved if they weren't close to being with. (which they weren't in the beginning I might add)

    We are now considering creating our own packages to solve some of these dependencies issues. But distributing them is unclear to us. We should be able to distribute packages in the package manager even if they had their own separate place there. So that we could push free packages that would solve some of these dependencies and work around supporting SRP in a more sensible way that doesn't require hacks to do so. Currently it's a mix of reflection, detecting and patching tools, a bunch of support packages and shader hacks. I doubt this is something most asset developers are willing to do. You may very well see an exodus of the asset store if some of these issues are not addressed.

    Now to the questions:
    What is your understanding of the Unity tech releases (eg 19.1, 19.2…) vs Unity LTS releases?
    Each increment adds new features while the LTS simply stabilizes the last increase. IE: 18.4 LTS is the same as 18.3 with additional bug fixes. I see no point in anyone still using 18.3 when 18.4 is out besides some SDKs compatibility issues.

    When you see a new Unity release, what is your expectation of quality?
    New features with lots of small missing parts and bugs with no clear idea whether or not the new features will keep receiving updates.

    Do you expect different levels of quality or completeness based on the release distinction?
    Hard to say, I don't see much of a different between each release that grants it a higher level of quality. I would expect tho that some new features would continue to receive support in significant ways in later versions.

    What is the primary motivation for you to use the LTS vs the non-LTS version of Unity?
    If the project is close of releasing either stick with the current version until the end or change to an LTS version if available and stick with it.

    When we say something is in Preview, what does that mean to you? Why do you feel that way?
    Does the expectation of quality change when you see something in Preview? What drives you towards using something in Preview? What keeps you away from using something in Preview?

    I already explained what I believe the typical user thinks by our experience. Personally, I think previews mean that many things may be broken, which is fine, however I feel that non-previews have mostly the same level of quality that of previews so for the past 2 years I don't see any distinction between them.

    In resume:
    • Make a clear distinction between whats stable and what beta/alpha/preview/whatever so that users know what to expect. Instead of expecting nothing different and then expecting everything to work.
    • Make SRP versions somehow exposed to shaders. Otherwise shader creation will continue to be a nightmare and shader graph does not solve everyone's needs.
    • Allow publishers to submit packages to the package manager (even if forced to be free) that could allow us to create dependencies with Unity packages. Right now there's no connection between Asset Store packages and Package Manager assets.
    • Focus on stability and improvement of the existing tools and features. Currently a lot of them have a great potential that sees little or very situational use.
    • Allow for extensibility of packages, isn't that one of purposes of using this modular system anyway? Let us take advantage of that.
     
  45. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    Is it possible to create one unique template SRP which can be used for both URP and HDRP?
    This template could have materials, shaders , VFX graphs, particles system shaders which can be read in SRP and HDRP.
    That way developers could quickly interchange the assets and there won't be disruptive production.
     
    Lars-Steenhoff likes this.
  46. RecursiveFrog

    RecursiveFrog

    Joined:
    Mar 7, 2011
    Posts:
    350
    @smcclelland This actually seems to be a point of failure on Unity's part in that the SRP team didn't mark a breaking change with a major version update. If there are breaking changes to the API and to the extensibility of the existing libraries that isn't a minor version release. This seems like it should be explicitly an 8.X.Y release if you've done anything to break backwards compatibility.

    This seems like a disregard for the entire idea behind semver if I'm not mistaken.

    Could we at least have assurances that if we want to solidify efforts on a major version of 1 or greater that we can be assured that the API will not suddenly remove functionality or extensibility in a minor or a patch release?
     
  47. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I agree with everything in your post- however, just so you know, because it's not actually documented anywhere and only exists on one forum thread as far as I know:

    Code (CSharp):
    1. #define SHADER_LIBRARY_VERSION_MAJOR 7
    2. #define SHADER_LIBRARY_VERSION_MINOR 2
    3. VERSION_GREATER_EQUAL(major, minor)
    Still, it's a nightmare to develop shaders with even more compile time branches in them than normal. It's already hard enough to get all that stuff right, and working across versions with some of these changes would be extremely difficult.

    Oh it's way worse than just that, because Unity 2019.3f3 will download 7.18 when you create a new URP project, and Unity 2019.3.3f0 will download URP 7.21. So these releases don't even line up to Unity versions, and in fact have no relationship to them at all. Almost every URP release requires a change to something - and there's no clear pace or timing to their release, so you tend to just hear from users that something isn't working and scramble to handle it. Combine that with a package system that upgrades things based on dependencies, and it's not even possible to have a fully working version of everything much of the time.
     
    AnvilNight, transat and Amplify_Paulo like this.
  48. Amplify_Paulo

    Amplify_Paulo

    Joined:
    Jul 13, 2017
    Posts:
    297
    Uau... This has been missing in the docs all this time? How do you people even find this?
    Thx for sharing, this will definitely help, even with longer compile times this is better than nothing.
     
    bugfinders likes this.
  49. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Human suffering. When ever you ask "How did they build that?", like at the pyramids or the great wall of china, the answer is always "human suffering"..

    Oh, and it won't increase compile times since those are not shader_features- only valid cases are compiled..
     
    Last edited: Mar 13, 2020
    transat, hippocoder and a436t4ataf like this.
  50. Amplify_Paulo

    Amplify_Paulo

    Joined:
    Jul 13, 2017
    Posts:
    297
    yeah I went to check and found it

    but it seems they only implemented this since 7.X
    I went to check 6.X and not only it doesn't exist, it only exists in LWRP with the format:
    Code (CSharp):
    1. #define LWRP_6_9_OR_NEWER
    and there's nothing for HDRP

    No wonder we never saw this before
     
Thread Status:
Not open for further replies.