Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Feedback The State of Unity & Packages in 2020

Discussion in 'General Discussion' started by smcclelland, Mar 5, 2020.

Thread Status:
Not open for further replies.
  1. MrPaparoz

    MrPaparoz

    Joined:
    Apr 14, 2018
    Posts:
    82
    I'll go bananas. Please consider:
    1- Rewriting whole engine (or what's is required) with Packages/RPs/ShaderGraph/VFXGraph/VisualScripting in mind to have a modular and stable engine.
    2- Making SRP built-in renderer and let people add features from any RP they want with PackageManager.
     
  2. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    372
    Well, that's kinda the point. They're currently doing just that (with varying degrees of success), the issue at hand is how to go through the process with as little pain as possible. And don't wanna be too harsh, but your suggestion sounds like "solve the problem by making it go away".
     
  3. MrPaparoz

    MrPaparoz

    Joined:
    Apr 14, 2018
    Posts:
    82
    But isn't the problem while doing that so called rewrite, they are doing changes to core engine and that's why there are lots of problems lately? Seems to me, they are inserting functionality here and there to make this work and refactor it later. Almost they are pushing a game to release with after release polishing in mind.
     
    Lars-Steenhoff likes this.
  4. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    Built-in renderer should be removed as soon as possible. Probably adds a lot of weight for moving forward faster.
     
  5. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    7,163
    This is a terrible, ridiculously bad idea. The vast majority of Unity projects are all running on Built-in and require it for long-term release support. Stop making suggestions without thinking first.
     
  6. playemgames

    playemgames

    Joined:
    Apr 30, 2009
    Posts:
    437
    Wow this is probably the worst idea I've seen posted here so far. I haven't even really touched the new render pipeline because it is not completely production ready. Built-in renderer just works right now, where as the new system is a jumbled incoherent mess.

    Removing parts that are solid and working for unfinished parts is what is getting into this mess in the first place.
     
  7. playemgames

    playemgames

    Joined:
    Apr 30, 2009
    Posts:
    437
    My 2 cents are the same I shared in the alpha channel:
    • Focus on stability, push all bug fixes to LTS and stable releases first not second since a lot of time we are left using Tech releases to fix problems when we need it because fixes don't go in stable releases till, more often than not, much, much later.
    • Label the Tech releases as Beta, because that is what they really are these days.
    • Thoroughly test releases in LTS so fixes don't break a multitude of other things in projects. Often times I am finding projects break between LTS releases because they are too frequent and not tested thoroughly.
    • Make console support release independent via a package or plugin so we are not so tied to a particular release that we cannot use the latest SDK's without getting some sort of waiver because new releases only support the latest SDKs and older working releases get the shaft.
    • VR performance past 2019.1.2 has regressed, and I have projects that are stuck since they need as much performance as possible. Things should not be regressing, especially after upwards of a year after release.
    • Fix the bug reporter, it's a toss up if it actually works in the editor release to release, and making repro projects to flag bugs is getting more and more tedious. Add a place to put a link from a Google drive, dropbox, or some kind of cloud based place to make bugs more valid since it often fails to attach projects in its current state and we can go and attach our own this way.
    That's about it for now, it's getting harder and harder to rely on Unity these days because of stability issues, and I have been using Unity for over a decade now. The past year has been especially harsh dealing with projects breaking with updates and the package system mess. Reason we got into this engine was for ease of use, flexibility, extensibility and stability but these days some of them are falling short and having me start to look elsewhere because productions need a clean easy work flow that we can rely on.
     
    n3xxt, Edy, AnvilNight and 3 others like this.
  8. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    282
    Just go full open source and allow people to backport bug fixes and even contribute by bug fixing themselves.
     
    n3xxt, deus0, nxrighthere and 3 others like this.
  9. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    372
    They're kinda sorta going in that direction with packages. If they keep stripping stuff from the core engine and commited to an open development pipeline that allows for PRs, I think that would alleviate a LOT of issues.
     
  10. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    445
    They are only doing that for the C# parts of the engine. The parts written in C++ and locked away behind a substantial per-project price tag, that ranges from tens of thousands to a million or so, based on what I heard.

    There's a stated intent in moving more of the engine to C# via packages and the likes of SRP and SBP, but the pieces which interact with platform SDKs, graphics APIs, and the entire Unity Engine.Object core will remain C++. Also, packages are already causing editor performance issues during enter play, compilation, project loading, and IL2CPP building.

    I think Unity should study forms of putting C++ parts of the engine into packages too. Distribute precompiled binaries plus source for the adventurous. This way they could keep the parts they need locked behind a license while allowing users to at least have a chance of fixing/customizing things like rendering, asset loading, physics, etc.

    For example, I'm right now fighting to optimize a game for Switch, and I'm modifying the PPv2 source to high heavens to get there. If that half-baked package was locked behind a source license, I would be SOL.
     
    Last edited: Mar 14, 2020
    deus0 likes this.
  11. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    811
    The three of you (@jbooth, @AdamGoodrich, @Amplify_Paulo) are some of the most affluent Asset Store publishers/developers [that I'm aware of, at least], particularly in the graphics arena.

    It sounds like one of the toughest challenges that you all have is a lack of actual API access in the SRPs. Have you considered possibly forking the SRP source code and adding such API access yourselves? This would further involve sharing the forked versions of the SRPs with users and pushing them to use those packages instead of the gimped Unity-official ones.

    I am fully aware that this is not a problem for you to solve, but one for Unity to address. That said, I am also aware that these are not new complaints; that Unity's response to date appears to be one of (based on actions - see the @Amplify_Paulo excerpt earlier in this comment). I'm not sure how difficult it would be to maintain such a repository, but it may be something to consider... especially if the time spent opening access to APIs and versioning was less than the time you all spend on trying to work around the artificial walls that Unity keeps throwing at you...

    ¯\_(ツ)_/¯
     
    neoshaman likes this.
  12. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    Then you'd just have way more code to maintain. No thanks- Unity has pretty much killed custom SRP's anyway by directly tying things to their SRPs, such that a custom one would now require forking and modifying other systems, like the VFX graph. Going down that route, it's not long until you basically own the whole stack.

    It would be far easier if Unity would just talk to us about this stuff and figure out a path forward.
     
    n3xxt, Lorash, AcidArrow and 5 others like this.
  13. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    Forking SRP this early is a bad move. SRP is still undergoing big change and I would give it another year at minimum before embarking on custom SRPs for asset authors. Seriously.

    I think custom SRP is only for end-developers or suicidal asset developers. There's no way in my mind this is a thing for a long time yet.
     
  14. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    5,206
    I can hardly imagine any (serious) developer would choose a "community SRP" fork over the official one. It would probably end in compatibility hell too. You still have to support the official SRP's, for all the people who want to use official SRP, plus the SRP fork o_O
     
  15. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
  16. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    Ideally we'd have a single SRP Unity supports (HDRP with URP fast mode, combined) and I don't think that's impossible in the future but Unity's said they don't want to do it. This is frustrating for pretty much everyone.

    As a dev I love URP's peformance on simple devices but I can't have raytracing or support big hardware at the same time. That's a horrendous amount of work for me Unity could've designed out at the gate. With UE4 you can have one set of work and it will scale.

    IMHO SRP should only exist for the end-developer who wants to tweak those things for their custom stuff. It should never ever ever be the responsibility of an asset author giving up 30% to do 60% of the work.
     
    neoshaman, Metron, Fer00m and 2 others like this.
  17. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,780
    Yes the SRPs makes sense for large studio's making their own render pipeline.
    For the rest of us, it seems to make things less unified.

    The fact you have to choose one already makes it limited. Like @hippocoder said we need just one scalable render pipeline.

    The goal was:



    I'm not sure we are there yet
     
    konsic likes this.
  18. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    811
    To be clear, when I say "fork the package", I don't mean "create a new package based on the old one with a different name", I mean that it would literally be the same package with the same name and the same functionality albeit with some "public" keywords and a handful of versioning APIs thrown in for accessibility. In a "package" sense, the forked version would be the exact same thing as the Unity-official one, albeit grabbed from a different repository (npm, upon which PacMan is based, allows you to specify GitHub/git branches/versions/commits as the "version" in the package manifest).

    That said, I totally understand everyone's objections here and am not trying to argue that what I've tossed out is actually a good idea. It is constructive to hear why this would be a bad idea as it highlights other issues with the current setup.
     
    neoshaman likes this.
  19. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    Instead they have done the exact opposite. They have removed flexibility by making it so you have to pick between features and can't reasonably customize the pipeline without breaking other systems, while pushing the complexity onto us.

     
  20. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,780
    olc7i.jpg

    But at the same time limitations on how they work together.
     
  21. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    372
    Hey, just out of curiosity: How would you all have felt if, around 2017, Unity had announced a new engine, saying that it would be bare-bones for at least 3 years but it would be made from scratch, would only contain new tech (like DOTS, SRPs, package manager, etc) and would bring no leagacy baggage from current engine?

    Would that have been worth it if it had implied putting the current engine (5.6 at the time) pretty much in maintenance mode? Take into account that something like that probably wouldn't have been usable until like last year, and not production ready until like 2022 or something like that.
     
  22. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    5,206
    Wasn't the point of SRP to avoid exactly this? I thought the idea was to cater different SRP's for different purposes.

    I believe one argument for SRP's was: built-in supports everything and this makes it difficult to maintain/extend and performance isn't as good as it could be. So they came up with LWRP and HDRP, makes sense to me.

    However, what we're seeing now is that they actually cram in all the stuff into URP again. I imagine this becomes the next monstrosity, just like built-in, because it starts doing everything again.

    I think rather than having a single RP that supports everything, they should work on SRP compatibility. The actual problem isn't having two RP's. It's there is no easy way to switch between RP's.

    Providing mechanisms to swap SRP's without breaking everything would not only make asset store developers life less miserable, it would actually allow the full potential/purpose of SRP: To create even more optimized/tailored RP's!
     
  23. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    Unity is making Unity 2.0 right now. In fact some staff have called it such. DOTS, SRP, Packages. These are all part of the new Unity. Problem here is Unity could not afford to do as you suggest, for exactly the same reason microsoft chose not to make a break with windows.

    By keeping compatibility, no matter how broken it is, you keep the customer and so it was never ever going to be on the table that they'd make a separate Unity.

    I still want HDRP + URP together though, because my project should scale. I should not have to redo all my scenes and redo all my art. Just make HDRP have a high perf option under the hood.

    Nope. SRP was about flexibility, right? and there is nothing at all preventing HDRP scaling down to URP perf.

    It's a big job now but the longer Unity waits the bigger and more unwieldy that job becomes. In the end you wait 5 years and HDRP runs on everything but hey - it's still not the default pipeline...

    SRP should exist, certainly. But nobody is asking Unity for two pipelines to fragment the users this soon. And merging them will not make it slower since SRP designed out that problem with multiple camera callbacks and all that nasty stuff in builtin. That was one of the main problems.

    what happens now is that URP will get more and more post effects, deferred etc but:
    • still be too slow for devices like switch etc to use this
    • still not look as good as HDRP on devices that can use this
    Frankly, it's not quite working... URP started losing it's reason the moment they started going heavier on visuals because you eliminate the low end it specialised in but still don't reach HDRP. So it forces people to use URP (I'm on URP) and ignore DXR/high end. There is no choice for most indies now.

    Don't get me wrong, I'm happy with URP, it's a great pipeline. But it is a pisser that you can't scale it to HDRP or vice versa. Asking customers to port their whole project from URP to HDRP is a nightmare, since for a game on sale, you would be supporting two separate Unity projects. Not something I will ever do for obvious reasons!

    Gonna bow out now, but it's a shame I cannot use HDRP even though I want to, because as an indie I cannot afford to port to it and maintain a cross platform game constantly.

    URP + HDRP are already too big for the limited devices that LWRP began with, and have eroded the reasons they should be separate already, much to end user chagrin.
     
  24. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    811
    I initially saw these as stable versions of Unity - back when the 2017 TECH stream started. Over time, I've come to see them as a continuation of beta. This is thanks to the fact that many releases land with serious known issues but are "blessed" for what seems to be reasons of date: that "we need to get this out so we can start the beta of the next TECH release".

    The problem with this approach is that unlike the 5.x series, users aren't guaranteed to have the issues resolved within the span of their current point release's lifetime. Here's why that's an issue:
    • Unity 2020.1 has some theoretical feature I want to use.
    • Unity 2020.1 launches with many known bugs that will hopefully get fixed soon. One of those bugs affects me.
    • Unity 2020.2 launches and my bug isn't fixed.
    • Unity 2020.2 has its own list of known bugs that cause crashes in other areas that I'm also affected by.
    • Unity 2020.2.3 has a fix for my bug. I cannot update because now I'm affected by other, newer 2020.2 bugs.
    Do you see how this is an issue? This very quickly leads you to say "Well, it's LTS or bust." You could extend that to the .3 TECH release as well, of course, because that is effectively known to be the "LTS Beta release"...

    I mean, they may as well be. At the most recent GGJ I used 2019.2 and things were okay, but we also were being extremely conservative with the features we used so as not to rock the boat at all. That TECH release had a relatively long life cycle, though, and was the main recommended version when you announced that you were switching to a two-TECH version release cycle instead of three.

    In order for me to feel comfortable using a TECH stream release in production, you'd have to provide bugfix support for each major TECH release until LTS was released. If a new major feature landed in 2020.1, I wouldn't want to have to upgrade to 2020.2 in order to get a bugfix for an issue I reported on 2020.1.

    Nothing standout, no. A major contributing factor here is that you do a terrible job describing what is a feature built into the TECH release or that is simply shipped along with it as a package. That whole situation opens any editor release up to Package Manager confusion and bugs in packages making the release feel unstable.

    For us, the Animator system has been extremely crashy in a small demo project we've been working on. This doesn't feel like anything new, however, just what we've run into most recently.

    I'm also not the best person to ask this question to as I'm extremely fortunate to work in the one area that Unity doesn't actually care about: Audio. The AudioSource API has remained ~100% unchanged since Unity 5.x, receiving only a handful of esoteric, under-the-hood bugfixes over the years. There was a recent conversation in the old Alpha Testing list called "Unity 2019.3F6 is way too buggy, major course correction needed" that highlights more specifics. Track down someone internal with access to that mailing list if you'd like to see the contents.

    I joined the old Beta Testing list back in 2014 and then the Alpha Testing list in 2016. I only recall needing to communicate whether things were ready for launch or not a handful of times (see the Audio API stability note above). One particular circumstance was with respect to how assets were being imported by the Package Importer - as I recall, a change was made that would break the update process for many Asset Store packages and it took many, many words to convince the release managers to delay until a fix was added. Overall I do believe that the alpha/beta cycles used to be better. With the set release cadence, however, the number of bugs that appear to get triaged as "will fix post launch" feels as though it has grown.

    There is great value in this, yes. Getting early access to features is particularly useful for Asset Store publishers and others who would like to use such a package as a building-block, provided there is a clean and clear way for those early users to provide feedback.

    First, you really need to DROP THE "PREVIEW" MONIKER. Just follow standard development practices and call it what it is: "alpha". Integrating this into standard semver usage would be very, very welcome. The package manager itself should have the ability to filter out packages that happen to be in alpha, beta, or RC status. In addition, there should be a clear, official way to report bugs for a package. Do we use the Unity Bug Reporter? Do we report bugs we encounter on the GitHub repo (if one is known)? Do we post on the Unity Forums somewhere? If major features are going to be broken out into packages, then either the Unity Issues site should be filterable by package or there should be some clear direction on where to go for a clear set of release notes.

    I would expect there to be a loud warning or error when I started the version of Unity. It would show me a dialog with the identified issues and any potential ways to address it. Even better would be if the Unity Hub itself were to highlight compatibility issues between a project and the selected target Editor version.

    I mean... that should show a warning dialog that explains the situation and contains links to release notes for the relevant versions of the target packages. This would allow users to check any changes on their own to ensure that there won't be some sort of crazy cascade of breakage as a result of the action.

    Yes. I understood that the "Experimental" tag was for features that were truly "in flux".

    You realize that this is extremely confusing, right? There is no clear progression from Experimental→Preview→Released. Why are you so fixated on these labels? Just call it what it is: Prototype, Alpha, Beta, and Release.

    That said, to answer your question:
    • Experimental: The APIs and features aren't locked in stone yet. They're very likely to change. Basically, this is somewhere around Prototype/Alpha.
    • Preview: The APIs and features are mostly locked. "Bugs are expected and we appreciate your reports - please give it a shot and let us know if you run into problems!" This is somewhere between Alpha/Beta.
    • Release: This is released and we fully support it. Expect bug fixes in this version for quite some time! This is the same as the standard "Release".
    But seriously. I don't see why Unity seems so fixated on ignoring software industry best-practices here. What is the problem with Prototype/Alpha/Beta/Release?

    Before I started using packages, I had the understanding that this was an officially supported feature that would soon hit Unity: I should expect to see a pretty consistent release of bugfixes. Now, after I started using packages, I have the understanding that you're lucky the package is even maintained anymore, let alone if any of your bug reports would actually be seen by a human...

    Umm, yes? Isn't this obvious? It's not like package maintainers get to somehow skip the LTS lifecycle simply because they're not a part of core, right? I would expect that a released semver-major version of a package have the same lifecycle as the LTS version of when it was released. Bugfixes should continue flowing during that entire time and not require an update that includes features.

    • Support: As mentioned above, you will provide bug fixes for that release in the year-major version(s) of Unity that it was initially released alongside in the same manner as though it were built-in. This means actual LTS support.
    • Active Development: You can continue developing out new features and changes. This triggers a major version bump and the package goes back through the standard [Prototype]/Alpha/Beta/Release cycle before official release. Once released, it falls into the support category above.
    This... is basic software development practice.

    Yes.

    How would it work otherwise? In what world would I possibly consider building a long-term project on Unity and use the URP when I wasn't certain that it would be feature locked with guaranteed bug fixes during the LTS period? New features mean entirely new classes of bugs that may or may not have unintended consequences for me.

    No. I would not expect that to be added to the 2018.4 LTS branch. I would expect that a future version of the URP that did receive that change might happen to be backwards compatible with the 2018.4 LTS version and that I could switch to that if I wanted.

    This assumes that such a feature (e.g. SpotLight Shadows) didn't require a change to the 2018.4 Unity core engine, right? Core Unity 2018.4 aka 2018 LTS should be feature locked.

    Let me clarify that previous paragraph a bit: I am making the assumption that adding a feature like "SpotLight Shadows" would be considered a "breaking change" (as was the change from URP 7.1.8 to 7.2.1 already highlighted in this thread). If there were a way to add such a feature in a way that was 100% optional, had zero performance implications [unless enabled/used], and didn't require any changes from users, then perhaps it would be fine (this would warrant a MINOR version bump). But in that case, I would expect it to be simple enough that I as a project maintainer might simply back-port the changes on my own as the changes should be visible on GitHub, right?

    I don't understand what you mean by "installing packages outside the 'Unity Kernel'". I will continue with the understanding that you mean "outside the corresponding 'Unity Kernel' version".

    Yes. I would 100% expect guidance and structure here. The "Unity Kernel" version should be effectively treated as "just another package". The difference is that it is considered the "Engine" in npm-lingo (reminder: PacMan started life as an npm fork). The npm manifest has this real nifty option called engines. The documentation reads:


    Replace the term "node" above with "Unity" and the term "npm" with "PacMan" and you'll see how this would really help you out here.

    I would expect that the "engine compatibility" limitations be mentioned in the package description and I would expect actual warnings/errors/klaxons to go off if I attempted to open a project with a package that was known-incompatible with my current project. Bonus points if the Unity Editor (or "Kernel") ran its own compatibility check online with up-to-date information (in the event that I'm on an old project wherein the "max compatible version" for some package was "uncapped" at the time it was installed - there are other ways to handle this as well, of course, but we're starting to get down a rabbit hole here).

    To be clear, when I mean "operating on my own", I mean that I would be in "unsupported territory". I would have the expectation that "Unity only officially supports version X of some package in my "Unity Kernel" version, but that I can attempt to run version Y of that package. If I do so and things appear to work, it is beyond the scope of support. In that case, I expect that I may ask questions, but that the response may be "sorry, that configuration is unsupported."

    That comment had nothing to do with UI/UX for installation, etc. Simply that I would be outside of the "Green" area in a "supported-by-Unity" version matrix.

    Or, you know, MINOR releases as well - as was the case with the URP 7.1.8 to 7.2.1 change that has been called out by several people in this thread.

    To be clear, the current semver spec defines a MINOR release as:

    When you add functionality in a backwards compatible manner.

    Breaking changes must be relegated to MAJOR releases. For the avoidance of doubt, a "breaking change" is one that is not "backwards compatible". The SRP team seems to treat breaking changes as "okay" in MINOR releases which is decidedly not semver.

    I hope that the followups above are similarly helpful.
     
    Last edited: Mar 14, 2020
  25. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,780
    Just now I was importing an asset from the store and a popup came stating the asset has package dependencies and if I want to install / upgrade them. or not.

    There was no info about the package dependencies and no info to what version they would be upgraded,
    perhaps even downgraded?

    It would be helpful to show the dependencies so I can make an informed decision if I want to upgrade / install the dependencies.

    Screenshot 2020-03-15 at 13.37.11.png
     
    Last edited: Mar 15, 2020
  26. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    7,133
    Yes, please. Is there a way to vote for this? I want it too.

    With Switch being part of "the consoles" and HDRP not supporting the Switch, I have to make the decision now of either *worse visuals for all platforms* and *supporting the switch*. And I don't want to make that decision. I don't have to make that decision with Built-in, I know what to do to make it scale. The current situation makes me sometimes dream of what would have happened if instead of abandoning built-in, they kept iterating on it and cleaning it up.
     
    Last edited: Mar 14, 2020
  27. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    I'm sure this was a big internal debate. Really the only part of the engine which is radically new is the entities framework. The job system and burst compiler all work fine without changing the workflow of the engine - and I think in some ways Entities is held back by the current engine, in that it leads to a lot of time being put into hybrid solutions to solve problems that would have to be addressed earlier in a breaking re-write. SRP is pretty orthogonal to DOTS and packages - and the problems it has would exist in either case.

    Yeah, once I heard the announcement of URP it seemed to me the vision for SRP completely changed, and that there needed to be a real evaluation about what that meant, because it means "URP will support everything that standard did in some manner".

    In truth the hype about SRP's allowing them greater flexibility with less code is mostly incorrect. In practice its lead to massive amounts of code duplication, and both systems solving the same things in slightly different ways, while both feeling incomplete and being incompatible. While the rendering techniques they use are different, there is a lot of overlap and very little which requires them to be separated. An engine can co-exit with forward, deferred, tiled plus, raytracing, etc in it. Hell, these all exist in HDRP together because you have to have deferred, forward, and raytracing passes for your shader. You don't need to write shaders in some fundamentally different way for deferred, tiled, or forward renderers - just the inputs and outputs change, but the actual "what are the inputs to lighting equation" part of it doesn't change at all. Further, one having raytracing and one not, one having Parallax Occlusion Mapping or Tessellation and the other not, etc, is mostly just about what code you choose to put into that pipeline. They run on the same hardware, they can do the same things, they run the exact same code, there's nothing special in most of these cases. I don't rewrite tessellation for URP, Standard, or HDRP, I just have to hack it into different code structures in each because they are arbitrarily different.
     
  28. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,834
    What is your understanding of the Unity tech releases (eg 19.1, 19.2…) vs Unity LTS releases?
    The LTS releases are what I use as I normally expect to see better performance from them in compiled builds. I assume this because they would have ideally been gone over for bugs and performance regressions many times over. The thought is that the Unity team has squeezed every performance fix they possibly can from both the editor and builds. I don't really know what Tech releases are? I've stuck with Unity 2018.4 this year.

    When you see a new Unity release, what is your expectation of quality?
    I started using Unity in 2007 or 2008 during the 3.xx series. I've used it every single year since then. The last time using the engine was a great experience for me was the 4.6 series. The editor was responsive and quick. Lightmap baking was very fast. The workflow was simplified and straightforward. Once Unity 5.xx came along I noticed that each version of the editor became "worse". By worse I mean the editor felt slower and more sluggish. Baking lighting ended up taking an extremely long time. It became a tedious and tiresome process to use the editor to do things that were once simple and straightforward. The overall user experience just became poor. Nowadays, out of pure fear I don't install any of the newer versions of Unity. Almost anything imported into newer versions resulted in red compiler errors, pink shaders, obscure errors about the editor UI, compiler errors concerning missing packages. I remember simply trying to import the standard assets into 2018.4 or 2019 and it immediately had red compiler errors. In a short summary, my expectation is very low. I expect a ton of incompatibilities and strange compiler errors regarding packages. I expect compiled builds and the editor are both going to be slower...which I see user research reflecting on the forums (I always check ahead of time.) @Peter77 is doing great research into performance on a wide range of editor versions.

    Do you expect different levels of quality or completeness based on the release distinction?
    I lean heavily on the distinction of LTS to determine which editor versions will be the latest stable with the best performance. If I don't see the LTS label I assume it's in some experimental or beta stage and has quite a few broken bits. As a related note, it's a shame that the latest LTS is 2018.4. This is 2020 and you'd hope that at a minimum we'd have an LTS 2019.xx right now. It's disappointing and it doesn't inspire confidence in the team.

    What is the primary motivation for you to use the LTS vs the non-LTS version of Unity?
    For the stability of the editor, compatibility with a wider range of packages from the Asset Store and for the performance improvements in compiled builds. This is based on the assumption that performance regressions were aggressively addressed and improvements were also found.

    When we say something is in Preview, what does that mean to you? Why do you feel that way?
    I see it as an experiment and limited to a few users who designate themselves to be a sort of remote QA team. I feel that way because a lot of the preview packages I see require a very specific version of the editor (usually bleeding edge versions), they are not intuitive to use and usually require the user to follow steps from a cloud document, they usually require some kind of special package or library imported alongside, or random dependencies on other versions of frameworks, editors, or APIs. It's all a bit of a mess to be honest. Preview = Experimental to me.

    Does the expectation of quality change when you see something in Preview? What drives you towards using something in Preview? What keeps you away from using something in Preview?
    I addressed this in the question above. I think this question is pretty much a mirror of the one above it. I don't look at preview packages as robust, built-in parts of the editor. I usually expect them to be incompatible with an older LTS version and only work in bleeding edge versions.

    As an aside I do understand that packages were supposed to make Unity more modular and slim down what the users needed for their specific use cases. Instead it fragmented the engine into a million, mostly incompatible pieces. Unity, from the user perspective, is no longer the product it used to be. I don't mean this negatively, in my view it's an opportunity.

    A single image defines my user experience:
     
    Last edited: Mar 15, 2020
  29. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,866
    @smcclelland thank for your reply, here are my answers to your questions:

    are there examples of situations where this has burned you upgrading a Unity version and having the package thrash a project?

    My own experience was opening a HDRP project via Collab using the same exact Unity version that was used by the other party to create it. As it required the HDRP preview packages I had to go to the Package manager and download them. Lighting was entirely broken because the HDRP package had received a minor update (z in x.y.z).

    In the instances where you upgrade editor versions, would you expect on project launch to be informed that the package also needed to be updated?

    I can't see how that could be useful. If I'm upgrading the editor, why should I also upgrade the packages? The project was created using specific package versions. If those packages were on Preview then using a different version involves a significant risk, as in the example above.

    At the very least, when I open a project the first time (e.g. via collab) I'd expect to be informed if the package versions I have installed match the versions that are being used in the project. A project may require a specific package version. Using a different version may require an upgrade process.

    Would you expect the package remained forwards compatible so the old version worked with the new version of Unity?

    Yes. Unless the package really requires something very specific of a recent Unity version, it should be forwards compatible. They shouldn't use features from the latest versions "just because they're there", but only because the package really needs them for their essential operation.

    I understand this is not possible for some "cutting edge" packages that require the latest Unity features. In these cases there should be different versions of the package aligned with the maintained Unity versions. For example, there should be a maintained HDRP version for Unity 2018. It should use a closed set of features, but it should be worked on and receive support and bug fixes. Same for every maintained Unity release. If a project is based on Unity 2019.1, then there should be a maintained HDRP version for Unity 2019.1. Having to upgrade the Unity version just for receiving the fixes in a package is not acceptable.

    From the code side, what would you expect API or query wise to reason about what packages exist or are in use?
    • Preprocessor directives that allow to compile code selectively based on a package being available or not.
    • Preprocessor directives to determine the package version currently installed.
    Please understand that the above is critical for the sustainability of the Asset Store and projects where the reusability of the code is important. We need to be able to figure out the precise features and API available in the Unity version and project the code is being used at. This was easy before the packages (as for the platform dependent compilation), but became a hell afterwards.

    In your eyes, is our release cadence too fast?

    Yes. Currently there's no way to "settle" and work with a reasonably stable set of features and API unless you adhere to the LTS releases and non-preview packages exclusively.

    You mention tying it to specific events so would you rather see us take a more long-term stance of the product and ship when it's ready instead of fixed target dates?

    Absolutely. I really like the approach described by @Deozaan, where there's a stable branch an features land there only when they're verified to be entirely completed:
    For example: all the new features reside in the beta branch until they're completed. Twice a year (i.e. March and September, as for the events in these dates) you take all the completed features and release a new stable version with them.

    If a feature is not fully completed, even if it's only about the documentation, then it will be postponed to the next stable release. Dev teams may then keep working on testing and stabilizing it even more, or even start working in the next move. The stable release itself may be delayed one or two months if deemed necessary. There should be flexibility to release a version when it's really ready to be shipped.

    Anyone wanting to use such features in advance could download the Beta version anytime, as it should be already stable enough. Note that today is very similar, as the Tech stream is pretty much equivalent to a Beta branch.

    I know it's cool talking in the keynote at GDC/Unite like "Here's the latest and greatest Unity and you can download it right now!" "<Applause etc>". But actually that makes me REALLY affraid of it, as that means the teams had to rush to have it somewhat ready for that event. What I'd love to hear is something like "Here's the latest and greatest Unity, you can download it already from the beta stream, and we expect it to land in the Stable branch in ~5 weeks when we finish polishing it. Give it a try!". I'd certainly do.

    Sounds like the confidence level in Preview for you is quite low?

    Yes. Unfortunately, that's has been my experience with Preview packages so far. A minor package upgrade breaking working projects.

    How could we improve the quality of Preview packages for you and make it so you'd have a bit more confidence in our ability to deliver them?

    Adopt a terminology that better allows to figure out the state of the package. I think "Preview" is too ambiguous. Alpha, Beta, Stable, are better and widely known terms. Again, I think the description from @Deozaan nails it:
    Use "Experimental" for stuff you're just toying with, but don't really know where it will lead yet. These change and break things anytime (nightly), and may be adopted or discarded later. Actually releasing an Experimental feature requires it to enter the Alpha - Beta - Stable cycle.

    Adopt a versioning system aligned with the Unity version lifecycle, so every Unity release has its corresponding maintained package versions.

    In sumary:
    • Strict Alpha-Beta-Stable cycle with clear unambiguous protocol:
      • Alpha: very raw, lots of instability, lots of potential changes expected in the future (currently this is Preview and Alpha/Beta).
      • Beta: feature complete, mostly stable, and mostly fleshed out but in need of finishing touches, bug fixes, and/or documentation (currently this is Tech).
      • Stable/Release: all that done and be very stable (currently this is LTS).
    • Two Stable releases a year: March (GDC) and September (Unite). These receive only the completed features from the beta branch. Flexibility to postpone the release dates.
    • Keep the latest 4 stable versions maintained. This is a lifecycle of 2 years per stable version.
    Packages:
    • Adopt the same strict Alpha-Beta-Stable cycle.
    • Versioning system aligned with Unity version lifecycle.
    • A branch of the package maintained for each stable version of Unity (max 4 at a time).
    • Forward-compatibility as long as the package doesn't require features from the latest Unity versions.
    • Experimental packages for toying with features without restrictions (currently this is Preview). Releasing an Experimental feature requires it to enter the Alpha-Beta-Stable cycle.
    • Critical for the Asset Store: Preprocessor directives to allow the code to be aware of the packages and APIs available.
     
    Last edited: Mar 17, 2020
  30. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    811
    @smcclelland This question is somewhat surprising, speaking as an Asset Store Publisher. This is the exact thing expected of all Asset Store publishers. If incompatibilities arise, you address them and either release a forwards-compatible version with your oldest-supported Editor or you add a new upload that covers that version of the Editor-and-newer.

    I guess I shouldn't be surprised here, though, as I've certainly run into many situations where Unity Technologies-published assets tend to be some of the worst-supported and least-forward-compatible products on the Asset Store. As an Asset Store publisher myself, I understand the challenges here: it takes a serious commitment to maintain your released projects. Currently it's hard to tell with Unity Technologies' packages (both PacMan and Asset Store) what will continue to receive support and what won't. This has to change.
     
  31. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    Well they treat it like Unity releases- as if there's no forward compatibility requirement because everything is in the latest version. When they make a change to their shader they don't have to worry about what versions it's going to work in, because it only ships with that version of Unity- and they are treating packages as if it's the same, but it's most definitely not.

    Any real project is going to run into cases where different providers are on different versions of a package (most commonly in ones where the packages are changing quickly). The simple "change and abandon old versions" doesn't work when releases are not sync'd between everyone involved.
     
    neoshaman, konsic, transat and 5 others like this.
  32. StephanieRowlinson

    StephanieRowlinson

    Joined:
    Jul 23, 2014
    Posts:
    120
    Tagging in my colleague @PatrickHogenboom who's handling our VR implementation to answer those questions. I just know it blew up, not familiar with the details.

    Fixing API's till a major new version is always a good idea imo. The only problem I'd see for packages is the same one of communication. Right now there's no clear channel to keep an eye on to see when the next major version will arrive.
     
  33. Lorash

    Lorash

    Joined:
    May 6, 2019
    Posts:
    215
    The survey's questions are stupid, I stopped halfway through because of the twisted questions and me "liking" or "tolerating" something or a lack thereof when my problem is with the fact that there are packages at all. It's a failed experiment. Packages are just another thing you need to deal with and upgrade, but since package versions are strongly tied to Unity versions and you can't update them separately anyway I just don't see the point. It however nicely complicates things as your Unity version no longer determines your set of known and fixed bugs but you also need to deal with the packages (and consequently updating your version of Unity anyway).

    Packages are also making the Unity experience fractured, each package lives in its semi-isolated world and comes with its different ways of doing things (Cinemachine uses public m_something fields as its naming convention, the shader graph serializes into this horrible double-layered one-liner whitespace-filled JSON kludge while everything else uses YAML including the VFX graph, some packages need MonoBehaviours, some packages need DOTS, etc.), which means more work for package and asset authors (for instance in practice you can't really switch over to the new input system entirely, you have a choice of the extra work of dealing with both of them in "Both mode" or just sticking with the old one, defeats the purpose) as they now need to deal with the almost 2^n complexity of what packages are and aren't being used. On the other hand packages obviously have dependencies so you're not really getting the theoretical (and mostly academic, to be honest) benefits of splitting the features apart.

    Additionally, things being moved from C++ engine-side to mostly-C# packages are also making it very apparent that Unity's alternative asmdef-based way of building C# just doesn't scale as more and more code gets added. msbuild on the generated csprojs is about an order of magnitude faster even doing a full rebuild than Unity is with an "incremental" one in a "modern" project with a bunch of packages compared to something old with a "monolithic" Unity. On the other hand this alternative system makes it that none of the community msbuild-based tools work, including many great NuGet packages. Speaking of NuGet, Unity packages being Java-style packages managed with node.js instead of the de facto .NET standard way of dealing with packages just reeks of Not Invented Here syndrome.

    ---
    1. What is your understanding of the Unity tech releases (eg 19.1, 19.2…) vs Unity LTS releases?
    Anything but the LTS would be considered beta or preview in most sensible companies (and I'm fully aware that those companies are rare).
    1. When you see a new Unity release, what is your expectation of quality?
    Low. If it's new it's very likely it needs at least five more rounds of patches.
    1. Do you expect different levels of quality or completeness based on the release distinction?
    Yes, but not in a balanced way, more like "there might be some quality at the LTS end of the spectrum"
    1. What is the primary motivation for you to use the LTS vs the non-LTS version of Unity?
    When I want to do actual work that's LTS.

    And more package specific questions we would like feedback on:
    1. When we say something is in Preview, what does that mean to you? Why do you feel that way?
    It translates to "this is bad even by Unity standards".
    1. Does the expectation of quality change when you see something in Preview? What drives you towards using something in Preview? What keeps you away from using something in Preview?
    Not fundamentally, it might shift from "you can't really use this anyway" to "don't even dream about using this one".
     
    Edy likes this.
  34. Lorash

    Lorash

    Joined:
    May 6, 2019
    Posts:
    215
    Oh, and this, so much this. I've stopped checking out betas and giving feedback in the beta forums because why bother.
     
    n3xxt and neoshaman like this.
  35. StephanieRowlinson

    StephanieRowlinson

    Joined:
    Jul 23, 2014
    Posts:
    120
    Okay, I'm almost back up to date with reading this thread again and I just want to take a moment to re-emphasise the need for better communication with regards to packages.

    Right now I have no idea who's behind most of the packages, what they actually plan to do with them or when they roughly expect to drop them. I can probably find all that information through a combination of google searches and forum trawling. The point is that I shouldn't have to. Don't just throw out a blog post and have maybe one talk at an event be the primary source of documentation for a feature for a year! Have an easy to find place for learning about which packages currently exist, links to their documentation and the development plans for the them.

    Speaking of the event talks, I love them but, your YouTube channel is easily the best source of documentation you have at the moment. For a few years now I've seen people getting directed to the the latest talk for ECS documentation. What's almost worse imo, is that some presenters use the Unity Slideshare page to share their slides and when I mentioned it to the head of documentation at Unite, he didn't even know you had one.

    I know I keep on hammering on documentation, but if we actually knew what features were supposed to do and how to use them, we'd all be a lot less frustrated with them. Half the time now a new feature is talked up in a blog post and once you've found it (blog posts rarely specify if it's a package or not) half the functions mentioned are on some nebulous roadmap that we'll never see.
     
    Hypertectonic likes this.
  36. Lorash

    Lorash

    Joined:
    May 6, 2019
    Posts:
    215
    No need to ride the BS train. I worked at a company with shameful quality procedures, but everything was all technically compliant and they kept growing their shiny ISO number collection. They mean almost nothing in practice.
     
    jbooth and Lars-Steenhoff like this.
  37. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    There's no mandate or certification which can prevent mediocrity. In fact, the gravity of the world pulls to it, as it's the average after all. It actually gets harder and harder to prevent as you scale, as every social norm, every decision norm, they all pull to the average in order to maintain social cohesion. In most cases, the velocity you created as a small company is what floats the large company for many years. Unity built up a ton of that velocity from enabling novice developers and small teams- which ironically the current environment is moving away from. On the flip side, they've leveraged that velocity to take bold risks and giant leaps as well (HPC#, etc). A ton of management practices are directly about managing this problem without explicitly stating it- scrum, hierarchies, competing sub-companies, etc, are all ways to attempt to have high velocity groups which can escape the gravity of mediocrity within a larger structure. The disruption needed to maintain that level of quality is often a major cause of friction within the social norms, and often crushed in the name of them.
     
  38. Metron

    Metron

    Joined:
    Aug 24, 2009
    Posts:
    1,045
    Unfortunately, this is not correct. In fact, some ISO certifications are needed to work in specific domains (i.e. medical). So, no BS, just plain sense if you're running a professional business outside the gaming industry.
     
  39. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    It is no bearing on quality even if mandatory (unfortunately). It could even make matters worse since it's trivial to comply but if appearing to comply is all you do...
     
    Lorash and StephanieRowlinson like this.
  40. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    772
    I would never go to any medical clinic that employs former Unity QA staff, no matter how many ISO certifications that facility has. :)
     
    Lorash and Peter77 like this.
  41. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    The QA staff are the heroes here though.... IMHO. They have to go through some (thousands) of really brutal and S***ty reports, lack of examples and so on, then verify that swill before enacting the shawshank redemption and coming out the other side free and cleansed, holding high a single valid bug report.

    So please don't attack QA :)
     
  42. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    445
    Seems those heroes need reinforcements:
    https://forum.unity.com/threads/crashes-after-updating-to-unity-2019-3-4-2000-on-android.845293/
     
    Lorash likes this.
  43. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    True but lets all stay on topic best we can from here on out.
     
    KokkuHub likes this.
  44. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,424
    everything about what Unity is trying to do is hard; no one is slacking or incompetent, it’s just very hard to handle this many moving parts and not have issues. And once you do it can be hard to change direction. Packages, SRP, DOTs are all attempts to solve previous issues and deliver a better product, and the foresight required to see the problems they would create ahead of time often requires having made that mistake before, or sitting from a specific vantage point. The best thing for unity to do right now as pause and evaluate, which is part of the reason they created this thread.
     
  45. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    315
    I am a heavy user in my personal projects, but I also teach. Our university has a large medical school, and they are keen to find students to work in Unity for multiple VR/AR projects at the graduate and undergraduate level.

    Teaching Unity to undergraduates last spring was a painful experience. I can't just focus on the aspects of learning the game engine, I have to explain about Packages and deal with outdated learning materials, both from the Unity example projects and things students find on their own.
    Working in AR I pay no attention to the designation. My assumption is that any new version is a bug fix or update to use the latest tech from mobile devices, so I always immediately upgrade to the latest package regardless of it's designation.
    Honestly, I find the release versions to be no more or less stable than the previews. I tend to push the tech, so I want to use a VFX graph in AR with URP. I will upgrade to the latest versions in all of them. Occassionally, when one package upgrade breaks another package, I will get upset and frustrated. But, what is more common, the upgrades fix bugs I encountered prior.

    I can tolerate a lot of fiddling for my personal projects, but as a teacher I find the students question my advice when the product seems unstable. I'm hesistant to teach this again, except to graduate students who are more likely willing to fiddle as much as needed for a thesis project. My advice is to simplify things somehow.

    Maybe templates are the answer in some way, but I wouldn't want to have a complex template that people learn, only to find the template changes. That would be like learning two game engines and workflows.

    Edit: Having caught up on the thread somewhat, I would agree the SRP concept is flawed and having URP, HDRP as a fork in the road is a real barrier at the beginning of a project. I do think a single HDRP option, with the ability to 'turn off' things until it becomes URP would make everyone's life a lot easier.

    When I first heard the concept of SRP, it seemed like this might be cool. People can release and develop SRP outside of Unity's and we'll have all kinds of options for rendering. The reality is, who would ever do that outside of an internal company pipeline? Right now, I have to pick URP or HDRP and can never go back. I would be in the same boat (or worse) with some other developers SRP. It's good that SRP is there, but Unity should focus on a single HDRP>URP pipeline and not force us to choose, as others have stated.
     
    Last edited: Mar 16, 2020
  46. Deozaan

    Deozaan

    Joined:
    Oct 27, 2010
    Posts:
    680
    Disagreed.

    I just reported a bug (case 1227998) in 2017.4 LTS. I got a response saying:

    Let that sink in for a moment.

    I'll wait.

    Think about it a little longer.

    Hmm... What does LTS stand for again? It seems Unity QA thinks it stands for Lacking the Tiniest of Support.

    No information was given about plans to backport the fix to 2018.4 LTS or 2017.4 LTS. And even though it seems from QA's response that the bug is a known issue, it does not appear in the list of Known Issues for the 2017.4 LTS release notes (I didn't check whether it appears in 2018.4 LTS release notes).

    The implication is that 2017.4 LTS and 2018.4 LTS are no longer supported, the bug is not going to be fixed in those versions, and that I need to "upgrade" my project to the latest beta-quality TECH release to get any support.

    I stick to LTS versions and the last few bug reports I've made the response has been, in essence, "we have no plans to fix it [in the LTS version]." Why should I put effort into making good bug reports? It's just a waste of my time when the response is always "won't fix." In fact I'm strongly considering not bothering to make any bug reports anymore. I'm trying to help make Unity a better product, but I don't feel my time and efforts in that regard are truly appreciated.
     
    Last edited: Mar 16, 2020
  47. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    372
    That illustrates the Core of the problem perfectly. Why come up with all this tiered structure of different levels of support and compatibility cobwebs if in the end you're gonna ignore all that and focus on only the latest version??
     
  48. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    2019.3 is good already. It only a month way from LTS or so.
    Still needs fix for script compilation takes too long, adding/deleting shader graph nodes takes too long and HDRP grass is not supported yet.

    It is the better way. Unreal has only 2 or 3 hotfixes per version build and they move on to next release which supports project in previous release.

    I think Unity should make the same project plan to improve faster.
     
    Last edited: Mar 16, 2020
  49. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    5,206
    I've made this experience a few times myself. It's frustrating if you're on LTS and the first response is "Please upgrade to latest". :)

    However... after the "upgrade to latest" response, I often replied with how important it's to me to get it fixed in LTS. In several cases, QA then turned the report in a Case number and the bug was fixed eventually.

    Don't be afraid to let them know how important a particular bug-fix is for you!

    If you treat QA nice, they treat you nice too. Just my experience of course. :)
     
  50. JoNax97

    JoNax97

    Joined:
    Feb 4, 2016
    Posts:
    372
    I agree with you, and would like the same, but let's not be naive and acknowledge that Unreal can afford that because they've build an extremely solid base.

    And I'm sure that, if the right decisions are made, Unity can be there in 1 or 2 years, when the new tech stack stabilizes and some old stuff is finally gone. But in the meantime, Unity has to choose how to proceed, either doubling down on their current strategy and fixing the issues it brings, or taking a different approach. Either way is fine with me, but this uncomfortable middle ground is benefiting no one.
     
Thread Status:
Not open for further replies.
unityunity