Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Unity C# 8 support

Discussion in 'Experimental Scripting Previews' started by JesOb, Apr 18, 2019.

  1. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    Note that when we ship C# 8 support in Unity, we will not have default interface implementations. That feature requires work in the runtime which we have not completed yet for Mono and IL2CPP. I would expect that work to be shipped in the 2021 release cycle.
     
    Qbit86 likes this.
  2. Awarisu

    Awarisu

    Joined:
    May 6, 2019
    Posts:
    215
    It would be really nice if we could just get the Roslyn update into 2020 (and 2019 - I know it won't happen but one can dream) to get started with the language-only features. I think everyone here agrees that default interface implementations would be great to have but it's a tall order.
     
    GiraffeAndAHalf likes this.
  3. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    This is what we are working on now. We're planning to ship it in 2020.2 though, not any 2019.3 versions. I'll ping here when I know that it will be in.
     
    Qbit86, CaseyHofland and Awarisu like this.
  4. CptBertorelli

    CptBertorelli

    Joined:
    Apr 9, 2020
    Posts:
    20
    Any news on when C# 8 could be available for Unity?
     
    Qbit86 likes this.
  5. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    No, nothing to report yet.
     
  6. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    Hey, I'm happy to announce that C# 8 support has landed into a release branch internally. It will be in the 2020.2.0a12 release of Unity, which should be available in about two weeks.
     
  7. jGate99

    jGate99

    Joined:
    Oct 22, 2013
    Posts:
    1,945
    will it have support for default interfaces?
     
    Ghat-Smith and GiraffeAndAHalf like this.
  8. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    No, we've not add support in the runtimes for default interfaces yet.
     
    hippocoder and Qbit86 like this.
  9. Muuuuo

    Muuuuo

    Joined:
    Apr 16, 2015
    Posts:
    57
    Super excited to keep on waiting for default interfaces!
     
  10. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    How complex is it that it can't be backported?
     
    Qbit86 likes this.
  11. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    The complexity is not too high. The decision is more about risk - we'd like to avoid introducing risk into releases that are farther along in the release process.
     
    EZaca and JoNax97 like this.
  12. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    Can it really be that bad for QA? It's pretty much just a compiler version update. As far as the users are concerned, at least.
     
  13. CaseyHofland

    CaseyHofland

    Joined:
    Mar 18, 2016
    Posts:
    613
    I'm happy that this has been communicated so well. Even though I too crave default interfaces, backwards compatibility is never easy and I'd rather it be done right.
     
  14. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    Everything is a matter of trade-offs, engineering time spent on this vs. other features, new things vs. stability, etc. At this point we've decided not to back port it. But that decision could change. I guess what I want to say is there is not a technical limitation preventing us from back porting C# 8 support.
     
  15. jGate99

    jGate99

    Joined:
    Oct 22, 2013
    Posts:
    1,945
    Yes its better that engineering time is spent on future builds than older this way we can get features like default interfaces quickly
     
    GiraffeAndAHalf likes this.
  16. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    Yeah, no. The engineering effort of backporting the compiler update is presumably quite low compared to the one required for the Mono runtime update (required for default interface implementations), coming in 2021 at the earliest. They're better off backporting to at least 2020.1. Happier programmers. Pretty sure there's a feature freeze on 2019.3/4.
     
    Last edited: May 8, 2020
    firstuser, sand_lantern and Awarisu like this.
  17. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    Agree. Thanks @JoshPeterson for comms done right (it's refreshing not to have to wait months for an official reply) and for keeping your cool under duress.
     
    Awarisu likes this.
  18. Little_Turtle

    Little_Turtle

    Joined:
    Jan 29, 2017
    Posts:
    9
    I can understand you. I was quite happy when this was introduced in the Java language spec. But again nullable reference types... . :)
     
  19. Little_Turtle

    Little_Turtle

    Joined:
    Jan 29, 2017
    Posts:
    9
    I should have been more precise I guess. I want to have nullable reference types. Nullable value types are usually of no use to me so I tend to ignore it. Also I come from the Java/Kotlin bandwagon so I make no such distinction.

    for everyone interested: https://docs.microsoft.com/en-us/dotnet/csharp/nullable-references
    Also have a look at: https://devblogs.microsoft.com/dotnet/embracing-nullable-reference-types/
     
    Last edited: May 12, 2020
  20. Awarisu

    Awarisu

    Joined:
    May 6, 2019
    Posts:
    215
    Those come for "free" with a compiler upgrade (it's just a bunch of attributes at runtime), however for it to truly be useful Unity APIs would need to be made nullable-ref-aware, and ideally you should be able to make serialized fields non-nullable with the Editor catching if they're ever assigned None. Luckily these can be done independently from each other.
     
  21. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,338
    What would that look like? Errors if you open or load a scene where there's non-nullable fields that are null? That's going to be pretty slow, there's a lot of fields you have to look at!

    Still, it could be very useful to be able to specify that fields have to/don't have to be filled. One of the most common errors I make when developing new features is to add a field, think to myself that I'm going to set it, forget to set it, enter play mode, get an exception, exit play mode, set the field. An optional check on enter play mode is something I'd for sure try, see if the cost of the check was worth it.
     
  22. Little_Turtle

    Little_Turtle

    Joined:
    Jan 29, 2017
    Posts:
    9
    I do not care that much. C# 8 itself uses the dont ask dont tell policy. You have to enable nullable reference types per library / code fragment. If you use something that has not nullable reference types enabled C# compiler switches to a mode for those fields that is close to 'either way' and does not issue errors nor warnings.

    So there is an evolutionary upgrade path where one can enable it with all the benefits without being forced to upgrade the rest.

    Anyway I just want to get rid of these pesty if(x != null) stuff all over. Dislike it for decades.
     
  23. danm36

    danm36

    Joined:
    May 18, 2016
    Posts:
    10
    I'm wondering how exactly the null coalescing (
    ??
    and
    ??=
    ) and null conditional (
    ?.
    and
    ?[]
    ) operators will work with Unity classes. For example:
    Code (CSharp):
    1. Destroy(someComponent);
    2. if(someComponent != null)
    3. {
    4.     someComponent.DoSomething();
    5. }
    The if statement isn't entered as Unity overrides the == and != operators such that they equate to null if the object is destroyed, even though the variable reference itself is still assigned. You couldn't do
    Code (CSharp):
    1. Destroy(someComponent);
    2. someComponent?.DoSomething();
    Because someComponent isn't actually a null reference, it just points to a destroyed Unity object, and the method will still be called even though it shouldn't. As far as I'm aware, you cannot override these specific null operators either. Has Unity found a way internally to handle this for Unity objects, or is this something to be wary about when we use these features? (Or will Destroy now have a 'ref' form that sets the reference to actual null itself?)
     
  24. Awarisu

    Awarisu

    Joined:
    May 6, 2019
    Posts:
    215
    ??, ??=, ?., and ?[] use the equivalent of an
    is null
    check instead of
    == null
    .

    This is not ideal but the overloaded equality operators are legacy that will stay with us for probably as long as there are GameObjects and MonoBehaviours in the engine. Changing them would pretty much break every Unity code there is.

    If you hate checking for
    != null
    every time like I do, there's a slight improvement: UnityEngine.Object also overrides
    operator bool
    so you can check if something is valid C++-style:
    if(mycomponent){blah}
     
  25. danm36

    danm36

    Joined:
    May 18, 2016
    Posts:
    10
    Wow, somehow I completely missed that when reading up on the Unity docs. That definitely makes sense though, and follows other C++ patterns I've used in the past. While I would love if Unity somehow got
    ?.
    and
    ??
    working, I'd understand if it couldn't happen. I figure that when this eventually comes out, a gotcha should probably be indicated on the announcement posts and whatnot for newer developers.
     
  26. Awarisu

    Awarisu

    Joined:
    May 6, 2019
    Posts:
    215
    danm36 likes this.
  27. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    266
    Am I right to assume that this won't coincide with an upgrade to .NET Standard 2.1, meaning we'll still need to bundle System.Memory and friends to have access to types such as Span<T> and Memory<T>?
     
    andywatts, JesOb and Qbit86 like this.
  28. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    Sounds like a no to me.

    Of course, they could just get off their asses and update their runtime and libraries right now to Mono's latest and then .NET Core, in preparation for .NET 5, making the transition much easier, but no — DOTS Visual Scripting is more important.
     
    Vincenzo likes this.
  29. Wings1412

    Wings1412

    Joined:
    Feb 9, 2019
    Posts:
    4
    Were any changes made to simplify updating language version?

    I'm sure you are aware Microsoft gave a preview of C# 9 during BUILD this year which is supposed to come as part of .NET 5 in November. I understand that the shift to .NET 5 will be complex but with the work done to enable C# 8, will C# 9 be an easier to migrate too?
     
  30. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    The C# 8 update they're doing is just a compiler update, so runtime features like default interface implementation will be unavailable (the compiler throws an error). Same goes for C# 9: A compiler update is easy, but any new features that require a new runtime will be unavailable.

    If they update their class libraries to .NET Standard 2.1 by updating their Mono fork, it will be easier to migrate to .NET Core. If they migrate to .NET Core, migrating to .NET 5 shouldn't require much effort, as Microsoft wants to make .NET Core 3.1 -> .NET 5.0 as seamless as possible.
    Of course, see my rant above. They won't do that as soon as they should.
     
    Vincenzo likes this.
  31. Wings1412

    Wings1412

    Joined:
    Feb 9, 2019
    Posts:
    4
    Yep, my question is only in regards to the language version itself and not on the framework or run time, I understand that this is a complex project. However it would be ideal if we were able to get the latest language version shortly after it is released.
     
    Vincenzo likes this.
  32. Vincenzo

    Vincenzo

    Joined:
    Feb 29, 2012
    Posts:
    146
    Unity should grab all engineers from burst jobs ecs and dots and full time let them work on the move to. Net core runtime. Ruyjit and latest generational GC. 3 months later we all would get what we need. Which is a fast game engine and we can than just forget about the fiasco that unity got into.
     
    Protagonist and Ramobo like this.
  33. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    That sounds easy, but it is not, the developers of other teams have other special areas, have to be trained first, the teams have to be reorganized, the original team is slowed down and it takes months at the end to make up for the time. And the task for which the restructuring was done may end before that, so you lost time in the end and the other projects were stopped unnecessarily.
     
  34. Vincenzo

    Vincenzo

    Joined:
    Feb 29, 2012
    Posts:
    146
    You have absolutely no idea about the teams in unity. Stop making assumptions. They have their top engineers that know a lot about the engine. And c# working on the useless project that is burst.. Some of those even did a first port to. Net core in a week. They know how to do it. They know what to do. Nothing happens for three years. The situation is embarrassing.
    https://xoofx.com/blog/2018/04/06/porting-unity-to-coreclr/
     
    Last edited: May 24, 2020
    Protagonist and Ramobo like this.
  35. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    The same goes for you.
    I have heard that short-term redistribution of personnel has resulted in more caos than anything solves. In cases where you work on similar things, or only a few people, this can still work. But C# developers on a runtime in C++ are probably on the wrong place.
    Teams only have a deep understanding of the part of the software that they are working on. Also here in the forum you can often see how Unity employees refer to other employees when questions arise that affect other parts of Unity.
     
  36. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    The point here is that a few guys prototyped this in a week over 2 years ago and it was never picked up officially. Instead, they're wasting unbelievable amounts of time and money on shiny performance-improving technologies that require significant user investment. Time that could instead be used to port to .NET Core and get matching performance improvements for free (from the users' perspective) and to all users in the form of the editor, not just final builds.
     
    Last edited: May 25, 2020
    Awarisu and Vincenzo like this.
  37. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    It has already been discussed here that you cannot simply replace the runtime, even if it is easy to replace yourself. It's also about the whole ecosystem, both the whole Unity packaged and 3rd providers. Standard 2.1 is not compatible with Standard 2.0, so far i know, you cannot use Standard 2.0 libraries / packages in standard 2.1 projects. And it will probably not be possible to offer both versions, since all packages and the C # part of the core engine must be available in both versions.
    Ryujit is fast, but whether Ryujit is faster or same in games is uncertain without real comparisons in real games. I think in games Burst is superior to Ruyjit and also builds on proven technologies (Clang / LLVM).
     
    JesOb likes this.
  38. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    If we keep this line of thought we'll stay with Mono forever, even when it's inevitably deprecated after being merged into .NET 5. Jesus, was Unity silently acquired by Microsoft and now puts backwards compatibility above all else?

    Well, how about Unity does an official prototype? It can't take more than a few months to make something decent considering that a few of their employees, with 2 years less experience, did a very rudimentary, but functional, prototype in a week. There's a reason they used their hackweek for that: They want it to become official, and xoofx's post proves it.

    RyuJIT has also been proven for over 6 years. .NET Core has also been proven, including Microsoft being confident enough in the stability of their previews that they redirect increasing amounts of traffic to their websites to versions running on the latest .NET preview. Epic Games plans to port Fortnite to UE 5 halfway into its preview cycle. Unity, meanwhile, doesn't even have a dogfooding game and their demos completely break after a few engine releases.
     
  39. Awarisu

    Awarisu

    Joined:
    May 6, 2019
    Posts:
    215
    The comparison is not even between RyuJIT vs Mono, it's RyuJIT vs Unity's-ancient-Boehm-Mono-setup. You can comfortably expect a 2x speedup of C# code for literally zero effort from your perspective as a Unity developer if that change is made. That's the lower end of things, in the aforementioned article Unity devs refer to 2x-5x with up to 10x in special cases which is more in line with my experience of working with Mono and the two CLR flavors outside Unity. Even SGen Mono is an improvement, it's that bad. The only reason that hasn't happened yet as far as I'm aware is that Unity C++ makes some pretty strong assumptions all over the place of, e.g., exactly how badly ancient Mono's GC does things so you just can't drop in a better runtime or it would break spectacularly. They've been adding the proper barriers into the C++ side of things to prepare so hopefully the switchover can happen Soon™ but I wouldn't wait for it.
     
  40. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    The Standard 2.1 and C # 8 have already been confirmed, I was only concerned with the fact that an upgrade must be well planned and organized. Otherwise there are only problems and cries in the community.

    The speed increase of .NET core is not only due to runtime, but also due to a more efficient frame work.
    It has already been confirmed that the DOTS runtime will run on the .NET Core.
     
  41. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    DOTS is already doing quite well in the editor on Mono. I wonder how well DOTS will perform on .NET Core, vs .NET Core on GameObjects. The question right now is whether DOTS is a waste (i.e. whether the management people keeping it alive need to be fired).
    Benchmarks have already confirmed that RyuJIT is faster than Mono — usually by several or a few times — in all tests, faster than IL2CPP in about half of the tests, and comparable to Burst on half of the tests. Unfortunately, the thread was locked for some bullshit reason when I asked for CoreRT tests, not sure how related those two are. So users can't do tests just because the company also does?
     
    Vincenzo likes this.
  42. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    DOTS, especially ECS, pays great attention to the memory layout. This is important regardless of the language / runtime. After all, the developers and not the runtime are responsible for the memory layout.

    Unfortunately, tests don't say much about performance in a real gaming environment. That's why you don't yet know how RyuJIT behaves in certain situations.
     
  43. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    And just like that, we've circled back to the official prototype argument.
     
  44. Honestly, I'm watching this argument for days now and still don't understand what's your problem. They are working on both (at least they stated that they do). So? Are you jealous that they are working on things yo don't like/need too or what?
     
  45. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    The problem is that this damn prototype could and should have been here years ago, benefitting everyone. Meanwhile, they're wasting time on DOTS when they could get comparable performance improvements with .NET Core. I don't have a problem with them working on DOTS for an extra boost — after they get the RyuJIT boosts. They're marketing DOTS as some sort of savior (implied to be the only hope for performance in Unity) as if it's easier to make DOTS (has been in development for what, 4 years?) than to port to .NET Core (could probably be done in a single focused release cycle, considering how long the unofficial prototype took).

    "hur dur but backwards compatibility what about these code changes that some people might have to do"
    One could not (well, should not) give less of a S*** about backwards compatibility when it's a barrier to great progress such as tremendous performance boosts.
     
    Vincenzo likes this.
  46. A single 'yes' would have been suffice.
     
  47. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    No, it wouldn't. Just dig a little. There's plenty of people dissatisfied with the current state of Unity — the blog comments seem to require little research. Just look for a post with plenty of comments. That's not to discredit the forum as another source.
     
    Qbit86 likes this.
  48. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    DOTS solves other problems, the DOTS boost is many times larger than the boost only through a faster runtime. There is also cache miss through random memor access with RyuJIT, because it is related to the software architecture and not the language / runtime.
     
    iamarugin and JesOb like this.
  49. Ramobo

    Ramobo

    Joined:
    Dec 26, 2018
    Posts:
    212
    How will I say this... DOTS, at least at the moment, requires a lot of investment and is 100% useless to those sticking to GameObjects, except in the context of Unity systems backed by DOTS. To the best of my knowledge, the only part of DOTS that might back a Unity system that works with GameObjects is the Job System, not ECS nor Burst.
    .NET Core, meanwhile, benefits everyone in the form of faster domain reloads without having to adapt your code to work with the enter play mode options, and just sheer performance — all without having to change anything unless you're using a .NET Framework feature that isn't available in .NET Core, a small list whose most notable item is AppDomains. Last time I checked (right about now), "everyone" was a much bigger portion of people than "those interested in reorienting towards DOD".
    They should focus on a .NET Core migration simply because it will majorly benefit everyone while still taking much less time to do than DOTS.

    I think that's a better point of view.
     
    Waz, Qbit86, slimshader and 2 others like this.
  50. No, they should do both. But I think you miss the point. In order to move the entire engine, I believe they need to rewrite a huge amount of core, since they are running on a highly customized Mono in order to operate on all of the supported devices.
    And as far as I (we) can tell, they're just doing that.

    I don't understand why it is a pain for you that they're working on other things too. Do you have problems with the Stadia support too?
     
    OndrejP, Awarisu and JesOb like this.