Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Unity Future .NET Development Status

Discussion in 'Experimental Scripting Previews' started by JoshPeterson, Apr 13, 2021.

  1. Thaina

    Thaina

    Joined:
    Jul 13, 2012
    Posts:
    1,168
    What I want to say is, he is one of the source and his link is the official source

    Source may be false, I am not say it must be true, but if asking for source then that he is

    If asking if it really true then I won't argue about this
     
    Eclextic and Spy-Master like this.
  2. tannergooding

    tannergooding

    Joined:
    Jun 29, 2021
    Posts:
    30
    I've only provided context into what we consider in the official .NET API review, which as said is live streamed weekly. The videos all exist on the .NET Foundation youtube if people want to watch them :)

    The framework design guidelines are also documented in a book that comes with additional context/annotations from those of us on the .NET Team and that are part of the .NET API Review group. The third edition was released a couple years back and is where a lot of the changes from between .NET Framework and .NET [Core] happened, including better documentation around why things may be considered "bad"

    Another example of the potential problems with default parameters vs a new overload from the library side of things is if that default value will ever change. Consider for example the constructors `List<T>()` and `List<T>(int capacity)`. We could of course have exposed this as one constructor with an optional `int capacity`, however the "ideal" capacity might also change over time and the way default parameters work is that the compiler producing the IL embeds the literal constant directly into the IL stream.

    So, if we defined `List<T>(int capacity = 4)` and we determine that `16` is better for modern apps, we can certainly go and change the signature to `List<T>(int capacity = 16)`. However, anyone who has already compiled cannot see this change until they likewise recompile.

    We could have instead used a sentinel value. That is, a value which is otherwise "illegal" to pass in that is specially handled/recognized. For some parameters, such a value doesn't exist and `Nullable<T>` adds its own overhead which can be undesirable. For `int`, `-1` is frequently fine to use since negatives are frequently disallowed and so we use this in several APIs where we do allow default parameters, but it still comes with considerations around consistency with other APIs where such a sentinel isn't possible/trivial, and that it could hide subtle logic bugs that exist in a caller.

    So, we consider the tradeoffs and sometimes opt to use defaults and sometimes opt to use two overloads instead.
     
  3. CaseyHofland

    CaseyHofland

    Joined:
    Mar 18, 2016
    Posts:
    613
    Well I'll be, I've come out of this on the other side and now I'm also wondering why Unity doesn't do this more often. I always assumed it was because of the guidelines I know and love.

    I don't really see the point of refactoring my old code, but in the future I'll start taking default parameters into consideration myself!
     
    Eclextic, Thaina and Mindstyler like this.
  4. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    266
    It's also worth mentioning that using overloads allows you to provide "default" values for parameters that otherwise couldn't have them due to their type not having a "constant" form.
     
    Eclextic and Deleted User like this.
  5. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    Well of course that's a given. But my intention was to explicitly refer to all the methods that apply for default parameters. That you can't change the method overloads with non-'constant' values is a given but that's not lessening the argument for all that could.
     
  6. MasonWheeler

    MasonWheeler

    Joined:
    Apr 2, 2016
    Posts:
    219
    If you look at the ECMA specification of how boxing works, it's basically a proto-generic type at the CLR level. I think it would have been entirely unnecessary if CLR 1.0 had had generics; there could have been an explicit
    Boxed<T>
    type. But unfortunately, as with
    ArrayList
    ,
    Hashtable
    ,
    IEnumerable
    ,
    IEnumerator
    , and a bunch of other legacy garbage, we're stuck with it now.

    I think it would be a lot less bad if we had
    Boxed<T>
    , and anywhere where you try to use a struct in an object context you needed to box it explicitly. That would do away with the worst part of boxing, the insidious invisible nature of the process.
     
    Last edited: Feb 7, 2023
    Nad_B, Trigve, blackjlc and 2 others like this.
  7. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    You mean like in Rust? The best language ever ;) XD
    But yeah no you're right being explicit about that should have been how it should've been implemented but honestly it can't be solved unless they make the compiler convert ALL code to this new boxed type or smth.

    In theory you could create it yourself even :p
    But the BCL won't benefit so yea...
     
  8. MasonWheeler

    MasonWheeler

    Joined:
    Apr 2, 2016
    Posts:
    219
    Yawn. Come back to me when it has inheritance and loses all the ugly colon::cancer syntax, then we'll talk about best language candidates.
     
  9. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    I know this is going off topic, but INHERITANCE? TF?!
    It literally has composition way better than C# so it doesn't need it!

    Cant defend the ugly colon syntax :p XD
     
    goncalo-vasconcelos and OBiwer like this.
  10. MasonWheeler

    MasonWheeler

    Joined:
    Apr 2, 2016
    Posts:
    219

    :eek: Please don't tell me you're one of those inheritance/composition cultists. Trying to say composition should be preferred over inheritance is like saying that drills should be preferred over saws: they're different tools with different use cases and it's simply pants-on-head crazy to attempt to use one to do the job of the other. And Rust takes one of the two out of your toolbox, leaving it crippled.
     
    Nad_B, Trigve, Nyarlathothep and 4 others like this.
  11. OBiwer

    OBiwer

    Joined:
    Aug 2, 2022
    Posts:
    61
    I'd love to have traits like in rust. Where you can basically implement "interfaces" for types you don't own.
     
    Deleted User and Eclextic like this.
  12. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    Deleted User likes this.
  13. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
  14. mm_hohu

    mm_hohu

    Joined:
    Jun 4, 2021
    Posts:
    41
    @JoshPeterson

    Is there any chance to use IL rewriting via MsBuild in the new Unity version?

    With the current ILPostProcessRunner.exe we cannot choose which assemblies should be processed based on our own dependencies. Furthermore, each assembly is processed in its own separate process or in its own thread.

    This limitation is a problem when implementing an AOT DI container that supports open generics, because it requires rewriting across assemblies.
     
  15. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    is there even an msbuild way of IL rewriting? i couldn't find anything on that. i've used mono.cecil, but i'm definitely interested in what msbuild offers.
     
  16. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    I'm not familiar with the specific details about IL re-writing options available in MSBuild, but our plan is to make the player build process use the stock MSBuild, so I would expect the extensibility points to all be available.
     
    DrummerB, Huszky, soargon and 5 others like this.
  17. mm_hohu

    mm_hohu

    Joined:
    Jun 4, 2021
    Posts:
    41
    MSBuild allows us to insert our custom tasks into the build process. So if we have access to the .dll and .pdb files generated by Unity, we can do IL rewriting.

    However, I feel that we need a robust csproj generation pipeline because we need to rewrite the csproj files accurately.
     
    Eclextic likes this.
  18. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    hm, c# falls back to code lowering with many old and new features. i'm actually curious why default parameters don't. wouldn't a lowering approach to default arguments solve a lot of these considerations while also providing additional features? e.g. being able to provide 'complex' objects as defaults.
    or is there an actual performance / optimization / compiler benefit to the current 'hard' implementation of default arguments?
     
  19. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    The defaults are inserted by the compiler AFAIK
    So yeah a performance benefit is definitely there, though if it is tradeable for what much more usability could be, I have no clue...
     
  20. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    thanks, but that's not what i'm asking / curious about
     
  21. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    Oh and this is probably also one of the reasons...
     
    Last edited: Feb 14, 2023
    RunninglVlan likes this.
  22. Qbit86

    Qbit86

    Joined:
    Sep 2, 2013
    Posts:
    487
    Microsoft itself uses default parameters in its public APIs, including the new ones. For example, APIs receiving CancellationToken with the default parameter of `default`.
     
    Eclextic likes this.
  23. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    just because these, btw absolutely awesome, attributes use default parameters doesn't mean that those wouldn't work in a lowering approach.
    actually, the parameters are already lowered at compile time to 'constant' default parameter values, which, if default parameters would be implemented in a solely lowered way, would get lowered further by the compiler the same as any dev-inserted default parameters. just because these attributes leverage a feature, doesn't mean they dictate how that feature is implemented.

    nevertheless, my question was really just directed at tannergooding since they were traversing this thread not long ago, because they are an official .net member and might have some insight in the design decisions of ere, and so i'm kindly asking anyone else not to reply to this particular topic if they are not part of the .net team.
     
    Eclextic likes this.
  24. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    Nad_B, oscarAbraham, rdjadu and 26 others like this.
  25. tannergooding

    tannergooding

    Joined:
    Jun 29, 2021
    Posts:
    30
    The way default parameters work behind the scenes today for C# is two part:
    1. The parameter is annotated with `[System.Runtime.InteropServices.OptionalAttribute]`
    2A. The parameter is annotated with `[System.Runtime.InteropServices.DefaultParameterValueAttribute(...)]`
    2B. For `System.Decimal`, C# instead utilizes `[System.Runtime.CompilerServices.DecimalConstantAttribute(...)]`

    `[Optional]` and `[DefaultParameterValue(...)]` are both magic runtime attributes that get a special encoding in IL. `DecimalConstant(...)]` is a "custom" (user-defined) attribute specific to C#.

    Likewise, while this is how C# emits its own default parameters. It is a bit more flexible in how it consumes methods with the same parameters.

    For example, when the default parameter value is the same as `default`, you actually only need `[Optional]`. Likewise while C# requires all defaulted parameters to be "trailing" in its own declarations, consuming methods isn't limited to the same. Knowing this, you can declare `public void N([Optional] int x, int y) { }` and consume as `public void M() => N(y: 5);`, even though `public void N(int x = 0, int y) { }` would result in a compilation error: https://sharplab.io/#v2:EYLgxg9gTgp...dNgNoA8uw0EOQAhgA2ALo41FQ4CAA0sfG2Jio4AL4SmUA

    For `[DefaultParameterValue(...)]` you can likewise take advantage of implicit conversions and C# will respect them, so while `public void N(ReadOnlySpan<char> x = "foo") { }` is invalid C#, declaring `public void N([Optional, DefaultParameterValue("Unity!")] ReadOnlySpan<char> x) { }` and then consuming as `public void M() => N();` works as expected: https://sharplab.io/#v2:EYLgxg9gTgp...VmALqk1DDBACa+9KEVgvzB9AA8YAAWabYIFhqkAL6Ek0A

    Both of these can be quite useful to know when generating interop bindings, particularly where the native bindings are optional in such places, for example.

    `[DefaultParameterValue(...)]` still has the general restrictions that the `...` must be implicitly convertible to the parameter type and the general restriction of all attributes that it can only be metadata (runtime) primitive types. This includes bool, byte, char, double, float, int, long, sbyte, short, string, uint, ulong, ushort, object, type handles, enums, single dimensional arrays of the above. C# has some restrictions around these, particularly for arrays.

    `System.Decimal` works via its `DecimalConstantAttribute` which basically just takes the constructor parameters and has special runtime logic knowing which constructor overload to call. It can do this since the API surface area for `decimal` is well-defined by the language/runtime and can be ensured to never break.

    For user-defined types this gets quite a bit trickier on how it should work and be supported. There are proposals for supporting this (such as my own https://github.com/dotnet/csharplang/discussions/688), but its overall quite complex with many considerations (just like `constexpr` support would have).

    -- There are other fun "tricks" you can do if you know how Roslyn (the official C# compiler) encodes its metadata and how it interprets IL metadata in return. Sometimes these work in the same assembly (such as `Optional/DefaultParameterValue`) and other times they need a "boundary" between the declaration and consumption side (`SpecialName`, which can bypass some language operator restrictions). Noting that not everything works "smoothly" in all cases and odd quirks can pop up for some of them.
     
    Qbit86, blackjlc, Thaina and 5 others like this.
  26. OndrejP

    OndrejP

    Joined:
    Jul 19, 2017
    Posts:
    304
    Great post, thank you!
    1. What is the cost of pinning in .NET Core?
    2. How does it work under the hood, does it just flag the GC handle somehow?
    3. Does this mean that new string is allocated every time tooltip getter is called?
      (bindings generator must somehow return C# string)
      UTF16String Internal_GetTooltip();

    EDIT: Found a great article about pinning with answers
    https://mattwarren.org/2016/10/26/How-does-the-fixed-keyword-work/
    1. Zero - when GC is not run while inside fixed statement
      - when GC is run while inside, it can cause heap fragmentation, but it's not an issue for short pins
    2. No, it doesn't use handles - Pinned objects are kept in local variable with [pinned] tag
      - when GC is scanning stack for GC roots, it takes this into account

    This is different from
    GCHandle.Alloc
    , which allocates handle and stores it in some table (which GC uses when looking for GC roots).
     
    Last edited: Feb 15, 2023
  27. tannergooding

    tannergooding

    Joined:
    Jun 29, 2021
    Posts:
    30
    The cost of pinning for RyuJIT is cheap. It is essentially spilling the gcref/byref to a specially allocated slot on the stack. This "spill" is approximately 4 cycles on most modern hardware since its just a write to memory. Once the `fixed` statement ends, the slot is zeroed out (another 4 cycles) so that the data is no longer considered pinned.

    Any type can opt into pinning support by exposing a `public ref T GetPinnableReference()` API and it's the cost of this that can vary from scenario to scenario. So what you're pinning can have some additional impact as getting the gcref/byref is not always the same cost.

    For example, with a `T[]` you must handle `null` and you must handle `Length == 0`. If you use `MemoryMarshal.GetArrayDataReference` you only need to handle `null`.

    While `Span<T>` and `ReadOnlySpan<T>` are just reading a field, there is no need to handle `null` nor any need to handle `Length == 0`.

    -- It's also worth noting that the two writes required for pinning have no dependencies in the linear instruction stream, so they typically get pipelined with other instructions and in practice can show up as "zero cost".
     
  28. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    Also for reference doing a division operation in a best-case scenario is about 30 cycles so don't worry about performance XD
     
    ThatDan123 and OndrejP like this.
  29. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    I think most of your questions have been answered, but to follow up for completeness:

    Yes, we need a new C# string in this case.
     
    Eclextic and OndrejP like this.
  30. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    'nother question!

    So do you just allocate the string on the C# side and get a pointer on the C++ side, which is pinned in the CoreCLR GC?
    Or do you allocate the strings on both sides?
     
    marce155 likes this.
  31. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    There is memory allocation happening on both sides, although with the recent changes we have limited that to the minimum possible allocations necessary to fulfill the requirements for the .NET VMs.
     
  32. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    I'm sure the Unity team is doing fantastic work to ensure performance. And while i wouldn't care or worry about something in-editor, especially something that only runs so few times as GetTooltip, things like a GameObject's name that game code could access relatively frequently might be good candidates for string interning.
     
  33. OndrejP

    OndrejP

    Joined:
    Jul 19, 2017
    Posts:
    304
    For me personally, I'd prefer method
    ReadOnlySpan<char> GetName()

    It would return span pointing to the internal UTF16String, thus not allocating any memory.
    Classic string-returning property "name" would stay for compatibility.
     
    Saniell and PetrisPeper like this.
  34. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    Exactly! Firstly for compatibility and secondly for simplicity's sake!
    Not everybody is building the next triple A title, so such small improvements don't matter in an indie game's context, but having it also be optimizable is literally why everyone decided to choose Unity.

    TLDR: Don't deprecate methods that use strings, but increase the possibilities to get other values!
     
  35. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    This is a good point! We do interning for really common things like string.Empty at the VM level, but I don't think there is interning of strings at the Unity level. I might be wrong though. Still this is a good suggestion.
     
    marce155 and Mindstyler like this.
  36. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    Agreed! We're looking at ways to improve the Unity API by adding safer, faster methods using the .NET Standard 2.1 API (available for engine code since Unity 2022.2). I expect that newer versions of .NET (e.g 7 or 8, depending when we can ship CoreCLR support) will open up even more opportunities like this.

    I will caution that API changes will likely follow support for user code doing this, as we need to get a stable foundation to build from first. But these kind of changes are definitely on our minds.
     
    marce155, OndrejP, Mindstyler and 2 others like this.
  37. OndrejP

    OndrejP

    Joined:
    Jul 19, 2017
    Posts:
    304
    I think everybody here agrees that switching to .NET Core is top priority and random improvements like this can definitely wait :)
     
  38. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,053
    @JoshPeterson @xoofx all the development being done to support msbuild/sdk style csproj/nuget is being done in parallel and are you going to be able to release it before CoreCLR? Or are you going to wait to have everything at the same time? Is it possible to develop it separately or is one thing going to depend on the other?

    Thanks!
     
    Huszky and Eclextic like this.
  39. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    I've asked this before already, the answer was that sdk-style project will definitely NOT be available before the .net core transition.
     
    Huszky, bdovaz and Eclextic like this.
  40. Sergey_ksubox

    Sergey_ksubox

    Joined:
    Jul 29, 2015
    Posts:
    26
    I wonder what C#<->C++ marshalling we can use now with 2022.2 (or 2023.1). And where I can find examples of marshalling in latest Unity ?
     
  41. Mindstyler

    Mindstyler

    Joined:
    Aug 29, 2017
    Posts:
    248
    there's nothing new.
    besides, marshalling into Unity is not something you'll do
     
  42. Deleted User

    Deleted User

    Guest

    I believe the latest "new" thing is that there are unsafe function pointers you can use. But this is since C# 9.0, and it isn't really anything you couldn't do before since the IL opcodes have existed. The blog mentions one of the opcodes,
    calli
    , that is used with these function pointers.

    https://learn.microsoft.com/en-us/d...erence/proposals/csharp-9.0/function-pointers
     
    Eclextic likes this.
  43. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
    In theory there is not a dependency here - We could release the build system changes to use MSBuild and NuGet without CoreCLR support. But in practice we likely won't release build system changes first. We are working on both in parallel now, and our release plans are not yet final, so things could change still.
     
    ThatDan123, Anthiese, NotaNaN and 4 others like this.
  44. JoshPeterson

    JoshPeterson

    Unity Technologies

    Joined:
    Jul 21, 2014
    Posts:
    6,938
  45. PetrisPeper

    PetrisPeper

    Joined:
    Nov 10, 2017
    Posts:
    66
    Are there any plans to update Roslyn before the move to CoreCLR so that newer C# versions become usable?
     
  46. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    1,091
    In another realm, Godot 4 just released with switched Mono to Net 6.

    It's a pretty big breaking change including move to double and long from float and int. As opposite to current Unity plans they did release Godot Nuget packages. Reasons why: https://godotengine.org/article/whats-new-in-csharp-for-godot-4-0/


    I wouldn't mind bigger Unity releases with breaking changes. I feel like complete lack of them ultimately slows down engine development.
     
    WavenGD, Wattosan, Alvarden and 19 others like this.
  47. Thaina

    Thaina

    Joined:
    Jul 13, 2012
    Posts:
    1,168
    Yeah, unity just become too sluggish on this. To let godot that not really start with C# adopt C# faster than unity that was work with C# from the start for 17+ year
     
    Nad_B and pm007 like this.
  48. bdovaz

    bdovaz

    Joined:
    Dec 10, 2011
    Posts:
    1,053
  49. Eclextic

    Eclextic

    Joined:
    Sep 28, 2020
    Posts:
    142
    Hearing that they want to change their course of action in your thread is really good to hear!

    I also saw the Godot 4 release and honestly I was planning since a long time to switch to Godot for 2D and to Unreal for 3D after finishing a project I'm working on, but I might just consider using Godot for everything XD

    Seriously though, if Unity doesn't learn to create breaking changes after every release like every other project out there using semver then we will end up in a really bad future
     
  50. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,338
    I mean the reason why we're in this "nothing can ever break backwards compatibility" mess is because everyone was crying about how hard it was to update back in the days, and Unity listened to that.

    Upgrading main Unity versions used to be the worst - 3 to 4 and 4 to 5 was a month long processes. But the reason why we had to do those was that console support for versions was utterly random, and you could end up having to update even if your project didn't need it.

    With LTS, the problem is kinda gone - we can expect third parties to support LTS, so a project can stay on the same major version for the entire lifecycle of the project, if you spend three years or less and start on tech.

    So we have this aversion to breaking changes that's due to complaints based on how things used to work in the last decade, that's not really relevant anymore