Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Null check with "is" operator on Unity Object?

Discussion in 'Experimental Scripting Previews' started by Neonlyte, May 14, 2022.

  1. Neonlyte

    Neonlyte

    Joined:
    Oct 17, 2013
    Posts:
    512
    Since Unity overrides the equality operator for the null-check behavior, the null-coalescing operators would not work.

    "is" operator has a side effect that if the left hand side is null, then it will return false. Would this not work with Unity Objects for the same reason as the null-coalescing operators?

    In other words, would the explicit equality check with null be redundant or necessary?
    Code (CSharp):
    1. if(unityObject != null && unityObject is ISomeInterface someInterface)
    2. {
    3.     someInterface.SomeMethod();
    4. }
     
  2. Since the "fakenull" object doesn't implement
    ISomeInterface
    , you should be safe without explicit null-check. The only thing you really need to pay attention to avoid the
    is null
    and
    is not null
    checks.
     
  3. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,294
    This is wrong, since the fake null object is still typed as the actual type expected. Try this:

    Code (csharp):
    1. using System.Collections;
    2. using UnityEngine;
    3.  
    4. public class TestScript : MonoBehaviour {
    5.  
    6.     IEnumerator Start() {
    7.         Foo foo = gameObject.AddComponent<Foo>();
    8.         foo.Data = 15;
    9.         yield return null;
    10.         Destroy(foo);
    11.         yield return null;
    12.         Test(foo);
    13.     }
    14.  
    15.     private void Test(MonoBehaviour mb) {
    16.         if (mb is ISomeInterface isf) {
    17.             Debug.Log(isf.GetType());
    18.             Debug.Log(isf.Data);
    19.             Debug.Log(isf.Position);
    20.         }
    21.     }
    22. }
    23.  
    24. public class Foo : MonoBehaviour, ISomeInterface {
    25.     public int Data { get; set; }
    26.     public Vector3 Position => transform.position;
    27. }
    28.  
    29. public interface ISomeInterface {
    30.     int Data { get; }
    31.     Vector3 Position { get; }
    32. }
    Output will be:

    upload_2022-5-16_13-27-7.png

    So the explicit null check is neccessary.
     
    apkdev likes this.
  4. Ohlala, you're absolutely right. I was dumb to answer without proper testing. And I don't know where I got the opposite and why. Anyway, I usually avoid this whole ordeal and try not to rely on checking on destroyed objects.
     
  5. print_helloworld

    print_helloworld

    Joined:
    Nov 14, 2016
    Posts:
    231
    destroyed objects in unity still exist until next frame, this is why destroy immediate exists and why a debate about ? and ?? even exists. there is a case where you can def ignore all of the all of the cons that people bring about about is null and is not null. since those issues exist only if the flow of objects isnt guarded against null objects (as in, not designed with the intent to describe if something could/couldnt be null when in the method, hint)

    I use this approach myself, and it has helped me stay focused on the code i actually need to write. one downside is that you will get IDE warnings for fields that are assigned in the inspector, but you can "forgive" that by using packages like MyBox which have components to ensure assignment before play.
     
  6. Gotmachine

    Gotmachine

    Joined:
    Sep 26, 2019
    Posts:
    34
    There is no "fake null" object, this is a myth floating around in the Unity community, steaming from the lack of clarity in the Unity documentation about the whole subject.

    A "destroyed" managed object is still the same unchanged instance, only the associated unmanaged object on the C++ side is destroyed.

    All the implicit bool operator, Equal(), ==, != and ToString() overloads on Unity.Object are doing is returning "null" when the associated C++ unmanaged object is destroyed. There is no magical replacement of the object reference with null or a "fake null" object. Said otherwise, what all those methods are doing is something like :

    Code (CSharp):
    1. if (obj.isDestroyed)
    2.   return null;
    Which means that all others methods of checking for nullity aren't following this behavior :
    - The "?." and "??" operators
    - "obj is null"
    - ReferenceEquals(obj, null)

    So it's actually quite simple : the overloaded / custom operators are checking "is the reference null or the C++ object destroyed" while the other ones are only checking "is the reference null".
    So saying that the latter methods "don't work" with Unity.Object isn't true.
    Depending on your code intent, checking for reference nullity only is a perfectly fine thing to do and can actually be desirable.

    I don't understand why Unity doesn't expose extra methods or properties that allow to decouple destroyed state checking from nullity checking, allowing to have a viable workaround for that terrible design decision.
     
  7. Adrian

    Adrian

    Joined:
    Apr 5, 2008
    Posts:
    1,061
    "fake null" is used to describe the behaviour of Unity Objects that have been destroyed. I don't see why you couldn't call it a "fake null object", even if the object wasn't replaced and just changed its behaviour.

    There's also the case in the editor, where empty Unity Object references will be assigned a dummy object to help debugging that behaves like a destroyed Object. Would be hard to find a better name than "fake null object" for that.

    Intentionally doing a reference equality comparison on a Unity Object is really esoteric. Can be helpful at times but you require anyone else trying to understand that code to know the intricacies of how Unity fakes null. At the very least, I'd use
    object.ReferenceEquals(unityObject, null)
    to make the intent explicit.
     
  8. Gotmachine

    Gotmachine

    Joined:
    Sep 26, 2019
    Posts:
    34
    Because using the term "null" for a concept that hasn't anything to do with what "null" mean is causing a lot of confusion, even for relatively knowledgeable Unity users.

    It absolutely isn't. That's like saying null checking in a non-Unity .NET program is esoteric. There are many valid code patterns involving intentionally having null references around, especially once you start to implement complex pure .NET game logic that isn't derived from Unity base classes but references Unity objects.

    As you say, one can rely on ReferenceEquals(), but it still is silly that we haven't direct access to a "IsDestroyed" property for when you need to decouple "real" null checking from destroyed state checking.
     
    atcarter714 likes this.
  9. Adrian

    Adrian

    Joined:
    Apr 5, 2008
    Posts:
    1,061
    That's why it's called "fake null". Because Unity tried hard to make an object behave like a null value, even if it isn't. Even many of Unity's properties and methods throw a MissingReferenceException that can easily be confused with an actual NullReferenceException. It's causing a lot of confusion because it's unusual and hard to analyze, not because of its name.

    There really isn't much useful you can do with a destroyed Unity Object? I guess you can continue to use it as a key or check if something has been set vs. set and later destroyed but that's already getting into the obscure territory, in which case I think you're better off finding a more obvious solution.

    Yes please. Also knowing in
    OnDisable
    if the object will be destroyed and a callback before the object is already marked for destruction (or a way to clear the mark) when hierarchies are being destroyed.
     
  10. atcarter714

    atcarter714

    Joined:
    Jul 25, 2021
    Posts:
    63
    The real question is why doesn't Unity's Object class implement the IDisposable interface pattern properly like responsibly-written C# code is supposed to do and have public IsDisposed and IsDisposing properties? That would have conveyed clearly that a Unity Object has unmanaged things to dispose of and could be in an invalid state even if the managed Object pointed to by a reference is not null. Almost seems like they thought way back then it was gonna be really "clever" to to overload an operator and make the managed Object denounce itself if the pointer to the native C++ object had already been relinquished, and that would somehow be better than IDisposable, the pattern literally made for this kind of thing and universally understand by any decent C#/.NET programmer. All it really ended up doing was creating years of confusion, and only on rare occasions are clarifying statements made about it.

    For anyone having trouble wrapping their minds around this, C# does not have a mechanism to delete an object explicitly and forcefully like C++ does (i.e., it's delete operator). Unity is a C++ engine, and all of those objects are native ones with native resources resources in unmanaged memory it's obtained from DirectX, OpenGL, Vulkan, other APIs and internal, native code. The managed C# Unity Object is just a managed "container" or "wrapper" that holds a pointer to one or more of these native engine resources. When you need to interact with it, the managed wrapper Object is hiding all the details to invoking function pointers and marshaling data across interop boundaries, and all you see are much cleaner method names with ordinary managed parameter types.

    I have a Github gist here which contains the code from a simulation of how null-ness of Unity objects works, to help clear up confusion on the subject. I compiled a native C library (DLL) which simply uses the C Standard Library memory management functions to allocate/delete unmanaged memory. Then I have a C# object called "UnityObjectWay" which uses the native DLL to get some heap memory and store a string there in unmanaged memory. It has a public "Print" method, which simply prints the contents of the unmanaged memory to the Console, which simulates how Unity's C# classes rely on pointers to unmanaged objects and memory. I could have done the same thing using the Marshal class in System.Runtime.Interop services, but I wanted it to be "more authentic" and leave no question that it's unmanaged memory and native code for people unfamiliar with this. At the bottom of the gist, I commented with a screenshot of the program output.

    As you can see, the C# class instance can be not null, but the native memory/resources and/or the pointer to it can be gone or destroyed, leaving the object in a useless and invalid state. Without the pointer to native resources, the C# object can't do what you expect it to do, and if it tries it will likely stop the whole show. That is why we use the
    if( obj )
    style check, because it returns false if either the C# instance is null or if the engine has rid itself of the native resources associated with it and is no longer any good to use. Writing
    if( obj != null )
    is only checking if the managed object is not null, and doesn't tell you if the native engine objects/resources exist to be used. It's an important distinction to understand, and I hope that by stripping the issue of its complexity and showing a working example of a C# class reliant on a pointer to unmanaged memory or C++ objects helps people get this ...

    So, in summary ...
    if( obj )
    -- Lets you know if you have something valid to use ...
    if( obj != null )
    -- Lets you know if the managed C# class is non-null, but not if it's OK to use it ...
    if( obj is null )
    -- Same thing, just using sugared syntax ...

    Your managed C# Unity Object, once created, is never null until nothing references it any longer or you explicitly assign null to your reference to that thing. It's also important to understand that a local variable in a method or field in a class isn't the object itself, just a reference to an object. It's just like having a pointer to a class instance in C++, essentially. The actual thing itself lives in memory, and we talk to it either by pointer or reference, and the pointer/reference isn't the same thing as the thing living in memory.
     
    melipefello and Protagonist like this.
  11. Adrian

    Adrian

    Joined:
    Apr 5, 2008
    Posts:
    1,061
    You probably won't find anyone working at Unity today that wouldn't agree the implementation was a mistake in hindsight.

    But keep in mind that these decisions were made for Unity 1.0 over 15 years ago, at a time when .Net and C# were still at version 1, Unity only supported Macs and was much more intended for beginners that wouldn't use C# but Unity's esoteric JavaScript implementation. Unity embracing C# and the .Net ecosystem and not trying to do its own thing is really pretty recent.
     
    Luxxuor and atcarter714 like this.
  12. atcarter714

    atcarter714

    Joined:
    Jul 25, 2021
    Posts:
    63
    You're absolutely right, I've always been rather keenly aware of that fact that when Unity decided C# was probably a good idea for their future it was a very different time period than the one we enjoy today. I still remember those days vividly ... Mono and the idea of cross-platform C# and .NET development wasn't something people took very seriously, and Mono was viewed as some crazy experimental thing that, while it sounded cool, Microsoft was surely going to intervene, sue and litigate it out of existence before long, so you shouldn't even get involved and waste your time becoming dependent on it. How wrong that sentiment was! Back then, I thought Unity was junk and didn't want to mess with it, at least not until about 2017 when I realized C# in Unity was becoming pretty legit and that it had changed a lot. But it definitely inherited some very odd and quirky bits from that time period and you can see the signs of its early heritage, if you know what you're looking at. You can see it in lots of small things, just like how quite a few parts of the oldest, original public APIs don't really follow the established C# naming and casing conventions (i.e.,
    something.gameObject
    , or
    something.transform
    , etc ... it almost looks like some Javascript guys named that stuff, weird, haha. But, for me, one thing, above all else, is the most weird and quirky thing about Unity, which a lot of people who just started in the last couple years probably don't recognize: it's the way Unity treats C# itself!

    In Unity, C# isn't exactly treated like a powerful, compiled, first-class game development language, but more like a second-class "scripting" language ... and that's even part of the Unity terminology: "Add a script", "scripting in Unity", etc. It's not so much like you're writing your own 3D application, but feels more like a "game modding" experience (and I mean that purely from the programming workflow and "style" standpoint, not what you can actually do with Unity) ... You just sort of add your "scripts" in there and wait for automagic stuff to happen. There's really no proper context outside of scene provided to execute any application state management code in at all. It's all really unusual if you come from a programming background to Unity ... people who never did any programming before using Unity don't realize it and will not understand what the hell I'm talking about lol. When I first started using Unity, I wasn't having any trouble understanding C# or the game development, rendering and 3D math concepts, I was trying to figure out why the hell things are the way they are and why can't I do these 10,000 totally normal programmer things, lol. For me, that still reminds me, more than anything, of the "pioneer days" of Mono. But there are other things too, like how there's barely any thread-safe API to be found in all the core engine libraries ... and that's not just Unity but many of these engines that have been around for a long time all need a new, segregated "ECS" system that's generally hostile and unfriendly toward the built in systems and APIs. But weird architectural things happen when software lives a long, long time like this and survives some major shifts in the landscape and ecosystem. And despite being treated in a very "second-class" fashion, C# in Unity has still gotten extremely powerful, you just have to use it in some really awkward ways, lol. That has weird side-effects on new users who don't have previous programming experience, like spending months or even over a year in Unity's C# environment and still have never heard of a regular old class with a constructor, lol. :D

    I'm really amped up about the migration to .NET Core and being able use NuGet packages and MSBuild directly and having across-the-board compatibility with the other 99% of the C#/.NET world. For me, that's a deciding factor on whether or not Unity remains a viable thing for me to use professionally in the future. You get to a point where it's tiring fighting with these things ... you just want to get in bed, cry and eat ice cream and dream about an engine with a good editor and content workflow like Unity but also works properly with all of your external code like everything else, haha. But those days are finally coming! Soon enough I won't have to find so many weird adaptive architectural patterns and hacks or have sprawling math-heavy systems with 51 type overloads of the same exact method! :D
     
    Nad_B and Protagonist like this.
  13. xoofx

    xoofx

    Unity Technologies

    Joined:
    Nov 5, 2016
    Posts:
    416
    Yes, it was an old decision made many years ago that feels today so annoying. 8 years ago, this question came back "Custom == operator, should we keep it?" and I believe that it was too much of a breaking change to make it happen.

    As part of the migration to .NET (CoreCLR), we would love to revisit this, but we haven't made any plan for this yet, so we don't know if we will be able to make it or not. Priority is definitely the migration to a newer .NET for now.
     
    Thaina, Huszky and atcarter714 like this.
  14. atcarter714

    atcarter714

    Joined:
    Jul 25, 2021
    Posts:
    63
    Ah, it's the man and legend himself! Thanks for stopping by here and replying, xoofx! When I see you around here I am both filled with gladness that you're working on Unity, and with sorrow that you're no longer working on SharpDX, but I realize we can't have both! :D

    Your priorities sound agreeable to me: the funny bool and == operators and the lack of an IDisposable implementation is definitely less of a concern than the lack of .NET, which is probably the biggest and most exciting thing on the roadmap right now. I'm extremely hyped-up and excited about it, wish I could help work on it too and get it ready faster, haha.

    Do you think adding an IDisposable pattern implementation on "Object" in the future would be particularly difficult/problematic to do, though? I would hope that it wouldn't be really tricky and might actually simplify the code base a little bit, but I'm not intimately familiar with that Unity code and don't have the same perspective as the devs actually working on it. And I feel like it would make game code a lot better and more understandable and reduce confusion in the future.

    EDIT: Also, is it true that C# 11 is going to be supported in the near future? And that abstract/virtual members will be possible to support things like "generic math" as it is possible in .NET 7 code? That's literally my favorite C# 11 feature of all, as it has greatly simplified a lot of code for me involving generics and constraints.
     
    Last edited: Nov 28, 2022
  15. Adrian

    Adrian

    Joined:
    Apr 5, 2008
    Posts:
    1,061
    New C# versions will come with the switch to CoreCLR/.Net and will include all language features.

    But that's probably not in the «near future». Unity hasn't announced an ETA yet but the CoreCLR player will be in 2023.2 at the earliest. And to really use it we need the editor on CoreCLR, which will probably be a few releases later. Maybe 2024.1/.2 if we're lucky?
     
  16. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,624
    It's funny how the blog has no comments any more. For that specific blog post, back then, I believe the overwhelming response from the community was "if you think it's going to make things better, do it".

    upload_2022-11-28_15-59-3.png

    And yet it wasn't done.

    I consider this one of the first examples of Unity asking the community for feedback and then not acting on it.

    In fact, I think "this is too big of a breaking change" is always Unity-speak for "we don't want to do it", because there are numerous examples of breaking changes happening silently and the community being left on their own to deal, while other things where the community is on board with we get "oooh, it's too big a change".

    (in my personal experience, the funniest incident I can remember, is an editor change happening (I believe by mistake), which broke one of my workflows, and after a bug report and multiple forum posts about it, the reply I got 3+ years later was "people rely on the new behaviour now, so reverting to the previous one is too big of a breaking change" or something along those lines)
     
  17. Gotmachine

    Gotmachine

    Joined:
    Sep 26, 2019
    Posts:
    34
    It could, be that's not really necessary. Plus Unity objects can't be instantiated with "new", which would clash with the "using" pattern one would expect to work with IDisposable.

    The only issue here is Unity choosing an abstraction that forcefully agglomerate the destroyed/disposed unmanaged resource state with the managed null concept, leading to behavior inconsistencies between various ways of checking for null.

    Nope, you're wrong here. "An object is faked as null when the underlying unmanaged object is destroyed" applies to :
    - The bool implicit cast :
    if (object)

    - The "==" and "!=" operators :
    if (object == null)
    and
    if (object != null)

    - The
    Equals()
    method :
    if (object.Equals(null))

    It doesn't applies to :
    - The
    ReferenceEquals()
    method
    - The
    is null
    /
    is not null
    C# constructs
    - The
    ??
    and
    ?.
    C# operators

    And note that you can actually write code to detect if the native object exists or not, for example by doing :
    bool isDestroyed = !ReferenceEquals(unityObject, null) && unityObject == null


    IMO, the "null can also means destroyed" concept wasn't that bad of an idea. In 99.9% of practical uses cases it functionally works and allow to write more compact code. If that concept didn't exists and was only available as a discrete property, you would have to replace every
    object != null
    or
    (bool)object
    statement with a
    object != null && !object.isDestroyed
    .

    The main issue is obscure behavior inconsistencies with the newer C# constructs, and that's the main reason I think it should be obsoleted and replaced with a
    IsDestroyed
    property.

    The other reason is performance. The overloaded operators/methods are awfully slow, although that specific issue is mainly due to their runtime (player) implementation being encumbered by an editor-specific call stack. Those methods/operators could be 3 to 8 times faster by having a discrete implementation directly checking for the
    UnityEngine.Object.m_CachedPtr
    field instead of having a non-inlined 3-4 methods deep call stack, as well as being written with an early out for the most common case (see the "LegacyEquality" example at the end of this post).

    IMO, the ideal implementation would be this :
    Code (CSharp):
    1. public class UnityObject
    2. {
    3.     internal IntPtr m_CachedPtr;
    4.  
    5.     public bool IsDestroyed
    6.         => m_CachedPtr == IntPtr.Zero;
    7.  
    8.     public static implicit operator bool(UnityObject unityObject)
    9.         => unityObject != null && unityObject.m_CachedPtr != IntPtr.Zero;
    10. }
    11.  
    12. public static class UnityObjectExtensions
    13. {
    14.     public static T DestroyedAsNull<T>(this T unityObject) where T : UnityObject
    15.     {
    16.         if (unityObject == null || unityObject.m_CachedPtr == IntPtr.Zero)
    17.             return null;
    18.  
    19.         return unityObject;
    20.     }
    21. }
    This mean you get the best of both worlds. You can now do this :
    Code (CSharp):
    1. // all those checks are functionally equivalent
    2. if (!unityObject)
    3. if (unityObject == null || unityObject.IsDestroyed)
    4. if (unityObject.DestroyedAsNull() == null)
    5. if (unityObject.DestroyedAsNull() is null)
    6.  
    7. // Null coalescing and null conditional support
    8. float f = unityObject.DestroyedAsNull()?.someFloatField ?? 0f;
    This keeps the ability to coalesce null checking and destroyed state in a compact check, but avoid the inconsistent behavior between various way of checking for null, unless the intent and behavior is clearly expressed by using the DestroyedAsNull() method.

    It would also provide a relatively safe upgrade paths for existing projects. Unity could analyze usages of the == and != operators against null and insert the DestroyedAsNull() method, making those calls functionally equivalent :
    Code (CSharp):
    1. if (unityObject == null) // before
    2. if (unityObject.DestroyedAsNull() == null) // after
    3.  
    4. (unityObject == anotherUnityObject) // before
    5. if (unityObject.DestroyedAsNull() == anotherUnityObject.DestroyedAsNull()) // after
    The only case where this doesn't result in the same behavior is if you are comparing two different objects for equality and both are destroyed. In the current implementation they aren't considered equal, which is inconsistent with the behavior of the second example, as
    null == null
    returns true.

    To have a truly safe migration path, object equality comparisons should be replaced with a "LegacyEquality" method, something like :
    Code (CSharp):
    1. public static bool LegacyEquality(UnityEngine.Object unityObject, UnityEngine.Object otherUnityObject)
    2. {
    3.     if (unityObject == otherUnityObject) return true;
    4.     if (unityObject == null && otherUnityObject.m_CachedPtr == IntPtr.Zero) return true;
    5.     if (otherUnityObject == null && unityObject.m_CachedPtr == IntPtr.Zero) return true;
    6.  
    7.     return false;
    8. }
     
    Last edited: Dec 2, 2022
  18. Huszky

    Huszky

    Joined:
    Mar 25, 2018
    Posts:
    109
    This, but the DestroyedAsNull should have nullability annotations
     
  19. atcarter714

    atcarter714

    Joined:
    Jul 25, 2021
    Posts:
    63
    That was incorrect, and I guess I mixed up/transposed what I was intending to type or something and did it backwards. I meant to say that `is null` and `is not null` (as well as the ?? operator, as you pointed out) make no consideration of the native engine pointer/handle and just tells you if the managed reference type is null or not. Despite the warnings Visual Studio and Rider like to give me about it and say you cannot use ??/??= or `is [not] null` ops on UnityEngine.Objects, I still do it when I want to check if a thing got assigned/initialized.

    Oddly enough, I've noticed that there are some UnityEngine Object's that will say they are not null references even without assignment. So I have to be wary and selective about what I'm actually using it on ...
     
  20. SisusCo

    SisusCo

    Joined:
    Jan 29, 2019
    Posts:
    1,295
    This is because Unity automatically assigns a "fake null" value to serialized Object fields in the Editor.

    So the
    is
    operator only works for your use case if the checked variable hasn't gone through the deserialization process - e.g. if it's static, readonly, non-serialized, or a member of a component that was created at runtime using AddComponent.
     
    atcarter714 likes this.