I'm confused about you said, why setter is forbidden? Take this for example: Code (CSharp): gun = player.transform.Find("Gun").gameObject; (source: https://docs.unity3d.com/ScriptReference/Transform.Find.html
@qklxtlx, you're not confused, @gorbit99 is. "Transform Parent = null" is perfectly fine. Transform references can be null, like any other object reference. The problem is probably related to the way Unity has overridden .Equals in UnityEngine.Object, to report that destroyed objects are equal to null. This affects == and != but maybe doesn't affect ??. I've tried to find some documentation on this, but so far, have found nothing that's clear. Wikipedia claims that ?? and != null are equivalent, but I'm skeptical. The C# docs, which are generally written very carefully, say that ?? returns the left operand if it "is not null" — it doesn't say "is not equal to null." This implies that ?? is only testing whether the reference is actually null, not some other value that maybe Equals null. EDIT: Yeah, compare to the docs on ==, which go on at some length about how this might be overridden. I think it's pretty clear now that ?? does no such thing, and is basically checking ReferenceEquals(lhs, null). What I still don't get, though, is why making your Parent reference a public field rather than a local variable has anything to do with it!
First of all, he's not trying to set a transform, he's only referencing one. You surely cannot set a gameobject's transform, that's just the nature of the gameobject. They have to have a transform and it'll stay the same all the time. That specific operator only takes the usual 'null' into account, whereas Unity also offers a pseudo-null object behaviour to present more information to the programmers in the editor (that's what happens in the second example, it'll set such a mysterious object to that non-assigned, serialized field). Usual comparisons like ==, != and some boolean comparisons will check whether the variable is 'null' or an instance that behaves as if it was null. You can show the difference by simply treating the reference in the second example as UnityEngine.Object and then as System.Object. Here's a small snippet: Code (CSharp): public class NullTest : MonoBehaviour { public Transform Parent; void Start() { // overriden operator compares to null and fake-null, result is true Debug.Log(Parent == null); // casting to a System.Object reveals that there's actually something assigned to it, result is false Debug.Log((System.Object)Parent == null); } }
Thanks, @Suddoha. That was the bit I was missing — I thought an unassigned public field would actually be null, but I see that Unity instead assigns (as you say) fake-null.
I often wonder what the guy at Unity who came up with the idea of the null override thinks of his decision this many years later.
I can relate to that, because it's a little tricky and can be really confusing! GameObjects and all sorts of components (afaik) are treated like that, but not MonoBehaviours. That's probably the thing which I also sometimes forget. Even more confusing: generics. Take this example: Code (CSharp): public class NullTest : MonoBehaviour { public UnityEngine.Object obj; public GameObject go; public Component component; // transform, light, camera, etc. public MonoBehaviour behaviour; // usually custom types public Component Parent; void Start() { LogNullResult<UnityEngine.Object>(obj); LogNullResult<GameObject>(go); LogNullResult<Component>(component); LogNullResult<MonoBehaviour>(behaviour); } private void LogNullResult<T>(T obj) { var typeSpecific = obj == null; var asSystemObject = (System.Object)obj == null; Debug.Log(string.Format("Result for {0}: Type-Specific: {1}, System.Object: {2}", typeof(T), typeSpecific, asSystemObject)); } } Afterwards, add a constraint, so replace Code (CSharp): private void LogNullResult<T>(T obj) (which is, as you probably know, kind of equal to 'where T : System.Object') with Code (CSharp): private void LogNullResult<T>(T obj) where T : UnityEngine.Object Clearly a difference that might have caused headaches and bugs sometimes. Without constraint it takes the System.Objects implementation (which is logical, as operators are defined as statics), with the constraint it takes the operator of the constraining type. What a mess and pain sometimes. :S
Got it, Thank you very much for the detailed discussion and explanations! I've never thought of fake null before in fact
just another question: the inspector may show "missing" or "none" for object field, is this included in the fake null also
That's just indicating that there's nothing set, with the difference, that 'Missing' additionally indicates there was something which does no longer exist and None, it hasn't been set at all or explicitly been set to None. 'None' may cause an 'UnassignedReferenceException' whereas 'Missing' may cause a 'MissingReferenceException', but only in the context of types that are actually candidates for Unity's fake-null. Or not? See below! Yet again, missing references seem to only 'really' appear during runtime, if it says missing in the editor while not in play mode, it will be 'none' as soon as it starts. At least that's what I've observed, you may have observed something else. Anyway, if you explicitly set the field to null at some point, it'll be just the normal null and will not act as if it was null. Also, local variables will not act as null even for those types mentioned above. It's only for GameObjects and Components (except MonoBehaviours) that are serializable fields. But here's a fun fact: Destruction with Destroy(...), it changes the game: If you destroy a MonoBehaviour, it'll behave like a component and cause the null-check to yield true, while (of course) a cast to System.Object yields false. (You're still referencing something which 'should not' exist anymore). So that's another inconsistent behaviour... It all has something to do with the lifetime of objects in C++ and C#. If you destroy the object, the unmanaged C++ object can be destroyed, its memory can be released but the C#-sided object will remain at least until you lose the last reference to that object so that it can be collected. At that point, they somehow wanted to still be able to tell you that it 'should be null' (but cannot be null - C# stuff). The best is always to check something by yourself, I don't know all the little details either and I don't wanna spread wrong information.
Yes, I would assume so. So basically just remember this: if you're using == or != to compare Unity object references to null, this will work both for references that are actually null, and for references that are equivalent to null (like destroyed objects, and apparently serialized fields that have no value assigned). But if you use ?? or ReferenceEquals, this tells you whether your reference is actually null (only). As @Suddoha showed, you can also do the latter by typecasting to System.Object. So, in most cases the behavior of == and != is quite convenient... it's almost (but not quite) like having magic references that automatically get set to null when an object is destroyed. But yeah, it can be a gotcha sometimes!
Sounds like they regret it. https://blogs.unity3d.com/2014/05/16/custom-operator-should-we-keep-it/ However the functionality is so core to the engine that it's difficult to remove without breaking every single project ever built. At the same time it's likely that we would simply be trading one edge case for another. Here are my rules for working around it. Never check if a UnityEngine.Object is null directly, always use the bool operator Never cast a UnityEngine.Object to a System.Object Never write a generic method designed to take UnityEngine.Object and System.Object I suppose I can add never use ?? on a UnityEngine.Object to the list too.
Pretty much. Unless the operator can be overridden to work with the Unity fake null. Note that most of the proposed solutions for removing the fake null and equals override would still leave you with the destroyed objects problem. It's not easily possible to set up a memory managed environment that also allows you to destroy arbitrary objects.
I'm really curious as to why the C# team forced the ?. and ?? operators to convert to ReferenceEquals instead of making them using the == operator.
The concept of nullability is complicated, but I think MS made the right decision here: for operators like ?. and ??, that are specifically about checking if something is null, it should just directly check that it is null. C# specifically doesn't let you override ReferenceEquals because a developer needs to know if a reference is actually the same reference most of the time, letting people override this would screw up almost any code that assumes the reference you're getting is actually the reference it says it is, and could probably screw up garbage collection as well. The concept of a "null reference" is sort of core to any reference based language and you shouldn't be able to mess with it. Null means the reference is not pointing at anything. Unity's to blame for trying to change what "null" means without having a good reason for it, and ignoring all the documentation about why you shouldn't do exactly what they did.
I agree that ReferenceEquals shouldn't be allowed to be overriden (and it isn't since it's static). What I meant was that I wonder why they did this: var x = myObject?.myProperty; // var x = ReferenceEquals(myObject, null) ? null : myObject.myProperty; instead of: var x = myObject?.myProperty; // var x = myObject == null ? null : myObject.myProperty; I agree that one shouldn't override the == operator without good reason but if they do, they want you to use their operator to decide if two objects are equal. MS choosing to do the former makes overriding the == operator less useful and developers who are still on C# < 6.0 are probably using the == operator to check for null so when they transition to C# 6.0 and start using ?. they will be surprised that they're getting different results (if they are using the ?. on classes that override the == operators).
I don't think anyone but Unity overrides == to check for null, most of the time people override it to handle the regular Equals(), to see if two objects are equal. Specific examples for why MS didn't tie the null checking operators to the == override... let's say you override it to sometimes return false on == null when it actually is null. Now "thing?.field" can throw a NullReferenceException, avoiding which was the entire reason they invented the "?." operator in the first place. "thing ?? otherthing" now gives you "thing", which is null, which will throw a null reference exception as soon as you use it, which makes it inherently useless. The opposite also causes problems. If you override == null to return true when it actually isn't null, then "thing ?? new Thing()" leaks memory. I feel much better knowing that the null-checking operators will not themselves throw null reference exceptions. You can always write extension methods to let you throw null ref exceptions from null checks if you really want to annoy other users.
Not likely to happen. It's so deep in the Unity architecture that it would break every single Unity project in existence. When (if?) it does happen, Unity will telegraph it months or years in advance.
They were saying that back when they didn't have the script upgrade system in place though. Now that they do, an intelligent conversion system could be made to repair people's projects for the change.
There was a discussion about this a while back with input from @JoshPeterson. The latest I've heard is nothing's going to change for now.
I think they should at least add an "isAlive" or "isDestroyed" field to GameObjects and Components so that we can get used to not depending on the null override. It will make it easier to remove the null override if that day ever arrives.
Well if someone says object is not null when object really is null then they really messed up somewhere. :/