So, we have been stuck on 0.5.3-preview for a long while now, which has been blocking upgrading to Unity 2019, and this is something that I have been trying to tackle. With every release after 0.5.3, our game just does not work. So, I have now rebuilt the Addressables setup and catalogue from scratch in 1.1.7, fixing all asset duplication issues we had at the same time. To my great surprise, the game still does not work. After much investigation, I have concluded that there is something very wrong with dependency resolution. We have very complicated dependency graph, assets depend on other assets, which depend on other assets, etc. I have made sure that anywhere where an asset depends on an asset in another asset bundle, the latter is addressable so that it is explicitly referenced by that bundle. There are no duplicated dependencies anywhere at this point, both by design and by analysis rule. Now, when we load, say, the main character, we get *a lot* of spam, which did not occur in our old setup in 0.5.3: The referenced script (Unknown) on this Behaviour is missing! (Filename: C:\buildslave\unity\build\Runtime/Scripting/ManagedReference/SerializableManagedRef.cpp Line: 197) The referenced script on this Behaviour (Game Object '<null>') is missing! (Filename: C:\buildslave\unity\build\Runtime/Mono/ManagedMonoBehaviourRef.cpp Line: 333) According to other threads, this is caused when you have dependencies multiple levels deep with duplication. This is not the case here. Moreover, what follows is even weirder. The system uses a very bespoke version of UMA. UMA serializes character recipes as text, which can live in assets, and then references to other assets use a soft-reference system that is similar to addressables. We scrapped the latter, when recipes are serialized any references to unmanaged assets are put in an array of UnityEngine.Object so that Unity can detect dependencies and are only stored in asset files. Loading arbitrary text recipes is not supported. At deserialization time, managed references are restored using the managed reference array. This has worked for a long time now without problems... until the missing script spam started. Now, all of a sudden, I'm getting invalid cast exceptions. What is occurring is that the native objects are deserialized correctly but their managed wrappers are instantiated as UnityEngine.Object! If I debug that code, the name of the object is correct, setting Selection.activeObject to it I can see in the inspector that all the data is there, but object.GetType() returns UnityEngine.Object! What is even worse is that this seems to be entirely non-deterministic. Which objects fail to load varies. I have managed to make a repro on a test project using nothing but ScriptableObjects and I am utterly stumped as to where to go from here. A, B, C and D are separate ScriptableObject-derived classes. A, B and C have a UnityEngine.Object reference. I created the following asset structure: All four chains (i.e. objects A1, A2, A3 and A4) are referenced by a prefab. The prefab and all objects in all chains are addressable and all in a Packed Separately group. Chains 3 and 4 always deserialize correctly. Chains 1 and 2 *occasionally* deserialize correctly. Most of the time, the reference to object B is dropped to an instance of UnityEngine.Object. Inspecting the new object with SerializedObject/SerializedProperty I can see that the native object is deserialized correctly, but I cannot do anything with it, since it is the wrong class in the managed domain. If I change the types of the references to be more specific, the references are outright dropped to null instead. ... ideas? Edit: I have now submitted this as a bug. @unity_bill @PaulBurslem it would be great to get your input on this, I am utterly stumped after spending an extraordinary amount of time in the guts of the system trying to debug this, even making asset dependency diagrams to see what on earth is going on.