I am still running on 5.6.4 and have avoided upgrading to 2017, but the new assembly definitions would be a huge boost to productivity for me. Wondering if anyone's done a transition from 5.6, and if so, was it alright? Also, just general feedback - I followed some of the beta and had some concerns with problems people found in release candidates - did they iron out most of the problems?
Seems alright to me but I have next to no external code running, just our own stuff. Assembly defs aren't a magic bullet though found a couple of issues with scriptable object and not sure if it's my fault yet.
Got any more details on problems you've had or hiccups? Compilation time for me is around 30 seconds off of SSD, it's a major problem.
When moving up to 2017.3, the first thing to do is test all of your build targets. There are a few API changes that will cause build issues on specific platforms. For example, if your code references GUILayer, it'll build on Standalone but not on WebGL.
About the same here on a rather large project. Have been running 2017.3 beta for a while and testing assembly def's. The thing is most of that 30 seconds is other stuff. So assembly def's will not bring that number down by a large percentage. If you look in your Library folder you can see that it's actually rebuilding stuff like the asset database also. It simply doesn't take as long as it takes just to compile the amount of code involved. Which you can verify by either compiling everything externally like via VS, or just comparing to other codebases of similar size.
Any info on what these other things are? I know the actual CS compilation is quick, but I assumed that a lot of the cost involved serialized object data like remapping monobehaviours and stuff. So the cost would be stuff like: "meta file has id" then: "use some reflection tool to resolve the class for meta file id" etc I thought this was the point of the entire 'logical assembly' thing, since if I read correctly, the Unity assemblies aren't actually different assemblies, it just means it can avoid rebuilding the internal mappings for some assets. Or is that incorrect? [looking over manual page, seems I did misunderstand - although actual seperate dlls would still allow less internal remapping (?) ]
I don't know what all it does, I saw a couple of obvious things in Library like the asset database, and didn't see an improvement that really mattered much. Plus numerous issues with dependencies, like if you want to use this on most of the large well known asset store technical assets, you would have to move code around to make it work. Or some configurations work in editor but builds fail. So overall for me just was not a good bang for the buck feature. I do love the 32 bit mesh indices, that feature didn't get enough fanfare really.
using System.Collections; using System.Collections.Generic; using UnityEngine; public class Test : MonoBehaviour { // Use this for initialization void Start () { Texture foo = null; if (foo) { Debug.Log(foo.imageContentsHash); } } // Update is called once per frame void Update () { } } !!!BUILD FAILED!!! error CS1061: Type `UnityEngine.Texture' does not contain a definition for `imageContentsHash' and no extension method `imageContentsHash' of type `UnityEngine.Texture' could be found. Are you missing an assembly reference?