Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  3. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Feedback About the com.unity.serialization Package

Discussion in '2023.2 Beta' started by Canijo, Nov 2, 2023.

  1. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    First of all, incredible package, i love it, though I have not seen it advertised anywhere and found it accidentally. Also, i dont see a section for it in the package's Forum, so i posted here.

    I dont know if this package is going to remain public, or if I should forget about it because it will become internal, but it's pure gold for me.

    About implementing adapters, i believe it can be beneficial to complement IAdapter<T> and IContravariantAdapter<T> (either Json or Binary) with a generic IGenericAdapter with the likes of this:

    Code (CSharp):
    1.  
    2.     public interface IGenericBinaryAdapter
    3.     {
    4.         void Serialize<TValue>(in BinarySerializationContext<TValue> context, TValue value);
    5.         TValue Deserialize<TValue>(in BinaryDeserializationContext<TValue> context);
    6.     }
    7.  
    I have a use case where i want any class or struct, that implements ISerializationCallbackReceiver to get called OnAfterDeserialize() and OnBeforeSerialize().
    With the current implementation, i cannot target structs "generically" through an adapter, as contravariance doesnt work on them, so i would have to create an adapter specifically for each struct that implements it.

    With a generic adapter i can target any class or struct, and also having the generic type in the method of an Adapter can be beneficial to other use cases i believe, specially for targeting structs implementing any interface.

    Also, i'd like you to consider implementing a similar adapter, but passing down a class, that represents the property we are serializing/deserializing (Like the Property<TContainer,TValue> from Unity.Properties), as this will allow us to look at the attributes of the property so we can have System.Attribute's that affect serialization.

    Still, i love this package, reeeeally big thanks :)
     
    Last edited: Nov 2, 2023
    CodeSmile likes this.
  2. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    So, auto-responding myself xD, for the ISerializationCallbackReceiver problem, this could be achieved by just using a PropertyVisitor before and after the serialization. It adds the overhead of 2 full visitations of the object graph, but is doable, it just seems like a waste considering that we already are visiting the graph.
     
  3. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,422
    I agree, I love it too - but only binary! :)
    If you dabble with the Json part ... I bet you'll be like: :confused::oops:o_O:(:rolleyes::eek::mad::mad::mad::mad::mad:


    I'll come back to this post tomorrow. Want to double-check whether I haven't done something like this. Can't remember.
     
  4. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,422
    Okay so I don't know how this fits into the question, I'll just post what I'm doing in that project where I binary serialize classes with native collections in them.

    First, this is the To/FromBinary starting point with a byte[] as input/output that works with a generic type:
    Code (CSharp):
    1. public static class Serialize
    2. {
    3.     /// <summary>
    4.     /// Serializes the object to binary using the provided adapters.
    5.     /// Adapters provide control over how serialization is processed.
    6.     /// </summary>
    7.     /// <param name="obj"></param>
    8.     /// <param name="adapters"></param>
    9.     /// <typeparam name="T"></typeparam>
    10.     /// <returns></returns>
    11.     public static unsafe Byte[] ToBinary<T>(T obj, IReadOnlyList<IBinaryAdapter> adapters = null)
    12.     {
    13.         var buffer = new UnsafeAppendBuffer(16, 8, Allocator.Temp);
    14.         var parameters = new BinarySerializationParameters { UserDefinedAdapters = adapters?.ToList() };
    15.         BinarySerialization.ToBinary(&buffer, obj, parameters);
    16.  
    17.         var bytes = buffer.ToBytesNBC();
    18.         buffer.Dispose();
    19.  
    20.         return bytes;
    21.     }
    22.  
    23.     /// <summary>
    24.     /// Attemtps to deserialize a byte[] to the specified type using the provided adapters.
    25.     /// </summary>
    26.     /// <param name="serializedBytes"></param>
    27.     /// <param name="adapters"></param>
    28.     /// <typeparam name="T"></typeparam>
    29.     /// <returns></returns>
    30.     public static unsafe T FromBinary<T>(Byte[] serializedBytes, IReadOnlyList<IBinaryAdapter> adapters = null)
    31.     {
    32.         fixed (Byte* ptr = serializedBytes)
    33.         {
    34.             var bufferReader = new UnsafeAppendBuffer.Reader(ptr, serializedBytes.Length);
    35.             var parameters = new BinarySerializationParameters { UserDefinedAdapters = adapters?.ToList() };
    36.             return BinarySerialization.FromBinary<T>(&bufferReader, parameters);
    37.         }
    38.     }
    39. }

    I have an interface for the classes/structs that I wish to serialize from within, rather than putting everything in adapter classes:
    Code (CSharp):
    1.     public interface IBinarySerializable
    2.     {
    3.         unsafe void Serialize(UnsafeAppendBuffer* writer);
    4.         unsafe void Deserialize(UnsafeAppendBuffer.Reader* reader, Byte serializedDataVersion);
    5.     }

    This is a data container class which accepts generic structs (unmanaged) that implement the IBinarySerializable interface:
    Code (CSharp):
    1. public struct LinearDataMapChunk<TData> : IEquatable<LinearDataMapChunk<TData>>, IDisposable
    2.     where TData : unmanaged, IBinarySerializable
    3. {
    4.     private const Byte ChunkAdapterVersion = 0;
    5.  
    6.     private ChunkSize m_Size;
    7.     private UnsafeList<TData> m_Data;
    8.  
    9.     public static List<IBinaryAdapter> GetBinaryAdapters(Byte dataAdapterVersion) => new()
    10.     {
    11.         new LinearDataMapChunkBinaryAdapter<TData>(ChunkAdapterVersion, dataAdapterVersion,    Allocator.Domain),
    12.     };
    13.  
    14.     public LinearDataMapChunk(ChunkSize size, UnsafeList<TData> data)
    15.     {
    16.         if (data.IsCreated == false)
    17.             throw new ArgumentException("UnsafeList<TData> passed into ctor is not allocated");
    18.  
    19.         m_Size = math.max(ChunkSize.zero, size);
    20.         m_Data = data;
    21.         ResizeListToIncludeHeightLayer(m_Size.y);
    22.     }
    23.  
    24.     // ... rest omitted
    25. }

    Then I have a base class for my adapters just so that every adapter carries a version - this is crucial if you want to be able to support older versions of already serialized data (eg a user's savegame, or user content!) as you update your serialized classes with more fields, different types, etc.
    Code (CSharp):
    1. public abstract class VersionedBinaryAdapterBase
    2. {
    3.     /// This represents the adapter's "latest" version.
    4.     public Byte AdapterVersion { get; set; }
    5.  
    6.     public VersionedBinaryAdapterBase(Byte adapterVersion) => AdapterVersion = adapterVersion;
    7.  
    8.     ///     Write the current Version / read the serialized version
    9.     protected unsafe void WriteAdapterVersion(UnsafeAppendBuffer* writer) => writer->Add(AdapterVersion);
    10.     protected unsafe Byte ReadAdapterVersion(UnsafeAppendBuffer.Reader* reader) => reader->ReadNext<Byte>();
    11.     protected String GetVersionExceptionMessage(Byte version) => $"serialized version {version} no longer supported";
    12. }

    Next is an actual adapter implementation that calls the interface methods Serialize and Deserialize. This has the advantage that that serialization code now lives within the serialized data, making it closely tied to the data and easy to update without sifting through many more lines of adapter code.
    Code (CSharp):
    1. public class LinearDataMapChunkBinaryAdapter<TData> : VersionedBinaryAdapterBase,
    2.     IBinaryAdapter<LinearDataMapChunk<TData>> where TData : unmanaged, IBinarySerializable
    3. {
    4.     private readonly Byte m_DataVersion;
    5.     private readonly Allocator m_Allocator;
    6.  
    7.     private static unsafe void WriteChunkData(
    8.         in BinarySerializationContext<LinearDataMapChunk<TData>> context, in UnsafeList<TData>.ReadOnly dataList)
    9.     {
    10.         var writer = context.Writer;
    11.         var dataLength = dataList.Length;
    12.         writer->Add(dataLength);
    13.  
    14.         foreach (var data in dataList)
    15.             data.Serialize(writer);
    16.     }
    17.  
    18.     private static unsafe UnsafeList<TData> ReadChunkData(
    19.         in BinaryDeserializationContext<LinearDataMapChunk<TData>> context, Byte serializedDataVersion,
    20.         Allocator allocator)
    21.     {
    22.         var reader = context.Reader;
    23.         var dataLength = reader->ReadNext<Int32>();
    24.  
    25.         var list = UnsafeListExt.NewWithLength<TData>(dataLength, allocator);
    26.         for (var i = 0; i < dataLength; i++)
    27.         {
    28.             var data = new TData();
    29.             // TODO: avoid boxing!
    30.             data.Deserialize(reader, serializedDataVersion);
    31.             list[i] = data;
    32.         }
    33.  
    34.         return list;
    35.     }
    36.  
    37.     public LinearDataMapChunkBinaryAdapter(Byte adapterVersion, Byte dataVersion, Allocator allocator)
    38.         : base(adapterVersion)
    39.     {
    40.         m_DataVersion = dataVersion;
    41.         m_Allocator = allocator;
    42.     }
    43.  
    44.     public unsafe void Serialize(in BinarySerializationContext<LinearDataMapChunk<TData>> context,
    45.         LinearDataMapChunk<TData> chunk)
    46.     {
    47.         var writer = context.Writer;
    48.  
    49.         WriteAdapterVersion(writer);
    50.         writer->Add(m_DataVersion);
    51.         writer->Add(chunk.Size);
    52.         WriteChunkData(context, chunk.Data);
    53.     }
    54.  
    55.     public unsafe LinearDataMapChunk<TData> Deserialize(
    56.         in BinaryDeserializationContext<LinearDataMapChunk<TData>> context)
    57.     {
    58.         var reader = context.Reader;
    59.  
    60.         var serializedAdapterVersion = ReadAdapterVersion(reader);
    61.         if (serializedAdapterVersion == AdapterVersion)
    62.         {
    63.             var serializedDataVersion = reader->ReadNext<Byte>();
    64.             var chunkSize = reader->ReadNext<ChunkSize>();
    65.             var data = ReadChunkData(context, serializedDataVersion, m_Allocator);
    66.  
    67.             return new LinearDataMapChunk<TData>(chunkSize, data);
    68.         }
    69.  
    70.         throw new SerializationVersionException(GetVersionExceptionMessage(serializedAdapterVersion));
    71.     }
    72. }

    So the actual data class and its adapter are not generic, but the data class itself holds generic data - it just needs to implement the IBinarySerializable interface and even that is optional.

    Finally, here are two instances of a struct that get serialized. I use this in tests to check that "loading an older version of binary data" works as expected.
    Code (CSharp):
    1. public struct DataVersionOld : IBinarySerializable
    2. {
    3.     public Byte RemainsUnchanged0;
    4.     public Int16 WillChangeTypeInVersion1;
    5.     public Byte RemainsUnchanged1;
    6.     public Int64 WillBeRemovedInVersion1;
    7.     public Byte RemainsUnchanged2;
    8.  
    9.     public unsafe void Serialize(UnsafeAppendBuffer* writer)
    10.     {
    11.         writer->Add(RemainsUnchanged0);
    12.         writer->Add(WillChangeTypeInVersion1);
    13.         writer->Add(RemainsUnchanged1);
    14.         writer->Add(WillBeRemovedInVersion1);
    15.         writer->Add(RemainsUnchanged2);
    16.     }
    17.  
    18.     public unsafe void Deserialize(UnsafeAppendBuffer.Reader* reader, Byte serializedDataVersion) =>
    19.         throw new NotImplementedException();
    20. }
    21.  
    22. public struct DataVersionCurrent : IBinarySerializable
    23. {
    24.     public const Double NewFieldInitialValue = 1.2345;
    25.  
    26.     public Byte RemainsUnchanged0;
    27.     public Int64 WillChangeTypeInVersion1;
    28.     public Byte RemainsUnchanged1;
    29.     public Double NewFieldWithNonDefaultValue;
    30.     public Byte RemainsUnchanged2;
    31.  
    32.     public unsafe void Serialize(UnsafeAppendBuffer* writer)
    33.     {
    34.         writer->Add(RemainsUnchanged0);
    35.         writer->Add(WillChangeTypeInVersion1);
    36.         writer->Add(RemainsUnchanged1);
    37.         writer->Add(NewFieldWithNonDefaultValue);
    38.         writer->Add(RemainsUnchanged2);
    39.     }
    40.  
    41.     public unsafe void Deserialize(UnsafeAppendBuffer.Reader* reader, Byte serializedDataVersion)
    42.     {
    43.         switch (serializedDataVersion)
    44.         {
    45.             case 1:
    46.                 RemainsUnchanged0 = reader->ReadNext<Byte>();
    47.                 WillChangeTypeInVersion1 = reader->ReadNext<Int64>();
    48.                 RemainsUnchanged1 = reader->ReadNext<Byte>();
    49.                 NewFieldWithNonDefaultValue = reader->ReadNext<Double>();
    50.                 RemainsUnchanged2 = reader->ReadNext<Byte>();
    51.                 break;
    52.             case 0:
    53.                 RemainsUnchanged0 = reader->ReadNext<Byte>();
    54.                 WillChangeTypeInVersion1 = reader->ReadNext<Int16>();
    55.                 RemainsUnchanged1 = reader->ReadNext<Byte>();
    56.                 reader->ReadNext<Int64>(); // skip bytes for: WillBeRemovedInVersion1
    57.                 RemainsUnchanged2 = reader->ReadNext<Byte>();
    58.  
    59.                 // could also be a value computed from the other fields
    60.                 NewFieldWithNonDefaultValue = NewFieldInitialValue;
    61.                 break;
    62.  
    63.             default:
    64.                 throw new SerializationVersionException($"unhandled data version {serializedDataVersion}");
    65.         }
    66.     }
    67. }

    Note that you needn't use new structs whenever you change the version, this is just for my unit tests. If you do change the version of the binary data, you do so in the existing data class and add another switch statement to handle each "version minus X" serialized data loading.

    There are cases where you may end supporting very old versions, thus you'd remove the code for "versions older than the past four generations of serialized data", and trying to load that data will then throw an exception.

    Almost forgot, this is the "can I load serialized binary data of the older version?" unit test:
    Code (CSharp):
    1. [Test] public void Deserialize_WhenLoadingPreviousVersion_DataCanBeDeserialized()
    2. {
    3.     var data0 = new DataVersionOld
    4.     {
    5.         RemainsUnchanged0 = 0xff,
    6.         WillChangeTypeInVersion1 = 8,
    7.         RemainsUnchanged1 = 0xff,
    8.         WillBeRemovedInVersion1 = 9,
    9.         RemainsUnchanged2 = 0xff,
    10.     };
    11.  
    12.     using (var chunk = new LinearDataMapChunk<DataVersionOld>(new ChunkSize(1, 1, 1)))
    13.     {
    14.         chunk.SetData(LocalCoord.zero, data0);
    15.  
    16.         var adapterVersion0 = new List<IBinaryAdapter> {
    17.             new LinearDataMapChunkBinaryAdapter<DataVersionOld>(TestAdapterVersion, 0, Allocator.Domain),
    18.         };
    19.         var bytes = Serialize.ToBinary(chunk, adapterVersion0);
    20.         Debug.Log($"{bytes.Length} Bytes: {bytes.AsString()}");
    21.  
    22.         var adapterVersion1 = new List<IBinaryAdapter> {
    23.             new LinearDataMapChunkBinaryAdapter<DataVersionCurrent>(TestAdapterVersion, 1, Allocator.Domain),
    24.         };
    25.      
    26.         using (var chunk = Serialize.FromBinary<LinearDataMapChunk<DataVersionCurrent>>(bytes, adapterVersion1))
    27.         {
    28.             var data1 = chunk.GetWritableData()[0];
    29.  
    30.             Assert.That(data1.WillChangeTypeInVersion1, Is.EqualTo((Int64)data0.WillChangeTypeInVersion1));
    31.             Assert.That(data1.NewFieldWithNonDefaultValue, Is.EqualTo(DataVersionCurrent.NewFieldInitialValue));
    32.             Assert.That(data1.RemainsUnchanged0, Is.EqualTo(data0.RemainsUnchanged0));
    33.             Assert.That(data1.RemainsUnchanged1, Is.EqualTo(data0.RemainsUnchanged1));
    34.             Assert.That(data1.RemainsUnchanged2, Is.EqualTo(data0.RemainsUnchanged2));
    35.  
    36.             // see if we can serialize v1 correctly
    37.             var bytes2 = Serialize.ToBinary(chunk, adapterVersion1);
    38.             Debug.Log($"{bytes2.Length} Bytes: {bytes2.AsString()}");
    39.          
    40.             using (var chunk2 = Serialize.FromBinary<LinearDataMapChunk<DataVersionCurrent>>(bytes, adapterVersion1))
    41.                 Assert.That(chunk2.GetWritableData()[0], Is.EqualTo(data1));
    42.         }
    43.     }
    44. }
    45.  

    Final note: the unit test relies on IEquatable<> implementation that I omitted from the two test structs. Just in case someone spots that the unit test isn't actually comparing the fields - it does, just not in this example. ;)
     
    Last edited: Nov 3, 2023
  5. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Hi @Canijo,

    Although the original implementer have moved on, I will relay your kind words.

    I agree this would be needed, but it's unlikely that this feature will be added in the short term. For the time being, it's probably fine to make it a local package and make the change directly.

    Same answer as the above, sadly. I believe the serialization context already contains that information in some way, but doesn't expose it. It might be only for json adapters though.

    Hi @CodeSmile,

    Can you elaborate further on this? I've actually heard the opposite feedback many times. :)

    Thank you!
     
  6. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,422
    Well, the issue with the Json serializer is that it seems to require you to know every detail of the Json format. It may be a case of bad documentation too, perhaps I took a wrong turn somewhere.

    I ended up writing code that said: put a bracket here, then fill in content, then add another bracket, and of course make that a key/value thing, and so on ... yeah, that's stuff I can pretty much do myself with a StringBuilder and it'll be more readable and understandable.

    It just felt extremely low-level and confusing to the point where I could not make the simplest things work, and I was fighting heavily with the serializer throwing very confusing exceptions because it was expecting something different than I gave it and no matter how I tweaked it, it would still complain. For example an object that contains another object, or an object that contains an array with data in it. That basic stuff I just couldn't make work after over 2 days, where the resulting json would have been maybe 5 lines.

    Again, the docs for this are terrible but in contrast, the binary serialization was absolutely straightforward, almost natural, with even LESS documentation than the Json part.

    If there's some secret docs to this, let me know and I may give it another try.
     
  7. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    The documentation of the package is lacking for sure. In most cases, you shouldn't have to write the brackets, indents or whatnot manually. The "primitives" of the
    JsonWriter
    follow this, which tells the rules that must be followed. For example, to write a key-value pair, you must be inside of an object scope.

    Usually, the thing I've seen that trips people using it is that they try to open a scope before nesting in adapters. For example opening an object scope, then calling
    context.SerializeValue(...)
    expecting that the adapter for that value will write it as key-value pair. Usually, this tends to break very easily. Again, here, the error messages are lacking.

    Looking at the type definitions in the Getting Started section of the package documentation, I'll iteratively implement adapters to make the output smaller. The basic output would be:
    Code (CSharp):
    1. {
    2.     "Name": "Bob",
    3.     "Health": 100,
    4.     "Position": {
    5.         "x": 10,
    6.         "y": 20
    7.     },
    8.     "Inventory": [
    9.         {
    10.             "Name": "Sword",
    11.             "Type": 0
    12.         },
    13.         {
    14.             "Name": "Shield"
    15.             "Type": 1,
    16.         },
    17.         {
    18.             "Name": "Health Potion"
    19.             "Type": 2
    20.         }
    21.     ]
    22. }
    If you wanted to have the same output, but with different/shorter names, you could define a
    Player
    adapter like this:
    Code (CSharp):
    1. public void Serialize(in JsonSerializationContext<Player> context, Player value)
    2. {
    3.     // Player will have multiple key-value fields, so we must open an object scope.
    4.     using var objectScope = context.Writer.WriteObjectScope();
    5.  
    6.     // Most primitives can write key-value pairs directly.
    7.     context.Writer.WriteKeyValue("N", value.Name);
    8.     context.Writer.WriteKeyValue("H", value.Health);
    9.  
    10.     // Sub-objects can use the `SerializeValue` method.
    11.     context.Writer.WriteKey("P");
    12.     context.SerializeValue(value.Position);
    13.  
    14.     // Arrays can also use the `SerializeValue` method.
    15.     context.Writer.WriteKey("I");
    16.     context.SerializeValue(value.Inventory);
    17. }
    This would give:
    Code (CSharp):
    1. {
    2.     "N": "Bob",
    3.     "H": 100,
    4.     "P": {
    5.         "x": 10,
    6.         "y": 20
    7.     },
    8.     "I": [
    9.         {
    10.             "Name": "Sword",
    11.             "Type": 0
    12.         },
    13.         {
    14.             "Name": "Shield",
    15.             "Type": 1
    16.         },
    17.         {
    18.             "Name": "Health Potion",
    19.             "Type": 2
    20.         }
    21.     ]
    22. }
    Next, I want to serialize the
    Position
    on a single line. I can define an adapter for the
    int2
    type to serialize it as a value rather than an object:
    Code (CSharp):
    1. public void Serialize(in JsonSerializationContext<int2> context, int2 value)
    2. {
    3.     // Serializes as a value.
    4.     context.Writer.WriteValue($"{value.x}, {value.y}");
    5.  
    6.     // Serializes as an object. This would be equivalent to the default behaviour.
    7.     // using var objectScope = context.Writer.WriteObjectScope();
    8.     // context.Writer.WriteKeyValue("x", value.x);
    9.     // context.Writer.WriteKeyValue("y", value.y);
    10. }
    Similarly, you could do the same thing for the
    Item
    :
    Code (CSharp):
    1. public void Serialize(in JsonSerializationContext<Item> context, Item value)
    2. {
    3.     context.Writer.WriteValue($"{value.Name} - {value.Type}");
    4. }
    This would give:
    Code (CSharp):
    1.  
    2. {
    3.     "N": "Bob",
    4.     "H": 100,
    5.     "P": "10, 20",
    6.     "I": [
    7.         "Sword - Weapon",
    8.         "Shield - Armor",
    9.         "Health Potion - Consumable"
    10.     ]
    11. }
    12.  
    Lastly, and this will make the output bigger, let's say you wanted to manually write the
    Item
    array inside the
    Player
    adapter and bypass the adapters, you could replace:
    Code (CSharp):
    1. // Arrays can use the `SerializeValue` method.
    2. context.Writer.WriteKey("I");
    3. context.SerializeValue(value.Inventory);
    with this:
    Code (CSharp):
    1. // Write the same array, but manually, by-passing the Item list and item adapters.
    2. context.Writer.WriteKey("I");
    3. using var arrayScope = context.Writer.WriteArrayScope();
    4. for (var i = 0; i < value.Inventory.Length; ++i)
    5. {
    6.     using var arrayItemValueScope = context.Writer.WriteObjectScope();
    7.     context.Writer.WriteKeyValue("Name", value.Inventory[i].Name);
    8.     context.Writer.WriteKeyValue("Type", (int)value.Inventory[i].Type);
    9. }
    Which would give:
    Code (CSharp):
    1. {
    2.     "N": "Bob",
    3.     "H": 100,
    4.     "P": "10, 20",
    5.     "I": [
    6.         {
    7.             "Name": "Sword",
    8.             "Type": 0
    9.         },
    10.         {
    11.             "Name": "Shield",
    12.             "Type": 1
    13.         },
    14.         {
    15.             "Name": "Health Potion",
    16.             "Type": 2
    17.         }
    18.     ]
    19. }
    20.  
    Hope this helps!
     
    yu_yang and CodeSmile like this.
  8. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,422
    Thanks, much appreciated! :)
    If you can, link your post in the docs. That'll be super helpful.
    What is really confusing about this is that there's no explanation on what constitutes an object or a scope and when to open/close them as you pointed out. I'm pretty sure I tripped over exactly this because I just could not get the nesting to work.
     
    martinpa_unity likes this.
  9. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Thank you for your detailed examples! You actually just tought me a few things =D
    Thanks for the response, i understand, i still hope that if development gets picked-up again, this post might get visited =)

    Im currently "re-designing the wheel" as a fun project where im making my own base "Object" class, that should mimic many of the features of a Unity Object, but with my custom perks, and many editor-only features.

    This was before i noticed that you already had released Unity.Properties (that i didnt know anything about), and the Runtime Binding api. With those + Unity.Serialization and the Source Generators compatibility, im just astounded for the power we have available.

    I hope we are getting to an age of "gosh how i LOVE working like this" :):):)

    I might leave some other suggestion / problems i find while using this.

    By the way, classic serialization rules dont really apply with these package, and i believe its not propperly documented? I dont know about the Json part, but on Binary, you can actually serialize open generics, and any class is actually by reference, with the "inline" behaviour of classes with [SerializeField] not being respected (null can be serialized).

    The open-generic thing is actually wonders for me =D (though probably breaks in AOT if i dont manually preserve those classes, but anyway is so cool)
     
  10. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Once again, thanks for the kind words!

    Please do!

    Correct, when using Unity.Properties, we will generate properties for both
    [SerializeField]
    and
    [SerializeReference]
    , but they are both treated as polymorphic types when it's a reference type.

    On the classic serialization, you can still end up with a null value for a field with
    [SerializeField]
    , but it gets patched once it's "looked at", which is a behaviour we didn't want to have for Unity.Properties.

    There is a small note in here, perhaps we should make it clearer.

    As long as they are used somewhere, they should get included in the build. You can also for the generation of the property bag for a given type to ensure it is referenced.
     
  11. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Nonetheless, if i have say a class like:
    Code (CSharp):
    1.  
    2. [Serializable]
    3. [GeneratePropertyBag]
    4. public class Property<T>  : IProperty<T>
    5. {
    6.    [SerializeField]
    7.    private T? _value;
    8. }
    And I only reference them via generics calls, or generic interfaces that dont even mention Property<T>, but instead some IProperty that will end up looking the deserialized value. That should not get AOT generated right?.

    This imaginary class would have been initialized somewhere through reflection via
    Code (CSharp):
    1. var propertyType = typeof(Property<>).MakeGenericType(someValueType)
    2. var property = Activator.CreateInstance(propertyType);
    3. ...
    and then serialized, so concrete implementations are really never directly mentioned

    I believe i should have some pipeline that at some point during the build or asset editing that can collect all serialized types that should be preserved, and create script like:

    Code (CSharp):
    1. static class PreserveGenerics
    2. {
    3.   /// never actually called
    4.   [UnityEngine.Scripting.Preserve]
    5.   static void Preserve()
    6.   {
    7.      var p1 = new Property<Vector3>();
    8.      var p2 = new Property<SomeCustomClass>();
    9.       ...
    10.    }
    11. }
    It might be easier to achieve this? SourceGenerator's wont help here me because its not code-dependent generation, but "asset-dependent" ?

    Im new to this "dealing with AOT" but i think i cannot get away without some explicit "Preserve" mechanism that is necesarily dependant on what is actually serialized.
     
    Last edited: Nov 7, 2023
  12. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    For the property bag generation, I don't think the
    [GeneratePropertyBag]
    will do much here, because we don't generate property bags for open generic types. You will need to use the
    [GeneratePropertyBagForType(typeof(...))]
    and pass it a "closed" type.

    But yeah, generally, if you only create instances through reflection and go through an interface, the types might not get preserved. Having the property bag generated will help, though.
     
  13. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Ohh perfect, so i could probably get away with it replacing the "PreserveGenerics" class with just

    Code (CSharp):
    1. [assembly: GeneratePropertyBagForType(typeof(Property<Vector3>)]
    2. [assembly: GeneratePropertyBagForType(typeof(Property<SomeCustomClass>)]
    3. ...
    as the PropertyBag will already cause it to be preserved =D

    ty!
     
  14. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Is it by design that JsonSerialization cannot be called from a MonoBehaviour's "
    OnAfterDeserialize
    " ? Im getting a "System out of memory" exception from the
    ReadJob.Run()
    inside
    JsonSerialization.FromJson(..) 
    on the Editor.

    It only happens on AssemblyReload or EnterPlayMode (with assembly reloads), any other call to
    OnAfterDeserialize 
    works. And moving it to "
    Awake
    ()" works.

    (I removed any custom code and only call JsonSerializatiion.FromJson with a non null, valid json string, and it doesnt seem to even be able to get to any IJsonAdapter, it crashes before)
    Edit: it really happens with anything, like
    Code (CSharp):
    1.  
    2.         public void OnAfterDeserialize()
    3.         {
    4.             string json = JsonSerialization.ToJson(5, default);
    5.             int value = JsonSerialization.FromJson<int>(json, default); /// crash
    6.         }
     
    Last edited: Nov 22, 2023
  15. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    So, in case someone gets the same problem. The crash is happening due to "something" related to Jobs not being prepared right after an AssemblyReload (serialization code appears to run before anything else when the reload happens, even before than any
    [InitializeOnLoadAttribute]
    .

    There is a method overload for
    JsonSerialization.Fromjson
    that bypasses the call to
    ReadJob.Run()
    by providing a
    SerializedValueView
    .

    Using that method fixes it for me, like so:

    Code (CSharp):
    1.  
    2. static unsafe T FixedDeserialize<T>(string json, JsonSerializationParameters parameters = default)
    3. {
    4.    fixed (char* buffer = json)
    5.    {
    6.       using var reader = new SerializedObjectReader(buffer, json.Length, GetDefaultConfigurationForString(json, parameters));
    7.       reader.Read(out var view);
    8.       return JsonSerialization.FromJson<T>(view, parameters);
    9.     }
    10. }
    11.  
    12. /// copied from internal method in JsonSerialization
    13. static SerializedObjectReaderConfiguration GetDefaultConfigurationForString(string json, JsonSerializationParameters parameters = default)
    14. {
    15.      var configuration = SerializedObjectReaderConfiguration.Default;
    16.  
    17.      configuration.UseReadAsync = false;
    18.      configuration.ValidationType = parameters.DisableValidation ? JsonValidationType.None : parameters.Simplified ? JsonValidationType.Simple : JsonValidationType.Standard;
    19.      configuration.BlockBufferSize = math.max(json.Length * sizeof(char), 16);
    20.      configuration.TokenBufferSize = math.max(json.Length / 2, 16);
    21.      configuration.OutputBufferSize = math.max(json.Length * sizeof(char), 16);
    22.      configuration.StripStringEscapeCharacters = parameters.StringEscapeHandling;
    23.  
    24.      return configuration;
    25. }
     
    Last edited: Nov 22, 2023
    martinpa_unity likes this.
  16. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Hi @Canijo, I'm glad you were able to find a way to resolve the issue. I'm not aware of anything specifically in the serialization package that shouldn't work in that call. It uses a lot of jobs, though.

    I've tried running simple jobs from that method and I've gotten crashes on domain reload. So I think, as you said, that something is not ready in Jobs.
     
    Canijo likes this.
  17. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Well, as long as we have a temporary workaround, its no biggie =).

    Edit: Also, running the original method on a Player build without the fix is not a problem, because it is actually ready, its just a problem on the Editor because of the order of things happening on the AssemblyReloads. I've actualy headbanged with this particullar issue a number of times: there is no way to run *anything* before deserialization happens for UnityObjects that were active during the Reload. Its annoying because it forces me to lazy-initialize some static classes whenever they are called, so they are "ready" when and if Serialization needs them, but wont always let me fully initialize because, even if on editor things are actually single-threaded, the Editor still prevents accesing some Unity APIs as it assumes you might be on the loading-thread (deserialization). So i end up needing a "double initialization" where the non-Unity side can lazy initialize, but the side that touches Unity needs to wait until [InitializeOnLoadMethod]. But thats a problem for another day xD
     
    Last edited: Nov 23, 2023
  18. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Posting some things that i've found while using this, in case this get re-visited.
    • Exceptions thrown during JsonSerialization, are wrapped into a
      DeserializationEvent
      that later throws. While i think this can be helpful, the fact that is being re-thrown makes us lose the stack trace and any meaningful information. I believe we could save some stressed programmers some time if all exceptions are captured into a
      ExceptionDispatchInfo
      so the StackTrace is preserved: later they can still get grouped into an AggregateException or directly throw them if there is only one of them through
      ExceptionDispatchInfo.Throw()
      , which would preserve all that beautiful stack trace.
    • Some Validation errors concerning
      ObjectScope 
      or
      ArrayScope 
      could be more easily debugged if you dump the incomplete generated json that failed to be propperly produced. As this validation errors are normally thrown outside of the code that generated the problem, whenever scopes are getting disposed and the writer sees that its missing some closures. Being able to see the incomplete json i think makes it easier to deduce where we actually started producing errors.
    • IContravariantAdapter could avoid boxing its Des/SerializationContext if the method accepted the context as a generic with a struct & interface constraint, mantaining the benefits of a readonly struct passed through the "in" keyword (not true until .NET is updated and has readonly struct constraint soz xD), like:
      Code (CSharp):
      1. public interface IContravariantBinaryAdapter<in TValue> : IBinaryAdapter  
      2. {
      3.     void Serialize<TContext>(in TContext context, TValue value)
      4.             where TContext : struct, IBinarySerializationContext;
      5.  
      6.     object Deserialize<TContext>(in TContext context)
      7.              where TContext : struct, IBinaryDeserializationContext;
      8. }
      9.  
    • Deserializing into an existing instance is actually implemented but not publicly exposed. Im not refering to
      JsonSerialization.FromJsonOverride
      , but rather to the
      DeserializeValue<T>()
      methods in both Binary & Json contexts. This will allow us to deserialize into either readonly reference fields or fields that are initialized through the object constructor, or just plainly overriding some object (imagine a Undo-like system, this would just be perfect), just like you do inside the visitor. Without it 's just too cumberstone to implement and all the code is actually already implemented on your side.
      Code (CSharp):
      1. /// existing method
      2. public T DeserializeValue<T>()
      3. {
      4.     var value = default(T);
      5.     m_Visitor.ReadValue(ref value);
      6.     return value;
      7. }
      8.  
      9. /// My desired method overload
      10. public void DeserializeValue<T>(ref T value)
      11. {
      12.     m_Visitor.ReadValue(ref value);
      13. }
    • Adapters for types that we want to be able to serialize as references, cannot be state-less, and thus can never be Global. What i mean by this is that there is no way to share data between adapters for a single serialization process without previously configuring them. For any adapter that wants to serialize a value as a reference, will probably be implemented by embedding the instance data the first time it is written, and referencing as an ID in further writes within the same serialization process, just as you are doing with your
      SerializedReferences
      inside your visitor. This is not really a blocking problem as you can make your adapters just have a Prepare & Finish methods passing some State object that is used to store shared data, and then clearing it on Finish. But it would be ideal if we could just pass an "object" to the serialization entry point (like you do with Migrations for Json), that is later accesible from within the serialization/deserialization context, or maybe even be able to store "key-value" pairs, we then wouldnt need to have any local state in adapters, as we could store inside the context, and thus some custom adapters could just be written as Global without needing to configure them prior to serializing. Or even if not global, they could just be included and not have to configure anything per serialization call. I hope that i've managed to explain what i mean.
    • JsonSerializationState can be used to share references between different serialization calls. But the binary equivalent is internal, im not personally using it but i believe is a typo?.
    Loving this package :):)
     
    Last edited: Dec 13, 2023
  19. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Also, about validation errors. In the "FromJson" method, errors are inserted into the DeserializationEvents and thus we have a way get notified if multiple errors happen. But in the "ToJson", if validation errors from ObjectScope / ArrayScope are thrown, as they are thrown from within the "Dispose" method, they actually steal any exception that might have triggered that Dispose, completely hiding the underlying problem, making it hard to debug and forcing to go step by step until we can catch the actual exception that we are interested about. It'd be nice if those scopes would not throw on Dispose, and instead just log the error and prevent further writing in some other manner. So the real exceptions can actually get thrown.
     
  20. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    1. This is really helpful, but I still don't know how to implement deserialization. Could you provide some examples of deserialization?

    2. The documentation doesn't provide details on how generics and collection types are handled/supported. Could you briefly introduce that?

    3. I seem to find a bug. The following code can execute successfully:
    Code (CSharp):
    1. Dictionary<string, float> d = new() { { "Key", 1f } };
    2. Debug.Log(JsonSerialization.ToJson(d));
    But if you change
    1f
    to
    float.PositiveInfinity
    , the following error occurs:
    "InvalidOperationException: WriteValue can only be called as a root element, array element, or after WriteKey."

    4. Please make
    BinarySerialization
    support
    System.IO.Stream
    . Many existing codes are based on this type and its derived types, not on
    UnsafeAppendBuffer
    . Currently, extra transfer operations are necessary between them. Additionally, it seems that the
    FromBinaryOverride
    method is missing.
     
    Last edited: Dec 22, 2023
  21. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    I just came up with an improvement idea for Unity Properties, not sure if it's feasible.

    Anyway, I checked the generated code by Unity Properties, for non-public fields, it uses Reflection and Emit to implement GetValue and SetValue. I was thinking, perhaps we can place the generated property class inside the container type, like this:

    Code (CSharp):
    1.  
    2. // User Type
    3. [GeneratePropertyBag]
    4. partial class Container
    5. {
    6.     [CreateProperty(insidePartial: true)]
    7.     int _field;
    8. }
    9.  
    10. // Generated
    11. partial class Container
    12. {
    13.     public class _field_XXXX_Property : Property<Container, int>
    14.     {
    15.         public override string Name => "_field";
    16.  
    17.         public override bool IsReadOnly => false;
    18.  
    19.         public override int GetValue(ref Container container)
    20.             => container._field;
    21.  
    22.         public override void SetValue(ref Container container, int value)
    23.             => container._field = value;
    24.     }
    25. }
    26.  
    Firstly,
    CreateProperty
    could allow receiving a parameter
    insidePartial
    . If set to
    true
    , the generated property class would be located inside the container, enabling direct access to non-public members of the container without the need for Reflection and Emit.

    The only issue is that the container may not have the
    partial
    modifier. In such cases, if
    insidePartial
    is
    true
    ,
    CreateProperty
    would check whether the container type, and the types containing it all have the
    partial
    modifier. If not, it would set
    insidePartial
    to
    false
    and output a warning, to reminding the user to consider adding the
    partial
    modifier.

    Finally, regardless of whether this idea is feasible, fields with the
    internal
    modifier should be accessible directly, similar to
    public
    fields, without the need for Reflection and Emit.
     
    Last edited: Dec 24, 2023
  22. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Hi @yu_yang, if you add the
    partial
    keyword to the type tagged with
    [GeneratePropertyBag]
    , the property bag will be generated as a nested type. For example, this:

    Code (CSharp):
    1. [assembly:Unity.Properties.GeneratePropertyBagsForAssembly]
    2. [Unity.Properties.GeneratePropertyBag]
    3. public partial class MyClass
    4. {
    5.     public float value;
    6. }
    Should output something like:

    Code (CSharp):
    1. partial class MyClass
    2. {
    3.     internal static void RegisterMyClass_7f11a2605bf9464c8f93f66930bb242a_PropertyBag()
    4.     {
    5.         global::Unity.Properties.PropertyBag.Register(new global::MyClass.MyClass_7f11a2605bf9464c8f93f66930bb242a_PropertyBag());
    6.     }
    7.  
    8.     [global::System.Runtime.CompilerServices.CompilerGenerated]
    9.     sealed class MyClass_7f11a2605bf9464c8f93f66930bb242a_PropertyBag : global::Unity.Properties.ContainerPropertyBag<global::MyClass>
    10.     {
    11.         public MyClass_7f11a2605bf9464c8f93f66930bb242a_PropertyBag()
    12.         {
    13.             AddProperty(new value_Property());
    14.         }
    15.  
    16.         [global::System.Runtime.CompilerServices.CompilerGenerated]
    17.         class value_Property : global::Unity.Properties.Property<global::MyClass, float>
    18.         {
    19.             public override string Name => "value";
    20.             public override bool IsReadOnly => false;
    21.  
    22.             public override float GetValue(ref global::MyClass container) => container.value;
    23.             public override void SetValue(ref global::MyClass container, float value) => container.value = value;
    24.         }
    25.     }
    26. }
    Hope this helps!
     
    yu_yang likes this.
  23. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    Great! Details like these really should be documented!
     
  24. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    It doesn't go into great details about it, but this page indicates the making type
    partial
    will allow the property bag to access internal and private fields and properties.
     
    yu_yang likes this.
  25. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    Wait a moment, the situation seems a bit off. My test code is like this:

    Code (CSharp):
    1. [GeneratePropertyBag]
    2. partial class Container
    3. {
    4.     [CreateProperty]
    5.     int _field;
    6.     [GeneratePropertyBag]
    7.     partial class Container2
    8.     {
    9.         [CreateProperty]
    10.         int _field;
    11.     }
    12. }
    But the generated code is like this (Please note that the property type still inherits from
    ReflectedMemberProperty
    ):

    Code (CSharp):
    1.     [GeneratePropertyBag]
    2.     internal class Container
    3.     {
    4.         [CreateProperty]
    5.         private int _field;
    6.  
    7.         internal static void RegisterContainer_f34c865745ed49bb85d97611bb05bfac_PropertyBag()
    8.         {
    9.             PropertyBag.Register<Container>((PropertyBag<Container>) new Container.Container_f34c865745ed49bb85d97611bb05bfac_PropertyBag());
    10.         }
    11.  
    12.         internal static void RegisterContainer2_da2e8613d194467f998744afe5a074b1_PropertyBag()
    13.         {
    14.             Container.Container2.RegisterContainer2_da2e8613d194467f998744afe5a074b1_PropertyBag();
    15.         }
    16.  
    17.         public Container()
    18.         {
    19.             base.\u002Ector();
    20.         }
    21.  
    22.         [GeneratePropertyBag]
    23.         private class Container2
    24.         {
    25.             [CreateProperty]
    26.             private int _field;
    27.  
    28.             internal static void RegisterContainer2_da2e8613d194467f998744afe5a074b1_PropertyBag()
    29.             {
    30.                 PropertyBag.Register<Container.Container2>((PropertyBag<Container.Container2>) new Container.Container2.Container2_da2e8613d194467f998744afe5a074b1_PropertyBag());
    31.             }
    32.  
    33.             public Container2()
    34.             {
    35.                 base.\u002Ector();
    36.             }
    37.  
    38.             [CompilerGenerated]
    39.             private sealed class Container2_da2e8613d194467f998744afe5a074b1_PropertyBag :
    40.                 ContainerPropertyBag<Container.Container2>
    41.             {
    42.                 public Container2_da2e8613d194467f998744afe5a074b1_PropertyBag()
    43.                 {
    44.                     base.\u002Ector();
    45.                     this.AddProperty<int>((Property<Container.Container2, int>) new Container.Container2.Container2_da2e8613d194467f998744afe5a074b1_PropertyBag._field_Property());
    46.                 }
    47.  
    48.                 [CompilerGenerated]
    49.                 private class _field_Property : ReflectedMemberProperty<Container.Container2, int>
    50.                 {
    51.                     public _field_Property()
    52.                     {
    53.                         base.\u002Ector(typeof (Container.Container2).GetField("_field", BindingFlags.Instance | BindingFlags.NonPublic), "_field");
    54.                     }
    55.                 }
    56.             }
    57.         }
    58.  
    59.         [CompilerGenerated]
    60.         private sealed class Container_f34c865745ed49bb85d97611bb05bfac_PropertyBag :
    61.             ContainerPropertyBag<Container>
    62.         {
    63.             public Container_f34c865745ed49bb85d97611bb05bfac_PropertyBag()
    64.             {
    65.                 base.\u002Ector();
    66.                 this.AddProperty<int>((Property<Container, int>) new Container.Container_f34c865745ed49bb85d97611bb05bfac_PropertyBag._field_Property());
    67.             }
    68.  
    69.             [CompilerGenerated]
    70.             private class _field_Property : ReflectedMemberProperty<Container, int>
    71.             {
    72.                 public _field_Property()
    73.                 {
    74.                     base.\u002Ector(typeof (Container).GetField("_field", BindingFlags.Instance | BindingFlags.NonPublic), "_field");
    75.                 }
    76.             }
    77.         }
    78.     }
    I'm not certain if this is a version issue. My Unity version is 2022.3.14f1.
     
  26. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Not a version thing, this is a bug in the source generator. It should be able to generate code that does not use reflection when it's generated as a nested type.

    I'll get that fixed once I'm back from the holidays!
     
    yu_yang likes this.
  27. Onigiri

    Onigiri

    Joined:
    Aug 10, 2014
    Posts:
    490
    I can't get around how to change key with adapters. I created this adapter for my data class

    Code (CSharp):
    1.     class ItemDataAdapter : IJsonAdapter<ItemData>
    2.     {
    3.         void IJsonAdapter<ItemData>.Serialize(in JsonSerializationContext<ItemData> context, ItemData itemData)
    4.         {
    5.             //context.Writer.WriteKey("Id"); error
    6.             //context.SerializeValue("Id", itemData.Id); error
    7.             //context.Writer.WriteKeyValue("Id", itemData.Id); error
    8.             context.Writer.WriteValue(itemData.Id);
    9.         }
    10.     }
    But i'm getting
    InvalidOperationException: WriteEndArray can only called after WriteBeginArray or WriteValue.
    when trying to change key
     
  28. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    After testing, it was found that after calling
    ToBinary
    , modifying the order of fields in the code, and then calling
    FromBinary
    , the object cannot be correctly restored. Is it because there are no names for serialized properties? If so, it would be best to provide an option to serialize property names, with the default being
    true
    , to ensure that data can be correctly restored after version changes.
     
  29. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    From what i've seen, currently only the Json adapter has the default Unity behaviour, where "renamed" properties marked with [FormerName] will get picked up. Also, anything more complex, still on Json, you can implement IJsonMigration which is great.

    I dont think they will implement what you mean for Binary because the non-versioned behaviour is actually quite logical.

    Json is a structured format, any serialized data makes sense on its own. Its a collection of key-value properties following javascript's object notation (JavaScript Object Notation = json). So it doesnt matter where you read it, it will always make sense.

    Binary on the other side is not a format, just a way of refering to the output as being bytes. You could write a BinaryJson wrapper, that emits json-formatted data in binary though.

    Part of the beauty of the BinarySerialization in this package is the crazy speed that comes by not needing to "format" anything and just read and write class definitions directly thanks to Unity.Properties.

    Also, @CodeSmile showed aboved a custom implementation for BinaryAdapters that takes into account versioning, which might be helpful!
     
  30. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    Version changes complicate binary serialization. Due to the likelihood of version changes, even if they don't ultimately occur, it's necessary to anticipate this possibility. Therefore, binary serialization should provide an appropriate way to handle version changes effectively. Note that it should be simple enough, preferably fully automated (similar to Json serialization), and at least simpler than manually writing a BinaryReader; otherwise, it loses the purpose of using Unity Properties.

    Keep in mind that the primary goal of using Unity Properties should be automation, with performance as a secondary consideration. If automation isn't achievable, custom approaches can be used to trade for performance. Automation, in this context, means being able to work out of the box in most scenarios and consistently in a robust way, which is crucial for real projects.
     
  31. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    I dont agree with the "should", i dont think that is explicitly stated by Unity. And from my POV, Unity.Properties is just an API for object graph visitation, leveraging SourceGenerators to make it extremely performant. The problem you are describing applies only to the the BinaryAdapter implementation of Unity.Serialization.

    The package provides 2 ways to serialize any data. The first outputs in a structured universally known format, and the other outputs a extremely compacted binary stream.

    Implementing the BinarySerialization with a format that is resilient to class definition changes, means a complete overhaul of the system and a huge increase of output size, which in turn can also lose its purpose for many high.performance use cases.

    Consider a simple class with two just Int32 fields. The output, if the target class is known at deserialization time (DisableRootAdapters = true), its just 4bytes per field. Total 8 bytes.

    Now consider you use a strong-binary format that is resilitent to changes. For it to be resilient, you need to serialize property names, but then you also need either the property type or the size of the written data, so you know when a property ends and the next starts. Writing the data size seems the simple approach. You are now serializing at least 1 byte for each character in the property name (i dont know the specifics of char as bytes, but i think they can also be of size 2byte per char), plus the string's total size as another 4bytes (or 2bytes using ushort, or 1 byte if you cap it to 255 max chars, which is very reasonable) , and also 4bytes for the Int32 representing the total written size of the property value, plus the value itself. You are easily doubling, probably way more, the output size. Plus the overhead of preparing a SerializedValueView (like the Json counterpart) for the user to consume.

    At least myself, I highly prefer the current binary approach, and If somehow you are constrained to a binary output, and you want a version-resilient format, you could actually just use the "JsonSerialization.ToJson" overload that takes in the JsonWriter. Then, instead of converting the JsonWriter to a string, just use the GetUnsafeReadOnlyPtr() method + Length to write into a binary stream. Then for deserialization, you create a SerializedObjectReader from that binary stream, and use that into the JsonSerialization.FromJson(), and voilá. Also, as your output is binary and you wont care about it being human-readable, you could disable StringEscapeHandling, and enable Simplified and Minified in the JsonSerializationParameters, which will be way faster than traditional json)
     
    Last edited: Dec 27, 2023
  32. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    It seems there is no built-in Clone method? Since the properties have already been collected in advance, I think an efficient cloning method can be implemented without resorting to serialization/deserialization.

    BTW, In non-main threads, is it safe to use FromJson/FromBinary?
     
    Last edited: Jan 3, 2024
  33. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    I'm back from then holidays, so I'll try to answer some of the feedback and questions.

    @yu_yang

    Here is a full example for the player class I used for the serialization example:

    Code (CSharp):
    1. using System;
    2. using Unity.Serialization.Json;
    3. using UnityEditor;
    4. using UnityEngine;
    5.  
    6. public struct int2
    7. {
    8.     public float x;
    9.     public float y;
    10.  
    11.     public int2(float x, float y)
    12.     {
    13.         this.x = x;
    14.         this.y = y;
    15.     }
    16. }
    17.  
    18. public enum ItemType
    19. {
    20.     Weapon,
    21.     Armor,
    22.     Consumable
    23. }
    24.  
    25. public class Item
    26. {
    27.     public string Name;
    28.     public ItemType Type;
    29. }
    30.  
    31. public class Player
    32. {
    33.     public string Name;
    34.     public int Health;
    35.     public int2 Position;
    36.     public Item[] Inventory;
    37. }
    38.  
    39. public static class Test
    40. {
    41.     [InitializeOnLoadMethod]
    42.     public static void RunTest()
    43.     {
    44.         JsonSerialization.AddGlobalAdapter(new Adapter());
    45.        
    46.         EditorApplication.delayCall += () =>
    47.         {
    48.             var player = new Player
    49.             {
    50.                 Name = "Bob",
    51.                 Health = 100,
    52.                 Position = new int2(10, 20),
    53.                 Inventory = new[]
    54.                 {
    55.                     new Item {Name = "Sword", Type = ItemType.Weapon},
    56.                     new Item {Name = "Shield", Type = ItemType.Armor},
    57.                     new Item {Name = "Health Potion", Type = ItemType.Consumable}
    58.                 }
    59.             };
    60.  
    61.             var json = JsonSerialization.ToJson(player);
    62.             Debug.Log(json);
    63.  
    64.             var player2 = JsonSerialization.FromJson<Player>(json);
    65.         };
    66.     }
    67.  
    68.     public class Adapter :
    69.         IJsonAdapter<Player>
    70.         , IJsonAdapter<int2>
    71.         , IJsonAdapter<Item>
    72.     {
    73.         public void Serialize(in JsonSerializationContext<int2> context, int2 value)
    74.         {
    75.             // Serializes as a value as a custom string.
    76.             context.Writer.WriteValue($"{value.x}, {value.y}");
    77.         }
    78.  
    79.         public int2 Deserialize(in JsonDeserializationContext<int2> context)
    80.         {
    81.             // Since we serialized a custom string for the value, we can convert the serialized value to a string
    82.             // and extract the 'int2' manually.
    83.             // This could be optimized to avoid allocations on the string operations.
    84.             var stringValue = context.SerializedValue.AsStringView().ToString();
    85.             var values = stringValue.Split(",");
    86.             if (values.Length != 2)
    87.             {
    88.                 throw new InvalidJsonException("Expected two values to deserialize a 'int2' type");
    89.             }
    90.  
    91.             return new int2(float.Parse(values[0]), float.Parse(values[1]));
    92.         }
    93.  
    94.         public void Serialize(in JsonSerializationContext<Item> context, Item value)
    95.         {
    96.             // Serializes as a value as a custom string.
    97.             context.Writer.WriteValue($"{value.Name}-{value.Type}");
    98.         }
    99.  
    100.         public Item Deserialize(in JsonDeserializationContext<Item> context)
    101.         {
    102.             // Since we serialized a custom string for the value, we can convert the serialized value to a string
    103.             // and extract the 'Item' manually.
    104.             // This could be optimized to avoid allocations on the string operations.
    105.             var stringValue = context.SerializedValue.AsStringView().ToString();
    106.             var values = stringValue.Split("-");
    107.             if (values.Length != 2)
    108.             {
    109.                 throw new InvalidJsonException("Expected two values to deserialize a 'Item' type");
    110.             }
    111.  
    112.             return new Item
    113.             {
    114.                 Name = values[0],
    115.                 Type = Enum.Parse<ItemType>(values[1])
    116.             };
    117.         }
    118.  
    119.         public void Serialize(in JsonSerializationContext<Player> context, Player value)
    120.         {
    121.             using var objectScope = context.Writer.WriteObjectScope();
    122.             // Most primitives can write key-value pairs directly.
    123.             context.Writer.WriteKeyValue("N", value.Name);
    124.             context.Writer.WriteKeyValue("H", value.Health);
    125.  
    126.             // Sub-objects can use the `SerializeValue` method.
    127.             context.Writer.WriteKey("P");
    128.             context.SerializeValue(value.Position);
    129.  
    130.             // Arrays can use the `SerializeValue` method.
    131.             context.Writer.WriteKey("I");
    132.             context.SerializeValue(value.Inventory);
    133.         }
    134.  
    135.         public Player Deserialize(in JsonDeserializationContext<Player> context)
    136.         {
    137.             // Here, since we have manually written all the fields/properties using alternative names,
    138.             // we can simply deserialize each field/property using that name and type.
    139.             var player = new Player
    140.             {
    141.                 Name = context.DeserializeValue<string>(context.SerializedValue["N"]),
    142.                 Health = context.DeserializeValue<int>(context.SerializedValue["H"]),
    143.                 Position = context.DeserializeValue<int2>(context.SerializedValue["P"]),
    144.                 Inventory = context.DeserializeValue<Item[]>(context.SerializedValue["I"])
    145.             };
    146.  
    147.             return player;
    148.         }
    149.     }
    150. }
    Simply put, when deserializing, you can access the serialized data view through the
    context.SerializedValue
    property. This serialized data view can either be converted to a known type using the
    As[...]
    methods or you can navigate through nested fields/properties by using the
    []
    operator or the
    GetValue
    method.

    For a given
    SerializedValueView
    , if you know the type, you can use
    context.deserialized<T>(view)
    to deserialize the value. I believe primitives needs to go through the
    As[...]
    methods.

    Unless I misunderstood what you are asking, I believe there is an example in the snippet I provided.

    Seems like a valid bug. I'll take a note. Again, a fix might take a bit of time while we figure out what will happen with the package now that the owner has left Unity.

    The reason we introduced
    UnsafeAppendBuffer
    was because we needed something that was burst compatible. Most of the serialization package was written with Burst in mind.

    A clone operation is something that is often requested. It is however not something we're likely to add to the API. There are several reasons for this, but it mostly boils down to what should the cloning operation even do? Should it do a shallow clone preserving all references or a deep clone?

    Over the years, we've implemented multiple version of the clone operation and in each version, it only served for specific contexts and would be completely broken in others. I think it's for similar reasons that the ICloneable interface is not recommended to be used in public APIs.

    In most cases, it's probably easier to write custom-tailored visitors for what you want to do.

    As far as I'm aware, it should be safe...as long as no calls to Unity is made and no Unity object are created or assigned.

    Hope this helps!
     
    yu_yang and Onigiri like this.
  34. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    @Canijo

    Using a
    ExceptionDispatchInfo
    is a very good suggestion.

    I'm not very familiar with the validation code in this package, if we do have access to the context during validation, it would be a good suggestion to output it.

    I think this should be possible today. If I recall correctly, the reason that we used a boxing interface in this case was because at the time, IL2CPP didn't support full generic sharing and it required a lot of AOT helpers to be generated.

    Changing this would be a breaking change however and would require another major version to be released.

    This should already be possible using the current adapters. The adapters are done in a way that they can be used in both contexts. To override an instance, you can use
    context.GetInstance()
    , modify it and return it. It's limited, but it's a start.

    This is true at the moment. The adapters do have access to the visitor, which contains the
    SerializedReferences
    , but it is not exposed. With additional validation added to
    SerializedReferences
    , it could be exposed.

    I think it's the other way around. What probably happened is that some internal feature required the binary one, so it was made public and we forgot to toggle the JSON one.

    Lots of good suggestions in there, thanks!
     
    Canijo likes this.
  35. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    A small suggestion, please provide ToReadOnlySpan, it can avoid allocations and is widely supported by .Net runtime.
     
  36. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    I am interested in the
    Clone
    because, when writing custom tools or dealing with data not inherit Unity
    Object
    , a method similar to
    Object.Instantiate
    is needed (Usually, shallow cloning can be achieved through MemberwiseClone). Additionally also
    Copy
    is useful, which can be used to reset state, which is useful when using object pools.

    Additionally, I am curious about a question. Will Unity Properties & Serialization be used to improve the serialization and instantiation of GameObjects? Is it faster than the existing solutions? If you have conducted performance tests, developers would be very interested in the details.
     
  37. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Object.Instantiate
    is a good example of how a
    Clone
    operation is more complicated than it can seem:
    • The parent property of a cloned game object will not be set by default.
    • Prefab connection is not kept (in order to keep it, you must clone the game object by using
      PrefabUtility.InstantiatePrefab
      .
    • The components of the game object are cloned, which requires a distinct way of creating/adding them (they are stored in a different storage and you can't simply call new
      MyComponent()
      and insert them in a list on the game object.
    • The children of the game object are also cloned.
    • References to other game objects/components will either be copied in a shallow way (copy the reference if it is outside of the cloned objects) or remapped (if the reference is inside of the cloned objects).
    • Instance ids are obviously not copied.
    • If you clone a
      MonoBehaviour
      using
      Object.Instantiate
      , you end up cloning all the game object, all of its components and all of its children.
    This is mostly on top of my head and I'm probably missing a few things. Now this operation is possible because
    Object.Instantiate
    knows about all of this. I haven't checked the native code too much lately, but it probably goes through some of the serialization paths as well.

    When using the Entities package, trying to clone entities would be done in a completely different way, even if in essence, the relationship between Entities and their components is similar to the relationship of Game Objects and their components.

    While trying to create a generic
    Clone
    operation, we concluded that it required a fully customizable system where you could inject global, local and contextual rules. In the end, every single time, we ended up writing dedicated code.

    This is very unlikely, as they serve two different purposes and their "domain" so to speak is different as well.

    Properties in itself is not tied to serialization at all. It uses the same rules to automagically create properties from fields, but that was done mostly for "backwards compatibility" and avoid needing to instrument existing type too much. I'm still on the fence on that. I think if I had to start over today, I would probably require explicit tags on fields/properties and completely decouple it from the serialization system.

    Properties also only has access to the managed side of things, whereas Unity has a lot of native components and data with light wrappers in managed.

    Serialization was made for runtime use-cases and to serialize types that Unity can't. In terms of performance, if we were to serialize scene data using Properties & Serialization instead of the classic Unity serialization, I would wager that it would be slower. It's a question of exposed features and trade-off:
    • Properties allows you to fully use C# properties instead of only fields/auto-properties. This means that the serialization/deserialization can go through arbitrary code. This indirection alone would make it slower.
    • Properties must create property bags for every type that it needs to use. Even if we were to use a source generator to codegen all the property bags at compilation time (which would increase the compilation time), there is a
      Mono.Jit
      cost to registering a property bag and visiting a type the first time. Unity can do a lot of the heavy lifting on the native side and avoid a lot of just-in-time compilation.
    • Properties & Serialization were created to be user facing, extensible features, including the ability to manually write your property bags, create properties that are not to be serialized, support read-only fields and properties, support migration, support adapters, etc. Extensibility usually has a cost. Unity serialization is a much more closed system with few extensibility points.
    • Properties & Serialization are managed-only libraries. Porting Unity to CoreCLR will help a ton here, but generally, it will probably still be slower than native code.
    • Properties & Serialization will treat every reference type as a reference, whereas Unity serialization will treat most types as a value type unless tagged with the
      [SerializeReference]
      attribute.
    But, as I said above, the aim of Properties & Serialization is different than Unity serialization. One aims to enable light-weight features (in comparison) that work in all contexts (editor, runtime, build) and the other is a low-level framework construct that powers a LOT of Unity features (inspector framework, undo-redo, prefabs, presets, etc.)

    Hope this helps!
     
    Canijo and yu_yang like this.
  38. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    In working with Properties, I noticed that the
    Accept
    methods of
    Property
    and
    PropertyBag
    accept visitor types as interfaces, it means that value-type visitors will be boxed. However, I need to use value-type visitors because each visitor has independent state.

    I could resolve this issue using an object pool or by using a stack/scope to cache the state of the previous visitor, but I wonder why not make
    Accept
    generic? Like this:

    Code (CSharp):
    1. void Accept<TVisitor>(ref TVisitor visitor) where TVisitor : IVisitor;
     
  39. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    We did experiment with making everything go through generics (including the properties themselves) to be able to have unmanaged visitation. The issue, at the time, was that the support for open generic types in IL2CPP was limited and required writing a lot of ahead-of-time helper boilerplate to generate the proper combinations of type. While a lot of this boilerplate could be done on our side, deviating from the most simple use-case (i.e. calling a visitor from a visitor) would require users to write their own ahead-of-time helpers.

    The support has been greatly enhanced since then, so it might be possible to do today, but it would be a big breaking change.
     
    yu_yang likes this.
  40. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    As you're also answering questions related to Unity.Properties, here is one (maybe we should open a new thread about those)

    Any
    KeyValueCollectionPropertyBag
    , returns its contents as
    KeyValuePairProperty
    's, which makes perfect sense. However, as the ".Key" and ".Value" properties for a
    KeyValuePair
    are ReadOnly (which also makes sense, as thats what they are), any ValueType Key &/o Value indexed on a IDictionary will be unwrittable, and cannot be edited through the standard Runtime Bindings as
    PropertyContainer.SetValue
    (concreteley the
    PathVisitor 
    implementation) wont be able to write back any modified value.

    I stumbled upon this while writing Editor tools and UI fields with the runtime bindings system and I am currently bypassing it by overriding
    UpdateSource<T>(...) 
    on a custom
    DataBinding 
    class, but im hoping to see if by any chance, you have a recommended approach for dealing with this, or if
    KeyValueCollectionPropertyBag
    could eventually return a special cased
    Property 
    that returns a
    KeyValuePairProperty
    which modifies the collection when modifying the Key/Value sub-properties, instead of returning a ReadOnly version for each one.

    The "Key" property might be problematic, as it would require removing/adding and that probably arises some issues. But i believe the "Value" one should be safe to be implemented this way?
     
    Last edited: Jan 14, 2024
  41. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Also some random comment about JsonSerialization => The
    SerializedReferencesVisitor 
    pre-pass, its accessing ALL properties for the provided object graph, instead of only the ones that will effectively be serialized (its visiting properties with "
    DontSerialize
    " or "
    NonSerialized
    ").

    For some objects that might contain non-serialized UnityObjects, this can potentially visit a loooot of references that are not really needed, and also (my real concern) triggering getters properties that might have custom behaviour not designed to run at serialization.
     
  42. Onigiri

    Onigiri

    Joined:
    Aug 10, 2014
    Posts:
    490
    Is it possible to parse only part of json without loading all file into memory?
     
  43. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    You can read to a specific location in the file using a custom approach and then pass the read
    string
    to
    FromJson
    . However, I think this is unnecessary because reading while determining where to stop is slow and challenging to define where to stop. If the data is large, it is recommended to use binary (easier to achieve partial reading) or save it as multiple files. For text, reading the entire file at once is the most convenient.
     
    Onigiri likes this.
  44. martinpa_unity

    martinpa_unity

    Unity Technologies

    Joined:
    Oct 18, 2017
    Posts:
    484
    Hey @Canijo,

    The reason why it's this way is because we wanted the
    Property
    to be stateless as much as possible, because that way, you can use the same
    Property
    instance to get the value from an object and set it on a different object. For the
    Dictionary
    , it makes things a little bit more complicated to use, however returning a
    Property
    that holds a reference to an object would be very inconsistent with the rest of the API.

    For data binding, sadly, at the moment, my recommendation so far has been to go through the
    UpdateSource
    route. I would personally opt to create a
    DictionaryView
    similar to
    ListView
    (but using a
    MultiColumnListView
    ), but it's not always possible. I've explored doing that and I ended up accessing/editing the
    Dictionary
    through a custom iterator using a type like:

    Code (CSharp):
    1. public struct Container
    2. {
    3.     [CreateProperty]
    4.     public TKey Key { get; init; }
    5.  
    6.     public DictionaryField<TKey, TValue> DictionaryField { get; init; }
    7.  
    8.     [CreateProperty]
    9.     public TValue Value
    10.     {
    11.         get => DictionaryField.value[Key];
    12.         set
    13.         {
    14.             if (EqualityComparer<TValue>.Default.Equals(Value, value))
    15.                 return;
    16.  
    17.             DictionaryField.value[Key] = value;
    18.         }
    19.     }
    20. }
    Which ends up sort-of creating a "statefull property".

    I remember there was a reason for it, but I don't remember what it was. But fair enough, it should probably not do that.

    Hope this helps!
     
  45. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    I've lately been doing a lot of Editor work and i've ended up solving this particular problem with creating a custom
    PathVisitor
    , based on the implementation of the default
    PathVisitor
    , but with it also being a
    IDictionaryPropertyBagVisitor
    , so it can detect dictionarys while traversing and get the propper generics. I later derive
    SetValueVisitor
    's for some SetValue calls and a
    ConvertibleSetValueVisitor
    with
    ConverterGroup 
    support for UI Bindings.

    When visiting a Dictionary, if the Path goes through a Key it will check if the property is effectively a
    KeyValuePair
    property, and instead of entering its property bag, it will manually enter either a custom
    KvpKeyProperty
    or a
    KvpValueProperty 
    that are not ReadOnly. These properties, instead of writing to the Dictionary and having to store it, they're stateless thanks to the "container" (in this case the
    KeyValuePair
    ) being passed by ref. So when you "SetValue" on the property, they set a new
    KeyValuePair 
    into the "ref" kvp that was passed in.

    This way, when the Visitor it is not on a "ReadonlyVisit", the new Kvp can be written back to the dictionary (as the default kvp property is not ReadOnly). I also modified that particular Write, for it being a "Remove the previous key" then "Insert the new KeyValuePair", so you cannot accidentaly remove a Value by overwriting a Key. Also with a helper virtual method for myself "
    IsKeyChangeAllowed
    " to later implement validation in my
    DictionaryField
    using a derived Visitor in the Bindings for Keys, that can check if there is a Key conflict, and notify the user without changing the field.

    These are the methods that are different to the default PathVisitor:

    Code (CSharp):
    1.    
    2. void IDictionaryPropertyBagVisitor.Visit<TDictionary, TKey, TValue>(IDictionaryPropertyBag<TDictionary, TKey, TValue> bag, ref TDictionary container)
    3.         {
    4.             var pathPart = _path[_currentPathIndex++];
    5.             IProperty<TDictionary> property;
    6.             switch (pathPart.Kind, bag)
    7.             {
    8.                 case (PropertyPathPartKind.Name, INamedProperties<TDictionary> named):
    9.                     if (named.TryGetProperty(ref container, pathPart.Name, out property))
    10.                     {
    11.                         property.Accept(this, ref container);
    12.                         return;
    13.                     }
    14.                     break;
    15.                 case (PropertyPathPartKind.Index, IIndexedProperties<TDictionary> indexed):
    16.                     if (indexed.TryGetProperty(ref container, pathPart.Index, out property))
    17.                         using (new AttributesScope(property, _property))
    18.                         {
    19.                             property.Accept(this, ref container);
    20.                             return;
    21.                         }
    22.                     break;
    23.                 case (PropertyPathPartKind.Key, IKeyedProperties<TDictionary, object> keyed):
    24.                     if (keyed.TryGetProperty(ref container, pathPart.Key, out property))
    25.                         using (new AttributesScope(property, _property))
    26.                         {
    27.                             if (property is Property<TDictionary, KeyValuePair<TKey, TValue>> kvpProperty)
    28.                                 VisitWrittableKvp(kvpProperty, ref container);
    29.                             else property.Accept(this, ref container);
    30.  
    31.                             return;
    32.                         }
    33.                     break;
    34.             }
    35.  
    36.             _returnCode = VisitReturnCode.InvalidPath;
    37.         }
    38.  
    39.         void VisitWrittableKvp<TDictionary, TKey, TValue>(Property<TDictionary, KeyValuePair<TKey, TValue>> property, ref TDictionary container)
    40.                     where TDictionary : IDictionary<TKey, TValue>
    41.         {
    42.             var kvp = property.GetValue(ref container);
    43.             if (_currentPathIndex >= _path.Length)
    44.             {
    45.                 VisitPath(property, ref container, ref kvp);
    46.                 return;
    47.             }
    48.  
    49.             var previousKey = kvp.Key;
    50.  
    51.             using (new PropertyScope(this, property))
    52.             {
    53.                 var pathPart = _path[_currentPathIndex++];
    54.                 if (pathPart.Kind is not PropertyPathPartKind.Name)
    55.                 {
    56.                     _returnCode = VisitReturnCode.InvalidPath;
    57.                     return;
    58.                 }
    59.                 bool isKeyVisit = false;
    60.  
    61.                 const string KeyName = "Key";
    62.                 const string ValueName = "Value";
    63.  
    64.                 switch (pathPart.Name)
    65.                 {
    66.                     case KeyName:
    67.                         isKeyVisit = true;
    68.                         _hasEnteredDictionaryKey = true;
    69.                         var keyProperty = KvpKeyProperty<TKey, TValue>.s_Property;
    70.                         using (new AttributesScope(keyProperty, _property))
    71.                             keyProperty.Accept(this, ref kvp);
    72.                         break;
    73.                     case ValueName:
    74.                         var valueProperty = KvpValueProperty<TKey, TValue>.s_Property;
    75.                         using (new AttributesScope(valueProperty, _property))
    76.                             valueProperty.Accept(this, ref kvp);
    77.                         break;
    78.                     default:
    79.                         _returnCode = VisitReturnCode.InvalidPath;
    80.                         return;
    81.                 }
    82.  
    83.                 if (!_readonlyVisit)
    84.                 {
    85.                     if (!isKeyVisit)
    86.                     {
    87.                         /// Values can be modified freely
    88.                         property.SetValue(ref container, kvp);
    89.                     }
    90.                     else if (!AreKeysEqual<TDictionary, TKey, TValue>(previousKey, kvp.Key, ref container)
    91.                           && IsKeyChangeAllowed(container, previousKey, ref kvp))
    92.                     {
    93.                         /// Keys will only be modified when they are different, and change is allowed
    94.                         container.Remove(previousKey);
    95.                         property.SetValue(ref container, kvp);
    96.                     }
    97.                 }
    98.             }
    99.         }
    100.         static bool AreKeysEqual<TDictionary, TKey, TValue>(TKey lhs, TKey rhs, ref TDictionary dictionary)
    101.             where TDictionary : IDictionary<TKey, TValue>
    102.         {
    103.             if (dictionary is Dictionary<TKey, TValue> defaultDict && defaultDict.Comparer is not null)
    104.                 return defaultDict.Comparer.Equals(lhs, rhs);
    105.             return EqualityComparer<TKey>.Default.Equals(lhs, rhs);
    106.         }
    107.  
    108.         protected virtual bool IsKeyChangeAllowed<TDictionary, TKey, TValue>(TDictionary dictionary, TKey? original, ref KeyValuePair<TKey, TValue> candidate)
    109.             where TDictionary : IDictionary<TKey, TValue>
    110.         {
    111.             return true;
    112.         }
    And these are the "fake" Key/Value properties:

    Code (CSharp):
    1.        /// <summary>
    2.         /// Custom implementation for the Key property of a <see cref="KeyValuePair{,}"/>.
    3.         /// It is not "ReadOnly", as it overwrites the <see cref="KeyValuePair{,}"/> when modified
    4.         /// </summary>
    5.         /// <typeparam name="TKey"></typeparam>
    6.         /// <typeparam name="TValue"></typeparam>
    7.         class KvpKeyProperty<TKey, TValue> : Property<KeyValuePair<TKey, TValue>, TKey>
    8.         {
    9.             public static readonly KvpKeyProperty<TKey, TValue> s_Property = new();
    10.             const string k_Name = "Key";
    11.  
    12.             public override string Name => k_Name;
    13.             public override bool IsReadOnly => false;
    14.  
    15.             private KvpKeyProperty() { }
    16.  
    17.             public override TKey GetValue(ref KeyValuePair<TKey, TValue> container)
    18.             {
    19.                 return container.Key;
    20.             }
    21.  
    22.             public override void SetValue(ref KeyValuePair<TKey, TValue> container, TKey key)
    23.             {
    24.                 container = new(key, container.Value);
    25.             }
    26.         }
    27.         /// <summary>
    28.         /// Custom implementation for the Value property of a <see cref="KeyValuePair{,}"/>-
    29.         /// It is not "ReadOnly", as it overwrites the <see cref="KeyValuePair{,}"/> when modified
    30.         /// </summary>
    31.         class KvpValueProperty<TKey, TValue> : Property<KeyValuePair<TKey, TValue>, TValue>
    32.         {
    33.             public static readonly KvpValueProperty<TKey, TValue> s_Property = new();
    34.             const string k_Name = "Value";
    35.  
    36.             public override string Name => k_Name;
    37.             public override bool IsReadOnly => false;
    38.  
    39.             private KvpValueProperty() { }
    40.  
    41.             public override TValue GetValue(ref KeyValuePair<TKey, TValue> container)
    42.             {
    43.                 return container.Value;
    44.             }
    45.  
    46.             public override void SetValue(ref KeyValuePair<TKey, TValue> container, TValue value)
    47.             {
    48.                 container = new(container.Key, value);
    49.             }
    50.         }
    Its currently working perfectly. I FINALLY have a working DictionaryField that does not rely on serialization hacks :p

     
    Last edited: Jan 16, 2024
    yu_yang and martinpa_unity like this.
  46. Ice_106

    Ice_106

    Joined:
    Jun 17, 2019
    Posts:
    5
    I tried to serialize/deserialize closed generics using this package,The CS code is as follows
    Code (CSharp):
    1. using System.Collections.Generic;
    2. using Unity.Serialization.Json;
    3. using UnityEngine;
    4. public abstract class Variable { }
    5. public class Variable<T> : Variable
    6. {
    7.     public T value;
    8. }
    9. public class JsonTest : MonoBehaviour
    10. {
    11.     public string json;
    12.     public List<Variable> variables = new List<Variable>() { new Variable<float>(), new Variable<Vector2>() };
    13.     void Save()
    14.     {
    15.         json = JsonSerialization.ToJson(variables);
    16.     }
    17.     void Load()
    18.     {
    19.         variables = JsonSerialization.FromJson<List<Variable>>(json);
    20.     }
    21. }
    22.  
    both Variable<int>and Variable<Vector2>were successful during serialization. However, there was an issue with deserialization during Variable<Vector2>.
    Error reported as
    ArgumentException: Failed to construct type. Could not resolve type from TypeName=[Variable`1[UnityEngine.Vector2], Assembly-CSharp].
    .
    I want to know how to do it correctly
     
  47. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    Im currently serializing generics with no problem, so maybe try this: (but might be a silly suggestion) => Ensure the types you are serializing are defined in a specific assembly, instead of the global "Assembly-CSharp". Create a AssemblyDefinition (right click -> Create -> AssemblyDefinition), and put your scripts in the same folder or subfolders where the AssemblyDefinition file is stored. You will need to manually reference Unity.Properties, Unity.Serialization and maybe Unity.Collections in that assembly.

    I used to have some problems while using "
    Type.GetType(..)
    " when Assemblies were not defined for those types some time ago. But maybe it is not related.
     
  48. yu_yang

    yu_yang

    Joined:
    May 3, 2015
    Posts:
    85
    It seems that when serializing generics, the type is serialized like this:
    Code (CSharp):
    1. $"{type}, {type.Assembly.GetName().Name}"
    2. // Result of List<Vector3>:
    3. // System.Collections.Generic.List`1[UnityEngine.Vector3], mscorlib
    And if you want
    Type.GetType
    to be able to restore the type, you must handle it like this:
    Code (CSharp):
    1. $"{type.FullName}, {type.Assembly.GetName().Name}"
    2. // Result of List<Vector3>:
    3. // System.Collections.Generic.List`1[[UnityEngine.Vector3, UnityEngine.CoreModule, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null]], mscorlib
    For non-generics, there should be no difference between the two. If you don't want the serialization result of generic arguments to be too long, you can simplify each type argument through recursive calls to achieve a result like this:
    Code (CSharp):
    1. // System.Collections.Generic.List`1[[UnityEngine.Vector3, UnityEngine.CoreModule]], mscorlib
    Perhaps it can be implemented using two dictionaries for 'string to Type' and 'Type to string'. This way, the serialization result for each type would only be calculated once.
     
    Last edited: Jan 17, 2024
  49. Ice_106

    Ice_106

    Joined:
    Jun 17, 2019
    Posts:
    5
    I resolved this issue after adding IContravariantJsonAdapter<Variable>, and my code looks like this.
    Code (CSharp):
    1. using System;
    2. using System.Collections.Generic;
    3. using Unity.Serialization.Json;
    4. using UnityEngine;
    5.  
    6. public abstract class Variable
    7. {
    8.     protected abstract object objectValue { get; set; }
    9.     public class Adapter : IContravariantJsonAdapter<Variable>
    10.     {
    11.         public void Serialize(IJsonSerializationContext context, Variable value)
    12.         {
    13.             using (context.Writer.WriteObjectScope())
    14.             {
    15.                 context.SerializeValue("$type", value.GetType().AssemblyQualifiedName);
    16.                 context.SerializeValue("objectValue", value.objectValue);
    17.             }
    18.         }
    19.  
    20.         public object Deserialize(IJsonDeserializationContext context)
    21.         {
    22.             Variable v;
    23.             if (context.GetInstance() != null)
    24.                 v = context.GetInstance() as Variable;
    25.             else
    26.             {
    27.                 var type = Type.GetType(context.DeserializeValue<string>(context.SerializedValue["$type"]));
    28.                 v = Activator.CreateInstance(type) as Variable;
    29.             }
    30.             v.objectValue = context.DeserializeValue<object>(context.SerializedValue["objectValue"]);
    31.             return v;
    32.         }
    33.     }
    34. }
    35. public class Variable<T> : Variable
    36. {
    37.     protected override object objectValue
    38.     {
    39.         get => value;
    40.         set
    41.         {
    42.             if (typeof(T).IsEnum)
    43.             {
    44.                 this.value = (T)Enum.ToObject(typeof(T), value);
    45.             }
    46.             else
    47.             {
    48.                 this.value = (T)Convert.ChangeType(value, typeof(T));
    49.             }
    50.         }
    51.     }
    52.     public T value;
    53. }
    54. public enum MyEnum
    55. {
    56.     A, B, C
    57. }
    58. public class JsonTest : MonoBehaviour
    59. {
    60.     public string json;
    61.     public List<Variable> variables = new List<Variable>() { new Variable<int>(), new Variable<Vector2>(), new Variable<MyEnum>() };
    62.     JsonSerializationParameters parameters = new JsonSerializationParameters()
    63.     {
    64.         UserDefinedAdapters = new List<IJsonAdapter>
    65.         {
    66.             new Variable.Adapter(),
    67.         }
    68.     };
    69.     void Save()
    70.     {
    71.         json = JsonSerialization.ToJson(variables, parameters);
    72.     }
    73.     void Load()
    74.     {
    75.         variables = JsonSerialization.FromJson<List<Variable>>(json, parameters);
    76.     }
    77. }
    78.  
    But this is not elegant, there seem to be many exceptions to type conversion during deserialization. I would prefer to have native support for this operation.Just like basic types, Variable<int>, Variable<float>, and so on all support direct deserialization. Why not expand to any type
     
    Last edited: Jan 17, 2024
  50. Canijo

    Canijo

    Joined:
    Oct 9, 2018
    Posts:
    50
    This seems very weird. In the BinarySerialization, they just serialize the
    Type.AssemblyQualifiedName
    . I dont know why is done like this on Json.

    If you are using the Package on your own project, and not for developing packages. You can overrride the Unity.Serialization package, and on the
    JsonPropertyWriter
    you will find this nested Type. Make this changes, and it should work without needing explicit Adapters.
    Code (CSharp):
    1.  
    2. class SerializedTypeProperty : Property<SerializedType, string>
    3.  {
    4.      public override string Name => k_SerializedTypeKey;
    5.      public override bool IsReadOnly => true;
    6.      // remove the old method....
    7.      // public override string GetValue(ref SerializedType container) => $"{container.Type}, {container.Type.Assembly.GetName().Name}";
    8.    
    9.      // this should work fine    
    10.      public override string GetValue(ref SerializedType container) => container.Type.AssemblyQualifiedName;
    11.      public override void SetValue(ref SerializedType container, string value) => throw new InvalidOperationException("Property is ReadOnly.");
    12. }
     
    Last edited: Jan 17, 2024
    Ice_106 likes this.