Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

C# data type conversion between different platforms - using sizeof(type) vs using predefined lengts

Discussion in 'Scripting' started by lestatos86, Apr 24, 2022.

  1. lestatos86

    lestatos86

    Joined:
    Aug 24, 2020
    Posts:
    3
    Please correct me if my question's title is incorrect in some way.

    Here is my issue. Let's say I have a UDP server, written in C# running inside Unity build, on a Windows machine. Then I have a UDP client running inside a Unity build on a Linux/Java(Android) machine. I am using the System.Sockets library for both server and client.

    I have a function called WriteInt(int data) on both server and client, which is used to write 4 bytes into the network stream for example . Of course this fuction writes exactly 4 bytes , using an int literal intLength = 4; I have seen from this article ( spoiler - not familiar with C ) : https://www.tutorialspoint.com/cprogramming/c_data_types.htm that C handles ints in two different ways : 4 bytes ( Integer32 in C#.System ) and 2 bytes ( Integer16 in C#.System). In case I have the function ReadInt() in my C# script ( in Unity ) , and I gave that function a variable length of 4 , it would always read 4 bytes. What if the network stream on the Linux client receives 4 bytes , which is supposed to be (C#) Integer32) ... How would it know to read C# int = 4 bytes or C# short = 2 bytes . Right here I would be ok , I think , because I have set the default byte size for int ( C#). What if I was using sizeof(int)? Would that make any difference in different platforms ? Hereby , the question is more about how Unity handles those kind of operations between different platforms. It is not about just int or bool( Java for example ) .
     
  2. VolodymyrBS

    VolodymyrBS

    Joined:
    May 15, 2019
    Posts:
    150
    Last edited: Apr 24, 2022
    Bunny83 likes this.
  3. lestatos86

    lestatos86

    Joined:
    Aug 24, 2020
    Posts:
    3
    @VolodymyrBS thanks for you response. As I am aware - Linux is written in 'C'.So if it would receive 4 bytes for an `C# int` lets say , but it wouldn't know how to deal with those 4 bytes ... is it 2 times 2 bytes ( 2 shorts ) or 1 int (4 bytes ) .. This is just an example. As i have said - it is a question which is interested in how Unity distributes different kinds of bytes amongst different platforms.
     
  4. lestatos86

    lestatos86

    Joined:
    Aug 24, 2020
    Posts:
    3
    And of course , we are not only speaking about Linux and 'C', lets say Java treats bools as bits .
     
  5. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    3,571
    Sorry but what has this to do with anything? C# and the CLR is a standard that is the same regardless of the platform, the CPU architecture or anything else. a System.Int32 (alias "int") is always a 32 bit integer, always. Your C# code that is "reading" or "writing" data to a stream does the exact same thing on all platforms. There are not several versions of compilers. Also unlike C or C++ you can not change what the identifier "int" refers to. It's always a 32 bit integer. I'm not sure what else to say. VolodymyrBS already said this and you ignored it. So I'm not sure what else there is to it.
     
    VolodymyrBS likes this.