Recently I noticed that one of my shaders looked incorrect between iOS and Android. I conducted some test with the UNITY_UV_STARTS_AT_TOP define. I use this in the erroneous shader to work across my target platforms (mobile/windows). On iOS and Android I was expecting this to return false (or a value of 0) since I can evaluate that the uv 1 is at the top of the screen. On PC/DX the inverse is expected.... However, from my tests I have noticed that on iOS UNITY_UV_STARTS_AT_TOP define returns a value of 1! The results of my test can be seen here (https://imgur.com/a/U7vkdgW) (see image in link...) Notice how the top left square proves that the uv coord 1 is at the top of the screen. Now notice on the Android screen grab the bottom left quad is green (returned when UNITY_UV_STARTS_AT_TOP is false) However, on iOS the bottom left quad is red (returned when UNITY_UV_STARTS_AT_TOP is true). This is contrary to the top left quad which proves that 1 is at the top (lerp (a, b, i.uv.y <= 0.5f)). My current work around is to use SHADER_API_MOBILE and then else check for UNITY_UV_STARTS_AT_TOP. Such that iOS and Android will fall into the my first condition. Originally I was just using UNITY_UV_STARTS_AT_TOP.... Is this behaviour expected, if so am I misunderstanding this define? Or is this a Unity bug?