Code (csharp): var ScreenResolution : int = Screen.width/Screen.height; function Start () { // set screen resolution width/height = 1.6 widescreen 16:10 or 1.33 normal 16:9 if (Screen.width < 1024) { Application.Quit (); } else if (ScreenResolution == 1.6) { Screen.SetResolution (1280, 800, true); } else { Screen.SetResolution (1024, 768, true); } } Any clue why this doesn't work? At least it doesn't give me an error -- which is what my coding normally produces -- but test screen shots (when I simply set one or the other resolution manually) show only 1024 resolution.
The main problem is that you're defining ScreenResolution as an integer, so it's always going to be 1 (unless you have a really wide screen, like 2:1). The other problem is that Screen.width and .height always return integers, so even if you define screenResolution as a float, it will still only be 1. At least one of the values has to be a float in the first place in order to get a float as a result. Code (csharp): var screenWidth : float = Screen.width; var screenResolution : float = screenWidth / Screen.height; Is there any way to force Screen.width height to calculate as floats in Javascript? In C# you could do "float screenResolution = (float)Screen.width / (float)Screen.height", thus saving a line of code...OK, so C# isn't all bad after all.... Another problem is that you don't wanna do stuff like "else if (ScreenResolution == 1.6)" because getting precise numbers out of floats is iffy. You might end up with 1.6000000001 or something because of floating point errors. You want to compare a range instead. (Like greater than 1.5 and less than 1.7.) --Eric
Code (csharp): (0.0+Screen.width) / Screen.height Clunky, but should work... Btw. the result of the division is stored in a variable called "screenResolution". Shouldn't that be "screenRatio" instead?