Search Unity

WebGL / Standalone - Instant FPS drop when using display scaling (Mac with Retina, etc)

Discussion in 'Web' started by OChrisJonesO, Feb 12, 2018.

  1. OChrisJonesO

    OChrisJonesO

    Joined:
    Jun 27, 2015
    Posts:
    13
    Not sure if this is the correct forum to post in, since it's not limited to, but does effect WebGL. Mods, feel free to move this if it's not the right place.

    Anytime you run a Unity game/app (we're seeing this/have tested in both standalone OS X builds, and in WebGL) on a Mac retina display or on a windows machine with display scaling (for example, 4K monitor with 200% scaling), you will see an instant drop in framerate. This is easily testable with a standalone build running in windowed mode, or a WebGL build.

    After doing a lot of investigation work, especially with WebGL builds it appears as if the resolution doesn't actually change when on a retina display vs a non-retina display. You can drag the window from a Macbook Pro's built in display, over to a 1080p monitor via HDMI, and see an instant drop in FPS. However, there's no real visual difference or improvement when on the retina display. Unity is reporting that Display.systemWidth/Height and Display.renderingWidth/Height values haven't changed when you drag the window from one display to another. Same with the WebGL canvas, when calling Javascript to spit out the current viewport resolution, no change. However, there's always a drop in frames.

    It's worth noting that if you use a third party application such as SwitchResX (http://www.madrau.com/) to set a specific resolution on OS X (for instance, setting 1680x1050 resolution, NON-scaled / Non-HiDPI), you no longer see the drop in frames. But if you set it to the retina version of 1680x1050 (so that HiDPI scaling is enabled), you do see the drop in frames.

    So in conclusion, anytime your system is using display scaling, Unity instantly performs slightly worse, even though it appears as if the game resolution is identical and visually it looks no different.

    I understand how retina displays / windows display scaling works, and tried to combat this by detecting HiDPI displays and setting a lower resolution manually (since typically, everything is at x2 scale). However doing so just makes it looks significantly worse, since we're down-res'ing but Unity isn't actually up-res'ing in the first place like it appears to be because of the FPS drop.

    Anyone else seeing this / have any kind of workaround?
     
    StaffanEk likes this.
  2. OChrisJonesO

    OChrisJonesO

    Joined:
    Jun 27, 2015
    Posts:
    13
    Also worth noting, I've done builds in multiple versions of Unity spanning from 5.4.6 - 2018.1 beta, and it is happening in every version.
     
  3. justins_unity

    justins_unity

    Unity Technologies

    Joined:
    Jan 30, 2018
    Posts:
    1
    Hi Chris,

    I performed a quick test with a standalone Unity (metal) app.
    With Xcode instruments I measured a render time increase from 2ms to 4.89ms on the GPU going from non-retina to retina display.

    I can also see that the resolution of the window increases when moving to a Retina display (I have a bunch of labels and UI text which shrink on Retina).

    So I believe the resolution is changing, and the increase in frame time is due to rendering more pixels (fill rate) on Retina.

    Are you sure you've looking at the Display.systemWidth/Height values for the actual display that the window is on ? eg

    Display.displays[0].systemWidth != Display.displays[1].systemWidth
     
    Last edited: Feb 16, 2018