Search Unity

How do you implement "Adjust your screen margins"?

Discussion in 'Editor & General Support' started by Peter77, Oct 14, 2017.

  1. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,620
    (Console) games typically have functionality to adjust screen margins, to make sure the game renders its content inside the safe area.

    adjust_your_screen_margins_01.png

    Since this functionality is very common among pretty much every Console game, I wonder if Unity has built-in functionality to adjust the rectangle, where its content is rendered to. I'm not talking about UI screen, but the core/rendering functionality to adjusting the "final frame-buffer blit".

    I'm aware that every Camera has a "Viewport Rect" property that can be used to (tediously) simulate this behavior. However, this requires a multi-camera setup to...
    • Render the 3D scene with adjusted Viewport Rect
    • Render UI using "Screen Space - Camera", to adjust the UI Viewport Rect
    • Clear the outer region of the Viewport Rect
    I'm not a fan of using multi-camera setups, due to performance reasons.

    Do you know if Unity has an API to "adjust the screen margins"? If it doesn't, how do you implement this functionality?
     
    Last edited: Oct 14, 2017
  2. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,620
    For anyone who is interested how I implemented it, here it is...

    UI Camera Changes

    I use Camera.rect to adjust the rectangle where the camera writes to the screen. Since this doesn't work with "Screen Space - Overlay", I had to move to "Screen Space - Camera".

    In order to use "Screen Space - Camera", here is what I did and why.

    My UI is stored in multiple scenes. For example, I have a dedicated HUD scene, Pause menu scene, and so on.

    Using "Screen Space - Camera" requires to assign a Camera to the Canvas. This means, I'd have to add a Camera to each of those dedicated UI scenes, because Unity does not support cross-scene references.
    Meaning, Unity does not support to simply drag&drop a Camera from one scene into the Canvas camera property to another scene.

    In order to keep editing UI acceptable, I decided that everything continues to use "Screen Space - Overlay" at editing time, but uses "Screen Space - Camera" at runtime.

    To overcome this limitation, I introduced a new Camera, that is created when the game initializes and never destroyed. The only purpose of this Camera is to render the "UI" layer, I named it the "UI Camera".

    I also introduced a new Component named "UI Canvas", which must be added to every GameObject where a Canvas/CanvasScaler exists.

    The "UI Canvas" Component is responsible to assign, at run-time, the "UI Camera" in Start() to the Canvas.camera property and to adjust the CanvasScaler.referenceResolution property, depending on the Camera.rect values.

    Adjusting the Camera.rect does not cause the CanvasScaler to scale the UI, like I expected. I reported this behaviour to Unity and it turns out it's a bug:
    At this point, the UI renders using a custom "UI Camera" and the CanvasScaler properly scales the UI considering the Camera.rect as well.

    However, changing the Camera.rect leaves what was rendered there previously...


    Clear Screen Changes

    Using a smaller Camera.rect than "entire screen", causes the outer-rectangle area not to get cleared. Means, it displays the content that was rendered previously. Makes sense.

    To fix this issue, I introduced a new camera named "Clear Camera". The only purpose of this camera is to clear the entire screen.

    However, a full-screen clear isn't free and most of the time not necessary.

    Therefore, I added a feature that the "Clear Camera" only clears the screen on some events, such as "Changing screen margins" or "Application Focus changes" etc. This makes sure to pay the additional performance costs for a full-screen clear only when necessary.


    The temporary render target

    When I looked how Unity renders (via FrameDebugger) a Camera that limits the view-port rectangle, I noticed it seems to be doing it inefficiently. It creates a render texture of the size of the Camera.rect where the content if rendered to. When the content must be presented, it Blits the render texture to the frame-buffer.

    I don't understand why the intermediate render texture is necessary, as far as I remember (at least) DX9, it's possible to render/present to a specific screen region without having an additional blit.

    I'll most likely file a bug-report for this issue too, but didn't do just yet.



    Conclusion
    Adding a "Adjust your screen margins" feature caused quite some work.

    Pretty much every console game must implement this feature to pass TRC/TCR and I wonder why isn't this part of the game engine? It's probably rather simple to add at engine level.

    Here is what I did at the highest level:
    • Add dedicated "UI Camera" to be able to adjust UICamera.rect
    • Add new "UI Canvas" Component to assign UICamera and fix CanvasScaler.referenceResolution bug
    • Add "Clear Camera" to clear previous Camera.rect content
    • Cry over additional screen blit caused by Camera.rect
    • Cry of scene-cross references not being supported

    Hope it helps in case someone needs to go through the same hassle.
     
    Last edited: Oct 22, 2017
    hippocoder and Prodigga like this.
  3. Prodigga

    Prodigga

    Joined:
    Apr 13, 2011
    Posts:
    1,123
    Isn't the safe area just for UI? Couldnt you just have a RectTransform at the root of your UI that insets everything correctly?
     
    Peter77 likes this.
  4. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,620
    As far as my research went, some games inset the UI only, while others adjust everything. Most games I checked (Xbox One games) adjust everything.

    I thought about only adjusting the UI as well, but it turned out it doesn't work with how the UI is built in my game. Several UI elements are designed to start or end at screen edges.

    Only adjusting UI would reveal where a texture or UI element ends. I didn't want to work with that limitation and more importantly for me, I didn't want to recreate lots of existing UI.

    I also tried to use a root element to inset UI that is located below, but it always screwed up the layout. I didn't figure out how I would get this to work, I'm not an UI expert.
     
  5. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    You need to think of UI as having a global rect transform that is your physical screen (usually this clamps to the screen bounds) but you can adjust this container for TRC safe areas easily - a line of code would be enough.

    Assuming the children of this have their own rect transform containers, for attaching to various points along the margins, then your own UI.

    Bascially if you use Unity UI correctly, this isn't a problem to solve to begin with. I do think most of the problem with this is that Unity just can't keep up with learning materials at all vs the rate of progress (which took long enough with UI haha)

    I had to learn how to use UI by trial and error mostly, I think that's the problem, devs just don't have time.