Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Voting for the Unity Awards are OPEN! We’re looking to celebrate creators across games, industry, film, and many more categories. Cast your vote now for all categories
    Dismiss Notice
  3. Dismiss Notice

Pixel Perfect 16:9 not always perfect

Discussion in '2D' started by marcgenesis, Jan 29, 2018.

  1. marcgenesis

    marcgenesis

    Joined:
    Jan 18, 2017
    Posts:
    7
    Hello all,

    I know there are a lot of resources online concerning the "pixel perfect camera", but I have a quick question that I can't seem to find an answer.

    I've created two cameras with a render texture, etc, etc to display my perfect pixels. The base resolution is 384x216 which is a 16:9 resolution. Both cameras are set to a camera size of 1.08 (100 PPU for the base resolution mentioned earlier). The quad that contains the render texture material is scale to 3.84 x 2.16.

    In the Unity game window, when I run the game full screen with a scale of 1, I am having some issues.

    First of all, resolutions of 384x216 and 1920x1080 display correctly. (Click to enlarge)

    384x216



    1080p



    Then you have 1280x720 that looks like crap:



    And just the normal 16:9 selection that look better but has some squished pixels:



    Is this a Unity game window issue or I am missing something? Putting a pixel snap material on the sprites doesn't fix the issue...

    Any clues where to look next would be greatly appreciated. I don't want a baked solution, I want to understand what is happening in the camera.

    Thank you!

    Marc
     
  2. yuanxing_cai

    yuanxing_cai

    Unity Technologies

    Joined:
    Sep 26, 2014
    Posts:
    335
    Your issue in 16:9 game view could be caused by an odd-numbered resolution (e.g. 1282 * 721).
     
  3. marcgenesis

    marcgenesis

    Joined:
    Jan 18, 2017
    Posts:
    7
    That makes sense I didn't think of that. Is there a way of forcing Unity to use even numbers only?
     
  4. yuanxing_cai

    yuanxing_cai

    Unity Technologies

    Joined:
    Sep 26, 2014
    Posts:
    335
    None that I know of. But since it's not likely you will ship your game with the support of those wired resolutions, this is not a big deal IMO.
     
  5. marcgenesis

    marcgenesis

    Joined:
    Jan 18, 2017
    Posts:
    7
    I agree. I will just use the base resolution with a zoom factor.

    Any ideas why 720p looks like S*** though?
     
  6. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    At 1280, it does not divide exactly into an integer/whole value when your render texture width is 384. 1280/384 is 3.3333. That means you are attempting to scale the render texture up so that one texel spans 3.33 pixels. Every third or fourth pixel is going to show a duplicated pixel due to the point sampling/nearest neighbor.

    Your target res of 1280x720 simply is not properly compatible with your render texture size, if you want pixel-perfect graphics. The resolutions of the screen must be an EXACT MULTIPLE of the render texture size, otherwise you are guaranteed some artifacts.

    Other options you can consider are 640x360 or 320x180. If you want this to work out exactly pixel-perfect at the major 16:9 target resolutions, which includes 720p, 1080p, 1440p, 4k, then you have to consider that you are "doubling" the size of the render texture output at each step.

    640x2=1280, 640x3=1920 (HD), 640x4=2560 (half of 5k), 640*5=3200 (?), 640x6=3840 (4k), 640x8=5120 (5k) and 640x12=7680 (UHD)

    Or if you use 320x180 that will divide exactly into all of those as well.

    Use pretty much any other render texture size and you will not be able to support all of these resolutions perfectly.

    There are some other options such as 480x270, which fits into 1920x1080 correctly, but not 1280x720.

    One other option is to use a render texture the size of your screen resolution, such as 1280x720, scale-up your sprites by however many times is needed to get to the right size, and clamp the sprite coordinates to approximately align to the grid. This will still introduce artifacts though. Another option is to render at higher res e.g. 1920x1080 and then scale down using bilinear filtering, which will look better than the nearest neighbor ugliness you're seeing at 1280, but will blur the screen slightly and look less crisp. In that case the 3.33 pixels is 'smoothly' interpolated over the pixels so you don't see sudden harsh duplicated pixels but it makes the whole display less crisp. You can apply a sharpening filter over the screen afterwards which helps some but it's still hacky and not as pure.

    Fact is if you want to support 1920 and 1280 exactly your only option is the 640 or 320 width render texture. I recently have been playing "Steredenn" shootemup, and its at 320 width scaled up exactly to my 2560 display. For my own game I'm using 640.
     
    Last edited: Jan 29, 2018
  7. marcgenesis

    marcgenesis

    Joined:
    Jan 18, 2017
    Posts:
    7
    Ding ding ding!

    It works! Thank you for the explanation. This is the kind of things that is very obvious once you know, but easily forgettable when you read countless pixel perfection tutorial that DON'T mention it.

    This makes things much, much clearer. I was developing for 1080p and was always enlarging my art ("art"...) but found the process tedious. I knew it could be simplified, hence delving into using a smaller base resolution.

    Does that get rid of the use for a "pixel snap" shader?

    Sneak edit: Does it matter where my pivot point is on my sprite? I've read somewhere that it should be in a corner and to always use even numbers in the transform position.
     
    Last edited: Jan 29, 2018