Search Unity

Does Unity need a new Rendering pipeline for 4k+ devices?

Discussion in 'General Discussion' started by Arowx, Aug 25, 2016.

  1. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    My understanding is that resolution is outrunning GPU power or at least GPU power using current rendering technology.

    So does Unity need a new rendering pipeline technology for a future of greater than 4k resolution displays and if it does what might that be?

    I know this is more of a Graphics topic but I think it should be general as it is fundamental to the future of games and game engines as device resolutions increase up to and beyond 4k.
     
  2. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    How would that help?
     
    Deon-Cadme, Ryiah and zombiegorilla like this.
  3. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,156
    This is an Arowx thread. I'm 100% willing to bet that they saw a video or an article on 4k rendering tech and desperately wants unity to adopt it without any real thought on the matter
     
    Ostwind and Acissathar like this.
  4. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    I don't know about any of you.
    But I am perfectly content with 1920 x 1080.

    And if you want 4K so bad, just upscale it like they do on Star Wars Battlefront with a slider.
    I'm sure Unity can do that. Granted not true 4K, but it does give better pixel resolution.
     
    Martin_H and Yukichu like this.
  5. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Higher resolutions are great for some things, and nearly a requirement for others. VR, for example, really needs a higher resolution if you're going for a wide FOV. Then there's the cases where people want to use screens more like paper - added resolution is great there, too.

    For standard gaming on a monitor, though, I agree that 1080p is pretty reasonable.
     
    ZJP, Deon-Cadme, Martin_H and 2 others like this.
  6. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    You just don't want to have to find a way to fit a 40-inch monitor onto your desktop to be able to see the cursor. :p
     
    Deon-Cadme, Martin_H and N1warhead like this.
  7. Lightning-Zordon

    Lightning-Zordon

    Joined:
    May 13, 2014
    Posts:
    47
    but once you do, it's really really nice.
     
  8. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    Just get a bigger GPU. I'm failing to see the problem here.

    Of course for most applications we are already past the point where the human eye can actually separate the pixels. So going to higher resolution really doesn't help much.
     
    Ryiah, Martin_H and N1warhead like this.
  9. gian-reto-alig

    gian-reto-alig

    Joined:
    Apr 30, 2013
    Posts:
    756
    Just because it is relevant to this thread: http://forum.unity3d.com/threads/sc...filtered-and-culled-visibility-buffer.424820/

    Seems like some people are already trying to find this "holy grail of new rendering technology" which would make 4k less of a GPU crusher.

    On the allure of 4K for games: Thanks to upgrading to a new OCed GTX 1070, and the 4k 40" screen I bought some time ago at a reduced price, I can finally start testing out 4k for myself.
    Some observations:

    - I can get 4k at 60Hz with a single 500$ GPU with max setting in 2016! ... well, for some games at least. Obviously older games not using the latest and greatest technology. But the fact this not-so-fast-as-hoped GPU still seems to have enough power to at least get up to 60Hz in slightly older titles maxed out is cool as hell.
    - 4k IS looking good... just not as good as many have hoped. Me included. Even games that HAVE been developed to also be playable at 4k just sometimes do not look as good as you would hope. the additional pixel count easely lets you see textures not so well done, a lot of things that look fine viewed in Full HD looks kinda meh once you have the additional pixels.
    My guess is that many devs just added the options to the menu, and then MAYBE (get to that next) made sure the UI scales, and called it a day without testing it out much.
    - UI scaling. Really, its the same mess as with Windows all over again. How hard can it be? .. then again, dabbling in game development myself I know it is kinda hard, and also way more work than anyone would think.
    Point is, if the UI does not scale, games are hardly usable on a 40" 4k screen. Thank god I am not trying to play those games on a 24" screen, or even worse, a tiny 17" laptop screen!
    - AA... yes, I also was sure that 4k would make me finally abandon the worries about crappy ingame AA, or trying to fry my GPU with Downsampling.
    But to be honest, AA is still needed. Even on a 24" screen, specular aliasing with small tris would most probably make AA essential. On my 40" screen, I am currently trying to decide if I should scale down to 30Hz for proper AA, or keep just the ingame FXAA (which I could probably switch off just as well) and play at 60 Hz.


    I would say 4k is the future. But its still not there yet, and the biggest problem is not the GPU power which is not increasing fast enough, or renderer technology not utilizing that power efficiently enough.
    Its plain simple the game developers developing for Full HD first and foremost, leaving 4k as an afterthought and often not even investing the time to look into proper UI scaling.

    And really, I personally am just a little bit dissapointed because games in 4k just don't look as spectacularly better as I might have hoped they would. Which might have been me expecting too much.
    Betting on 8k now... when will the first 8k screens finally come out? Where are the 8k capable single GPUs? ;)
     
  10. MV10

    MV10

    Joined:
    Nov 6, 2015
    Posts:
    1,889
    This is exactly what happened when HDTVs first became available at cheap prices but most broadcasts (and even stuff like DVD content) was still SD. People were simply shocked at how bad SD really was, but they needed a dramatically better display before it was obvious.

    You want 8K? Got $133K? :)

    http://www.theverge.com/2016/1/5/10713490/lg-98-inch-8k-oled-tv-uh9800-ces-2016
     
    gian-reto-alig likes this.
  11. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194

    Well I noticed that the Ogre 3D Gfx engine developers were talking about bandwidth limits to deferred rendering at >= 4k. They speculated that Forward+ rendering would be needed to provide performant higher resolution rendering on current generation hardware.

    But I also noticed a technical blog post about different approaches to this issue when I posted.

    But the GDC 2016 lecture notes on using a V buffer look fascinating > http://www.conffx.com/Visibility_Buffer_GDCE.pdf

    Summary:
    Forward rendering does too much overdraw.
    Deferred uses up too much bandwidth at higher resolutions.
    Visibility buffering reduces rendering with lower bandwidth overhead than Deferred rendering.
    Vulkan and DirectX 12 allow Visibility buffering.

    It sounds like the Forward+ (Forward rendering with Light Culling) rendering combined with Visibility buffering could allow more performance in 4k+ resolution games and VR (with DX12/Vulkan level GPU's).
     
    Last edited: Aug 25, 2016
    gian-reto-alig likes this.
  12. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Its just a fillrate issue. 4k is 4x the pixels of 1080p. If the graphics card is not 4x faster at filling pixels then the framerate will drop. Same thing happened on iPad when they went from 1024x768 up to 2048x1536 but only bumped the processing performance by 2x in that generation. ... instant slower performance. It wasn't until the next version (ipad4?) that the speed bump was 2x again - enough to make up for the fillrate increase.

    There is not really anything Unity can do with regards to making things faster when it's a fill-rate issue. Maybe a little optimizing here and there but nothing in the order of a 400% boost. It's purely down to the graphics card horsepower. Although maybe more modern API's would help like the replacement for OpenGL for example.
     
  13. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194


    If you look at the presentation slides you can see that the Visibility buffer does not boost performance by 400% but it can boost performance enough to keep under a FPS threshold (note further optimisations are being worked on) and this approach uses less memory.
     
  14. Deon-Cadme

    Deon-Cadme

    Joined:
    Sep 10, 2013
    Posts:
    288
    That is situation dependent. It depends on the resolution, size of the screen, your distance from it and your eyesight.

    @Arowx - SD, HD, UHD, 4K, the story is the same. It is simply a new transition. 4K just raises the bar yet again and we have to squeeze more performance out of software and hardware... The same problem the industry have been working for since infancy. This can be achieved through more powerful GPU:s, more optimization, new API:s and new tricks... usually a mix of them all. There is already 8K ;)

    Will rendering change again in the future? Certainly but how and what will it be? Who knows... There is a cost to changing stuff and Unity will do it when they feel that the results are worth it :)

    Oh, remember that the majority is still stuck at 1080p... don't worry be happy :)
     
    Ryiah likes this.
  15. Bradamante

    Bradamante

    Joined:
    Sep 27, 2012
    Posts:
    300
    I agree that Arowx's threads can be quite frustrating. The issue of different rendering approaches/pipelines being more or less fit for higher resolutions however has merit. To my knowledge, Apple was able to go for "Retina" resolutions on their mobile devices earlier than the competition thanks to their use of PowerVR hardware, which thanks to their tile-based rendering approach can handle the resolution better. Then again, this is a hardware discussion, less a software one.

    PowerVR delivering iPhone GPUs is an interesting turn of events since PowerVR has been around for a while. Due to their exotic approach, they struggled to gain ground in the gaming market during the 90ies. Yet 20 years later they were back all of a sudden, supplying a company that until then was struggling with gaming. Oh the irony.

    Apple's move to 4K and 5K resolutions on their desktop iMacs baffled me, however. The Intel iGPUs or 2xx/3xx ATI cards they are or were using were visibly struggling under the load, even with just desktop effects.
     
    Last edited: Aug 27, 2016
  16. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,791
    I'd like to see Unity implement forward+ at some point (not for 4K rendering, just in general).

    I don't really care about 4K at this point to be honest. I'd rather see consistent, properly antialiased 1080p @ 60fps before we make the jump to 4K.
     
    Kiwasi and Martin_H like this.
  17. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    What about Unity adopting Clustered Forward Rendering as used by Doom and explained here -> http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-study/ (really good article on Doom Rendering Pipeline)

     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    It's an alternative rendering pipeline. That's it.
     
  19. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
  20. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    Naturally hardware will struggle with massive increases in resolution when the hardware is still largely made for much lower resolutions (usually 1080p/1440p). Hardware needs to push four times as many pixels as before.

    By the way your link is very dated. The GTX 1080 can handle 4K at playable speeds.

    http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,16.html

    Yes, the GTX 1080 is a bit pricey but a 4K display isn't cheap either.
     
    Last edited: Sep 11, 2016
  21. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    7,532
    Someone explain to me how a higher resolution demands an alternative rendering pipeline.

    A good rendering pipeline performs the same for any resolution.
     
  22. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Modern deferred rendering uses depth or G-buffers to help work out what lights effect what triangles and pixels. Normal maps, Shadowmaps, Specular, PBR shading, Anti Aliasing and Image Effects all add to the workload of a GPU.

    So going from 2k to 4k is not just a quadrupling of work for the GPU it's a big jump in GPU memory and processing bandwidth to render a frame in 16ms or less (VR 8ms).

    Unless of course we have a smarter lower memory/bandwidth rendering pipeline.
     
  23. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    7,532
    So the real question here is more like "Can we improve the rendering pipeline to meet growing demands?"
     
  24. passerbycmc

    passerbycmc

    Joined:
    Feb 12, 2015
    Posts:
    1,741
    hehe, why i'm really loving the 1440p resolution, i got 2 27inch 1440p displays. Gives me lots of extra screen real estate and is especially nice for working in visual studio or maya where i can get a large viewport size or have multiple documents open side by side. Also i can actually have a reasonably sized display for on a desk and find my cursor at the same time. Not to mention getting good framerates in games on only a gtx 970. 4k is still a little to new and not quite ready yet and i find it is always good to stay 1 step behind the bleeding edge tech wise.
     
  25. gian-reto-alig

    gian-reto-alig

    Joined:
    Apr 30, 2013
    Posts:
    756
    How about upscaling?

    That seems to be the big buzzword lately for Sony, they are pushing out a console that is finally powerful enough for 1080p/60Hz, but after they sold an underpowered predecessor as being able to handle that (the original PS4), they had to one-up their marketing to explain to people WHY this improved PS4 was needed.

    They cleverly made it look like the PS4 is capable of 4k, thanks to that clever upscaling tech. Talk about marketing wash.


    Thing is, people that have seen it claim it works better than you would think. Seems like a halfstep between Full HD and 4k really, but depending on the Art Style you might not even notice much of a difference.

    Now, I am not sure if this is a feature that needs to be supported by the hardware (I guess it does), or if current gen GPUs do support it (though if the PS4 Pro can do it, which has basically just a clocked down RX480 soldered to its CPU, most probably its off the shelf tech at least for AMD GPUs).
    That might be something to concentrate on until GPUs really do catch up (Which might take another 2 years for such GPUs to reach the mainstream).
    And it might be a good time to look into if such upscaling tech could be cleverly linked to Antialiasing, so you could do both in a single step. And maybe finally get some good Antialiasing working for Deferred renderers.


    BTW, You can get 4k going today with performance GPUs. I play some games at 4k/60Hz with my overclocked GTX 1070. That is a 450$ or so card. No exactly cheap, but affordable enough.
    Of course, we are not talking about the most taxing AAA games ever here. World of Tanks and World of Warships are optimized for the average russian toaster, so they tend to go light on the most high end effects, viewrange and stuff like that. Games still look fairly good, especially in 4k.
    If only there was enough power left for good Downsampling AA, but of course, I barely reach 60Hz with all settings maxed out, so additional AA is not possible right now.
     
    Last edited: Sep 12, 2016
  26. MV10

    MV10

    Joined:
    Nov 6, 2015
    Posts:
    1,889
    It isn't too surprising. I run a 1080p projector in my living room on a 150" screen, and depending on the format of the show or media I'm watching it does all kinds of stretching and scaling, and it's generally not easy to detect for me, and likely impossible to call out for the average viewer who doesn't know anything about that kind of thing. (Well, other than SD material of course... which is almost like being struck blind, lol.)
     
  27. Lockethane

    Lockethane

    Joined:
    Sep 15, 2013
    Posts:
    114