Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Anyone tried Unity on the 5k iMac?

Discussion in 'General Discussion' started by adslitw, Nov 5, 2014.

  1. Moonjump

    Moonjump

    Joined:
    Apr 15, 2010
    Posts:
    2,571
    There are apps on the Mac App Store that allow you to set the resolution to native, so everything will be really tiny, but you can test the preview at full size. I tried a free one (Display Menu) on my rMBP and it worked fine, but it did not return to a proper retina mode until I restarted the computer.

    It is a useable workaround for when you want to do the bigger previews, but is not a substitute for retina Unity.
     
  2. Ippokratis

    Ippokratis

    Joined:
    Oct 13, 2008
    Posts:
    1,521
    Moonjump likes this.
  3. schmosef

    schmosef

    Joined:
    Mar 6, 2012
    Posts:
    851
    @badweasel, Feel free to ship your discarded iMac kit to me. Or I can pick it up if that's more convenient. :D

    HAHAHA!
     
  4. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    I was wondering about that. I'd like a larger monitor for home and was looking at 4K screens, but it's only an inch larger than my work monitor despite having a much higher pixel density, so still will get tiny. Plus, even with a GTX970 I expect that many pixels to hurt rendering performance, but don't necessarily think they'll make images look noticeably better.
     
  5. badweasel

    badweasel

    Joined:
    Jan 11, 2015
    Posts:
    18
    @schmosef yeah right. I sold my Late 2013 iMac for 500 less than I paid for it and then bought the 5k for about 800 more. In another year hopefully Apple will have a decent gfx card in there and I'll be able to afford to do it again.
     
  6. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    @angrypenguin retina-support is an OS X feature that stops things looking tiny on screens with a high pixel-density. I heard something about windows 10 having a similar feature.
     
  7. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    That's great for general applications, but for 3D (which is where I spend the majority of my time) it doesn't help. I'm still spending exponentially more rendering power for a diminishing gain in image quality, especially where good AA is available.

    My 27" 1440p monitor at work seems to be a great all-rounder - plenty of space, pixels are a decent size, standard text is quite readable, and the resolution isn't so high that 3D rendering is a chore (which is important as I often have rendering on multiple monitors).
     
  8. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,398
    Sounds like it could be a good idea to start putting "retina mode" in games, where you have the 3D graphics computed at 2K and the UI overlaid at 4K (or whatever's appropriate for the actual monitor). That way you get most of the benefit of a high DPI display without needing an ultra-high-end card.

    --Eric
     
  9. badweasel

    badweasel

    Joined:
    Jan 11, 2015
    Posts:
    18
    Right.. but iOS devices ARE retina and I have to support retina on iOS devices. At least if you want to ever be featured by apple in the app store. All my games (pre-Unity) were retina and 60fps. The one time I did a single graphic non-retina (because it was a little expensive to compute), my insider at apple complained that the element was blurry and unacceptable.

    What I'm hoping for right now is just IDE retina support. Xcode's Simulator has a Scale mode where you can emulate the iPhone screen at full screen, 75%, 50%, or 25%. That allows you to preview placement of elements on a retina device scaled down. The Game window in Unity is completely literal. For us mobile people it would be really swell if we could see the entire 1080 wide by 1920 tall iPhone 6+ in a window and have the engine tell us the screen size is 1080x1920. Couldn't the unity engine do a post process scale down for mobile if needed?

    I purchased the new 5k iMac hoping that with my 5120-by-2880 screen I'd be able to see a pixel for pixel tall iPhone 6+ Game window. I can now in Xcode/Simulator where I couldn't before.
     
    angrypenguin likes this.
  10. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    Yeah, this. Even on my 27" monitor I can't show a whole iPad 3 in portrait mode.
     
  11. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    I've said it before in other thread, although unity is a bit late to the party with retina-support they've been busy with IL2CPP (and unity 5) so I understand the delay. Without 64-bit builds we wouldn't even be able to ship and that would have lost them customers. I'm confident they'll bring it out soon enough (fingers-crossed!).
     
  12. schmosef

    schmosef

    Joined:
    Mar 6, 2012
    Posts:
    851
    Hopefully retina support will be released in time for the rumored 8K iMac.
     
  13. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    Personally, I don't see 8k making sense on the iMac. Its already pushing the best available GPU at 5k and the pixel-density is enough so you don't see the pixels. Maybe if they squeezed in dual-GPUs but even then it would be a better experience at 5k with the higher framerate. Perhaps a 8k 30in-display for the mac pro or something...
     
  14. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    Yeah, I agree with that. There's a point where adding more pixels doesn't (practically) improve image quality for many use cases. For photographers, graphic designers, or other image professionals I can see it being valuable. But for average desktop use not so much, and for 3D the prohibitive rendering cost is something we don't/shouldn't want to pay while there are other, bigger increases in quality we can get for less computational power.
     
  15. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    I think apple went higher than previous models on the iPhone 6 plus but I suppose your eyes get closer to the screen on that than you do with a desktop screen. Samsung did even higher ppi with the S6 but that just wastes GPU power on pixels you never see IMHO.
     
    angrypenguin likes this.
  16. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    Yeah, it's a bit of a race for higher numbers without thought for the practicality of the matter.
     
  17. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,398
    I dunno...without anti-aliasing, I can clearly see the pixels on my iPod touch 5 screen (326 ppi). 500+ ppi seems reasonable. When it gets over 1000, then I'll start questioning it. ;)

    --Eric
     
  18. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    the iPhone 6 plus is about 400ppi, its crazy they actually scale down graphics for it and call it "Retina x3" as opposed to Retina x2 on the iMac and older iPhones. Even tho the physical pixels are 1920x1080 its considered to be a 2304x1536. Screenshot sizes on iTC just plain huge now. Back 2009 it was so much easier all we had to worry about was 320x480...
     
  19. badweasel

    badweasel

    Joined:
    Jan 11, 2015
    Posts:
    18
    Some great points.. I want to like and discuss them all.

    8k iMac? Please Apple get the 5k iMac graphics running smoothly first. Like I said, I went to work in substance painter and it lags so bad it's unusable. I can't afford a $10k mac pro so I hope that was just a fluke. And I have maxed out RAM too. I'd much rather see a much better video card on the iMac and keep it at 5k. Heck, honestly.. I'd rather have a non-retina iMac and be able to plug in a 4k external via thrunderbolt 2.0.

    Anti-Aliasing and screen sizes. I had this all figured out when I was working just in OpenGL. I have a lot to re-learn in Unity yet. I need to play with render settings. Is it anti-alisaing now on the iPhone? I guess I see a setting in Quality. Can you select that per iOS device to disable stuff on older devices? Or does it start with Fantastic and then lower it if it has to on older devices? Is there documentation somewhere about that? (sorry if too far off topic)
     
    Last edited: Apr 28, 2015
  20. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    @badweasel have you tried dropping the rez on your iMac to boost the framerate? or if substance is its own program what about switching it to low-res mode to effectively disable retina? that might give you more GPU grunt.
    I noticed blizzard have heroes of the storm defaulting to 1/2 rez on it to keep the glorious 60fps flowing. At 5k it chugs a bit...
     
  21. badweasel

    badweasel

    Joined:
    Jan 11, 2015
    Posts:
    18
    @Mishaps Thanks I'll give it a try.
     
  22. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    That's exactly it - without AA. I'm just of the opinion that better AA is a more effective direction to head in than adding even more pixels to the current high-DPI displays we have.
     
  23. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,398
    But if you have enough pixels you don't have to worry about AA at all, so technical stuff like AA not working with deferred rendering unless you apply iffy image effects becomes irrelevant. (I'm playing devil's advocate a bit considering current technology, but I think that probably is the eventual end game.)

    --Eric
     
  24. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    we've never bothered with AA now everything's retina, it just didn't seem worth the framerate hit.
     
  25. Moonjump

    Moonjump

    Joined:
    Apr 15, 2010
    Posts:
    2,571
    I think an 8K iMac would be a bigger model sitting above the 5K iMac, perhaps 32".
     
  26. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    i'd say the 8k rumour was a mistake personally. It was only ever raised once on the rumour sites and even then labeled "dubious". There's just not enough grunt in the iMacs to power it and I'd bet Apple would go for a thinner case then put dual gpu in. If anything a mac pro display...
     
  27. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    Sure, but increasing the resolution requires an exponential increase in applied computing power in and of itself (doubling resolution means quadrupling the number of pixels - this is significant especially on fill-rate limited mobiles), and the increase in pixel count can in turn make other things more expensive. Or, modern AA approaches can get great results, which may be practically equivalent, with a fraction of the expended resources and on hardware that's cheaper to create.

    Remember, we're not just talking about "RAM is cheap" and "GPUs are fast" here. People also have to make the screens to display these extra pixels (manufacturing costs), images have to include the extra pixels for it to be of any use (bandwidth and storage costs, sometimes content creation costs as well), so on and so forth.

    To be clear I'm not saying that current high-DPI displays are a waste. I'm saying I think they've reached the point of diminishing return where even higher DPI would be a waste outside of a few niche areas.
     
    Ryiah likes this.
  28. Mishaps

    Mishaps

    Joined:
    Nov 28, 2011
    Posts:
    181
    200-300ppi for a desktop and 300-400ppi for a phone seems to work pretty well. 500+ seems a waste of GPU power to me.
     
    angrypenguin likes this.
  29. Games-Foundry

    Games-Foundry

    Joined:
    May 19, 2011
    Posts:
    632
    We support supersampling in Folk Tale (via a console command), so even before we had a 4K monitor we were able to render the camera at 4K then resize down to 1080p (or native) each frame. On a single 780 Ti we can do this and still maintain 60FPS on ultra settings, so with a 970 you should have no problems. This approach significantly improves clarity, helping detail to stand out. We also support downsampling for low end machines, so the world can be rendered in lower than native resolution, but the UI remains native. Normally you can get away with 80% of native without too much degradation, while gaining a performance boost on GPU or fill rate limited machines.

    Capturing 4K video footage for us will require further hardware investment. My main dev box has an SSD array, and the write speed isn't fast enough. I haven't tried yet with an eSATA drive, and if that doesn't work there is always a separate machine with a hardware capture device, but I doubt we'll go that far.

    Even though most of us on desktop are producing 1080p assets (screenshots and video), there is always a fidelity boost to be had in rendering higher then scaling down, whether it be video or images.
     
    angrypenguin likes this.
  30. Deleted User

    Deleted User

    Guest

    Hi I have late 2015 iMac 5k in max configuration, all unity based games and game editor is not working i only get black screen. But there is interesting thing. Once time from time it's start working, until I made restart. I hate this behaviour.
     
  31. passerbycmc

    passerbycmc

    Joined:
    Feb 12, 2015
    Posts:
    1,739
    Seems im a little late, but for dev on the macOS side of things, i been on the late 2013 27inch iMac for a while now. 1440p for its internal and my extra display i find is a great compromise since pretty much everything supports that res, while still having a very noticeable amount of extra screen space over 1080p. The iMac itself is more than beefy enough for unity work today as well i7 3.5ghz, 32gb ram and a gtx780m