Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Voting for the Unity Awards are OPEN! We’re looking to celebrate creators across games, industry, film, and many more categories. Cast your vote now for all categories
    Dismiss Notice
  3. Dismiss Notice

Stupid question by a Mac-Developer

Discussion in 'General Graphics' started by DeepShader, Mar 25, 2018.

  1. DeepShader

    DeepShader

    Joined:
    May 29, 2009
    Posts:
    682
    Hi there,
    maybe this is a stupid question, but I've to ask :p

    Since there's Apple Meta API v2 what is the reason why some Shaders still only work on DX11 or higher?

    Why is there no universal Shader which we can use, which automatically uses the best the operating system/graphics-card could give while running?

    I don't know how they do it, but since 4.18 it seems to be possible on UnrealEnginge to get nearly the same results as under Windows with DX.

    So my question is, why this isn't possible in Unity yet?
     
    DaftPuzzler likes this.
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    It is possible. The Standard Shader does most of this for you already.

    However there are features of D3D11 which are usable by Unity's ShaderLab for which Metal has no direct equivalent (and explicitly, purposefully never will have), where as for Unreal those features are not exposed to their node based material system and can only be accessed by modifying the engine source code. Specifically geometry shaders are not available to Metal, you have to use a compute shader instead.

    If you have a custom shader and use #pragma target 4.5 you have access to only features that both D3D11 and Metal have. Unity also let's you use displacement tessellation with Metal by automagically converting D3D11 style tessellation to the Metal style, which Unreal also does.
     
    DaftPuzzler and theANMATOR2b like this.
  3. DeepShader

    DeepShader

    Joined:
    May 29, 2009
    Posts:
    682
    Thank you for your answer :)

    Ok, help me to understand.. In my iMac Pro is a Vega64 card and a lot of new Macs have better graphic-cards as old Macs ever had. Windows uses DirectX and on macOS there is Metal.

    Let's talk for example about the Adam-Demo or the new Book of Death Demo. Both running on Windows only.

    But why? Why Unity can run this things on Windows only?

    I can't understand why UnityTechnology isn't interested in faster support new Mac technologies like Metal v2.

    I talked to some UnrealEngine Mac users and they told me the difference between Unity and Unreal on Mac is, that Unreal supports Shader Model 5.0 on Macs which makes it possible to get AAA-graphic on a modern Mac.

    So it seems that there's a way to get graphic like Adam or the Book of Death on a Mac.

    So my question is again: Why this isn't possible in Unity yet (I mean the technical reason)?


    Or will Vulkan be the solution for this Unity/Mac problem?
     
    Last edited: Mar 25, 2018
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    The original Adam demo mostly predates Unity's MacOS Metal support, so it makes sense to be DX11 only.

    Book of the Dead isn't out yet, and is going to be using the HD Scriptable Render Pipeline for Unity 2018.1 (or 2018.2) once (if?) released publicly. I haven't seen any mention of it being DX11 only.

    So does Unity, at least as much as Unreal does. MacOS cannot really support "Shader Model 5.0" since OpenGL 4.1 is missing compute shader support and Metal is missing geometry shader support, and both are supported by DX11's "Shader Model 5.0". Unity has that #pragma target 4.5 I mentioned earlier which is effectively Shader Model 5.0 w/o geometry shaders, which is what Metal is. For Unity 2018.1 they added support for Metal tessellation, though I don't know what shader target is needed for that as the documentation hasn't been properly updated.

    You mean with MoltenVK? It only solves so much as it still doesn't support geometry shaders. Otherwise Apple has stated they will not be implementing native Vulkan support, even though clearly the GPUs in the iMac Pro can handle it.
     
    DaftPuzzler likes this.
  5. DeepShader

    DeepShader

    Joined:
    May 29, 2009
    Posts:
    682
    Thank you again for your answer :)

    So the only difference between DX11 and Metal for Unity is the geometry shader? And this shader technology is the reason why stuff on Windows is looking so much better?

    If yes.. why they don't look as good in OpenGL?

    If no.. what is the technical difference between DirectX and every API which runs on macOS on a modern Mac that graphics still look "bad" in comparison to Windows?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    Not exactly. There are actually some features that Metal has which DX11 does not. However generally speaking geometry shaders aren’t even used much anymore for AAA game dev, though they do get used by a lot of Unity game devs.

    Nope. It has nothing to do with that.

    Unity’s original Metal support was incomplete, and mainly focused on mobile rather than desktop. Even though Metal capable phones are essentially able to use most Shader Model 5.0 features, the amount of performance they have is not, so Unity’s Metal support dropped down the visual quality to the same mobile variants used for OpenGL ES 3.0. Using Unity’s built in features for 5.6 or 2017.3 should look identical between Windows and MacOS (running OpenGL).

    Outside of Unity, any visual differences between DX11 and OpenGL on MacOS either comes down to Macs generally being under powered vs PCs, or the lack of compute shader support prior to Metal. Now that there’s Metal support on MacOS there’s not really a big push to get games onto the Mac since it’s not a big market for modern gaming. It’s hard as a dev to want to spend time getting graphically intense games on the Mac platform when the only Mac computers out there with any graphics chops is the $5k+ iMac Pro (about equivalent to an $800 PC in terms of gaming performance), and the even more expensive top end Mac Pro which is actually slower.
     
    Last edited: Mar 26, 2018
    Ony, PeterB and theANMATOR2b like this.
  7. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    The bit about mobile focus is news to me, probably because I have been focused on compute shaders rather than normal shaders. In this regard I never felt like the metal support for Unity was mobile-focussed in particular, and Ive had some great results on the desktop, especially once we were past a certain version of Unity and macOS.

    Given the relatively little traction on this front due to the mac gpu hardware constraints you mention, I'm rather impressed with how much work has been done to both Unity and UE4 to get higher end features working on macos. Cobbling together something with far more GPU grunt than a normal mac (replace card in old tower mac pro, build hackintosh, or use external gpu with a thunderbolt 3 enclosure and an imac or macbook pro, use nvidia web drivers etc) tends to indicate that the potential is there and largely realised, although I am not suggesting performance is identical to windows).
     
  8. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    There is stuff about the pragma target in the 2018.1 beta release notes. A two step approach, based on relaxing requirements for pragma target 5.0, and a new granular way for checking for supported features:

    I think I've seen some further docs regarding the latter point but dont remember where right now.
     
  9. DeepShader

    DeepShader

    Joined:
    May 29, 2009
    Posts:
    682
    Ok, this is a lot information and I'm very thankful :)

    But.. it's still unclear for me, why it's not possible to build a high-end-graphic game on my iMac Pro with Vega64 card, which maybe will not run on many Macs, but I could build it for Windows too (from the Mac).

    So.. what is the technical reason, that it's not possible to develop a game with high-end-graphic on the latest macOS with the latest Unity (2018.1 beta) which could run on a high-end-mac or a "normal" Windows computer?

    While it seems to be possible in Unreal. What is the technical difference in this case between Unity and Unreal if both supports Metal on macOS?
     
  10. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    OK I will be very specific, its a combination of you being wrong about certain particular things, some of your questions being far too vague, and also the timing as regards the evolution of Unity and where things are at with 2018.1.

    'high end graphics' is a vague term. Ideally you need to be far more specific, and visual aids such as screenshots of the same unity scene on windows and on macos would help.

    You are wrong about Unity in terms of how much effort they've put into supporting metal on macos. Some of this has already been explained in other posts on this thread, and I'm not trying to argue about what Unitys top priority has always been or the amount of time its taken to get certain things supported in metal. Rather my point is again to do with timing, and the foundation that has been built.....

    First the past: I've had some of the systems that made the scenes compelling in the Adam demo working just fine on my compute-capable metal mac. Specifically the volumetric fog & area lights stuff, which is also available on github on its own. Sometimes there are things in these demos that havent worked on macos or metal, but often most of it has worked and its just a question of Unity promoting it being run on windows because they can use quite extravagant hardware and they know quite a few devs and users have beefy gaming systems too. But they havent ignored the mac because of this....

    The present: The future of Unity High-end graphics is the HD scriptable pipeline and whatever other high-end pipelines other people make. The HD pipeline requires compute capabilities, and all the work Unity did on the past to make Unity compute shaders compile to and work with metal pays off. macos machines with metal are very much a target of the new HD pipeline, and they would even like to extend this to high-end mobile metal devices one day.

    But, as of 2018.1 the HD pipeline is not finished and is not considered production ready. And the focus will have been on Windows and consoles more than macos at this point. But these 'its early days' issues apply to all platforms more or less, eg shadergraph not totally ready for HD pipeline yet, and lots of the HD pipeline has worked when I've briefly tried it on my mac.

    Also note that in terms of Unity being confident about metal, its only 2018.1 where macos metal support inside the editor is now turned on by default and no longer considered experimental. So all these pieces are only just coming together, and I expect to see further progress in 2018 for the HD pipeline in general, including on the mac.

    So, I really dont think the current situation deserves to be explained by a 'why is the mac-Unity combo so far behind' narrative. Its more a question of the whole of Unitys high-end graphics future being at a delicate moment right now. A moment where various non-mac-specific pains may be felt, and a few mac-specific ones at times maybe, but I seriously feel we are much closer to parity now than has been the case for many years. It's not enough for metal to simply exist, metal within unity had to mature and metal itself has had to mature, and I'm far happier with how the timing has panned out on these fronts so far than I am, for example with what highend mac hardware is actually available to buy right now.

    Additional complication for people playing the games on a mac in future:

    Those lovely high-res screens that many models have, have their own performance implications in the same way that some pc gamers wont touch 4k gaming because they favour framerate.

    Additional complication for developers looking to develop 'high end' scenes on either mac or windows:

    How much RAM have you got? Neither UE4 or Unity work magic when loading very large assets into large scenes. Not an OS-specific issue but one of the barriers to working on this stuff with your dev machine that may affect mac users a bit more due to the number of mac models that dont feature upgradable RAM!
     
  11. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Plus there are all the reasons why particular shiny graphics might not be included in a game on some platforms due to other performance considerations than just the GPU. Some assumptions are bound to be made by devs as to the average power available on different platforms, and the hardware Apple have sold for many years does no favours to those perceptions, even if you happen to have a mac that has a bit more power on the GPU front.

    That wont stop me using a game engine to develop graphics that I can run on a mac with a 1080ti in it if I want, the power is/will be available to me in the engine, but I certainly wont expect to see lots of similar eye-candy targeting this largely fictitious level of mac gpu power. Mind you we do live in an age where this doesnt have to be a fiction because egpus are an officially supported option now, but at this stage not one I think developers are placing many bets on (similar story with macOS VR).

    Also with all this focus on graphics I should have taken time to say that there are other reasons why the shiny extras may be switched off, eg if other parts of Unity perform slower on macos than windows. And here again we have the same story, this is a moment of change for Unity where efforts regarding game performance have been focused on Job System/ECS/Burst, and much of these systems are still at the experimental stage. It would not surprise me if ECS/Burst are not performing as well on macOS as windows right now, but again its early days and Burst is not even available in standalone builds yet if I remember correctly. So anyway, more parts of Unity where the mac is not being ignored, may not be treated as priority #1 but certainly isnt languishing far down the priorities list. I look forward to the modular mac pro in the hope that all this love will find a most suitable target :) Failing that, the imac pro would be quite good for development when paired with a beefy GPU in a thunderbolt 3 enclosure, although I understand there is quite a lot of thermal throttling. I might live with a top-end machine from the non-pro imac lineup, again paired with an egpu.
     
    mahaloKahuna and PeterB like this.
  12. gecko

    gecko

    Joined:
    Aug 10, 2006
    Posts:
    2,238
    I appreciate this discussion and especially @elbows' comments. If I may steer it slightly OT: I've got a 2017 MBP with Radeon 560 GPU, but finding very poor performance with Unity -- only modestly better than my old 2013 MBP with Nvidea 750m GPU. I've got a test scene that is simply a small terrain with some grass planted with Vegetation Studio -- when I run the build on my 2017 MBP, I get about 40 fps, compared to 140 fps on my old PC with Nvidea 750Ti GPU. know that Windows will perform better than Mac, and I know that's a crude test, but I'm seeing the same poor performance consistently on the Mac. I ran Cinebench on my MBP and got a score of 83, which roughly matches what I've found in published benchmarks online. Any thoughts? Does this seem right to you?
     
  13. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    3,033
    @gecko:
    The RX 560 isn't easy to get the specs of because AMD released THREE different versions over time (articles you find will be saying two, but Wikipedia has three models). They're AMD's equivalent to the NVidia *50m cards, and each newer variant is clocked lower. Exact benchmarks I dunno about though.

    But there's a pretty big leap in real-world performance between RX 560 and RX 570, possibly because the 560 isn't intended for mobile or desktop use specifically, but is clocked lower/throttles down when heat-constrained (at least that's been the case before when there's no m model of a GPU). I think Apple typically underclock mobile parts even more than reference specs. As long as all their apps and the OS itself runs well, other apps be damned. Maybe they'll care about whatever Pixar use (but they're desktops + render farm users, mainly).

    My iMac has the 570, and the performance is roughly on par with my gaming PC, which has a 290. Both play 1080p to 1440p games well, including all the latest. Both are a joy to use with Unity or Unreal editors. It's just a shame about the lack of shader support etc.
     
  14. gecko

    gecko

    Joined:
    Aug 10, 2006
    Posts:
    2,238
    @orb My MBP has the 560 Pro, but yeah, I suppose Apple is probably throttling it down. Sigh.
     
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    The (as of right now Apple Macbook Pro and iMac only) Radeon Pro series is not the same as the Radeon RX series found in PC desktops and laptops. It's quite a bit slower than even the stealth "updated" slower version of the RX 560. A rough estimate is it's about 75% as fast as the RX 560 based on the on paper specs.

    As mentioned above the 570's are significantly faster than the 560's. The RX 570 is roughly twice as fast as the RX 560, and the same is roughly true of the Pro 570 and Pro 560.

    Comparing the on paper specs of the Nvidia GT 750m, the Radeon Pro 560 should be at least twice as fast, and synthetic benchmarks seem to show that too. However I've seen and heard many reports that the Pro 560 isn't actually performing like that in the real world with several long time MBP game devs ditching their 2012~2014 Macbook Pros and buying Windows laptops instead, or buying lightly used top end 2014 Macbook Pros. This is after many of them bought brand new 2016 or 2017 Macbook Pros. The late 2014 models are often just as fast and sometimes faster!

    So, unfortunately that might be what you're experiencing too.

    It's also possible you're experiencing the bug which causes the integrated Intel GPU to be used instead of the Radeon Pro. But that should be far more obviously slower on the new Macbook Pro than the GT 750m.
     
    mahaloKahuna likes this.
  16. gecko

    gecko

    Joined:
    Aug 10, 2006
    Posts:
    2,238
    Wow, that is really depressing. I'm still on 10.12 so I don't think I've got that bug with dedicated/integrated.

    Well, I've been casually following the progress of eGPU support, and it looks like that'll be official in 10.13.4, to be released anytime now, so I think that'll be my next step.

    Anyways, thanks everyone for sharing information on this larger topic -- very very interesting -- and sorry @DeepShader for hijacking your thread. :/