Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Join us on Thursday, June 8, for a Q&A with Unity's Content Pipeline group here on the forum, and on the Unity Discord, and discuss topics around Content Build, Import Workflows, Asset Database, and Addressables!
    Dismiss Notice

AMD FSR and Unity

Discussion in 'General Discussion' started by Joe-Censored, Jun 1, 2021.

  1. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    I just finished watching the Digital Foundry video covering AMD FSR and I find it thoroughly amusing that Epic's TAAU (Temporal Anti-Aliasing Upsampling) is higher quality while only very slightly more demanding. AMD FSR meanwhile looks about as good as I expected it to. That is to say I would never turn it on for standalone.

     
    Knil and Joe-Censored like this.
  2. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Dunno about DF in this case.. other places are saying different things.
     
    Joe-Censored likes this.
  3. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    DF is consistently the only group actually putting the analytical work in while many, many others are little more than thinly veiled tech hype machines. Like, Linus Tech Tips will put the effort into performance analysis but they've always had terrible frame analysis as an example, but they're hardly unique in the space.

    The reality is that this is only slightly more performant than the existing hardware agnostic competition at the cost of looking worse than all of the game scaling solutions out there. This really does look like it's barely a hair above DLSS 1.0
     
    Joe-Censored, Ryiah and PutridEx like this.
  4. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Analytical work doesn't really work so well when the goal is to judge perceptually. Also it's a big win for lower resolutions too, not just 4K.

    The reality is this is all we're (potentially) getting in Unity that runs on AMD and Nvidia cards. I also don't see Unity rushing to add DLSS 2 to URP any time soon.

    "looking worse than all of the game scaling solutions out there" -- I keep reading this from people but people seem to forget the Unity context.

    Unity right now:
    • DLSS2 on HDRP only, on high-end GPUS only
    • HDRP upscaler with lanczos or cubic or just bilinear
    vs
    • Nothing for URP
     
  5. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    We've both been here long enough to know that Unity isn't rushing to add anything if they can't add it without half the functionality missing.
     
  6. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    And here I was trying to be nice to Unity.
     
    NotaNaN likes this.
  7. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    With the latest beta release I started playing around with DLSS. I'm very happy with the results so far and I'm hoping they do an equally impressive job implementing FSR.

    Full Render Resolution.png

    50 Percent Render Resolution.png
     
    Last edited: Jun 29, 2021
    frbrz likes this.
  8. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    9,995
    Judging from Unity's history here's what's likely to happen.

    Unity will implement FSR, but they will implement only the Performance profile, because of *reasons*.

    People will complain that it looks like ass and comparisons with Unreal's custom solution will start. Certain forum regulars will start huffing and scoffing, blaming the community for being entitled and not trusting Unity's decisions, because big corporations always know better.

    Years will pass, it will become a big issue. Here on the forums we'll be so tired of it that every time someone mentions it, we'll go "ugh, this again, Performance mode is fine, you don't need FSR to make a good game anyway!" complete with examples of other games that look pretty good without upsampling at all.

    Soon, threads about it will start to get locked.

    But after some high profile dev on Twitter slams Unity about it and Unity's stock takes a small dive (although, I guess this doesn't matter that much any more, because the higher echelon of Unity people have already gotten rid of their stock), Unity will make a blog post saying how dedicated they are to quality and stability and how implementing an industry standard upscale algorithm is their highest priority.

    A tidbit about "researching upsampling" will be added to the roadmap. Reddit and the forums will be flooded with surveys asking questions like "If you could upsample your resolution by 2x, 3x, 5x but not by 4x, how would you feel?" and "How would you feel if you could not only see, but also smell the pixels".

    Finally, a few more years later, Unity will release their new upsampling algorithm. It will be pretty buggy and kinda slow, but it's a start. People will say it's okay, it's a first release, surely they will improve it soon. They won't.

    Eventually, graphics cards will be powerful enough to render at 16k at native res and upsampling will not matter and everyone will be happy(?).
     
    Last edited: Jul 6, 2021
    Timboc, Novack, frbrz and 11 others like this.
  9. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    This reads like a grim Ragnarok prediction told by vikings holding horns of mead over fires in long wooden halls at night while the wind howls outside like the beasts of myth.
     
    HyperionSniper, Novack, frbrz and 7 others like this.
  10. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    Except in the case of Unity, Jormungandr stops eating its own tail and vomits poison onto the land and sea every few months, which is why the vikings talk like that.
     
  11. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    387
    AMD partners with Epic Games in Unreal Engine 5 Ea... - AMD Community

    I assume FSR also comes to UE :), together with DLSS (currently avialble on UE4 and UE5, u need to download teh plugin thoo) and UE5's Temporal Super sampling solution. Btw UE has only one render pipeline.

    As i said in other thread, DLSS, FSR those are too good of a features to just pass on URP. Not neccesarily someone has to use HDRP to create a PC / Console game, ppl can also use URP why cut them off from the tech.

    Secondly yes FSR is rough around the edges atm, but do u remember DLSS 1.0 ? So if ppl gonna use FSR (and they will, they don't need RTX card to use it) AMD definetly will invest resources to improve it.
     
    laurentlavigne, BonneCW and NotaNaN like this.
  12. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    Have you used an AMD GPU? It wasn't that long ago that AMD would release a driver and everyone would meme about it because their drivers were legitimately that bad with all sorts of problems being introduced assuming the update worked at all. AMD will invest resources but I wouldn't hold my breath for significant improvements.

    Drivers That Work.jpg
     
  13. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    387
    I want to be optimistic about it :), true that AMD has to up their game with GPUs, but watching how Ryzens turned out there is hope for them to step up their game eventually.

    We all should hope AMD gets better with GPUs, that would cause a healthy competition with Nvidia that should benefit us, users.
     
    Joe-Censored likes this.
  14. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,824
    TL;DR: My speculation about AMD next year

    The next series of GPUs from AMD are likely going to be on TSMC 5nm, as are Ryzen 6000 series CPUs. You're going to see significant efficiency boosts just from the node shrink, which means they can do more work with the same power budget as today. NVidia is probably going with whatever the best Samsung has again for the next generation, which is either 8nm or 7nm. NVidia has a bit superior engineering so will be more competitive than it might otherwise look, but I'm expecting this next generation is going to be the first time AMD's flag ship greatly outclasses NVidia's. But also the first time AMD charges more money for their's than NVidia.

    Since AMD is again going to share the same TSMC supply for both of their major product lines, you'll also probably see a repeat of this generation where AMD simply can't supply enough product to leverage their advantages to really capture significant market share from either Intel or NVidia.
     
  15. cxode

    cxode

    Joined:
    Jun 7, 2017
    Posts:
    265
    I've used AMD GPUs since 2014 and this has not been my experience at all. I have never had any kind of driver issue whatsoever. You're exaggerating.
     
    Andresmonte likes this.
  16. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    WORKSFORME WONTFIX
     
    Zarconis and Joe-Censored like this.
  17. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,824
    OMG, you were on the dev team at my last job weren't you? :p
     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    My post is based off of the statements of hardware reviewers (who have dealt with a wide range of hardware as well as driver releases) and the users of said hardware in communities like /r/AMD. If you have evidence proving your statement I'm willing to listen to it but I'm going to disregard one person's anecdotes.

    https://en.wikipedia.org/wiki/Anecdotal_evidence
     
    Last edited: Jul 1, 2021
    Joe-Censored likes this.
  19. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,824
    Their video driver issues are well known, and have a history which predates even the AMD acquisition of ATI. My personal experience goes all the way back to the ATI Rage series, which had unstable garbage drivers. The first Radeon R100 series, again unstable garbage drivers, etc, etc. It seemed like AMD decided not to break such a well established tradition until relatively recently.
     
    Zarconis likes this.
  20. adventurefan

    adventurefan

    Joined:
    Jan 17, 2014
    Posts:
    228
    Well, look at AMD and Nvidia's patch notes. They are BOTH filled with issues all the time. :) Such is life with technology.
     
  21. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    AMD/ATI's driver situation was/is so bad that it has been used as a point in nvidia's favour when making purchase comparisons.
     
    Joe-Censored likes this.
  22. adventurefan

    adventurefan

    Joined:
    Jan 17, 2014
    Posts:
    228
    People say that and then you also see people complaining about their $1000 Nvidia card being bugged in the latest game. Bugs happen.
     
  23. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    "Bugs happen" at different scales at different manufacturers. This has been a specifically known issue of AMD/ATI for longer than you know.
     
  24. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    We're not saying complaints don't exist on the NVIDIA side. We're saying the ones on the AMD/ATI side vastly outnumbered the ones on the NVIDIA side. I think it's often lost on people that NVIDIA's strength is with their software not their hardware. If you stripped out all the optimizations they've done their cards would be slower.
     
    Joe-Censored likes this.
  25. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,289
    Yeah, I swapped from ATI to nVidia cards ages ago both individually and across the studio I was running because we kept having issues with the ATI stuff and research showed that it wasn't just us. For general gaming there wasn't really an issue, but we'd regularly get crashes in content creation software. And when you're paying people to make stuff that's not just an irritation, it's lost productivity.

    Indeed, nVidia aren't perfect either, but there's a far cry between "not perfect" and "a noticeable impact on productivity". I strongly suspect that things have improved since, as negative reputations are hard to shake. I've had no reason or opportunity to find out first hand, though.
     
    Joe-Censored, NotaNaN and hippocoder like this.
  26. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I always buy nvidia now because of DLSS2+ and more apps are hardware accelerated for nvidia cards. It's simple: stuff is faster, more stable and more expensive but I'll live with that.

    AMD is hardly hurting: they are in a couple of entire console generations and doing fine.
     
    Joe-Censored and PutridEx like this.
  27. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    Some part of which is likely due to the console manufacturers having feedback into the design of the chips.
     
    Joe-Censored likes this.
  28. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,824
    For a console product, I'm sure it is very attractive to be able to get both the CPU and GPU as a single deliverable from 1 company. AMD is probably the only company which could do that at this time, outside of a handheld product.
     
  29. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    nvidia has big plans for arm, and it'll be interesting to see what next.
     
    Joe-Censored likes this.
  30. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    9,309
    Genuinely excited to see if they've got any plans for high-performance consumer/prosumer desktop chips with proper expansion support. RISC devices (as opposed to the RISC-like we kinda have with x64 if you squint) have shown a lot of potential in the consumer space lately, well beyond the theoretical ideas people had back 20 years or so.
     
    Joe-Censored likes this.
  31. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,824
    Yep, but they were too slow for the recent console refresh. If they stay on track, maybe AMD will have stiff competition for the next console cycle though. More competition = more better
     
  32. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,373
    If only there was a successful console using both arm and nvidia ...

    Also dlss 2.0 is basically a tsr, tge neural don't do the upsampling directly, but manage the policy of pixel rejection.

    That explains why unreal 5 solution is getting close, they just use human expertise rather than neural networks.

    For those who don't know, tsr jitter the camera position in subpixel over time, and use the data of the previous frame, which is a subpixel apart, to blend with current frame, and thus recover fine details and edges data, using a pixel rejection policy on the gbuffer, trying to discard obsolete pixel, which is why bad implementations had trail artifacts, which dlss 2.0 occasionally have. In some way it can be seen as the child of accumulation buffer and checkerboard rendering. They key is to develop the rejection policy by identifying the correct pixel to blend, the motion buffer help but so is any data to identify relevant information.

    Why is that relevant? Fsr don't do that, it takes the input of the final frame and try to guess details from basically nothing, which in case of arbitrary shape, can't work, because it's... arbitrary! If we were able to inject more data in fsr, like sub pixels information, it would probably do a much better job.

    Using a vanilla upscaler i have no idea how to do that, it would mean redesigning the process, but maybe some ingenious dev can hack current implementations with some tsr idea to improve it?
     
  33. SJAM

    SJAM

    Joined:
    Jan 11, 2009
    Posts:
    729
    Joe-Censored likes this.
  34. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Is it available for URP/VR ?
     
    Totalschaden and Joe-Censored like this.
  35. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,100
    Last edited: Jul 15, 2021
  36. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
     
    adventurefan and Joe-Censored like this.
  37. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Unity right now:
    • DLSS2 on HDRP only, on high-end GPUS only
    • HDRP upscaler with lanczos or cubic or just bilinear
    • AMD FSR for HDRP
    vs
    • Nothing for URP
     
  38. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    387
    This is actually stupid. I hope Unity realizes URP can also be used to make a frickin PC and Console games too -.-.

    But that is what You get with two completly separate teams working on rendering system -.-
     
    hippocoder likes this.
  39. Onigiri

    Onigiri

    Joined:
    Aug 10, 2014
    Posts:
    388
    Add to this TAA Upscaler soon:)
     
  40. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Indeed, and that is why we smile with mild Stockholm like insanity :)
     
  41. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,915
    Unless I'm missing something, the pipelines are supposed to be hackable.

    So it should be theoretically possible to fork URP and wire in the upscaler.
     
  42. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It's theoretically possible for me to just switch to O3DE and beat URPs performance in under 6 months and have a reliable engine on top. But I use Unity because I only have time for gameplay programming or control programming rather than engine programming.
     
  43. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,915
    It shouldn't take 6 months to hack in an upscaler assuming Nvidia/ATI provide an SDK. And it is not exactly an engine-level task either.

    And possible losses from an engine switch would be higher.

    So I'm genuinely wondering why nobody has attempted that yet.
     
  44. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    6 months does include me becoming familiar with the engine enough to replace URP as a whole. I expect FSR would take a few hours for a first draft test.

    The reason I mention going to another engine rather than implement it myself in Unity with a custom SRP is because it will break every update, and I'll have to maintain it all the time. That is why we pay Unity to do it instead.

    Creating a custom SRP is an invasive and time consuming project which for me, given Unity's 4 years on it so far, is worth porting to another engine for instead. I would be much more trusting if there was a track record to support it.
     
    NotaNaN likes this.
  45. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,915
    But I'm not talking about a custom SRP.

    Isn't entire source of HDRP on github? In that case, you can fork it. It is under Unity Companion license, so that is allowed.

    Then you can make minimal changes necessary to accomodate for your feature and either ride the upstream while maintaining your fix, or attempt to submit the fix to the upstream.

    That's not even remotely the same thing as maintaining your own SRP made from scratch.
     
  46. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,915
    Basically, what I'm describing can be done through version control, like git.

    You fork stable branch of HDRP (if there is one). You base your feature branch off that. You implement the feature.
    When the upstream updates, you update your "cloned upstream" branch to match upstream (git fetch/git pull), and then REBASE whatever you did in the feature branch on top of that. That will cause the version control software to apply your fixes on top of the new state of the source tree.

    Depending on what did you do, this can be completely painless, as in you'll never have to fix anything, or painful, meaning there will be a ton of conflicts. Painless change would be adding new files, or a line or two to existing files. Painful would be making changes in some sort of binary blob.

    In general, this kind of thing should be feasible, although you won't be using the package manager for your pipeline.

    I mean the point of scriptable pipeline was to make it hackable. Why not hack it?
     
    NotaNaN and hippocoder like this.
  47. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    387
    Maybe because its pain in the ass and produces more problems than benefits?...i guess.
     
    hippocoder likes this.
  48. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Because:

    a) I didn't ask for SRP or hackability or anything
    b) I now have to pay dev time costs for it

    c) I shouldn't have to beg for URP when HDRP has 4 different upscalers (including this) without any need for what you're saying.


    If it's replacing built-in why isn't it the same amount of work? So far I end up doing more work throughout Unity when I go near SRP even peripherally if I care about performance.
     
    NotaNaN likes this.
  49. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    18,921
    Last edited: Jul 18, 2021
  50. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,915
    Pain in the ass with problems would be rolling out your own SRP from scratch.
    That's not what I'm describing, at all. What I'm describing is making a SMALL adjustment. That's not the same thing, at all.

    Well, the point of having pipeline SCRIPTED would be making it easier to extend. Because there are things you cannot shoehorn into builtin. Two basic examples would be
    1. Order independent transparency (because directional lights use depth buffer)
    2. Unusual shadow techniques like Shadow Volumes.
    So the idea would be to make comlex tasks take less work.

    Also, people on those forums are also often saying that it would be nice if unity provided source code for the engine.

    Well, here's a subsystem with source code. Why isn't anyone hacking it?