Search Unity

Any way to hack 16x or 32x MSAA?

Discussion in 'General Graphics' started by mabulous, Jul 6, 2020.

  1. mabulous

    mabulous

    Joined:
    Jan 4, 2013
    Posts:
    198
    I'm doing order-independent transparency using the MSAA buffer (there are hundred-thousands of independent, textured, triangles that must be rendered with correct transparency orderr without having to sort them in every frame). It works very well, but the problem is that at 8x there's still too much aliasing on low opacity objects (which makes sense as the sample count per pixel is limited by the opacity).

    I don't quite get why Unity has limited that setting to 8x? I understand that for regular AA the returns are probably somewhat diminishing above 8x, but there are may more applications for multisample buffers than simple antialiasing and I don't really see technical reasons to limit it.

    Is there ANY way (can aslo be by using some undocumented hack that may break at any moment) I can force unity to set MSAA to 16 or 32?
     
  2. richardkettlewell

    richardkettlewell

    Unity Technologies

    Joined:
    Sep 9, 2015
    Posts:
    2,285
    I'm not totally sure on this, but from reading: https://docs.microsoft.com/en-us/wi...d11-d3d11_standard_multisample_quality_levels

    It seems like, for DX11 at least, the spec only guarantees up to 8xMSAA, and 16x is optional. So, beyond 8, we are into the realms of "the graphics hardware might not support it". So we would at least need to expose graphics-caps queries to let you check if it's supported. And limited support may cause you problems depending on what you are trying to target your app to run on?

    I also can't see any mention of 32xMSAA being supported on any hardware, from some quick Googling. (Though I may of course be wrong about this)

    Do you know of hardware that supports 32 (and beyond?)

    (EDIT: you're also looking at consuming a lot of gfx memory at really high MSAA levels - maybe there is some way to use the MSAA k-buffer approach for the first 8, and then use DX12 Raster Order Views with an approximation of the pixels that are beyond the first 8? aka this kind of thing: https://software.intel.com/content/...cles/rasterizer-order-views-101-a-primer.html)
     
    Last edited: Jul 6, 2020
  3. mabulous

    mabulous

    Joined:
    Jan 4, 2013
    Posts:
    198
    that doesn't seem to be a problem, since OpenGL for example only guarantees 4 samples and Unity just limits the value to the max of the graphics card (if less than 8). This behaviour is also documented here: https://docs.unity3d.com/ScriptReference/QualitySettings-antiAliasing.html

    Most decent desktop graphics cards should support 32 samples. both, my RTX 2070 and my GTX 1080 both support 32 samples.

    According to this database https://opengl.gpuinfo.org/displaycapability.php?name=GL_MAX_SAMPLES
    29% of all recorded gpus support 32 samples or more (about 38% support 16 or more samples)

    Regarding memory: It requires quite a bit of memory yes, but it's really not that bad. At 1080p resolution, that'd be "only" about 256MB of memory (RGBA32).

    I just don't see any reason why there is a hardcoded limit, rather than just supporting whatever the hardware supports. It's not like Unity would have to special case everything for each sample count.
     
    Last edited: Jul 7, 2020
  4. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    400
    Because everybody will enabling MSAA32 just because they can, and then complain "unity is slow engine".
     
  5. mabulous

    mabulous

    Joined:
    Jan 4, 2013
    Posts:
    198
    That's a terrible argument for not exposing this capability (again, since multisample buffer can be used for much more than just antialiasing in advanced rendering techniques). There's a million ways to make unity slow if an idiot uses it, if you feel like you have to cater to him, add a warning - but don't hardcode an upper limit on such an elementary hardware capability.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    I suspect there's something of a political / pragmatic reason for this. While the latest GeForce GPUs are capable of 16x (and really even 32x though only with professional drivers & GPUs), it's not been a setting that is exposed anymore. Nvidia removed options for anything above 8x MSAA from their Direct3D drivers and control panel about 6 years ago. Games that used to show 32x CSAA in their options on older GPUs no longer show that option or will crash if you tried to. So really, 16x MSAA support appears to be considered deprecated by Nvidia themselves, even though OpenGL applications can still force it on.
     
  7. Microblast-Games

    Microblast-Games

    Joined:
    Aug 20, 2013
    Posts:
    7
    Any news on this? Are they still on this S***ty 8x limit?
     
  8. GoGoGadget

    GoGoGadget

    Joined:
    Sep 23, 2013
    Posts:
    864
    Just about every game since this post was originally made has been released without support for 32xMSAA, and they managed. There are better ways to deal with Aliasing than by exponentially increasing MSAA samples, whether through post AA like FXAA or DLAA, TAA, or supersampling.
     
  9. Microblast-Games

    Microblast-Games

    Joined:
    Aug 20, 2013
    Posts:
    7

    How to super sample in unity?
    All I have found is this but didn't work

    https://forum.unity.com/threads/supersample-rendering.330368/
     
  10. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,929
    1.- Render your camera to a RenderTexture that's larger (x2, x4...) than the screen.
    2.- Dowscale to the size of the screen, using a filter of your choice.
    3.- Profit.
     
    GoGoGadget likes this.