Since having MSAA roughly doubles the ram usage of rendertextures, I thought I'd try using memoryless mode. The problem is, I probably don't know how to use it... Code (CSharp): using UnityEngine; public class memoryless : MonoBehaviour { RenderTextureDescriptor mainrtdesc; RenderTexture rt; void Awake() { mainrtdesc = new RenderTextureDescriptor(Screen.width, Screen.height, RenderTextureFormat.Default, 24); mainrtdesc.memoryless = RenderTextureMemoryless.MSAA; mainrtdesc.useMipMap = false; mainrtdesc.msaaSamples = 2; } void OnPreRender() { rt = RenderTexture.GetTemporary(mainrtdesc); GetComponent<Camera>().targetTexture = rt; } void OnPostRender() { GetComponent<Camera>().targetTexture = null; Graphics.Blit(rt, null as RenderTexture); RenderTexture.ReleaseTemporary(rt); } } But it doesn't seem to work. I was assuming that by setting the MSAA as memoryless, that I'd have the rt only have the resolved texture and save some memory. But it doesn't work. I build on a Shield Tablet K1 with Vulkan, and memory usage is exactly the same as without the memoryless flag (I checked with Unity's MemoryProfiler from bitbucket). So I probably have some fundamental misunderstanding of how this works. Any examples of this working, I looked but there isn't a lot of info. (also, is there no way to set both MSAA and Depth as memoryless?)
Hey, RenderTextureMemoryless.MSAA is not supported on Vulkan (https://docs.unity3d.com/ScriptReference/RenderTextureMemoryless.MSAA.html). I don't remember specifics, but its basically Vulkan limitation. You are correct, this is how it works in metal.
Ah, I did notice that Vulkan was missing from the docs page for MSAA, but I assumed it was a docs error, since all other modes do mention Vulkan. The rest should work on Vulkan, (like .Depth) right? Thank you for the reply.
Hey LukasCh, I'm trying to use memoryless RenderTextures for my project but it's not working as I expect. I'm trying to have cameras render to a render texture and then set the render texture as the texture on a material to be rendered by another camera. The render textures appear to not be getting populated when I set them to memoryless. Here's an example of how I'm trying to use them. Can I not use memoryless this way? Code (CSharp): using UnityEngine; public class MemorylessTest : MonoBehaviour { public Camera ViewCamera; public Renderer ViewMonitor; private RenderTexture renderTexture; void Awake() { renderTexture = new RenderTexture(Screen.width, Screen.height, 24, RenderTextureFormat.Default); renderTexture.useMipMap = false; renderTexture.antiAliasing = 1; // Color and Depth and Color only don't work renderTexture.memorylessMode = RenderTextureMemoryless.Color | RenderTextureMemoryless.Depth; //renderTexture.memorylessMode = RenderTextureMemoryless.Color; // Depth only works //renderTexture.memorylessMode = RenderTextureMemoryless.Depth; renderTexture.Create(); ViewCamera.targetTexture = renderTexture; ViewMonitor.material.mainTexture = renderTexture; } void OnDestroy() { renderTexture.Release(); renderTexture = null; } }
@bryanDDI what are you trying to achieve by setting .Color to memoryless? Setting it to memoryless means you don't really have direct access to it.
The main goal is to save general memory for the rest of the game. It seems there could also be some rendering speed performance gains from not needing to transfer the render textures from general memory to the GPU for each frame render too.
@bryanDDI First of all thanks for testing it. What you are trying to achieve in test is not valid and I will try to explain. So basically what your script is doing (Without memoryless flag): - ViewCamera renders scene objects into renderTexture - Camera does storeAction to save renderTexture from tile memory to system memory - Then renderTexture system memory copy is used as shader resource for ViewMonitor - Lastly some other camera renders ViewMonitor (Where u probably use for testing if it works or not) Once u enable the memoryless flag, storeAction cannot be done as renderTexture doesn't have system memory for it. So that leads that renderTexture will stay to the color it was created (Most likely gray - internally we even ignore all memoryless rendertexture setting on materials as it is not valid). Yes this will increase your general memory, however for performance you don't really need memoryless as it is controlled by load/store actions. (Just not sure are we expose this in camera, https://docs.unity3d.com/ScriptReference/RenderTargetSetup.html)
Was there a regression for this recently? (or maybe an iOS 12 bug?). I clearly remember confirming it was working properly a couple of months ago (so before iOS 12 and with an older version of Unity), but I just started memory profiling our game again and it seems something's off. I am doing tests with code pretty similar to the one on my first post, and it seems that enabling and disabling AA results in very different RenderTexture sizes (like the one with 4x MSAA takes approx 4 times the memory). I submitted a bug report, it's Case 1095432.
Hi, I came into the same problem. There is a big size memory usage when profiling by XCode and Unity Profiler with MSAA on . Do you know how to reduce the memory taken by MSAA now ? My Unity version is 2018.4 and my device is iPhone 11( iOS 13 )
If I remember correctly, this specific bug was simply that the memory profiler was not accounting for memoryless and was showing x4 the memory, even though memoryless was working. Not sure if what you're seeing is the same thing.
Thank you. How do you know the memoryless was working ? I don't known how to check wether it really works.
I think XCode was showing the correct ram usage for me, but to be completely truthful, I don't quite remember
Hi, I used the code provided by AcidArrow on the top of this thread. Supposed that memory for renderTexture to be x1, but it is still x4 in XCode. Seems like the memoryless.MSAA does not work. Do you know how to use the memoryless.MSAA correctly? (My Unity version is 2018.4.22)
Hi I would be very interested in the memoryless feature of the render targets, at it seems to improve performance. The issue is that I am experiencing this kind of tiled rendering situation in my iPad Pro and also in iPhone12Mini: This is my code for generating the RT: Code (CSharp): RenderTextureDescriptor desc = new RenderTextureDescriptor(finalRenderTargetSizeX, finalRenderTargetSizeY, RenderTextureFormat.Default,32, 0); desc.memoryless = RenderTextureMemoryless.Depth | RenderTextureMemoryless.MSAA; desc.msaaSamples = 2; desc.useDynamicScale = true; desc.autoGenerateMips = false; desc.useMipMap = false; desc.dimension = TextureDimension.Tex2D; desc.stencilFormat = GraphicsFormat.None; _renderTexture = RenderTexture.GetTemporary(desc); It is also happening with only using MSAA mask as memoryless RT option. I am just setting this render texture to the camera, and then I pass it to a UI shader as a texture resource in order to render it. I am using Unity 2020.1.4f1 with URP 8.2, but I also tried in Unity 2020.3.5f1 with URP 10.4 and the same happens. URP Quality option for AA (MSAA) is set to 2x. Maybe @LukasCh can help or someone in Unity? Thanks!