Search Unity

Question use a Texture as screen output

Discussion in 'General Graphics' started by GamerLordMat, Mar 2, 2023.

  1. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    Why is this such a hard problem in Unity?
    I have a Texture of Colors and what them set as my screen output. Should be easy...
    In my mind this would be: mainColorBuffer = myTexture;
    after seaching for 2 hours I havent found one solution how to do this in a non-hacky way.
    Just really annoying and this whole renderTexture story is one of the most unusable systems in Unity for me.

    I mean putting a quad in front of the camera and set the material texture to my texture sounds like a really stupid way of doing such a thing.

    I would appreciate any help very much!
     
    Last edited: Mar 2, 2023
  2. Sluggy

    Sluggy

    Joined:
    Nov 27, 2012
    Posts:
    983
    This isn't really a Unity thing. It's just how computer graphics work in general since their inception. If you want to take contents from an offscreen buffer and display them on the screen you need to copy that data to a region of memory for the screen. In this case you'd want to look into something like Graphics.Blit
     
    arkano22 and GamerLordMat like this.
  3. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    hello,
    thanks for answering. I know how Blit works but where to copy it? Camera.main.renderTexture = myTexture did nothing when I tried it.
     
  4. Sluggy

    Sluggy

    Joined:
    Nov 27, 2012
    Posts:
    983
    All of the details you need are in that link I provided. Specifically in this case you want to set the destination to null to imply that it should blit directly to the screen. You also need to be sure that your Camera.main in the scene has it's render texture set to null otherwise it will just blit to that camera's render texture.

    As far as I can tell from that link this method doesn't actually do a true memory copy but in fact just renders a fullscreen quad to the entire surface of the destination using a given material and the texture supplied by the source. The only real difference between this and doing all of that yourself is, well, you don't have to do all of that yourself. No gameobjects or components need to be set up.
     
    GamerLordMat likes this.
  5. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    Hello,

    again your answer is much appreciated.

    Yet, it still doesn't work for me (not meant to bother you any further).
    My question is a simple "know-how" one. There is nothing much to understand here, Unity should have a certain way of handling these things, it is different from doing from scratch in directX or doing it in Unreal Engine.
    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. public class Example: MonoBehaviour
    4. { // Copies aTexture to rTex and displays it in all cameras.
    5.  
    6.     Texture aTexture;
    7.     RenderTexture rTex;
    8.  
    9.     void Start()
    10.     {
    11.         aTexture = new Texture2D(2,2);
    12.         rTex = new RenderTexture(2,2,24);
    13.         if (!aTexture || !rTex)
    14.         {
    15.             Debug.LogError("A texture or a render texture are missing, assign them.");
    16.  
    17.  
    18.         }
    19.     }
    20.  
    21.     void Update()
    22.     {
    23.         Graphics.Blit(aTexture, rTex);
    24.     }
    25. }
    this code doesnt do anything, but Doc states it should copy aTexture to rTex and displays it in all cameras.
    Like I said, haven't found one working example in the internet for this simple operation.

    I will now make a custom shader and try to make it work with a quad...
     
  6. Sluggy

    Sluggy

    Joined:
    Nov 27, 2012
    Posts:
    983
    So real quick, that code you posted does not do anything that you describe it's doing. You are essentially just copying the contents of one texture to another so unless you somehow display that second texture of course you will still not see anything. In your case you want to pass null as your second parameter to render to the screen however there is a caveat...

    If you call this during Update you will just have your screen overwritten by the camera later during the rendering phase anyway. Instead you want to call this blit method sometime during the built-in function OnRenderImage which is called automatically on any script that is attached to a camera just after the camera has finished rendering. This is normally used to apply chained postprocessing effects but you could just as easily blit a texture directly to the screen too.

    So it does appear that if you are trying to handle this during the normal update phase then the easiest way to do what you are attempting is to render to the texture and then assign that texture to a material used by something else such as a quad or a UI RawImage element. Otherwise, you can use something like this to draw the contents of a render texture directly to the screen (which again, does the exact same thing with quads and materials and whatnot in the background anyway).

    Code (CSharp):
    1. using UnityEngine;
    2.  
    3.  
    4. /// <summary>
    5. /// This script must be attached to a camera. The camera will blit the contents of a RenderTexture
    6. /// directly to the screen just after the camera has finished rendering.
    7. /// </summary>
    8. public class CopyRenderTextureToScreen : MonoBehaviour
    9. {
    10.     public RenderTexture RenderTarget;
    11.  
    12.  
    13.     void OnRenderImage(RenderTexture src, RenderTexture dest)
    14.     {
    15.         //you could also pass 'dest' as the second parameter. in that case it would
    16.         //pass the result down the chain to the next component that uses this method
    17.         //that is also attached to this camera gameobject. if 'dest' is equal to null
    18.         //that means we would be at the end of the chain and the final result would be
    19.         //rendered to the screen directly. in this case we are shortcutting by manually
    20.         //passing a value of null.
    21.         Graphics.Blit(RenderTarget, (RenderTexture)null);
    22.     }
    23. }
    24.  

    As a side note I originally attempted to run this during OnPostRender but apparently it doesn't seem to work there. I'm not sure if that's a bug or expected behaviour to be honest. ALSO VERY IMPORANT: this will only work on the built-in renderer for Unity. If you are using URP or HDRP you instead need to register a handler with the pipeline manager to perform the post-rendering logic.
     
    PowerJake likes this.
  7. gamerseecubic

    gamerseecubic

    Joined:
    Mar 30, 2016
    Posts:
    3
    Hi @Sluggy, sorry to bother you under this post. I have a similar question but it is for multi-display. As you may have noticed, if we set null to dest of Blit, Unity will show the texture on the main camera display. As a result, in my two-camera and two-display application, I always get the rendering of the last camera on the primary display while the second display is black. Do you by any chance know how to tackle that problem? Any suggestions are appreciated.

    I know I could create a canvas and an extra camera for each of the original cameras so I don't need to blit. But that's my last choice if I couldn't do anything with Blit.
     
  8. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,924
    Welcome to making games, and to graphics in general! ;)

    RenderTextures are a basic concept and they exist in all engines -under a variety of names: offscreen buffer, render target, framebuffer- not just a Unity thing. They conceptually work the exact same way everywhere, though abstractions vary between graphics APIs and game engines.
     
    GamerLordMat likes this.
  9. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    Hello,
    yes sure, you are right, but my problem is that the documentations are not enough detailed for me to understand it well.
    Sure, If you have been programing DirectX for years (which had like 5 years ago when I last looked no resources to learn DX11 from scratch) you can maybe assume what Unity is doing under the hood but look at @Sluggy 's answers, his responses are so much more in detail then the docs, but every word of his is import to understand the concept.

    And even he did not get why PostRender doesnt work.

    It is stupid and I could not find one Unity code in the internet that works out of the box for that simple example that I am doing...(I haven't tried out Sluggys stuff out yet bc Unity forum is weird: hasnt send me follow up notifications)
     
    PowerJake likes this.
  10. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    And btw. thank you openAI for making my IT life bearable with ChatGpt, because very simple and stupid know how topics like this that are neither well documented nor hard to understand are finally done in a few minutes and not hours of bothering people on forums.

    Maybe I'm just incapable of reading documentation, but I also see the same frustration in my college peers on similar topics.... Unity sucks at documenting things, but yet they are one of the best at it.... very sad how the IT industry saves money.

    Only a handful of nerds (not meant as an insult) who have much more insight either directly through Unity training or through years of experience with the same thing understand how it works....
     
  11. Sluggy

    Sluggy

    Joined:
    Nov 27, 2012
    Posts:
    983
    @gamerseecubic Sorry I don't have any experience with this so I can't say. I don't even have two monitors at the time to try to play around with it and test things out. Unfortunately a google search brings up the documentation and your very own post asking this question.
     
  12. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    @gamerseecubic

    maybe it is a stupid and lazy answer, but I have asked ChatGpt4:


    From what I understand, you're having trouble rendering different cameras to different displays in Unity. You're currently using Graphics.Blit() to render to the main camera display, but you're not sure how to render to the other display.

    There are a few ways you could approach this problem, but one straightforward solution would be to use the Camera.targetDisplay property. This property allows you to specify which display a camera should render to.

    Here's a basic example:

    csharp
    public Camera camera1;
    public Camera camera2;

    void Start() {
    // Set the target display for each camera
    camera1.targetDisplay = 0; // Main display
    camera2.targetDisplay = 1; // Secondary display
    }

    In this example, camera1 will render to the main display and camera2 will render to the secondary display. Note that the display indices are zero-based, so 0 refers to the first display, 1 refers to the second display, and so on.

    If you're working with more than two displays or if your setup is more complex, you may need to adjust this example to suit your needs.

    As for using Graphics.Blit(), this function is typically used for post-processing effects and other similar operations. When you use Graphics.Blit() with a null destination, it will render to the active RenderTexture, which is usually the screen that is currently being rendered to. This is why you're seeing the output on your main display. If you want to use Graphics.Blit() to render to a different display, you would need to set the active RenderTexture to a texture that is being displayed on that display. This could be a bit more complicated, depending on your setup.

    If you want to stick with Graphics.Blit(), you might want to consider using multiple RenderTextures and setting each one as the target for a different camera. You can then use Graphics.Blit() to copy the output of each camera to its corresponding RenderTexture. Here's a very basic example:

    csharp
    public Camera camera1;
    public Camera camera2;
    public RenderTexture display1;
    public RenderTexture display2;

    void OnPostRender() {
    // Blit the output of camera1 to display1
    RenderTexture.active = display1;
    Graphics.Blit(camera1.targetTexture, null);

    // Blit the output of camera2 to display2
    RenderTexture.active = display2;

    Graphics.Blit(camera2.targetTexture, null);
    }

    In this example, display1 and display2 would be RenderTextures that are being displayed on your two displays. Note that this is a very simplified example and may not work perfectly in your specific setup. You might need to adjust this code to suit your needs, or look into other solutions if this doesn't work for you.
     
    Last edited: May 30, 2023
    crandellbr likes this.
  13. GamerLordMat

    GamerLordMat

    Joined:
    Oct 10, 2019
    Posts:
    185
    sorry for the late response but thanks for helping me!
     
  14. gamerseecubic

    gamerseecubic

    Joined:
    Mar 30, 2016
    Posts:
    3
    Thanks for your reply. I guess your proposed method is similar to my workaround; eventually, it needs something (e.g., a canvas) to show the RenderTexture display1/2. And the display class doesn't expose the underlying render texture.

    I didn't consider camera.colorbuffer etc because it only works for the built-in render pipeline. But I learned something new here that RenderTexture.active and Graphics.Blit can be used together.
     
    GamerLordMat likes this.
  15. Idual

    Idual

    Joined:
    Mar 11, 2015
    Posts:
    14
    Have you looked at GUI.DrawTexture()?
    It's very simplistic but depending on your needs might do the job.
     
  16. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,635
    Just make a quad with an unlit material and your render texture as the materials texture. point a camera to it. You can isolate these on their own layer for rendering efficiency.