Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Get UV warped texture as another texture

Discussion in 'Shaders' started by ivanshv, Aug 14, 2018.

  1. ivanshv

    ivanshv

    Joined:
    Apr 13, 2018
    Posts:
    11
    Hi guys,

    I'm quite new to shaders and Unity overall, and desperately need your help.

    So I've got this render texture applied to the object (cylinder, pyramid, etc.), this render texture is warped according to UV coordinates of this object defined in Blender. The goal is to get this warped texture to be projected onto another object (plane), and then captured by main camera as a final video output. Of course, render texture doesn't work here, since I cannot get the whole object with the camera (or can I?).
    Basically, all my findings so far take me to the shaders which use multiple textures and multiple passes without any possibility to use the warped texture (material) from another object. And I'm a bit stuck.
    Also, it is not necessary to use the model of arbitrary geometry, maybe it is possible to apply some sort of a map (UV map .png exported from Blender) on a plane so that the render texture is project only on special areas of the plane and hence warped accordingly?

    Hope you get my question, If not, please don't hesitate to ask for details.
    Thank you in advance.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,245
    An image UV based approach would likely be the easiest solution. You’d need to export the UV using a 16 bit PNG or Tiff and import it with compression and sRGB off. Then you would use a Blit() to render the first texture into another render texture using the UV texture.
     
    hbguler, nainpk09 and ivanshv like this.
  3. ivanshv

    ivanshv

    Joined:
    Apr 13, 2018
    Posts:
    11
    Thank you very much for your reply, bgolus!

    I've been also considering Blit and GrabPass, good to know that I was looking in right direction.
    I've seen some examples of how to use Blit, but don't really understand how to execute it in my case. So first I should write the script which executes Blit OnRenderImage, and will use some new material (???), to which the shader is attached, to actually Blit src to tar. Attach this script to main camera.
    Then I don't really understand what to write in the shader. I mean, how to use the UV image to warp the source texture in the shader. I would really appreciate if you could find time to guide me through this a bit.

    In addition, as I understood, you find the approach of getting the texture already applied to geometry to be very complicated but still possible, am I right? The thing is that this texture is actually a cubemap (well, not really yet, I'm still struggling to get it properly). That is why, I assume, that UV image may not be sufficient enough to reach my goals.
    I just have too little experience in shaders, and in search for the solutions have consumed a lot of information, which is yet not formed in one picture in my head.

    Thank you very much.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,245
    First, where is the initial erender texture coming from?
    Second, when you say "projected onto another object", what do you mean? Is the cylinder's shape itself important here, or is it just that's what you had the UVs on?

    For the first question I'm going to assume you have a camera in the scene that has a render texture assigned to it's target. I'm going to mostly ignore the second question and just assume you only care about the UVs as a way of distorting the first render texture within the 0.0 to 1.0 range of the UV space.

    With the above assumptions you could indeed use a script that runs OnRenderImage on the render texture camera and apply the UV distortion as an image effect. The short version of how to do this would be to write a shader that has two texture inputs, one called _MainTex, which will be the render texture, and one called something like _UVTex. The _UVTex is an uncompressed 16 bit texture that you rendered out of Blender in some way. I don't know blender that well, so I can't give you any real advice on this, but it'd have to show the x and y UV values encoded as red and green colors, and needs to be a linear gradient with no gamma correction rendered to a linear 16 bit texture. The fragment shader part of the custom shader itself would have something like this:

    float2 texUV = tex2D(_UVTex, i.uv).xy;
    float4 col = tex2D(_MainTex, texUV);


    Otherwise it'd look like your basic image effect shader, which isn't much different than than a basic unlit shader.
    https://docs.unity3d.com/Manual/PostProcessingWritingEffects.html
    https://www.alanzucconi.com/2015/07/08/screen-shaders-and-postprocessing-effects-in-unity3d/

    You then use the output render texture to map onto any material you want.

    However there might be a slightly easier way. You could use a mesh's UVs directly rather than trying to render it out to a texture to sample from. To do that you'll want to use command buffers to hook the end of the first camera's rendering, ie: CameraEvent.AfterEverything. Then you'll set the current target to be a new render texture, and use DrawMesh to render your UV'd mesh to the screen using a shader which uses the UVs to define the output position rather than the vertex positions. Create a default unlit shader, then change the vertex shader to do:

    o.uv = v.uv;
    o.vertex = float4(v.uv * 2.0 - 1.0, 0.5, 1.0);


    If you render a mesh with a material using that shader, it'll draw full screen as it's unwrapped UVs. The render texture that's output by the command buffer will then be the texture you need.
     
    hbguler and ivanshv like this.
  5. ivanshv

    ivanshv

    Joined:
    Apr 13, 2018
    Posts:
    11
    Thank you so much, sir! That's exactly what I needed. Perfect guidelines. I will test everything you suggested and post the results here, for those who encounter the same issue in the future.
     
  6. oliverEllmers

    oliverEllmers

    Joined:
    Feb 24, 2014
    Posts:
    2

    Hi ivanshv - did you ever to get this working? Interested in your solution as I am also trying to achieve something similar...

    I am wanting to do perspective correct projection mapping within Unity, and then render an unwrapped texture of the geometry that has been projected on (with the correctly distorted projection) that can be sent off to a media server such as d3.
     
  7. ivanshv

    ivanshv

    Joined:
    Apr 13, 2018
    Posts:
    11
    Hi oliverEllmers! Sorry for my late reply.
    Yes, I've got things working on the level we needed them to work. The solution is not perfect at all and requires more work. Unfortunately, not every step is performed within Unity. In order to unwrap texture according to the geometry of the display, I use custom UV map of the texture, which is obtained through custom calibration program based on OpenCV (and ROS) libraries.

    I'll briefly explain the pipeline I used:
    1)With the mentioned above program calibrate the display with physical/virtual display coordinates. Get the UV map of physical display as a result.
    2)From main camera render to texture on the plane, dimension of which correspond to the resolution of your projector(s)
    3)Apply the UV map from step 1 to that plane. The render texture will get unwrapped accordingly.
    4)With another camera render from this plane to projector.

    This may look like a poor solution, but it works. I no longer work on this project, but if I would, I would like to integrate OpenCV libraries into Unity and do the first step without additional programs. Then all these steps with the plane and render to texture might've been performed in the background with shaders, I assume. Search Asset Store for OpenCV libraries, there are some.
    Even though, due to formalities, I cannot share source code and that calibration program yet, don't hesitate to ask additional questions if you decide to follow this approach and figure out how to perform calibration.