Search Unity

  1. We are looking for feedback on the experimental Unity Safe Mode which is aiming to help you resolve compilation errors faster during project startup.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Raycast to plane of rendertexture (solved)

Discussion in 'Scripting' started by phensch, Nov 21, 2008.

  1. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    Hi.

    I have a tricky problem:
    I have a main plane with a rendertexture on it. The rendertexture shows a plane with a button which is far away from the main plane.
    Now if i do a raycast from the render texture camera with the mouse screen coordinates, my button script won't react.

    Code for the raycast:
    Code (csharp):
    1.  
    2. Physics.Raycast(m_RenderTextureCam.ScreenPointToRay(Input.mousePosition), out hit, 100);
    3.  
    Anyone a hint? Thx for help.
     
  2. Dreamora

    Dreamora

    Joined:
    Apr 5, 2008
    Posts:
    26,602
    How does your button code look like
    How does your routine look like that calculates the virtual coordinates basing on the RT?
    How far away is the button? if the raycasts reason is to hit it physically, ensure that you cast the ray far enough to even be able to hit it.
     
  3. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    The button script works if I put it one the main plane and make the raycast from the main camera, so the screen coords and the script are correct.

    Well... I don't calculate them. I pass them over to the camera of the render texture, as shown in the code.

    Both main cam and render texture cam are at the same height, as well as the distance of the button to the camera.


    I have attached a small picture that explaines my problem better.
     

    Attached Files:

  4. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    Hm... seem's to be a more complex problem :)

    The main problem seems to be the calculation of the mouse position:

    The main camera is at the world position 0,0,0 vector.
    the render camera is at the world position 20,0,0 vector.

    Now how can I calculate this difference to the mouse position? How many pixels is a 1.0 of a vector?

    Someone a hint? :)
     
  5. DGuy

    DGuy

    Joined:
    Feb 7, 2007
    Posts:
    187
    Does the button have a collider of some type associated with it, so that the ray has something to hit?
     
  6. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    Well yes. As I mentioned before, the button works fine when I use him on the main plane. It is the same button (dublicated). :)

    edit:
    I think the problem is the calculation of the mouse input. If anyone knows how to calculate the distance of the two vectors to pixel coordinates, I would be very thankful :D
     
  7. DGuy

    DGuy

    Joined:
    Feb 7, 2007
    Posts:
    187
    How are you drawing your button: Unity GUI or your own code?


    Is the MainPlane is a flat piece of geometry with a MainPlainTexture associated with it, and you are copying the RenderTexture into the MainPlainTexture?

    - OR -

    Is the RenderTexture on a separate piece if geometry that floats above the MainPlane?

    - OR -

    Is the RenderTexture being used to texture the whole MainPlain?


    It seems your wanting to render your UI to a separate texture so you can manipulate it in cools ways, but you still want the user to be able to interact with the UI as if it where being drawn directly to the display, Is that correct?


    I apologize if the questions overlap or you feel you've answered them already, I'm just trying to get a better idea of the problem, before I attempt a solution. Because I'm looking at it and thinking, "Well, depending on what he wants to do, it's either a simple solution or a kinda' tricky one" ... ;)
     
  8. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    Unity GUI. I used the editor to drag the buttons into the scene. Both buttons work well, if they are on the main plane. Of course the button moved to the render texture won't work anymore, as he doesn't get the raycast :(

    The main plane has a own background texture. Above the main plane, there is another plane with the render texture on it. The main plane should be unchanged, as the real game will be shown on the render texture.

    Excactly! :D The user can click what he can see, but manipulates the scene which is far away.


    No problem, I meant no offense (sry if it sounded so)! The problem seems to be a tricky one, if not you would spare me a hard work around :)
     
  9. phensch

    phensch

    Joined:
    Oct 13, 2008
    Posts:
    23
    Ok, I solved the problem myself :)

    For those who come to the same problem:

    The trick was not to change the mouse coords, instead a second ray was made and its origin vector was translated by the delta distance.
    With the second ray, I made a second raycast and looked if it hit something (this time on the render texture plane).
     
  10. DGuy

    DGuy

    Joined:
    Feb 7, 2007
    Posts:
    187
    Congrats! :)
     
  11. ianjosephfischer

    ianjosephfischer

    Joined:
    Mar 6, 2012
    Posts:
    13
    I'm facing a similar problem. You say you translate the second ray by the delta distance. What is the delta distance, from what to what?
     
  12. thempus

    thempus

    Joined:
    Jul 3, 2010
    Posts:
    61
    @ianjosephfischer if the first raycast hit the screen object than you get the hit.textureCoord and pass it as a parameter on the renderTexture camera's method ViewportPointToRay

    ray = RenderTextureCamera.ViewportPointToRay(hit.textureCoord);
    Physics.Raycast(ray, out hit, Mathf.Infinity);

    hit.textureCoord only return a proper value when it hits a meshcollider. In my test scene I have the rendertexture applied to a plane with a meshcollider with a quad mesh.
     
    Last edited: Sep 8, 2013
unityunity