Search Unity

Project a 3d Scene on ledwall Stage

Discussion in 'General Discussion' started by hierro__, Aug 1, 2020.

  1. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Hi all, I have a ledwall stage composed by 3 surfaces, left, right and floor, basically a cube.

    I need to display a 3d scene on these surfaces properly, actually I'm using 2 display out to send 2 cameras, one for left and one for right (FOV 60 and rotated on Y acys by 30 and -30), how to manage the floor ?

    I was thinking to use cubemap as well, but looks like very intense computing and lacks of quality on postprocess.

    I thank you all in advance for any suggestion.
     
  2. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    From which point should the stage be viewed? From inside or from the outisde?
     
  3. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    From the inside actually, so basically the led floor should show the 3d scene floor.
     
    Last edited: Aug 2, 2020
  4. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    Orthographic camera pointing down for the floor might be sufficient. If people can walk across the leds.

    If position is fixed, you could try placing the camera at the spot that corresponds to viewer's position, and fixing FOV and Aspect Ratio so they match.
     
  5. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Hi, thank you for reply, I tried that but it's very difficult to match, seamless, the vertical walls contents, I guess there are technique more complex as well as it's also a matter of contents type.
     
    Last edited: Aug 2, 2020
  6. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    If all cameras are perspective, then they all should be located at the same point, but with different FOV/aspect ratio.

    Those can be determined using trigonometry.

    If bottom camera is orthographic, then its bounds should be determined using trigonometry as well, and it doesn't matter where side cameras are located.
    upload_2020-8-3_2-0-26.png
     
  7. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Thank you a lot for your time and knowledge, I got he theory, now i'll try to make it real, you describe 2 angles in your sketch, forgive my ignorance please, I suppose the 2 angles are referred to 2 different cases ( persp camera and orto camera ), what is height /width referring to ?
     
  8. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    In the example, all cameras are perspective projection.
    Height and wight refers to physical dimensions of your stage.

    For ortographic projection for the floor, you should adjust orthographic size (becauseo orthographic camera has no floor) to match that of your floor and calculate side camera (perspective) the same way as in the example.
    https://docs.unity3d.com/Manual/class-Camera.html

    In essence in case of led stage, you could be making an equivalent of "perspective painting" or "3d street art". Which looks correct from specific point.


    Using correctly configured perspective cameras should allow you to replicate the effect.
     
  9. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Hi I have been working on this, but i still don't get a way out...I have 2 perspective cameras for vertical walls ( let's avoid floor for now ), actually they have a FOV= 60 and they are rotated on y axys respectively by -30, 30, its just a simmetric case.
    You pointed out to trigonometry to find out the angle of the 2 cameras, but then how to calculate the FOV to match seamless the 2 views ?

    My real room is like this in image, and I rebuild it into virtual space to check how it works

    StageSammple.jpg
     
    Last edited: Aug 9, 2020
  10. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    upload_2020-8-10_1-57-3.png
    Assuming the camera is looking at the center of the wall, its Fov can be calculated through arctan function, as it effectively forms two right triangles.

    On a unit circle... (can't find the right diagram)

    x is cosine of angle A.
    y is sine of angle A.
    y/x is tan of the angle A
    x/y is cotan of the angle A
    You know Y which is half wall width, and X which is distance to the wall.
    You can calculate angle using atan function by giving it y/x.

    By the way for seamless effect all your virtual cameras that are not orthographic should be placed at the same spot in space.

    Her'es a decent chart:

     
  11. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Hi, thank you for clear explanation of your approach, resuming, you determine a correct fov for unity camera in order to have a correct projection on real wall.
    What if I need to rotate or move unity camera ? The virtual camera(s) are needed for an xr studio, so if real camera move, virtual should follow accordingly, what about atan2 when camera is not perpendicular to wall ? An idea is move the scene :)
     
  12. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    PArent all cameras to the same empty gameobject, and move that object. If the object can move up/down, you won't be able to keep bottom camera orthorgraphic, and it'll need to become perspective. Aside from that, verything is the same.

    For calculating fov, just assume that scene moves along with the camera, so dimensions are alwyas the same and so are the angles. If you setup it correctly once, you can move the hwole configuration anywhere.
     
  13. hierro__

    hierro__

    Joined:
    Sep 22, 2014
    Posts:
    59
    Hola, my rig is already done like that, 2 cameras for vertical walls, and one ortographic, the idea is that the rig faces frontally the corner where the 2 walls meet, hence each camera si rotated on left or side by the half of fov.
    So i approached it, choosing a fov then movig camera s to look at the 2 walls, you approach is different since it implies cameras changee fov depending on distance from virtual wall, i found it very smart since i cant get cearly the reason of it, sure it has to do with obtaining a realistic point of view.
    reagarding that i was also testing a camera matrix modifed to obtain a more realistic distorsion based on "sight" of cameras rig, what do you think ?
     
  14. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,569
    Well, the angles you want are based on configuration of your stage. The stage acts like a window to look through the world. So if stage configuration is different, angles would change too.

    I wouldn't bother. You could try modifying camera projection matrix, but with unusual projection matrix you can end up with some sort of glitch here and there, due to how projection works.

    Some thing I could consider in case you need an unusual projection is rendering a high-resolution cubemap, and then writing a shader that would sample that cubemap to render it with whatever distortion you need. This allows creation of fisheye view, panoramic cameras and the like. But, this can be heavy on GPU, because you may end up rendering scene multiple times and at higher resolution than your screen.