Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.

Realtime VCam Tools in Unity for Filmmakers (manual camera operating, mimicking real cameras etc)

Discussion in 'Virtual Production' started by fablemaker, Mar 7, 2021.

  1. fablemaker

    fablemaker

    Joined:
    May 11, 2020
    Posts:
    12
    Sharing some development work and tests on Unity tools for realtime film-makers...

    As a film-maker working in the live-action film world, and visual effects industry, and having been blown away with the near photo-real demos from Unity (such as the Adam short film, the Neon demo, The Blacksmith, Heretik and Book of the Dead), the idea of using Unity to shoot photo-real camerawork in real-time, virtually is very exciting. Especially now with current graphics hardware e.g. nVidia 2070 and 3080 series graphics cards being another generation ahead of when those tech demos were made, it's becoming possible to create film visual effects shots that would cost millions, in a real-time game engine. The issue so far is that very few film-makers have managed to get to grips with Unity's huge and powerful yet enormously complex and wide ranging features and workflows - understandably because they are for the most part aimed at game makers and developers (and secondly as mobile game makers make up much of the Unity customer base, some of the film toolsets haven't been much used in anger yet).

    However it is only a matter of time before it will be possible for virtual cinematographers to previs AND fully shoot scenes in engine, and create mood / lighting instantly and ultimately to be able to shoot an entire film virtually, whether in previz, or in post, and integrate motion capture or greenscreen virtual production in photoreal real-time environments, with real-time control of lighting and camera. Current virtual production trends are focussing on video walls, but this is only one side of the cake. Creating visual effects in engine is where things will be going in future, especially for independent films that don't have the $200 million budget of a major vfx film. The creative opportunities for better film-making that realtime rendering opens up are also potentially ground-breaking, and can give huge creative freedom to directors of such VFX projects.

    I wanted to take a workflow like this excellent short film which was done in Unreal, and implement some film-maker friendly real-time tools to leverage Unity's fantastic HDRP, volumetric fog, and some of the hq environment assets that already exist. Specifically I wanted to create a film-maker friendly camera tool, to add cinema style camera movement controls to easily and manually move around a scene and allow film-makers to shoot virtual film style dolly, crane, steadicam moves with the feeling (almost) of a real camera, and manual control (as opposed to cinemachine's automatic camera targeting) plus with real-time lens/focus control.

    I downloaded the Unity Book of the Dead environment and The Hunt (two of the most realistic environments I have seen) and started working by hacking together a camera controller...

    First off to create a virtual cinema camera controller, for manual control of camera movement and lens in realtime.

    Here's a few demo/test videos of realtime camera operating in the engine. The camera is controlled using only keyboard and mouse in these tests. Using an Oculus Quest or an Android phone would also be an option as a virtual camera, but it just needs a bit of extra integration work. For some types of shots it's quicker without the added complexity.

    VCam Tests using BOTD forest tech demo:


    This was shot and captured in 'real-time'. (basic focus/zoom controls and dolly) in the BOTD forest environment. (This uses the old 2018 depth of field, which has been improved on significantly in HDRP10 - from Unity 2019 onwards).

    Tests using The Hunt environment (by Beffio) + some virtual production greenscreen shots
    Motion control style camera moves (technocrane style etc)

    This was shot and captured in 'real-time'. Volumetric fog control controlled with the mouse wheel in realtime, to find a suitable mood/look. Using the mouse wheel to control focus and zoom (e.g. hold right click to zoom) is fairly intuitive. Camera controlled with keyboard WASD keys (with tuned smoothing/weight and damping to simulate crane/dolly style). + A couple of green-screen shots included here too as examples. (Elephant thanks to Malbers Animation!)

    Another demo/test using the BOTD forest environment

    Realtime camera operating, (and some slightly rough real-time lighting control tests!) There's lots of lovely detail in this BOTD environment.

    The idea is to be able to easily move the camera with 'film language' in realtime around the environment. For cinematographers and directors who are accustomed to how physical cinema cameras and lenses behave. So many use cases: a director's viewfinder, to frame up and find shots in an environment, and also to work with a virtual art department (moving props around in realtime etc). The most powerful pre-viz tool is the DP can control the lights and haze in realtime as well, and shoot virtually, with a camera that feels a bit like a real camera and moves like one.

    I looked at the most intuitive control devices (phone, VR headset etc) but settled on mouse and keyboard control with weight and damping, as this was just faster to work with for dev purposes. However all are possible (with a bit more development. Oli has great tools for Oculus quest.) I also wanted to be able to record those moves, but have found that Unity recorder does not yet support recording of the DoF params, so this needs more coding work. So far the above videos are captured and rendered in real-time. Using an offline renderer like Deckard Render could boost the quality massively.

    Plus I then added control over DoF, expozure, and other real camera /lighting type things etc in real-time, so you can effectively shoot your movie in realtime (in play mode). Or work like a cinematographer. One of the coolest things is to be able to control the amount of haze in a scene and how the light picks it up, or move lights around in realtime. Some of it is quite a hack (e.g. to implement DoF control in Unity 2018 which is because it's an older version which was necessary to run these very high quality environments).

    There's still a few challenges to integrate film centric workflows into Unity, as the needs are very different from those of game devs. Things like control and recording of DoF and other camera settings are currently not integrated in current versions of Unity, and other post processing effects in a way that makes sense to a live-action filmmaking and visual effects pipeline. Plus being able to record takes, or integrate greenscreen footage shot with professional camera codecs (ProRes) is not possible yet, unless you run Unity on a mac (and macs are not GPU friendly as we know so this isn't a good solution for film-makers we all know that running a high end real-time PC gets you the best GPU power).

    With a lot more development of a complete virtual production toolset it will be possible to create photo-real live-action style films in Unity, and this could certainly open up the world of virtual production to independent film-makers.

    The main challenges for this type of work are:

    * Achieving film quality environments and lighting. (few film quality environments on the asset store currently, as you need a powerful GPU e.g. 2070+ to run in realtime)
    * Controlling the camera and focus (DOF) manually, as a real film camera might move. (I got some ways towards a solution to this problem although it's a cheat. The latest versions of HDRP in Unity 2019 and later, are much better than above though!)
    * tools for integrating greenscreen footage into Unity (there are some very promising chromakey tools in development, but Unity does not as of 2020 support professional video compositing codecs such as ProRes (except on macs which don't support the fastest GPU cards).

    Some useful/essential components to get the look above were:

    * HDRP volumetric fog (needed to realistically simulate light and haze, it was a bit of a hack in Unity 2018 but more recent versions have better support)
    * Bloom (just a touch makes it feel like a real lens/glass)
    * Temporal anti-aliasing - takes out any jaggies, avoids a game look
    * Post Processing (ambient occlusion helps a lot with realism)
    * Depth of Field. Unity's standard post processing stack has fairly limited DoF. But the HDRP version is much better in quality. The above tests were done with the standard 2018 depth of field (which is not the latest/best).
    * Recorder (useful for recording out, but using nVidia realtime screen capture gives better performance)
    * Running game view resolution up to 3840x2160 helps give smoother edges and greater realism, (but impacts depth of field blur size negatively, and performance is also effected)
    * Tuning ambient occlusion and realtime fog settings is crucial to try to get towards photorealism. There isn't really a preset for this.
    * Using color grading post processing to try to get a more filmic look.

    For anyone interested in these workflows, I would also point you to the tools of OliVR who has created amazing tools such as Deckard Virtual Camera (for Oculus Quest control) and Deckard Render (which makes photoreal non-realtime unity rendering a reality) and his upcoming chroma keyer tools. And just coz he is genuinely an amazing creator who really knows film-making workflows and Unity development inside out.

    Also the professional tools created by DMM, shown at high level film industry conferences are looking extremely powerful, and the future looks very bright for the flexibility Unity is able to supply to developers creating tools.

    If there is interest to try an .exe build of this prototype camera controller, there's one here, which lets you fly the camera around in a basic way (and control zoom, DoF etc with mouse wheel). It's quite limited to the scene in question, and is just a proof of this type of camera control (using keyboard and mouse only).
    Demo Build (.exe PC only, needs nVidia 1080 or higher):
    https://drive.google.com/open?id=11tSYAjgN-8CUlfFu-1G9dNEpBjm0DZFz

    Any feedback would be interesting to hear, especially from other live-action filmmakers, and I will try to get back to you - although bear in mind this is something of a side project at the moment. :)

    Let's see what Unity can make possible next... :)
     
    Last edited: Apr 28, 2021
  2. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,281
    These tests are looking lovely!
     
    fablemaker likes this.
  3. imaginationrabbit

    imaginationrabbit

    Joined:
    Sep 23, 2013
    Posts:
    349
    Interesting test! I have a 3090 but it still seemed to run at a low fps

    I released a little filmmaking/animation toy made in Unity that has its own camera system. You might want to check it out as a reference as it allows really fast camera setups/dof point selection etc- you can get it here on Steam

    For reference- I'm a professional animation filmmaker/game developer using Unity to make feature films/interactive projects so I designed that tool based on my experience quickly producing animation.
     
  4. fablemaker

    fablemaker

    Joined:
    May 11, 2020
    Posts:
    12
    Thank-you kindly @newguy123 !

    And imagination-rabbit, thanks for trying that little build test (your camera game on steam looks really fun (and strange! hah!) btw, thanks for sharing, will take a look at it - autofocus would be handy for tools like these I agree. My demo build probably is only optimised for my hardware I guess. I think it was getting 24fps+ on my PC (i5 intel64, 1070) but there are a lot of lights in the sky, so it might lag if you tilt up. There could be many other factors of course, as I'm sure you know - no guarantees, but thanks for trying! :)
     
    Last edited: Mar 13, 2021
  5. Allan33

    Allan33

    Joined:
    Mar 13, 2021
    Posts:
    5
    Hi there - I am very interested in your works and findings although my perspective is more mechanical in nature since I work as a motion control telescopic crane operator - what I need is that the Vcam is "connected" to the physics of the real rig so I know that I can actually make the camera moves in the real world - I want to use Unity as a TechViz tool. Im new to Unity but find it very inspiring to work with. I agree that the Cinemachine "aim" style camera moves can be somewhat not natural - we often use backpan compensation but rarely targeting when using cranes on a filmset.

    I like to use real life tools for the operators like a tracked iPad style Vcam so I use my (or DoPs, Directors etc ) actual body movements - I want to have the camera operating as close to a real camera operating setup so we utilize the many years of operating / framing skills that operators have.

    I myself is in the process of rigging a model of the moco crane I use - trying to get my head around the animator rigging etc.

    Also I got Unity to implement the InertiaWheels - that's very useful.

    Thx for sharing

    Allan /
    Motion Control Specialist - TD operator
     
    Last edited: Apr 10, 2021
    fablemaker likes this.
  6. fablemaker

    fablemaker

    Joined:
    May 11, 2020
    Posts:
    12
    Super interested in your work there Alan33 with the motion control stuff!
    And yes - totally agree for tech-viz need tools that experienced operators like you guys can operate, to help with cinematic framing choices and movement styles. With an accurate simulation of the real equipment, like you say, and inertia wheels for controls of the motion control cranes/dollys.

    My attempts so far here were aimed at directors designing shots in 3D pre-viz (or post-viz in the case of the film project I was working on which was shot on green with multi-cam. The benefit creatively, is to enable a more introspective/liberating process of trying out ideas for directors using 3D storyboarding, complete with mood/colour/light and decent quality rendering, so those ideas can be a lot more refined by the time the tech-viz pass begins. I see this sort of thing being used quite a lot in post-viz for certain types of virtual production, as it will help the VFX teams lock down camera movement faster if the director can operate cameras virtually in post.

    The next big need for those projects to be practically possible is slick and robust multi-user editing, to allow the principal film crew (art dept and camera, lighting - e.g. around 5 -6 people) to work together virtually in the engine. Scene Fusion seems to be the only solution for that so far, which I'm keen to try out. Not sure if it yet supports realtime collaboration on these large near photo-real scenes like Fontaine Bleu or Book of the Dead though...