Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Live Capture: How to use it with Unity Timeline & Recorder?

Discussion in 'Virtual Production' started by Brainslum, Sep 1, 2021.

  1. Brainslum

    Brainslum

    Joined:
    Jan 10, 2017
    Posts:
    57
    Hi

    I had a few takes. But the animation files generated in the /Takes/NewShot folder, only work for Cinemachine Camera Actor and I checked in the animation window seems like the anim key frames are public vars of Virtual Camera Actor Class only.

    Is there any way, for me to get the same animation file, that I can use on Cinemachine virtual cameras instead? This way, I want to create a take and I can use it for bridging different vcam clips in the timeline on a cinemahine track.

    Right now the only way that I can use the recorder along with it is to ocpy and paste the track in take recorder into my own timeline and put it alongside a recorder track. Is this the only way to render out a movie file thus far?

    Also additional questions:
    1. I am using Cinemachine Camera Actor. In this case what is the advantage of cinemachine virtual camera? Is it just so that you can add noises/colision detections for the vcams???
     
  2. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    From reading the docs (or in other words, "use at own risk - I have never done this - just trying to understand it myself!!!")...
    • The Live Capture "Virtual Camera Actor" component is like a single component with all the settings for focus etc. Putting them all together into a single component makes animation clips more consistent (all the settings are in one component).
    • The Live Capture "Virtual Camera Device" component gets signals from an external camera (like a phone or ipad) and sends them to "Virtual Camera Actor". [ So you would normally (I assume) have a Virtual Camera Actor controlled either by a Virtual Camera Device, or an animation clip that you previously recorded. ]
    • The Live Capture "Physical Camera Driver" reads settings from the "Virtual Camera Actor" of the current object and sends them to a Camera on the current object - it connects the Virtual Camera Actor properties to your main camera etc.
    • The Live Capture "Cinemachine Camera Driver" reads settings from the "Virtual Camera Actor" of the current object and sends them to a Cinemachine "Virtual Camera". [ So there are two provided drivers. Both read from "Virtual Camera Actor", then control either a Camera (like the main camera) or a Cinemachine Virtual Camera. ]
    • Cinemachine "Virtual Cameras" you can drop into a timeline to control a Camera (making it easier to animate, follow, track, dolly, etc)
    So I assume you could create a game object, put a Live Capture "Virtual Camera Actor" on it, then put a Live Capture "Cinemachine Camera Driver" on it, then point the "Cinemachine Virtual Camera" property of the "Cinemachine Camera Driver" component at your Cinemachine Virtual Camera (you can probably stick it all in the one game object I guess?). Then the animation clip changes "Virtual Camera Actor" properties, which the "Cinemachine Camera Driver" converts into commands to drive your Cinemachine "Virtual Camera".

    Simple huh?

    https://docs.unity3d.com/Packages/c.../ref-component-cinemachine-camera-driver.html

    That would make sense... (not sure if its the only way, but it makes sense). You would put the Virtual Camera Action animation clip in a separate track to control moving your virtual camera around in parallel to the Recorder track (and all the other animation of objects going on in the scene).

    I assume by "Cinemachine Camera Actor" you mean "Virtual Camera Actor"? If so, my understanding is that it does not really do anything by itself. It is just collecting all the settings in one place so a remote virtual device and animation clips have a single agreed set of properties to control all that stuff. You need to add a "Cinemachine Camera Driver" or a "Physical Camera Driver" to then read from the Actor component and control an actual Cinemachine Virtual Camera or [Main] Camera.

    Why use a Cinemachine Virtual Camera (that controls a Camera) in this case? I suspect it's up to you. I assume it makes it easier to cut between camera angles on a timeline for example. If you don't need that, you can bind the Virtual Camera Actor to a Camera directly instead.

    DISCLAIMER: Never tried any of the above. But that is my understanding reading the docs at the above link.
     
  3. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    I will add a footnote - when using the new Sequences support I had problems with only using Main Camera type objects without Virtual Cameras. I thought "in each sequence, I can just skip the virtual cameras when I don't need different camera angles etc". But in practice (forgot what already!) I had problems. I think things tried to find the main camera by tag, getting more easily confused as I created a new Main Camera per sequence (so I could change Bloom settings etc per sequence).

    I should go back and try again, but basically I found things worked more reliably always having a Cinemachine Virtual Camera and animating that to control the main camera in the sequence.
     
  4. Brainslum

    Brainslum

    Joined:
    Jan 10, 2017
    Posts:
    57
    thanks. I have actually been using it exactly like how it was supposed to be set up.

    But virtual camera actor is a rather limiting component, using it will prevent you from mixing cinemachine clips together because they are different animation properties.

    Again, my question remains, which is that the output .anim file is acting on Virtual camera actor. And its a really isolated animation clip, wont be able to mix match with other camera anims(say you rig up a shot in unity or procedurally)

    My question is how we can get .anim that act on virtual camera properties directly, not through a handler. My only bet rihgt now is to record virtual camera(not actor) AGAIN with a recorder track that outputs the .anim file in timeline. But that is really really redundant plus this feature only records its child position meaning that its not world absolute position meaning you would have to leave virtual camera either with no offset to world pos or putting it in world pos with no parent. very unpractical

     
  5. Brainslum

    Brainslum

    Joined:
    Jan 10, 2017
    Posts:
    57
    right now from the doc or tutorial, the end point of the whole thing is 'an .anim file that applies to virtual camera actor ONLY'.

    I guess the better way to understand my question is how we can conveniently get .anim file that acts on perhaps camera or vcam, without having to manually use a recorder track to record its animation values.(in the past i even had to use a custom script that spits out a .anim file because recorder track for .anim is quirky).

    I think this virtual production feature is a good starting point for drafting a shot/iterating. Its best value would be if we can continuously iterate on vcam/camera .anim files, even if it is not captured by live capture.(say a shot rigged up in maya)
     
  6. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Ahhh, sorry, I misunderstood. That makes more sense now. Sounds like a question for the Unity folks.


    Side note: I have been hitting a similar problem as soon as there are two paths to control something. E.g. do I use locomotion via animation clips with root motion enabled? Or via a procedural scripts? Sometimes one vs the other works better for a specific scene. A pattern I am starting to use is to add components into the prefabs that are disabled. Then when I set up a scene, I drop the prefab into the scene then turn on the appropriate components for that scene.

    It is not very satisfying, but I have not been able to map everything down to a single common form. (I have not tried using Recording to create an .anim file though.)

    I guess what I was wondering is whether can you live with one approach or the other for a specific scene? Do you really need to mix them? If you can get away with not mixing them (just using one or the other in different sequences), then add the Actor component etc in prefabs but disable it (and only enable it when replaying Virtual Camera (app) Take .anim files).

    For my project, I just use Cinemachine virtual cameras directly. The extra complexity of using iphones etc for controlling a camera have not been worth it yet for me. My needs are very simple with 95% of the time the camera not even moving.
     
  7. Sergi_Valls

    Sergi_Valls

    Unity Technologies

    Joined:
    Dec 2, 2016
    Posts:
    212
    We can explore making a small tool to convert clips from VirtualCameraActor to CMVirtualCamera/Camera. Would that solve the issue?

    Cheers
     
  8. SuperPoopsy

    SuperPoopsy

    Joined:
    Jan 20, 2020
    Posts:
    17
    @Sergi_Valls Sorry for necroing this thread, but has the small tool to convert clips you mentioned ever been done?
    I'm in a similar situation as @Brainslum, more specifically I love to be able to record 'natural' camera movements with live capture (and to be honest 99% of the times I really just need cam rotation and position) but the resulting clip is basically an isolated file that I can either only use exactly as it is, or trying to adapt it becomes incredibly cumbersome so much so that it's probably less problematic to animate a camera by hand in the first place.

    A conversion tool would really help me out.
     
    Last edited: Jul 21, 2022