Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Virtual Camera - First Look - Questions and comments

Discussion in 'Virtual Production' started by Noisecrime, Feb 12, 2022.

  1. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,058
    Just happened to have seen this feature mentioned in the 'Here's what's new in Unity 2021.2' video on Unity's youtube channel and by chance I'm working on project that could benefit from such a feature so figured I'd give it a spin.

    Firstly I'm really pleased to see Unity committing resources to these sort of endeavours, especially the virtual camera and face tracking. Whilst not every developer might use them I think in time they will become an essential tool for developing 3D software as they open up so many possibilities. I'm also impressed with what I've seen and experienced so far with the virtual camera and its app - all very professional. The 'Getting Started guide' was very useful and I was able to get up and running within minutes.

    Most of what I'll discuss here is in direct relation to my current project, so I figured I'd provide a brief outline first. I'd been tasked with providing extensive amount of video of dancers recorded using MS Kinect and rendered using Unity VFX. The finally video will be projection mapped onto a building behind a live band playing a 1 hour set of music to which these virtual dancers are dancing too. I figure having a virtual camera might simplify the process of animating the camera in the scene.


    Edit:
    Slight problem, it seems installing the Live Capture package in 2021.2.8f1 causes an instant hard crash when trying to load RenderDoc. Confirmed this happens several times, even after deleting the Library and eventually creating a totally new project no additional code just default core HDRP 3D packages. Once I removed the package from my project I was able to load RenderDoc again.

    Uploading a bug report: Case 1402654



    Question - When will this be available for Builds?
    Will the virtual camera and/or face tracking be supported in builds? Ultimately I feel this will be essential and the product is only half complete without. While currently I could use the virtual camera to pre-record my camera animation, it would open up a whole new avenue to be able to deliver a build of the project to my client and let them direct the camera.


    Question - Why are there so many camera type components?
    So my only real confusion in Unity comes from the many different components all sharing various camera settings that are similar or the same, yet apparently are not shared.

    I'm also confused as to why when I created a Virtual Camera Actor it didn't automatically clone my main camera settings or give me the option to do so.

    It seems odd to me that the VirtualCameraDevice has its own camera settings for lens, camera body and isn't using the Virtual Camera Actor camera settings. More so that the VirtualCameraActor has its own lens, intrinsics, camera body and then that gameObject also a has a Unity Camera with its own Lens, camera body, yet none of these settings are synced or shared.

    I guess there must be a good reason for this, but in my tests I simply set everything to be the same and that worked well. Maybe there could be a feature to clone the settings between all these camera types?

    There are also a large number of 'professional' camera settings here that i'm just not used to and don't really want or use. For simpler projects it would be nice to have a simple component that automatically deals with these.

    Question - Playback hooks
    For my use case I'm going to be recording the camera motion around animating objects ( vfx point cloud from Kinect ). Assuming I can find some way to trigger the kinect playback, does the Take Recorder have any hooks or events that I can use to detect when it starts playing a take?


    Comments - The App

    Reset Pose Button

    Generally this looks and works great. My only real gripe is that the Reset Pose button needs to be much more prominent and have a better icon. It took me way to long to discover it and its pretty much essential in order to start viewing the scene in the right place. I'm not sure I like it being combined with the reset lens option either.

    Had I used the help system I might have found it sooner, but I'd also rather an auto popup help system that went through all the main controls instead of you having to click on them ( though that's great for later exploration). Ultimately I feel such a button should have been discoverable through use of the app instead of it being tucked away to the side.

    Tracking Loss
    The other issue I ran into which stumped me for a while was losing tracking, upon which I seem to be randomly re-positioned in the scene. I eventually worked out that having cleared out my workspace for recording with the Kinect i'd removed any trackable features from most of the walls. So when I was focusing on a virtual object in the scene, but got too close to a wall it lost tracking.

    I'd like to see a pop-up in the app for situations where this happens as I can see it being very confusing for some users and it was quite jarring.


    Comments - Unity

    Streaming - Not working
    It appears if you don't have the game view active you will not get an active stream on the device, or maybe there is but its just black. It took me a while to discover why the app didn't appear to be streaming when I had the scene view active and game view was not. Its a bit weird though as I thought I started in game view and streaming wasn't working, so switched to scene view to check the virtual camera motion, so maybe something else went wrong.


    Streaming - Performance
    Performance wasn't great, but that's likely a combination of elements, such as old hardware. I was also testing in edit mode as the 'getting started pdf' seemed to suggest, but i'm sure I read elsewhere that in edit mode you might get more hitches.

    I tried changing the video settings and reduced my game view to 1280x720, but it still hitched.

    I tried using NVENC H.264 but it just gave an 'InvalidOperationException: Encoder failed during initialization.' error. Having a 3060TI i'm pretty sure this is supported on my hardware and I'm sure I'm using that setting in other software ( not running at the same time ) without an issue. The error log didn't provide any other useful details so not sure what I can do to debug this.

    Server Problem.
    Generally I find as long as I don't 'stop' the server in Unity the app connects fine. However as soon as I stop the server and then restart it, it doesn't matter what I do on the PC or on the ipad I cannot reconnect and the app can't even discover the server! The only way to get things working again is to remove the server and create a new one.


    Well thats it for my first foray into the virtual camera. Like I said it seems very impressive and generally well designed, though a little confusing for the lay person. I look forward to play around with it more whenI have time, and seeing if I can actually use it for my current project.
     
    Last edited: Feb 12, 2022
    marc_tanenbaum likes this.
  2. marc_tanenbaum

    marc_tanenbaum

    Unity Technologies

    Joined:
    Oct 22, 2014
    Posts:
    637
    Hi @Noisecrime and thanks for this excellent, detailed report. I've forwarded it to the team for them to consider.

    Some responses to a few key points/questions you raise:

    Question - When will this be available for Builds?
    The short answer is that this isn't a high priority at present. Understand that the Virtual Camera was developed first and foremost by the virtual production team for cinematographers, not games or interactive uses (which would likely be the main case for runtime usage). We may someday choose to go in this direction, but it's not where we're at at this time. Can you provide some detail on what the use case you describe would look like? I.e., what would "delivering in a build" look like that would provide value to your clients?

    Question - Why are there so many camera type components?
    Essentially this is a side-effect of rapid growth over many years, with many teams working across many countries and use cases. We're aware that this is suboptimal and are looking at ways to streamline this.

    There are also a large number of 'professional' camera settings here that i'm just not used to and don't really want or use. For simpler projects it would be nice to have a simple component that automatically deals with these.
    Again, this is a sign of who the core target user is for these tools. But we're aware that not everyone is a cinematographer and will be working on an alternate UX to enable easier use. Are there specific tools you'd find most useful in a "less pro" UI?

    Question - Playback hooks
    We don't have these yet, but such hooks are very much on the roadmap.
     
  3. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,058
    Thanks for the reply.

    Some your questions are going to be a little hard to give more details on since due to the package causing Unity to hard crash when trying to load RenderDoc i've had to uninstall it. At this point having RenderDoc available is unfortunately far more important than trying out this feature, though I may reinstall the package once the project is feature complete.

    I'd also like to mention based on your replies that it feels like Unity is doing itself a great disservice if they restrict the scope of these tools to focus on production teams and cinematographers. I would say something like a virtual camera is the perfect feature to assist indie developers in developing their own games for say cut-scenes. Sure I understand in todays world that cinematographers would be the main focus of such tools due to a growing market, but its important to find some time to make it more indie friendly too. This would probably go as far as marketing it to indie developers a bit more as the few devs I mentioned this too didn't even know about it and I only happened upon it by accident.


    Question - When will this be available for Builds?
    I think there are two aspects to this.

    Firstly I'd like the option to run it in a build to avoid the performance overhead in the editor, especially for when I need to record video. I've taken many steps to minimise Editor overhead, but there seem to still be occasional large spikes and general low level cpu usage that could be removed when running in a build. I'm currently stuck on a 7 year old cpu and its showing its age in situations like this, so removing the editor overheads might just give the boost I would need to make this practical to use. Perhaps if we could determine why NVENC isn't working that might help too.

    Secondly for clients it means that they are free to direct the camera in real-time either for
    - a live show, thus making the direction part of the event itself, much like how a DJ or VJ would respond to the crowd/audience in how they mix the music or visuals.
    - recording the results ( or ideally the generated animation ) for use in another build of the project. The point being I'm not going to provide clients the full source code to run it in Unity themselves and they may well not want or have the expertise to use Unity.

    I'm a coder, though also trained in graphic design and generally turn my hand to anything, but I know i'm not the best person to 'direct' the camera, so being able to allow clients if they have the capability to do the direction would be a huge boon and less work for me ;)

    Question - Why are there so many camera type components?
    My point may have gotten lost here, its not so much that there are many camera components across Unity, but that just in the virtual camera ( VirtualCameraDevice & Virtual Camera Actor) there were at least 3 camera type components, many sharing the same parameters though not sharing the actual values. I found this confusing as to why it was needed and why it felt like i had to set each up by hand to match my existing scene main camera.

    Large number of 'professional' camera settings here
    Hard to say now since I'm unable to look at them now ( had to remove the package ) and as i'm not a professional perhaps I'm not the best person to say. I think the best I could offer is if there was say a simple mode that only provided the same settings/options that a typical built-in/URP/HDRP camera offers.

    Just to clarify this is only with respect to the editor, although the app still had a lot of settings, it seemed easy to use and get good results without having to dive into them.

    Question - Playback hooks
    Good to hear they are on the roadmap, I think they will be essential for being able to control live inputs for the scene, for example starting/stopping play back of Kinect recordings in my case.
     
    marc_tanenbaum likes this.