Search Unity

Augmented Reality for Broadcast Television

Discussion in 'Virtual Production' started by harveyclayton, Jul 15, 2019.

  1. harveyclayton

    harveyclayton

    Joined:
    Jul 15, 2019
    Posts:
    1
    Hi, I am new to Unity. I am working in a television news broadcast int the Philippines. I need help on where to find learning resources and tutorials to jump start on Augmented Reality that can be used on broadcasting station or in a studio. I have already a basic knowledge on Unity, I need to learn more on Augmented reality concentrated on broadcast. Most tutorials that I found are focused on using AR in mobile phones, I need to learn Unity using studio cameras.
     
  2. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Unity supports existing AR SDKs, like Apple's ARKit, Google's ARCore, Magic Leap, Microsoft Hololens, and Vuforia. We do not have our own generic AR solution, so if you want to turn an arbitrary camera into an AR camera, you would need a custom solution. Doing so usually requires calibration for your specific device, such as camera intrinsics and other sensors (e.g., IMU).

    I don't know much about solutions available for TV studio cameras. Can the camera report its position and orientation relative to some known reference point?
     
  3. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Depending on your needs, it is also possible to record or live stream the screen from an iOS or Android device. iOS provides this out-of-the-box, and there are several 3rd party solutions for Android.
     
  4. Gregoryl

    Gregoryl

    Unity Technologies

    Joined:
    Dec 22, 2016
    Posts:
    7,730
    @harveyclayton Maybe you could elaborate a little on what you're trying to do?
     
  5. Matjio

    Matjio

    Unity Technologies

    Joined:
    Dec 1, 2014
    Posts:
    108
    Hi @harveyclayton,

    You basically need precise frame rate in Unity, video IO, camera tracking, and lens distortion.

    We have not yet embrassed these subjects to have them built-in to Unity but here are a few resources that could be helpful:
    -Unity Japan has worked on a first POC for video IO with BlackMagic card (Decklink): it is available here: https://github.com/keijiro/Klinker (open source and could very probably be adapted to other vendors).
    -Our video team has been working with a few studios to propose a way to have precise frame rate in Unity to do genlock: https://blogs.unity3d.com/2019/06/03/precise-framerates-in-unity/
    -Most camera tracking systems (NCam, Stype,...) used in Broadcast have a plugin for Unity which also does the lens distortion part.
    -Solidanim has developed a system for Unity called EasyTrack using Vive trackers for camera tracking and a custom lens distortion solution: https://twitter.com/Solidanim/status/1136242249732370433 (DM me if you want an introduction).
     
    GilbertoBitt likes this.
  6. boon_yifei

    boon_yifei

    Unity Technologies

    Joined:
    Jul 25, 2017
    Posts:
    1
    Hi harveyclayton, I'm a Unity Field Engineer based in Singapore and I've worked with AR for broadcasting for a while now. I'll drop you a DM
     
    Aeoth and Matjio like this.
  7. mdaday

    mdaday

    Joined:
    May 16, 2017
    Posts:
    1
    Hi boon_yifei,

    I am interested in hearing some of your insight as well. I want to do something like the link below using Unity.
     
    LostPanda and LooksRealStudio like this.
  8. LooksRealStudio

    LooksRealStudio

    Joined:
    Aug 18, 2017
    Posts:
    6
    Hello everyone, I'm Luca from Italy, I'm also interested in this topic.
    I'm trying to bring Unity on broadcast workflow. Our tracking system is based on MoSys technology, which has an Unreal integration, but not for Unity.
    At the moment I'm considering these alternatives:
    - Vuforia with custom camera tracking
    - Rig with headset or tracker mounted on camera (Vive or combination of Oculus Quest)
    What do you think about?

    I would like to do some tests soon
     
  9. vive_creative

    vive_creative

    Joined:
    Mar 22, 2019
    Posts:
    12
    Why not output directly to a PC using miracast & then plug PC output to your broadcast system? Any high-quality mobile handset would be able to give you solid frame rates.
     
  10. max_coding13

    max_coding13

    Joined:
    Apr 24, 2016
    Posts:
    34
    @Matjio would you mind speaking here on if there are any updates to your most recent post, regarding Unity's efforts to integrate with the live broadcast industry? Specifically you said that Unity was working with various studios to supply genlocking support, has there been anything that has come out of that?
     
    Last edited: Jan 24, 2020
  11. RobertoM

    RobertoM

    Joined:
    Jan 28, 2015
    Posts:
    3
    Very interesting post,
    any update about integrating Unity with broadcast workflows?
    (in the same way Aximmetry works with Unreal)

    Regards!
     
  12. GilbertoBitt

    GilbertoBitt

    Joined:
    May 27, 2013
    Posts:
    111
    that's too bad i was wondering if there is some solution for this! it's really needed!
     
  13. Aeoth

    Aeoth

    Joined:
    Sep 18, 2015
    Posts:
    2
    Hi @boon_yifei I'm interested in the topic, can i send you a DM so you could guide me through it?
     
  14. DanS98

    DanS98

    Joined:
    Jan 17, 2020
    Posts:
    1
    Nice to see unity following unreal's path of targeting different markets but it's a shame one of those isn't broadcasting! I'd love to switch our team from Unreal to Unity if the support was there. First party support for blackmagic device outputs would be a nice start, any chance you guys are working on anything like that?
     
  15. Ivan_br

    Ivan_br

    Joined:
    Nov 10, 2015
    Posts:
    29
    Back in 2016/2017 I created in Unity 5 a way to create content using AR tools like Vuforia and ARToolkit (before ARKit or ARCore were available). I ended up sticking with Vuforia as the tracking quality was better than ARToolkit. I also used Metaio at the time, I liked it more than Vuforia, but then Apple purchased it and it wasn't available anymore.



    I essentially used Vuforia to capture targets to put virtual models on top. In my video I didn't move the camera, but it could be moved and rotated as the virtual objects are tracked by their trackers, so you would be able to capture them by moving the camera around.

    I used a professional film camera, which had its video fed into a video capture card on the computer, this was fed into Vuforia's virtual camera which I used to place the virtual objects and then used chromakey on top of it to remove the green and replace with a background image/video.

    Our camera rig also had a Kinect v2 on it, which could be calibrated with the film camera so you could segment the real objects and place the virtual object around (behind, next to, or in front of) real objects/actors. The problem at the time is that this was pretty heavy on the computer, so this is why we still used chromakey and target for AR. Now it would probably be much better to use a RGBD camera.

    The good thing about using AR trackers though was that actors could grab and manipulate virtual objects, which you wouldn't be able to do, at least not in real-time, if you were not tracking the actor's hands. We also tried using Leap Motion at the time to interact with virtual object but we didn't use it as it would appear during recording.

    I added a phone on top of the camera to show the augmented objects to the camera operator so they could better film. Today I would probably use rotation and positional tracking of the phone, using ARKit/Core to provide further information to the computer to improve tracking quality. This would need to be calibrated though with the professional camera and virtual camera, but as long as the virtual objects are coming from the professional camera's video feed this won't be difficult.

    This could be further improved by using a setup like Vicon with a motion capture suit, this would allow the actor/artist to interact with the virtual objects using their body.

    Concerning broadcasting in real-time, I was able to do this using the UnityCam project I found on github that allowed for creating creating a virtual camera in Unity which could be fed into other programs. I tested this with Skype and some other browser meeting software at the time and it worked well. I imagine it can be used to feed the augmented video from Unity into other programs as well. I believe the problem was that it was just video, so any audio generated in Unity would need to be fed into your program using another solution.

    https://github.com/mrayy/UnityCam

    Today there are better solutions, for instance what Digital Monarch Media (which was acquired by Unity) has worked on for their films - https://unity.com/madewith/virtual-cinematography

    The Dreamspace Project from Europe also used Unity and they have some great ideas and solutions for broadcast - https://www.dreamspaceproject.eu/Productions

    I hope this helps.
     
    Last edited: Oct 29, 2021