Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

[MeshSync] Rendered animations preview

Discussion in 'Virtual Production Previews' started by GeniusKoala, May 20, 2021.

  1. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    Hey!

    Working in an animation studio for cartoon animation, we are willing to upgrade our methods to work. We are thrilling to use new tools to make our series in Unity! So far we used Unity only for lighting and rendering. Animations are exported from Maya into alembics and importing in Unity with some script to automate processes. We tested new Unity tools like alembics while it was still in preview because we wanted to go beyond in term of quality. We are satisfied with what we've come to so far. I'm glad you opened this forum for Film because Unity deserves it. It can handles animation productions today. The past shows that it was already the case years ago with Monsieur Carton and some other productions. We want to trust Unity about their plan to allows animation studios to use Unity for their productions.

    Sorry for this long introduction. MeshSync is one of the tool we would like to experiment further. It would be a blast to preview inside Unity with lighting and post processing how our characters would move. It would make faster iterations. Is MeshSync a tool still under developement? For the moment it is still in preview but we can't really use because the sync does not follow our animations (25FPS). I'm working on a RTX3600 with just one character for the moment for testing. Have you got hints to improve the results or is the tool too early for animation? I know it was designed for modeling preview first.

    Thanks for creating all these kind of tools for animation!
     
    HIBIKI_entertainment likes this.
  2. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    Hello, thank you for your interest in using Unity for animation.
    Yes, MeshSync in still in development, and we'd like to bring it out of preview.
    One of the things that we need for this to happen is to have more users using it (like you!), and their feedback.
    So keep the feedback coming :).

    When you said this, did you mean that you played the animation on a DCC tool and were hoping that the animation would sync automatically in Unity in 25FPS ? If that is the case, then unfortunately we think that this is not possible.
    The reason is that, the amount of data that the DCC tool has to send to Unity will be very huge for any real use-cases, and the network (although local) will always be overloaded.

    As an alternative, there are two workflows that we provide.
    1. "Sync animation" button on the DCC tool side. This will send the animation and create PlayableDirector asset on the Unity side. Then, entering play mode in Unity should play the animation
    2. [Experimental] Use "Export Cache" to export the animation into a custom cache file (SceneCache), and load it using SceneCachePlayer component. Please see the SceneCachePlayer doc for more details.
     
    newguy123 and GeniusKoala like this.
  3. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    Hey! Good to read it's still in developement. The idea on our side is to consider Unity like any Maya render engines the most possible. One feedback I had from our directors was "ok it's real-time rendered but only once we export the alembics, what about while we work on the assets/animations ?". That's why I thought about Blender (Eevee) and the MeshSync repository for Unity.

    One workflow could be to create a cache on the fly from Maya to preview the animation inside a Unity project. Again that's quite against their expectations of real time if they need to bake animation to preview it rendered. Would be cool to have someday like a plugin inside Maya that could uses Unity as a render engine like VRay or Redshift. I know Unreal Engine 4 has a LiveLink plugin. I don't know how performant it is but some motion capture studios use it in their pipeline. We use mocap too with the XSens Live Link suit. It can stream the data from their software Animate Pro to Unity and it's really performant for a streaming. Maybe I expected MeshSync to acts like that in Maya...

    I think there is also a streaming tool for Substance Painter? Is it still supported? I did not try it yet. It could be nice to preview the shading while texturing.

    I showed this morning in our studio meeting your Cinematic Studio Sample project and Virtual Camera tool and they all loved it by way!

    I won't stop to send feedbacks if it can help everyone!
     
    HIBIKI_entertainment likes this.
  4. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    it's some time off and production adoption still needs to happen for things like this to take ahold, but I could imagine seeing Mesh Sync and USD being such an amazing combo to interchangeability through engines and DCCs while crossing diverse team skillsets for unity. both packages are still in preview but, here's to hoping
     
    GeniusKoala likes this.
  5. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    Hey @sindharta_at_unity ! I am trying the latest versions of the packages and I notice I can't change on the MeshSyncServer component the attribute "Override Frame Rate/Frame Rate". It's on 30 and when I change to 25 it is forced back to 30 again. I just want to make sure it reads the animation at the frame rate in the Maya scene which is 25.

    Also it's too bad it is not compatible with the smooth preview in Maya but I guess there is no way to have it in Unity. I would have to smooth myself in Unity but it would make the process veeeery slow haha.

    EDIT 1 :

    I was wondering what was the logic with the sync. How do you sync the animation in Unity from Maya? Do you build a mesh at runtime to update the Maya animation? Or do you animate properties instead like the joints and the blendshapes? With which protocol? What's the bottleneck in MeshSync that makes the update slow?

    I would like to try to sync a Maya character inside Unity by driving each joint of the skeleton with OSC protocol (maybe UDP woul be better I don't really know). The idea would be to do motion capture on the face of our character in Maya and update the joints and blendshapes in Unity. I would like to try it so that we can keep the original rig with its controllers in Maya for a much better animation that just with the skeleton in Unity.

    EDIT 2 : I've just seen an update of the package with a 0.10 version. It's cool to keep the developement thank you!

    EDIT 3 : So I might have understood. I thought MeshSync would act like Unreal Live Link in Maya. But since it's for modeling previz it bakes the mesh at runtime and updates it. Whereas Live Link bind skeletons and drive the skeleton in Unreal Engine from the rig in Maya. I think a tool like Live Link would be more useful for us in Unity. Substance gives us already all the feedback we need during the texturing I think. I think you may have interest to develop this MeshSync in that way. At least for us it would better fit our needs.

    Also since it's for the modeling stage, not having smooth integrated is really a lack for the user I think. Regular our artists do a previz smooth in Maya since we work only with low res models. We only smooth at the rendering stage.

    I think MeshSync is a great prototype for a useful tool so I hope you guys will continue to develop it :)
     
    Last edited: Nov 11, 2021
    newguy123 likes this.
  6. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    @sindharta_at_unity What do you think about streaming from Maya to Unity only Transform and BS info (baked controllers) ? If I could export a characters with its skeleton and BS, do you think an UDP streaming would be fast enough to update the Transforms and BS of my characters if I animate it in Maya? I ask this question because we want to do motion capture in Unity with our custom rig (mGear or sometimes not) and we want to keep the controllers and constraints without redo them in Unity. We would have to rig twice our characters so we would loose times. Everything is already done in Maya so it would be a game changer for us! So if we do the mocap in Maya and stream it in Unity it may work, don't you think? I think it's how Unreal Live Link work but I could not find any demo for long animations for live. We need to use it for previz motion capture with our own facial motion capture solution (We don't want to make ARKit BS on our characters since they are quite limited but we use ARKit to drive our custom rig).
     
  7. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    @sindharta_at_unity

    Hey! It's been a long time! I saw this feature on YouTube and it's exactly something that our animation studio would need :


    It allows to have a Unreal Engine viewport inside Maya. It would be awesome to have a Unity viewporte inside Maya too :)
     
  8. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    This is cool.

    The Unity Connector for Omniverse unfortunately isn't present yet, but otherwise, if present a USD >> DCC >Omniverse< Unity node network could absolutely set this sort of thing up too.

    Essentially all the information is kept streamable via USD packing but it does also allow the connection between softwares, meaning you could live capture and retarget at either point in realtime.
    Certainly something i would explore if you're not already familiar, even if it's just to spring up more fantastic ideas like this.