Hi, I am building a networked asymmetric experience using Unity 2019.4 LTS and Mars. There is an Agent that is using the app on one side and there is a Customer using the app on the other side. There are physically apart and are using the same app to meet but the app behaves differently for the Agent than for the Customer. Because I am using Photon Networking, the same scene has to be shared across both users who are having a virtual meeting through the App while being physically apart. I have a Mars Session in the scene but I need the Mars Session to do something different depending on which side it is. On the Agent side, I need the Mars Session to be configured in such a way that it does facial expression tracking by using the front camera. I want to extract the blendshapes as I have done before using AR Foundation. On the Customer side however, I need the Mars Session to use the World Facing camera to do 6 DOF tracking to create an AR experience where the Customer is able to see an avatar of the Agent appear in their living room. As such, the Mars Session needs to be configured differently than for the Agent. Can I dynamically configure the Mars Session to provide different functionality based on whether the agent is using the app or the customer is using the app? I was able to make it work well before Mars using ARFoundation, ARKit and ARKit FaceTracking.