Search Unity

Audio Clarification on the Unity DSP timestamp

Discussion in 'Audio & Video' started by Shane-Pangea, May 14, 2019.

  1. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    I'm trying to assist with some plugin development with the Plugin NatMic. The plugin allow you to record game audio along with microphone audio. We are running into issues with Unity's DSP timestamps that causes the two audio to be out of sync.

    Can anyone from Unity would be so kind as to answer the following questions, so we can fix this issue.
    1. What (native) clock is this based on? The timestamps that we receive from the native platform are usually based on some general timestamp, like nanoTime on Android and mach_absolute_time on iOS/macOS. I need a way to convert from Unity's AudioSettings.dspTime to a 'native' timestamp in order to sync up audio samples.
    2. Is the clock from (1) affected by engine events? If Unity is paused for instance, the audio timestamps should NOT be paused. But from what I see online, it seems that this happens.
    3. Is there a way to get sample-precise timestamps in OnAudioFilterRead? Or do we assume that dspTime corresponds to the timestamp of the first sample in the sample buffer?
    Thank you! :)
     
  2. wkaj

    wkaj

    Unity Technologies

    Joined:
    May 18, 2012
    Posts:
    50
    Hey,

    1) When the audio thread (the one coming from the OS asking for samples) wakes up and fires off a mix, we increment an integer by the number of samples requested by the OS. This value is initially 0 on application launch and increments forever.
    On the main thread, we read this value and divide it by the sample rate of the audio output to get a double precision value in seconds. This is the value you read in AudioSettings.dspTIme.
    Note here that this clock and the cpu clock may not stay in sync, depending on the platform.

    2) In Unity, when the application pauses, we take the current DSP clock value as a reference. You are right that the audio clock actually continues to increment, however we use that pause reference value when the application unpauses so that it appears like the world stopped to the user of the audio engine (for scheduling purposes)

    3) the dspTime corresponds to the start of the currently mixing block if read from OnAudioFilterRead. However it will stop incrementing when the application is paused, even though OnAudioFilterRead continues to be called I think.
     
    aihodge likes this.
  3. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    Thank you @wkaj !

    @Lanre, does this give you what you need to tackle the sync issue?
     
  4. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    @wkaj thanks for responding. As Shane mentioned, the reason why we're asking this is that we're having trouble getting our custom mixing DSP working (we're trying to mix mic audio with Unity audio).

    NatMic is designed in such a way that it pushes audio sample buffers to Unity, each with a corresponding timestamp. The challenge is that in order to achieve sample-precise mixing, we need a common time base between the mic audio and Unity's audio. On all platforms that NatMic supports, we use a common clock. On iOS and macOS, it's based on mach_absolute_time (CACurrentMediaTime from CoreAudio's AudioTimestamp::hostTime field); on Android, it's System.nanoTime; unfortunately on Windows, Microsoft doesn't specify the time base of the timestamps but it is likely based on the system time.

    With this timing info, we can offset sample buffers when we perform mixing. This is where the Unity audio timestamp question comes in. First, the fact that dspTime pauses at all will break this mixing entirely. Even if this wasn't a problem, we'd still need a way to get a system time-based dsp time to perform the offsetting. Any ideas?
     
    Last edited: May 21, 2019
  5. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    @wkaj, do you have any ideas on what @Lanre mentioned? Thanks so much!
     
  6. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6
    Hey @wkaj - any thoughts on the timestamp?
     
  7. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    Has anyone been able to make any progress on this?
     
  8. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Waiting on Unity.
     
    Bagenstose likes this.
  9. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6
    ;)
     
  10. wkaj

    wkaj

    Unity Technologies

    Joined:
    May 18, 2012
    Posts:
    50
    @Lanre If you would prefer to use the system timestamp, have you tried calling the system time query (pinvoke) directly from inside an OnAudioFilterRead?
    OnAudioFilterRead is executed in place on the audio thread so you would be able to query directly the timestamp for this mix.
    I assume your sample sync logic is happening directly OnAudioFilterRead?
     
    Bagenstose and aihodge like this.
  11. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    @Lanre , does this information help?
     
  12. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I've tried this before; it's close but not exact. The issue is likely that if I have to rely on system timestamps, the mic timestamps are probably later than actual (I'd be reporting the system timestamp instead of mic hardware timestamp).

    @Shane-Pangea Have you tested the build I sent?
     
  13. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    Thanks @Lanre ! I tried your latest build this weekend. I was getting some consistencies. Sometimes the audio would work but be extremely low, other times it would not be there at all. I would need to do some additional QA to find any pattern, but out of the gate it doesn't seem to be working. What specifically did you address in this build that was different from past attempt?
     
  14. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Make sure you are recording from an AudioDevice that does not use echo cancellation. AEC does these weird volume attenuations that are completely opaque to us. The only thing that changed is MixerDevice. Going off @wkaj 's suggestion, I'm using a custom seeking function based on the timestamp of audio from source A and B. Check the MixerDevice docs. The issue I'm facing is that timing is still off. At this point, I have no clue why.
     
    Bagenstose likes this.
  15. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    Thanks @Lanre.

    @wkaj , are there any other avenues to look down for getting a more accurate timestamp for better audio sync?
     
  16. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6
  17. Shane-Pangea

    Shane-Pangea

    Joined:
    Dec 12, 2012
    Posts:
    38
    I hope @wkaj and @Lanre didn't give up on this. How can we get this working?
     
    Bagenstose likes this.
  18. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6
    any updates?
     
  19. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I'm not sure that there is anything else I can do. First is the fact that I'm not an audio engineer, so I'm not familiar with how audio experts handle issues like this. The problem is that the timestamps used to synchronize the audio are off. Part of this is because the timestamps that the microphone actually reports can't be used, because it is usually on a clock that is entirely different from the ones we have access to (that we used to synchronize). Beyond this, Unity's audio engine is very opaque so there isn't much I can experiment with on that end.
     
  20. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6
    any thoughts on this @wkaj ?
     
  21. Bagenstose

    Bagenstose

    Joined:
    May 31, 2019
    Posts:
    6