Search Unity

Performing Frame Independent Audio Commands...

Discussion in 'Scripting' started by astracat111, Aug 18, 2021.

  1. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    This might be a dumb question because I can't find anything about this online...

    I'm wondering how you go about triggering code that works independently of frame rate? Doing this, working with audio, both in MIDI and audio in general.

    From what I understand, you use scheduled events with the audio engine, right? So, on a low level you're setting scheduled events to occur per samples from the audio engine/hardware.

    My first guess is that you have an asynchronous thread that uses a while/for loop to trigger code in between frames?

    This is still confusing me a bit. There is one callback called OnAudioFilterRead within MonoBehaviour. Is this what you would use for this?

    How would you do this if you were just working with plain ol' C language then? Is this a low level thing only, in where you have to directly access the audio hardware using C?

    My goal is to play MIDI events, and I know MIDIs use ticks, but do you have to translate this to/from audio samples in order to make a timeline in where you could have audio clips work in tandem with MIDI clips?

    Thanks for any help ahead of time.

    Here's the video I've been studying to try to understand how this might work:



    From what I'm understanding here, you have to play and trigger events using milliseconds rather than running code in frames. So the idea I come back to in my head again, is that you have an asynchronous thread inside of a loop that pauses every millisecond.

    The thing is, a piece of audio hardware from what I understand runs in Hz, so you can have 44,100Hz, which is 44,100 samples per second, right? So am I supposed to be scheduling events to happen when a sample occurs through the audio hardware?
     
    Last edited: Aug 18, 2021
    april_4_short likes this.
  2. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    I've long wondered how to make a unique, lightweight timer thread able to resolve at rates useful for audio editing and playback of notes via MIDI, in Unity.

    It would be a huge boon to game making, of all sorts, if Unity provided this kind of thread, for input and output, just for consideration of the new gaming phones working at up to 720Hz response rates to input (as of now), and fast, low latency mouse inputs for competitive gaming rigs.

    It seems strange, to me, that they haven't at least considered these needs in their New Input System and ECS/DOTS "performance by default" initiatives.

    Perhaps @Tautvydas-Zilys knows a way to make a unique timer thread at high rates and varying frequencies suitable for ultra low latency scheduling and input absorption and refreshes etc.

    But also for exactly what you're wanting/needing for MIDI performance, editing and recording.
     
    astracat111 likes this.
  3. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    What would you do with getting input at that rate? The earliest point in time you can make the action made by that input observable is the next frame. So the best point in time to receive input events is beginning of a frame. There isn't a good use case for input arriving when they actually happen.

    You can spawn threads in C# and do whatever processing you want on them. You can even mark them as high priority. You'd process MIDI input there, and then move data to OnAudioFilterRead callback for playback.
     
  4. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    MIDI Controllers are legion, all around the world of music, and even synths are MIDI controllers. Your own wonderful Keijiro uses them to drive animations, but let's ignore that for now, because that's at an entirely different rate and not pertinent to what follows.

    When MIDI controllers emit a signal from a keyboard/controller input of the user, the ideal is that it loop out to the recording device to be recorded, and then back to the MIDI device, where it's used to initiate a sound (or sound property change).

    The faster this loop through the recording device, the better. Anything above 4 or 5 ms is considered a bit of a slouch.

    // Yamaha's current flagship keyboard is at 6ms when processing an external controller's input, and everyone complains about this. Musicians are like pro gamers, they have more time acuity than most.

    It'd be very nice to be able to do this looping of input responsiveness in Unity, via a thread that was able to receive (be interrupted by) incoming input, as it happens.

    Is this possible, in any way, even if it means creating a separate thread?
     
    astracat111 likes this.
  5. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    I don't see why it wouldn't be. You'd just have to read the input yourself. We don't limit what you can do in your own code in threads.
     
    astracat111 and april_4_short like this.
  6. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    I'll call a friend. Threads I dread.
     
  7. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    You guys are geniuses I can barely keep up x___x

    So is the answer to run a for or while loop from an asynchronous thread? Or is there some kind of low level working with the Win32 MIDI API I gotta do, or getting a separate API like NAudio to work with to accomplish running code in between frames?

    Or does the code have to be scheduled?

    Sorry for being so dumb.

    I've got the Unity Asset Maestro - Midi Player Tool Kit, really enjoying it for opening up the MIDI data, but wondering if I have to use it's MIDI player or I can somehow use it's 'MIDI stream' object to just use stream.Play(midiEvent) or what have you.

    I think this asset uses NAudio by the way, which I'm sure uses C++...

    If I clench my brain REAL HARD, I'm getting what you're saying. Does that mean you have to use a low level programming language in conjunction with windows audio engine that directly communicates with the audio driver.....?? I'm sorry again for being dumb. I'm thinking I probably have to use NAudio from the sounds of it.
     
    Last edited: Aug 18, 2021
  8. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    The gist of it is that Unity doesn't have MIDI reading APIs out of the box (as far as I know). So you'd have to use some kind of library or OS API to get that data. If you want that data to be available more often than once a frame, you need to read it from a different thread. If you spawn a thread, all the code that executes there is in your control: you can choose how often you read the MIDI input API and how often you dispatch it to be output by the audio callback.
     
    april_4_short likes this.
  9. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    So in other words, it is an audio callback that's required, so you need to work with scheduling events? This is the part that I'm not quite understanding.

    Doesn't Unity have an audio engine with dsp time, so you could schedule code to be run in dsp time rather than frames? Or do you schedule code to be run when the MIDI driver runs? Is C++/low level programming required for this kind of thing?
     
    april_4_short likes this.
  10. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    Yes so that's what OnAudioFilterRead is. You will be called back to supply data every time it needs more data. What you also need to do is get that data from somewhere: it's probably too late to read it when that callback is called, hence you'd do it on your own thread, save it off somewhere and then when OnAudioFilterRead gets called, you'd put your data into it. That's at least how I'd try doing it - I'm not really an expert on these things :). I'd try playing with it and see what works.
     
    astracat111 likes this.
  11. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    Getting the MIDI data setup is no problem....well it is a problem, but it's easier to solve. x_x

    So what I'll try doing is to schedule events into the OnAudioFilterRead callback? Thanks for the help @Tautvydas-Zilys !

    What I'm trying to do is to simplify the MIDI composition process by creating a bare bones sequencer and piano roll program, like in the old days, wish me luck.
     
    april_4_short likes this.
  12. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    Is there any explicit documentation on creating threads in Unity, for Unity?

    Have been trawling docs for several hours, without much luck. But don't know terminology.

    Am also reading this, and getting LOST:

    https://www.jacksondunstan.com/articles/5522
     
  13. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    Unity - Manual: C# Job System Overview (unity3d.com)

    Maybe that link will help, but I'm not so sure myself. My thought was just using async and Tasks in C#, I'm pretty simple minded myself, but it has to be scheduled I think.

    So, MIDI works with microseconds, milliseconds and ticks.

    Microseconds.....as in...one one thousandth of a milliseconds. Does the computer even operate at that level? On a hardware level, I'd assume milliseconds since 44,100Hz is basically 44,100Hz pieces of code executed per second right? I guess this is why people use C++ if they wanna do things from scratch, gotta be able to access that audio hardware directly.
     
  14. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    Unfortunately Jobs/Burst/ECS/DOTS can't help us, because they don't have any clock, signalling, notifications, timers or other facilities at anywhere near the rates required for MIDI sequencing. I looked into this, and asked a couple of people in the know, and they all said the same thing: DOTS etc is good to send bursts of stuff too, not for fine granularity stuff in terms of very small bits of time signalling, despite the fact it's operating in tiny bits of time.
     
  15. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    You got me thinking with this... and I've been busily researching it since you wrote the above.

    I think this is a "lazy" way to achieve what we both want.

    We're not limited to having only one class utilising OnAudioFilterRead, so can kind of think of using (not sure about performance) one for each MIDI track.

    In this way, we can send in data that's exactly matching the amount of time before we want a signal back from this thread that requires us to do (or queue) something else to be done at fine time granularity.

    In other words, use the data packet sizes sent to/through OnAudioFilterRead to determine how long it goes off to "wait" before coming back to suggest we do the next thing.

    I've just done a pretty brutal test of it, not with much more accuracy than my ear, and it can go way up into the audible range, and quite high into it (if you think in terms of an LFO self oscillating, this is what OnAudioFilterRead is able to do), on an old Mac.

    I need to think a lot more, and read a lot more, about how to do this and test it more accurately, and learn to create an own thread, to see what it's benefits might be.
     
    astracat111 likes this.
  16. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725


    @april_4_shortWell here's a test of using Unity's playback engine and the Maestro MIDI asset, just using the code they've provided online for the metronome using the callback script to audio filters.

    It freezes at the end but I think that's because of me closing the MIDI out ports when the application ends.

    I'll have to run some more tests.

    Here's the code from Unity's API documentation:

    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. // The code example shows how to implement a metronome that procedurally
    4. // generates the click sounds via the OnAudioFilterRead callback.
    5. // While the game is paused or suspended, this time will not be updated and sounds
    6. // playing will be paused. Therefore developers of music scheduling routines do not have
    7. // to do any rescheduling after the app is unpaused
    8.  
    9. [RequireComponent(typeof(AudioSource))]
    10. public class AudioTest : MonoBehaviour
    11. {
    12.     public double bpm = 140.0F;
    13.     public float gain = 0.5F;
    14.     public int signatureHi = 4;
    15.     public int signatureLo = 4;
    16.  
    17.     private double nextTick = 0.0F;
    18.     private float amp = 0.0F;
    19.     private float phase = 0.0F;
    20.     private double sampleRate = 0.0F;
    21.     private int accent;
    22.     private bool running = false;
    23.  
    24.     void Start()
    25.     {
    26.         accent = signatureHi;
    27.         double startTick = AudioSettings.dspTime;
    28.         sampleRate = AudioSettings.outputSampleRate;
    29.         nextTick = startTick * sampleRate;
    30.         running = true;
    31.     }
    32.  
    33.     void OnAudioFilterRead(float[] data, int channels)
    34.     {
    35.         if (!running)
    36.             return;
    37.  
    38.         double samplesPerTick = sampleRate * 60.0F / bpm * 4.0F / signatureLo;
    39.         double sample = AudioSettings.dspTime * sampleRate;
    40.         int dataLen = data.Length / channels;
    41.  
    42.         int n = 0;
    43.         while (n < dataLen)
    44.         {
    45.             float x = gain * amp * Mathf.Sin(phase);
    46.             int i = 0;
    47.             while (i < channels)
    48.             {
    49.                 data[n * channels + i] += x;
    50.                 i++;
    51.             }
    52.             while (sample + n >= nextTick)
    53.             {
    54.                 nextTick += samplesPerTick;
    55.                 amp = 1.0F;
    56.                 if (++accent > signatureHi)
    57.                 {
    58.                     accent = 1;
    59.                     amp *= 2.0F;
    60.                 }
    61.                 Debug.Log("Tick: " + accent + "/" + signatureHi);
    62.             }
    63.             phase += amp * 0.3F;
    64.             amp *= 0.993F;
    65.             n++;
    66.         }
    67.     }
    68. }
     
  17. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    Add two zeros to the BPM. It copes!
     
  18. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489

    If you take out the Debug.Log nonsense, you can crank the BMP to truly INSANE numbers.

    1 million, no problem! And the DSP load stays super low.

    Interestingly, if you leave the Debug.Log in, when going to super high numbers, you can see that the DSP time usage stat in the Game View is accounting for all the time of the OnAudioFilterRead doing all that spamming to the console, and not actually measuring DSP load, just the load (I presume) on this thread.

    For me, the thread ID is 22, and doesn't change between runs, nor is it different when I add another game object with another AudioSource and another of these Metronomes, it's also on thread 22.

    Which is a bit unfortunate, because it means it's always this thread, rather than a unique thread for each instance using OnAudioFilterRead.

    You can check by using this:

    Debug.Log(Thread.CurrentThread.ManagedThreadId)

    Along with

    using System.Threading;
     
    astracat111 likes this.
  19. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    You can just spawn normal C# threads. There's nothing Unity specific about it.
     
  20. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    That's a "just" to you.

    Try to think what the concept looks like to someone that's never used C# outside of Unity.

    Daunting, is the word you're looking for.

    Don't bust yourself, but ANY tips would be welcome.
     
  21. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,678
    Well, you're asking for documentation and we don't document "standard" C# APIs. Microsoft has extensive .NET API documentation on MSDN and they do a far better job at it than we ever could.
     
  22. april_4_short

    april_4_short

    Joined:
    Jul 19, 2021
    Posts:
    489
    If you were going to pass on a bit of advice about threading, to a rookie, what would it be?

    Give me a head start, if you can.


    Let me give you an example. If you were new to After Effects, I'd say:

    Compositions form the basis of creativity in After Effects (AE) and are somewhat akin to documents in other apps.

    The key feature of Compositions? - the ability to nest them, from any level of granularity on upwards.

    When in doubt, start a new composition, and do as little as you need within it to consider that part of your animation process "done". Then create another composition for the next part, and begin combining these in host compositions as you build up each bit of your production.

    This modularity will provide enormous creative flexibility as your project evolves.

    Learn masking early on, it's the most important thing that After Effects provides in terms of where its effects are applied, and how they falloff from your source materials.

    It's Adobe software, so nothing about it is intuitive, and it's gotten stale over the years. Use it on a PC, not a Mac, with a good video card.

    Buy more RAM than you think you need.