Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

Koreographer - Audio Driven Events, Animation, and Gameplay

Discussion in 'Assets and Asset Store' started by SonicBloomEric, Sep 15, 2015.

  1. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    You will need to make a fair bit of adjustments to the demo code, but it shouldn't be crazy. The core logic is all there - you'll just need to abstract what's there into something a bit more generic so that you can trigger the content as a "playlist".

    That's not precisely true, though. The demo code already performs look-ahead for the lead-in time. That is the system that allows you to see the notes falling before the music begins. Take a look at how the leadInTime, leadInTimeLeft, and timeLeftToPlay members are used in the RhythmGameController class.

    You should be able to abstract all of that code out into something like a "RhythmGameTrack" class that the RhythmGameController then uses as part of a list (a "playlist", if you will). Each RhythmGameTrack would then handle its own lead-in-time. The RhythmGameController class would then need to be updated to check whether the "next" RhythmGameTrack should begin or not.

    You will, of course, need to figure out how to get the looping to be "spot on". To do this, you will need to calculate how much time it takes for the gems to travel from the top of the screen to the bottom. That would then be your "next track's lead-in-time".

    I hope this is helpful!
     
    Guacamolay likes this.
  2. Guacamolay

    Guacamolay

    Joined:
    Jun 24, 2013
    Posts:
    58
    Thanks for the help, I'll look into it :D
     
    SonicBloomEric likes this.
  3. sebastiansgames

    sebastiansgames

    Joined:
    Mar 25, 2014
    Posts:
    114
    Hey there! Got a question for you. Koreographer is working great for my project but I'm interested in having a song loop with some overlap (ie music starts playing again before it completely ends).

    Currently using the Simple Music Player. For overlapped looping initial instinct was to duplicate the Simple Music Player so I'd have an A and a B iteration. Put a Koreography event near the end of the track which fires and starts B playing, then vice versa. While the audio sounds fine, I can't figure out how to stop listening to Koreography events from A when I switch to B etc. So I'm getting doubling up of Koreographer events which is leading to all sorts of problems. Is there a solution you might suggest? Thank you!
     
  4. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    There are several ways that you might handle this but the absolute simplest approach would probably be to disable the SimpleMusicPlayer component for A when you start B. Similarly, you would reverse that whenever the looping occurs (e.g. B→A would be "disable B, enable A").

    The SimpleMusicPlayer drives updates through its internal visor through the MonoBehaviour Update() function. Disabling the component should stop Update from being called which, in turn, should stop the "event pump" from that component.

    Please give that a shot and let us know how it goes!
     
    sebastiansgames likes this.
  5. sebastiansgames

    sebastiansgames

    Joined:
    Mar 25, 2014
    Posts:
    114
    Wow that worked like a charm! I somehow imagined that disabling the SimpleMusicPlayer would also stop the playback -- but the playback continued and the events stopped firing as you suggested. Thank you!!! Really loving Koreographer. Thanks again!
     
    SonicBloomEric likes this.
  6. unity_JWCBJdscImvxrg

    unity_JWCBJdscImvxrg

    Joined:
    Oct 1, 2019
    Posts:
    4
    upload_2019-11-1_22-50-58.png

    Hi! I've tried integrating Koreographer into Master Audio but I got this error after importing the integration.
     
  7. unity_JWCBJdscImvxrg

    unity_JWCBJdscImvxrg

    Joined:
    Oct 1, 2019
    Posts:
    4
    BTW, I have this feature in my game where the game will slow down after a specific action has been done...my question is...is it possible to slow down the speed of Koreographer's playing audio, too?
     
  8. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Hi! What versions of Koreographer Professional Edition and Master Audio are you using? Koreographer Professional Edition 1.5.1 included an update to address an issue with the Master Audio 4.2.0+ API specified in your console.

    We just tested installation of the lates version of both assets with the integration and found no errors in the console. Is there something custom about your setup, perhaps?

    Yes! If you use the Simple Music Player component for playback, you can simply adjust the associated AudioSoure component's pitch and Koreographer will follow right along. Likewise, if you're using Master Audio to control the audio, Koreographer will similarly just follow right along!
     
    unity_JWCBJdscImvxrg likes this.
  9. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    I've been tinkering with Koreographer for a while now and we are really enjoying how reliable Koreo is when it comes to firing payload events based on music tracks we provide! I've been working on finalizing the core of our game's framework and I've hit a snag.

    In short, what I'm trying to accomplish is replace Unity's "Time.deltaTime" value with a sort of "Koreographer.deltaTime", tracked either as a double or as a sample position, that is explicitly tied to the music position (and thus pausing the music would report a deltaTime of 0). I have a MonoBehavior that is keeping track of the "last Koreographer timestamp" and reporting the difference between frames as the delta which all other synchronized game logic references as the global timestep.

    What I tried doing was setting up a Coroutine or LateUpdate method that simply records the current sample position, the change vs previous frame, and use that as my delta. The problem is, frames occasionally report a lower sample position than the frames before them during playback. This happens inconsistently but averages about two to three times a second with a sample range between -80 and about -144 (using a 44khz clip). This is with vsync disabled and running an average of about 250fps. Some sample code of what I'm doing looks like this:

    int lastSamplePoint=0;

    void LateUpdate()
    {
    int sample = MyKoreographer.GetMusicSampleTime();
    if (sample < lastSamplePoint)
    {
    Debug.Log("oops " + (sample - lastSamplePoint));
    }
    lastSamplePoint = sample;
    }


    This update loop will log the difference in sample position when the current frame's sample position is somehow lower than the previous frame's. This scenario is expected when changing tracks, but it's happening multiple times per second when just playing a song and I can't find a reason or workaround for it.

    I have been tinkering with various methods while writing this post and the problem appears to go away when vsync is enabled. I'm not in love with the idea of forcing vsync so if it's possible to come up with a better solution than dropping negative frames I'd be happy to hear it, otherwise I think what I'm doing now should be acceptable. Sorry about the novella, but I hope it might be of some use in figuring out what's going on!
     
  10. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Please don't apologize! This was an excellent writeup and really helped us understand exactly what issue you're experiencing! Very helpful!

    You have tripped over an issue that we recently discovered where high frame rates can result in the timing estimation system overestimating the audio position. When the audio system begins reporting its numbers again, those numbers can end up falling behind the estimated time, which is why you see negative numbers. Turning on Vsync sidesteps this issue because the audio subsystem typically has enough time to update its position (or near enough) between frames.

    We have been testing a fix for this with a handful of affected customers. So far everything seems great with the new approach, which even includes more precise timing overall! If you would like to give this newer version a shot, please reach out to us at support@sonicbloomgames.com.
     
    rrahim likes this.
  11. NeatWolf

    NeatWolf

    Joined:
    Sep 27, 2013
    Posts:
    883
    Hi there!

    I purchased the asset almost 2 years ago, but I may be using it for production only right now :)
    I'm a bit worried looking at the last update being in 2018, since we're almost crossing the 2019-2020 line.

    I can see you're still supporting it but... is there any chance to have some "seal of approval" of the asset being fully compatible with Unity 2019.2.x?

    I'm also pretty sure it's going to have a good impact on the sales during black friday :)

    I'd gladly pay extra (in the form of upgrading to a new version) to contribute keeping the asset alive and officially supported :)

    (I should be doing it myself but work is killing me during these weeks T_T)
     
    Last edited: Nov 12, 2019
  12. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    This is actually very helpful feedback. As mentioned above, we are working on an update - we just want to make sure that it doesn't break anything for current users before releasing it. [Please contact us if you are a current customer and would like to try the upcoming update.]

    How does the fact that Koreographer is fully compatible with Unity 2020.x sound? We test Betas/Alphas as soon as we can to anticipate upcoming breaking changes. Our last test against a 2020 alpha (last week) showed no errors and no Script API Updater warnings - everything continues to work fine!

    This is an excellent point. We will see if we can get that release out prior to the sale!

    *makes note* ;)
     
    NeatWolf likes this.
  13. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    I see, thanks for letting me know! I will most likely get in touch with the address provided about trying out that update next week!
     
    SonicBloomEric likes this.
  14. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    While we're still happy to get in touch, the relevant bug fixes are included with the just-released v1.6.0! Please update and let us know if you experience any issues! :D
     
  15. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Hey everyone! As indicated above, we just released version 1.6.0 of Koreographer and Koreographer Professional Edition to the Unity Asset Store! Here'ss a rundown of changes:
    • [NEW] Completely rewritten core timing estimation system for Unity's AudioSource audio system! With this new system, timing updates are more precise, smoother, and more stable.
    • Fix final End Sample number display in the Koreography Editor's end position LCD.
    • Fix final beat sample location math.
    • Fix analysis window end range processing.
    • Fix double-click-created event snapped beyond end of track.
    • Fix Undo when drawing multiple snapped events during a drag operation.
    • Fix draw mode OneOff event snapped beyond end of track.
    • Fix draw mode Span event ending beyond end of track.
    • Adjust ReadMe language around credits logo usage for clarity.
    If you purchased a previous version of Koreographer and Koreographer Professional Edition, then v1.6.0 is a free upgrade!
     
    rrahim likes this.
  16. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    Is there any way we can expand the size of text fields in the Koreography Editor? We have some fairly long text payloads and the editor window ends up looking like this:



    The window itself is horizontally much larger than the space occupied here and the entire right half of this bottom portion of the window is blank; having this text field grow into that unused space based on window or input size would help our workflows a bit.
     
  17. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    There is no built-in way to do this at this time, no. Which version of Koreographer do you have? If you have the Professional Edition there may be something that you can do with a Custom Payload, which will allow you to handle the IMGUI call for that space. You should be able to create some sort of pop-over or separate viewer that shows the content of your Payload when an event with your Custom Payload type is selected.

    There are a few other options available as well, including possibly providing you with a custom build or source code (if you have Pro) to edit this yourself. Please reach us at our support address to discuss such options further.
     
  18. Tubbritt

    Tubbritt

    Joined:
    Nov 30, 2015
    Posts:
    16
    Hello. I was considering buying the pro edition in the sale. I'm really only interested in a specific function right now and I was wondering if you could tell me if your asset supports this. I wish to connect a Midi Keyboard and as I play notes, I want the Midi Keyboard to turn off and on Game Objects using Koreographer. Is this possible, and if so, would this also work on Android? Specifically the Input from an external Midi Synth.
     
  19. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Unfortunately Koreographer does not support this use case. Koreographer Professional Edition's MIDI support is limited to MIDI file import [for conversion purposes] and does not support MIDI's Real Time mode. There are other assets in the Asset Store and on GitHub that should be able to help you with this goal. Best of luck! :D
     
  20. Tubbritt

    Tubbritt

    Joined:
    Nov 30, 2015
    Posts:
    16
    Thank you kindly for your reply.

    Much appreciated.
    James.
     
    SonicBloomEric likes this.
  21. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    We DO have the Pro version, but as you suggested it may be more elegant and scalable for us to use custom payloads rather than fiddle with the editor UI. Thanks for the suggestions!
     
    SonicBloomEric likes this.
  22. misomero

    misomero

    Joined:
    Nov 14, 2017
    Posts:
    1
    Hi!
    I bought Koreographer and I’m quite new to this. I’m playing around with rhythm game demo and am wondering can I somehow increase the number of lanes? So that I could have more notes going down. Does Koreographer allow this?

    Happy new year and happy coding.
    Maija
     
  23. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Yes, this is 100% possible with the Rhythm Game Demo - you just need to build it yourself. You will need to look at the architecture of the objects in the scene, particularly around the Lanes. Duplicating a lane and then adjusting its position in the scene would be a good place to start. This effectively involves:
    1. Duplicate an existing Lane (a Target # GameObject).
    2. Adjust the position of the Lane in the UI (adjust the x position of both the new Target GameObject).
      1. If you're using the built-in Button UI, duplicate a button - located in Canvas/Buttons/Button - and then adjust it's x-position appropriately.
    3. Click on the Gameplay Controller object and add the newly created Target GameObject to the "Note Lanes" array.
    4. On the "Target" GameObject from step 1, adjust the Keyboard Button to a new button and then adjust the Matched Payloads to whatever KoreographyEvent Payload you want this lane to match (by default, these are note values).
    That should be enough to get you started!

    [The Rhythm Game Demo is built as a foundation from which you build your own gameplay. It is not a full game or template, but, rather, provides you with a working system and example of how to match user input to music events. :) Hope this helps!]
     
  24. Pourya-MDP

    Pourya-MDP

    Joined:
    May 18, 2017
    Posts:
    78
    Hey every one i have a simple question
    First let me tell you my scenario
    There is a cast splash screen in my game which will be played one time when the game installed on mobile device
    Now i want to synch the music with the name of creators and cast
    I alreday did it using dotween using corutines and such and everything working perfectly
    But now i see koreography is made for such scenarios but i dont know the performance on the mobile devices
    Now the question is which way do you prefer to go with?
    The corutines? Or koreography?
    (Mobile devices)
    Ill appreciate any help
     
  25. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Koreographer's performance on mobile is killer - it basically doesn't even show up on the profiler. People have been using Koreographer on mobile for years now and some of the first use cases were GearVR targets. Modern mobile devices are way, way stronger than the devices used in those circumstances so things have only improved.

    As the lead developer of Koreographer, I clearly have a bias here, but I would say that Koreographer would be an excellent fit for this scenario. DOTween and Coroutines are great and all, but they can get pretty heavy and difficult to manage if you have multiple going at a time...

    I hope someone else can chime in for you!
     
  26. Pourya-MDP

    Pourya-MDP

    Joined:
    May 18, 2017
    Posts:
    78
    Hi @SonicBloomEric
    I appreciate your help
    Now you have one another happy user!! :)
     
    SonicBloomEric likes this.
  27. Pourya-MDP

    Pourya-MDP

    Joined:
    May 18, 2017
    Posts:
    78
    Hi there again @SonicBloomEric
    One another question is poped up on my mind
    Lets say we have a track and an associated event with it which will be driven by a simple script
    Now lets assume we have 10 markers on track to fire event
    The question is how to make those markers to fire 10 individual events respectively from 1 to 10
     
  28. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Can you expand further on what you mean by this?

    My guess is that you're looking for 10 entirely separate event callbacks that you want to trigger in the audio. These 10 "separate events" have no relation to one another. If this is your goal, you could create 10 KoreographyTrack assets, add them to the Koreography, give them unique "Track Event ID"s, and then add event-specific markers to each Track. Then you can register for each event callbacks from each "Event ID" separately.

    Does this answer the question?
     
  29. Pourya-MDP

    Pourya-MDP

    Joined:
    May 18, 2017
    Posts:
    78
    Hi
    I think you got it!
    But i wonder ,is that the only way to trigger multiple events on same track?
    Is it efficient?
    Any performance affect by doing it(or maybe memory problems)?
    I mean creating multiple asset for multipme events
    I appreciate your time
     
  30. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    To be clear, I am interpreting this statement to read as follows:

    The answer to this is no, there are other ways, though they involve embedding the callback "mapping" into the payload data. There are two easy options to implement:
    1. [Professional Edition Only] Create your own Custom Payload type to attach to events and specify the "mapping" as a field on the payload.
    2. Encode the meaning in your Payload. If you use a TextPayload, you could preface all of your payload values with something like "A:", "B:", "C:", etc. and have each of those "mean" something specific.
    In both scenarios above you would create a single "callback handler" that would receive the payloads above, inspect them for "mapping", and then dispatch the callbacks to listeners/systems appropriately.

    There's a lot of manual work to get such a system to work but it should be relatively quick and efficient. We are aware of many users who have taken this path.


    Using multiple KoreographyTrack instances is indeed efficient. The extra weight that each KoreographyTrack instance adds to the system is effectively negligible and it certainly simplifies your code (no need for an intermediary above).


    The runtime memory impact of a KoreographyTrack instance is measured in bytes. Your other game data (textures, audio, meshes, etc.) will all drastically outweigh any pressure that a KoreographyTrack (or even 10) will add to your application's memory. As for performance, you should see little-to-no runtime impact by adding KoreographyTrack instances. It isn't free, of course, as each instance needs to be checked for an event each frame, but that check basically boils down to a simple if-statement in the most common case.

    If you already have Koreographer, we cannot recommend enough that you try a simple example with such a setup and see if Koreographer starts to appear in your profile. Our guess is that it will remain extremely quiet in the Profiler.

    Hope this helps!
     
  31. gegagome

    gegagome

    Joined:
    Oct 11, 2012
    Posts:
    340
    hi there

    Is it possible to load the Koreographer waveform of an audio track in the game window?

    Thanks
     
  32. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Unfortunately this is not possible. The waveform rendering logic consists of highly optimized IMGUI code that is deeply integrated with and reliant upon Editor APIs. :(
     
  33. gegagome

    gegagome

    Joined:
    Oct 11, 2012
    Posts:
    340
    Shoot, thank you so much for your quick reply
     
    SonicBloomEric likes this.
  34. Muskie

    Muskie

    Joined:
    Jan 21, 2017
    Posts:
    17
    Dumb question, but I can't *quite* seem to get my mind around it at the moment:
    I'm expanding the RhythmGameDemo code a bit , and I'm trying to work out how to get the time when the player interacts with the note (which I believe is
    GameController.DelayedSampleTime
    ), and compare it to the actual timing window so I can then score based on how close they are to the event trigger/beat within the timing window?
     
  35. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    That is correct! The delayed sample time includes support for any "lead in time" left. During actual gameplay, this is expected to be 0, so it simply defaults back to the last sample time position reported by the "playing" Koreography.

    The
    NoteObject.IsNoteHittable()
    method has exactly this type of calculation. The math in that method checks that the distance between the "note time" and the "current time" is less than the size of the hit window. The absolute value is used because the hit window in the demo is defined as "the time (both early and late) within which input will be detected as a Hit."

    You can create your own distance measures to compare the result of that absolute value to for scoring.

    Please keep in mind that the standard Unity input system does not have input timestamps so the best "timing resolution" you can expect is 100% tied to your game's framerate (specifically
    Time.unscaledDeltaTime
    ). You might see improvements using the new Input System as it contains support for input timestamps.

    Hope this helps!
     
  36. Muskie

    Muskie

    Joined:
    Jan 21, 2017
    Posts:
    17
    Thanks, Eric!

    I'm actually tearing this thing apart to work in VR with object collisions (yes, the whole beat saber thing I know) , but the biggest thing I needed to realise to get this going was that Koreographer measures time in samples, rather than seconds (for obvious reasons now I see it, but it was a big perception hump to get over!)

    EDIT: Nailed it. and in the interests of helping everyone else out:

    Code (CSharp):
    1.    float SamplesToMilliseconds(int samples)
    2.         {
    3.             float ms;
    4.            
    5.             ms = (samples / (gameController.SampleRate * 0.001f));
    6.  
    7.             return ms;
    8.         }
    get the hitTime from
    Mathf.Abs(noteTime - curTime)
    as noted by Eric above, then
    SamplesToMilliseconds(hitTime)
    to get a float time in ms that you hit the object.
     
    Last edited: Mar 24, 2020
    rrahim and SonicBloomEric like this.
  37. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Oooh, sounds awesome! Would love to see it someday!

    Ahh, yup! Glad to hear you've made the mental shift! Hopefully that will help clear up a whole lot of what you see in the code! ;D

    Thanks very much for sharing your utility function! I'm sure others will find it helpful! :D
     
  38. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    Another UI question for you:

    The [eventID] attribute tag brings up a dropdown of all eventIDs in the current project. How/where do I get the list/array of strings of all user-created eventIDs? Our current project has several hundred so the dropdown is effectively useless and I wish to replace it with an autocomplete attribute, but to do that I need the list of eventID strings. Thanks!
     
  39. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    The EventID Attribute's Property Drawer implementation handles that logic. This is the function that works the magic:
    Code (CSharp):
    1. static List<string> GetAllEventIDsInProject()
    2. {
    3.     // Find all Koreography in the Asset Database.
    4.     string[] guids = AssetDatabase.FindAssets("t:KoreographyTrackBase");
    5.  
    6.     List<string> ids = new List<string>();
    7.  
    8.     for (int i = 0; i < guids.Length; ++i)
    9.     {
    10.         KoreographyTrackBase track = AssetDatabase.LoadAssetAtPath(AssetDatabase.GUIDToAssetPath(guids[i]), typeof(KoreographyTrackBase)) as KoreographyTrackBase;
    11.  
    12.         string id = track.EventID;
    13.  
    14.         if (track != null &&
    15.             !string.IsNullOrEmpty(id) &&
    16.             !ids.Contains(id))
    17.         {
    18.             ids.Add(track.EventID);
    19.         }
    20.     }
    21.  
    22.     return ids;
    23. }
    In large projects it can take a while to generate the list (it has to load all the KoreographyTrack assets in the project), which is why we put that functionality behind a "[R]efresh" button.

    Hope that helps!
     
  40. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    This is super helpful. I don't know much about attributes and property drawers so I'm not sure how to cache the values the way you do with the Refresh button, but I was able to adapt the Autocomplete drawer to pull up the eventIDs using your code sample there, and now it's working just how I expected it to. Gonna be a big workflow improvement. Thanks again!
     
    SonicBloomEric likes this.
  41. yotingo

    yotingo

    Joined:
    Jul 10, 2013
    Posts:
    40
    This is probably a dumb question but... does Koreographer work with music file types besides MIDI? I want to use purchased .ogg or .mp3 music packs to generate events. Can Koreographer analyze these songs and create events for different instruments?

    I really love the idea of Koreographer but I'm worried that it may be useless for a one-man team.
     
  42. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    This question may be interpreted in two ways and I will respond to both:
    1. Does Koreographer play back audio file types besides MIDI? Koreographer does not actually "play" any audio files whatsoever. That is the job of the audio system that Koreographer integrates with. By default, this is the Unity AudioSource system. As such, Koreographer works with any file that the AudioSource system can play back.
    2. Does Koreographer support importing events from file types besides MIDI? To be clear, Koreographer Professional Edition supports MIDI event conversion. A MIDI file is roughly equivalent to "digital sheet music" in that it consists of information about instrumentation and timing for when notes should be played. The MIDI Converter utility included with Koreographer Professional Edition is capable of opening up a MIDI file version of a sample-based audio file you also have access to. A typical workflow is to have your custom music composed by an audio professional using a DAW (in some manner or another) and that person would export a MIDI version of the project alongside the WAV/OGG file for import into Unity. The MIDI version would be consumed by the MIDI Converter to generate extremely accurate event timing for any instrument used to create the music.

      OGG and MP3 files are sample based and do not contain such musical information. Analyzing an audio file's stream of samples to suss out those details is an entire area of research known as Music Information Retrieval. While some algorithms do exist to approximate some of the details, the results typically vary wildly depending upon the source music. Currently, neither version of Koreographer supports "event extraction" from sample based audio files though we are constantly evaluating functionality for inclusion into the system. The analysis functions that Koreographer Professional Edition currently supports are designed to automate a very specific type of event. They are also described in detail in this post (as well as in the documentation).
    No. You will be hard pressed to find a tool that is capable of doing this at all, if not in a generally meaningful manner. Researchers have made huge advancements in the last few years with respect to source separation (the first step required in taking down-mixed music and generating per-instrument event information), but they still have a long way to go. Most systems (algorithms, etc.) will struggle to get a basic reliable beat detection algorithm down, let alone something that can generate note information for all instruments used in a mix.

    I should point out that there is at least one composer on the Asset Store (@pdkmusic) who has published music packs that contain pre-generated Koreography tracks and include MIDI files for deeper customization (see: Build It Up and Build It Up 2).

    This entirely depends upon the requirements of your project and the quality and abundance of source data that you have access to. If you're looking at picking up music packs, then you might reach out to the publisher and see if they'd be willing to provide you with a MIDI file representation of the tracks in the pack you're looking to purchase. If you explain the use case they may very well be happy to provide you with that information!

    Hope this helps!
     
  43. yotingo

    yotingo

    Joined:
    Jul 10, 2013
    Posts:
    40
    Thank you for the detailed response!

    Given that information, can I place markers (events) in Koreographer manually? For example, going through a song and dropping a marker at each impactful piano note or drum beat. Would this be an unreasonable approach for music with unavailable MIDI tracks?

    [As a side note, the user guide linked from that post does not exist. "Sorry, the file you have requested does not exist."]
     
  44. Muskie

    Muskie

    Joined:
    Jan 21, 2017
    Posts:
    17

    Indeed you can, and that's the intended method.

    The Koreographer documentation is here: https://drive.google.com/drive/fold...VyMU5qaHdqS2VEbG83dUNpN3AxVWk4RkVQbTEyMGVld1U

    Take a look at the Koreography Quick Start Guide, it shows you how to set things up! :)
     
    SonicBloomEric likes this.
  45. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    @Muskie hit it on the head! :D

    Also, if you find that your event mapping gets too complex, you might consider using a DAW (e.g. REAPER) to map out the events as MIDI, export it, and then use the MIDI Converter. Some users have used this approach to great effect.
    Oooh, thanks for the heads up! For a while we were linking directly to the resource files but we found that those links broke when we upgraded the documentation with new releases. I've replaced the documentation link in that post with one that should not suffer from this issue.
     
  46. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    Is there a way to reduce the delay/inconsistency between the start sample time of an event and when the event actually plays?

    Our audio engineer has set up a track of koreo events telling enemies when they should fire at the player (this gunfire acts as a percussive instrument to the soundtrack as well), but reported the effects being as late as 30ms off. Using the RegisterForEventsWithTime method I was able to log the discrepancy between when we wanted an event to start and when it did start, and the range was anywhere from 7 samples to 1300 samples late. I'm hoping it's as simple as setting a value somewhere in unity, but it's important that we reduce the delay between a sample's specified start time and a sample's actual start time as much as possible, since there is an aural component to this particular feature that makes it so noticeable.

    We're using the SimpleMusicPlayer for everything as we only need one BGM track to play at a time, but I don't see anything in the controls or documentation about adjusting for audio latency so I'm hoping you could shed some light or offer us a suggestion for shrinking that gap! Thank you once again for your time and for continually working to improve Koreographer!
     
  47. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Unfortunately there is not. Depending upon your point of view, this is a limitation of:
    1. The core Unity Update Loop - We can't precisely nail the exact timing of the event because we can't interrupt the core engine's processing. Koreographer's update happens when Unity decides it will happen. The "...WithTime" callbacks are designed to provide you with the timing information necessary to offset visuals such that they align as though the callback had actually nailed the timing. In a bullet-hell shooter you would spawn the bullet at the spawn location and then offset it by "speed * offsetTime" to make it appear as though it really had spawned mid-frame.
    2. Time Itself is One-Way - Koreographer was designed to respond correctly if you were to suddenly jump the audio to a new position. A "lookahead" event trigger is incompatible with such a system in a general sense: you might trigger a "future event" that never actually happens because someone/something jumped the timeline (seeked the audio) in such a manner that that audio was never actually played. There are ways to solve this in specific scenarios, of course, as specific scenarios can say things like "the audio timeline will never seek/jump".
    We have not provided APIs for the specific cases outlined in #2 above because we don't really have a good sense of what that would look like. Do we need to provide an API to cancel events that never happened? Do we trigger all events from the "seek target time" to the "seek target time + future offset" when the seek is detected/triggered? This ambiguity has left us without a good direction on what we should do to support such a feature.

    That all said, we do have a specific suggestion for your example:
    The way to resolve the timing inconsistency between the music and the "percussive enemy" is to schedule the gunfire sound effects to time with the audio system. Basically, you tell Unity's audio system exactly when it should start to play. In this world, you would take a look at the KoreographyTrack in question yourself and compare its list of events against the current audio time sample and the "next event's StartSample". As for a the size of window you want to use to ensure your audio is properly scheduled, we recommend at least the length of the Audio subsystem's buffer. Specifically:
    1. Grab the list of KoreographyEvents in the KoreographyTrack that drives your "percussive enemy shoot" logic using the KoreographyTrack.GetAllEvents method.
      • Alternatively use the KoreographyTrack.GetEventsInRange method to avoid allocating a new List instance.
    2. Determine the minimum amount of time required to "schedule" a sound effect. As above, we suggest using the length of the Audio subsystem's buffer. To calculate this, use the AudioSettings.GetDSPBufferSize API and multiply the resulting numBuffers parameter by the bufferLength parameter. This will give you the "length in samples" of the audio thread's ring buffer. Multiply this number by 2 or 3, or add some standard number of samples to it (suggested that the standard number of samples is at least 1-2 frames worth of time for your target framerate). This last addition will "open up" your scheduling window such that you will catch events that need scheduling before they enter the "it might be too late to schedule them" window.
    3. Compare the current time of the music using either the AudioSource.timeSamples property on the AudioSource driving the music or one in the Koreographer API (e.g. Koreographer.Instance.GetMusicSampleTime).
    4. Add the value you retrieved step #3 to the value you calculated in step #2. This is the timing that you want to check events against.
    5. Compare the value you calculated in #4 with the current event in the list of events you retrieved in #1.
      • The current event is easy to manage: you simply keep an index pointing at the "next event to trigger". That event's StartSample is checked against the time you calculate in #4 above.
    6. For all KoreographyEvents with StartSample less than the sample position you calculated in #4, you schedule it using the following [pseudo]code:
      double playTime = AudioSettings.dspTime + (scheduleEvent.StartSample - currentMusicSampleTime);
      enemyAudioCom.PlayScheduled(playTime);
      • Note that the above says "for all". You may want to run the check you run in step #5 in a while-loop (or equivalent); check until the current "event to check" has a StartSample less than the target time - or, you know, the list is empty...).
    A system like this should allow you to properly schedule the sound effects such that they land precisely every single time.
    Hopefully the above will get you what you need! Please let us know if you run into any issues or have any questions about this approach!
    Happy to help! :D
     
    Last edited: May 4, 2020
    rrahim likes this.
  48. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    Thanks for the detailed response! That's actually kind of similar to what we've been doing with our track-advancing system, but will need to be a little bit more sophisticated due to the extra logic. Luckily the bulk of the logic should be okay to be a few ms off and we can just shift the audio-related code into a predictive/scheduling system like you suggested.
     
    SonicBloomEric likes this.
  49. Fearless_Garrett

    Fearless_Garrett

    Joined:
    Sep 25, 2015
    Posts:
    9
    Following up on what we previously discussed: it's working beautifully. Thanks again!
     
    SonicBloomEric likes this.
  50. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    735
    Happy to help! Thanks very much for reporting the results! :D
     
unityunity