Search Unity

Koreographer - Audio Driven Events, Animation, and Gameplay

Discussion in 'Assets and Asset Store' started by SonicBloomEric, Sep 15, 2015.

  1. wechat_os_Qy04YAOMsJLHrYZL1EzXnRpek

    wechat_os_Qy04YAOMsJLHrYZL1EzXnRpek

    Joined:
    May 26, 2020
    Posts:
    2
    Are there any other reasons for Unity to tell me that "The type or namespace name 'KoreographyTrackBase' could not be found" besides missing a using directive? I am clearly not missing any using directives but for some reason it's just impossible to use KoreographyTrackBase in a code.
    This might seem dumb but I just can't get over it...
     
  2. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Can you share the using directives that you have in the script that is producing that error?
     
  3. wechat_os_Qy04YAOMsJLHrYZL1EzXnRpek

    wechat_os_Qy04YAOMsJLHrYZL1EzXnRpek

    Joined:
    May 26, 2020
    Posts:
    2
    Just plainly:
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using SonicBloom.Koreo;
     
  4. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Hmm... well that definitely looks correct!

    A few more followup questions:
    1. When you double click the error in the console, what line of script does it take you to? Can you post that?
    2. Where is the script that you are writing located? Is it within the scope of an Assembly Definition file?
    This is strange... but we should be able to track down the issue for sure!
     
  5. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    So, I'm trying to figure out how to have notes spawn at an arbitrary Vector3 location, then move towards the player/target position.

    Now, I'm pretty sure I understand how UpdatePosition() works:

    1: samplesPerUnit is set to the sample rate of the audio / the note's "speed" (which, by default is set to one unit per second)
    2:
    Vector3 pos
    is declared, and set to
    laneController.TargetPosition

    3: the Z float variable of the Vector3 pos is reduced by
    (gameController.DelayedSampleTime - trackedEvent.StartSample) / samplesPerUnit; 

    4: The note's transform is finally changed to the now-modified
    Vector3 pos


    Okay. so, Assuming I have set up gameobjects in the world and passed references so that I have added
     Vector3 spawnPos = laneController.SpawnPosition
    , how would one have the object start at that position, then end at
    targetPosition
    at the appropriate event trigger? Is a Vector3.Lerp in order, here, or something different?
     
  6. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Your understanding is spot on. Very nice!

    Your intuition on how to implement the system with a specific SpawnPosition is also generally spot-on. What you would need to do is be sure to adjust the spawning logic in the LaneController. It currently assumes a 100% vertical approach where the viewport top is the spawn position. It therefore simplifies the logic to only bother with the Vector's y component.

    The key is in the GetSpawnSampleOffset() function. That uses the current speed to determine when in the audio stream a note will have to be spawned given the current speed. It presently calculates the distance between spawn position and target position using simply the y-location. To address this you'd need to use the distance between the Spawn position Vector3 and the Target position Vector3. Adjusting this will ensure that the NoteObjects are only spawned once they're at-or-beyond the spawn position.

    At that point you should be able to simply use the Vector3.Lerp function to change the position of the NoteObject instances themselves in the UpdatePosition function. Please note that if you want the objects to move beyond the target location at all you should use the Vector3.LerpUnclamped function as the standard Lerp version will clamp the t-parameter to a range of [0,1] which would halt the position precisely at the Target position.
     
  7. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    Okay, There's some progress:

    I've set
    float spawnDistToTarget = Vector3.Distance(SpawnPosition, TargetPosition);
    in laneController.

    UpdatePosition() has been changed to:


    Code (CSharp):
    1.             Vector3 pos;
    2.             float distanceToTarget = (gameController.DelayedSampleTime - trackedEvent.StartSample) / samplesPerUnit;
    3.          
    4.             pos = Vector3.Lerp(laneController.SpawnPosition, laneController.TargetPosition, distanceToTarget);
    5.             transform.position = pos;
    Now, the note objects are spawning at their spawn points, but they are stopping short of the Targets:


    (The location of the "One Shot Audio" gameobject is where the notes are stopping. this should be aligned with the target.)

    Changing from Lerp to LerpUnclamped results in this:



    This is even stranger behaviour, as it's overshooting on both ends!

    I have a feeling this is because i'm using distancetoTarget wrong, or I'm calculating it incorrectly. You need a 0-1 for the time function in a lerp and it's reporting -0.2xxx to values around the 3-5 range..

    I feel I'm tantalizingly close... What am I missing here? :)
     
  8. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    The problem is that you're sending the exact distance into the Lerp functions rather than a percentage. To address this you would need to divide your distanceToTarget value by the total distance between Spawn and Target. The resulting value is what you send into the Lerp functions.

    That said, another, perhaps more clean approach would be to do the following:
    Code (CSharp):
    1. // Calculate the "magnitude" of the distance from the Target (hit) position in units.
    2. float distanceToTarget = (trackedEvent.StartSample - gameController.DelayedSampleTime) / samplesPerUnit;
    3. // Calculate the _normalized_ Vector describing the direction from the Target to the Spawn position.
    4. //  NOTE: This is an excellent candidate for caching IF your Target and Spawn positions are static!
    5. Vector3 dir = Vectr3.Normalize(laneController.SpawnPosition - laneController.TargetPosition);
    6. // Set the position as "Target position offset by the amount left to travel in the direction of the
    7. //  Spawn position".
    8. transform.position = laneController.TargetPosition + (dir * distanceToTarget);
    That is effectively what the Lerp approach does... it's just more explicit and (possibly?) easier to follow.
     
    Last edited: Jun 5, 2020
  9. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    The
    Vector3.Normalize
    needed to be swapped to
     laneController.TargetPosition -laneController.SpawnPosition
    but it now works perfectly. I can arbitrarily place both the Target *and* spawn position in runtime, that should lead to some fun possibility space in VR! :)

    Thanks so much for your help, Vector math has always been my biggest knowledge gap!
     
    SonicBloomEric likes this.
  10. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Hah! Actually, that logic was fine. The actual calculation that needed fixing was the calculation for the distanceToTarget value.

    The original logic likely made use of the fact that subtracting the future event time from the current play time would result in a negative value. The actual "time left" logic should be reversed. I'll update my previous post with this adjustment as it certainly clarifies the logic better!
     
  11. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    I see! Thanks again!

    Now the next step is to work out a pitch/note speed lerp when the player fails a stage...
     
    SonicBloomEric likes this.
  12. SuperMiro

    SuperMiro

    Joined:
    Nov 23, 2018
    Posts:
    54
    Is it compatible with Unity 2019.4 ?
    Also is there any plan for a future sale?
     
  13. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Yes!

    We do not currently have any plans to put Koreographer on sale, but Koreographer (/Pro) is a frequent participant in Unity's seasonal/mega/etc. sales.
     
    SuperMiro likes this.
  14. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    We've run into a pretty messy error that we frankly have no clue how we fell into.

    Beginning to Process Koreography with a list of Koreography that should have been processed. Please check that you are not calling ProcessKoreography during another ProcessKoreography pass.  Alternatively, please verify that an Exception did not occur during the previous attempt to Process Koreography.



    We're not sure what's going on with this one.. little help? :)
    We also seem to be getting Nullrefs on IsNoteHittable() when it checks for trackedEvent.StartSample, but only on *some* notes? it's very strange.
     
  15. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Happy to help! A few questions to help us track down what's going on:
    1. Have you created your own Music Player or Visor class?
    2. Are you manually calling the Koreographer.ProcessKoreography() method anywhere in your scripts?
    3. Do you see any Errors in your console when this happens?
    4. Can you provide us with a screenshot of your console when you see this warning?
    In the vast majority of cases, this warning appears when an Uncaught Exception is thrown in a Koreography callback handler (e.g. a NullReferenceException). The exception will cause processing of Koreography to end prematurely, leaving the internal list of "Koreography to process" with entries. Koreographer warns you about this to help you understand why certain things you expect to occur may not be.

    This likely indicates that the NoteObject instance is still referenced by something when the NoteObject.Reset() method had been called. If you're seeing this in a [mostly?] vanilla install of Koreographer's Rhythm Game Demo then the likely culprit would be somewhere in the LaneController.CheckNoteHit method. If the note is hit, it should be dequeued from the trackedNotes Queue and then have the NoteObject's OnHit method called. By default, that method simply Resets the object.

    One thing that could be happening is that the NoteObject is despawning itself before the LaneController has a chance to dequeue it. This could happen if you customized the demo to change how despawning works or the logic surrounding "hittability" of a NoteObject. I might suggest that you modify the system as follows to see if it helps:
    1. Add the following method to the NoteObject class:
      Code (CSharp):
      1. public bool IsNoteValid()
      2. {
      3.     return trackedEvent != null;
      4. }
    2. Add the following block to the top of the LaneController.CheckNoteHit method:
      Code (CSharp):
      1. while (trackedNotes.Count > 0 && !trackedNotes.Peek().IsNoteValid())
      2. {
      3.     trackedNotes.Dequeue();
      4. }
    3. Adjust the while-loop condition in LaneController.Update to the following:
      Code (CSharp):
      1. while (trackedNotes.Count > 0 && (!trackedNotes.Peek().IsNoteValid() || trackedNotes.Peek().IsNoteMissed()))
    Note the presence of the logical not (!) before the call to IsNoteValid in steps 2 and 3 above. That is very important.

    In theory, this should help alleviate the null reference errors you're seeing with the system. That said, the above is a bandaid. It may be better long-term to look over how despawn occurs in your game and possibly make adjustments there.

    Hope this helps!
     
  16. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    Slight progress:

    Going over the XR rig, we found a duplicate of our weapon collision prefab that was causing the IsNoteHittable() errors. like, somehow a copy of the prefab got attached as a child of the same prefab. that was.. wierd, but easily fixable. :)

    As for the other issue, yes, it's a nullref occuring on Update. Here's the full stacktrace:


    Code (Boo):
    1. NullReferenceException: Object reference not set to an instance of an object
    2. SonicBloom.Koreo.Koreography.UpdateTrackTime (System.Int32 startTime, System.Int32 endTime, SonicBloom.Koreo.DeltaSlice deltaSlice) (at <691629bd1df1456c83fab6d345f23d76>:0)
    3. SonicBloom.Koreo.Koreographer.ProcessKoreography (System.String clipName, System.Int32 startTime, System.Int32 endTime, SonicBloom.Koreo.DeltaSlice deltaSlice) (at <691629bd1df1456c83fab6d345f23d76>:0)
    4. SonicBloom.Koreo.Players.VisorBase.Update () (at <691629bd1df1456c83fab6d345f23d76>:0)
    5. SonicBloom.Koreo.Players.AudioVisor.Update () (at <691629bd1df1456c83fab6d345f23d76>:0)
    6. SonicBloom.Koreo.Players.SimpleMusicPlayer.Update () (at <691629bd1df1456c83fab6d345f23d76>:0)
    For some reason it seems to only be occurring on the first level, so I'm doing another pass over the references to make sure I didn't miss anything.
     
  17. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Hurray! Great find! Weird... but good work identifying it!

    So the only thing that could happen in the location indicated by the callstack is that you have a null KoreographyTrack registered on that Koreography. Can you select the Koreography asset in question and take a look at the inspector? If you see an empty item in the "M Tracks" array, right-click on it and click "Delete Array Element". If that's truly the issue then the issue should go away once that empty array entry is removed.

    This might happen if, say, a KoreographyTrack asset was deleted from the project before being removed from the list.
     
    Last edited: Jul 31, 2020
  18. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141


    Bingo!

    I don't think we'd ever have figured that out without that tip, thanks a bunch, Eric! :)
     
    SonicBloomEric likes this.
  19. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Fantastic! Very glad to hear that the issue has been resolved!!
     
  20. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    Hello Eric, i'm diving in your asset since a few days , and it looks very promising.. Thanks for this work ! I do have a question about the manual, page 34 : how can i check if a span event has reached its end ? Is there an event that is fired at this moment ? Otherwise, what would be the path to follow to get this notification ?
     
  21. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    A good question! Please see the "How do I tell when a Span event starts or stops" FAQ entry for an explanation on how to do exactly this!

    In short, there is no specific event that fires when this occurs. Koreographer's callbacks provide you with all the context you need to determine when a Span starts, continues, and stops. Code to do this specifically can be found in the link above!

    Hope this helps!
     
  22. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    thanks for your help ! it's very useful, and as i didn't know about this webiste, i'll dig into it.
    Some questions about the workflow. My goal is to make variations of the level with events linked to the music . It's an endless tunnel.
    - is there kind of debugging during playmode, to watch the audio scroll in real time and check the events ?

    - let's say have two methods that use an float as a payload. I guess that i ve to put them on differents tracks to avoid them to be fired when they shoudln't ? ///For ex : SetSpeed and SetHealth booth use a float but shouldn't be fired at the same time. // So, when i tweak the events, i have to switch from one track to the other track. I guess there is no possibility to watch all the tracks together ?

    have a nice day !
     
    SonicBloomEric likes this.
  23. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    Thnking twice about it, i was wondering if Custom Payload is the easiest way to go, so that i can fill the same track with different events ?
     
  24. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    We have toyed with a way of rendering of the waveform and Koreography data as an overlay in Play Mode but doing so in a general way would be very tricky.

    That said, your question got us thinking. Could we perhaps have the Koreography Editor window monitor playback of the Koreography when in Play Mode and use its positional updates to drive the Koreography Editor window's position?

    It turns out that Yes! We can do that! We were able to quickly add that feature to our development version of Koreographer. We'd love to have you play with it to provide us with feedback (e.g. "It works!" or "I ran into an issue!"). If you're interested in giving this a shot, please reach out to us at our support email and include a link to this post and your purchase Invoice Number. Once we've verified your purchase, we'll send you a prerelease version that includes the feature!

    That is, unfortunately, correct. This is due to the underlying architecture of the Koreography Editor GUI. Supporting this is definitely on our feature request backlog, though we do not have an ETA for when it might arrive.

    That is correct. The idea here would be that you have a "Speed" track and a "Health" track. Alternatively you could use the "Text" payload type and create your own encoding that you parse at runtime, but that's a bit of a hacky workaround for Custom Payloads. And on that note...

    Yes indeed! If you find that you would like to set multiple things at once, a Custom Payload type would be just the thing. (This of course requires that you have the Professional Edition...) Please bear in mind that you will also need to provide your own GUI for this. An example of a Custom Payload GUI that supports multiple data fields can be found with the MIDIPayload type included with the demo.

    Speaking of MIDI, if you're comfortable working in a Digital Audio Workstation (DAW), you could use the DAW as a "multi-track editor" against your audio file. The idea would be that you export the MIDI file from the DAW and then import the data using the [Professional Edition's] MIDI Converter to create your Koreography Tracks. If you're not already familiar with DAWs, though, this might include a substantial bit of "learning"...

    Hope this helps!
     
    tieum67 likes this.
  25. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    thanks for your feedback and the help ! Actually, i own the pro version of the asset, so i ll have a deeper look on the custom payload possibilities. I would be happy to try the "debugging mode" so i send an email.
    And about the midi..wow you speak to my heart ! Actually i'm more a storyteller/musician rather than a dev, so if i can use ableton from times to times ... i would enjoy it.
    I'll let you know how things will go on. Thanks for this asset !
     
    SonicBloomEric likes this.
  26. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Glad to hear it! This video shows an overview of using the MIDI Converter. We'll keep an eye out for that email and will get you a build ASAP!
     
    tieum67 likes this.
  27. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    Hello, i tried to build my own custom payload, which should contain two floats. I used the scripts in the demo. Actually i dont get an error in the console , but it s not working. I mean, i can create a custom track but my custom type doesnt appear in the list (it show the "normal" list of payloads.
    Could you please have on my scripts ?
    upload_2020-9-17_13-44-36.png upload_2020-9-17_13-44-36.png
     

    Attached Files:

  28. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    The spelling and capitalization of the custom Koreography Track type's fields is extremely important. Where you currently have written:
    Code (CSharp):
    1. protected List<TwoFloatsCustomPayload> _twoFloatCustomPayloads;
    2.  
    3. protected List<int> _twoFloatsPayloadIdxs;
    Please rename them to:
    Code (CSharp):
    1. protected List<TwoFloatsCustomPayload> _TwoFloatsCustomPayloads;
    2.  
    3. protected List<int> _TwoFloatsCustomPayloadIdxs;
    Let us know if that helps!
     
  29. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    thanks ! it works... almost. I can select my custom payload and fill the boxes with the values.
    But when the game launches, i get this error, the boxes loose their values and the payload is unusable

    Serialization Error: No 'List<System.Int32> _Int32s' defined in TwoFloatsCustomTrack class!
    UnityEngine.Debug:LogError(Object)
    SonicBloom.Koreo.KoreographyTrackBase:GetFieldInfoOfListWithType(Type, Type, String)
    SonicBloom.Koreo.KoreographyTrackBase:OnBeforeSerialize()
     

    Attached Files:

  30. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    It appears that the name of the List<int> version isn't quite correct. Try changing the field name from this:
    Code (CSharp):
    1. protected List<int> _TwoFloatsPayloadIdxs;
    to this:
    Code (CSharp):
    1. protected List<int> _TwoFloatsCustomPayloadIdxs;
    Note that you're missing the word "Custom" in there, which is part of the type name :)
     
  31. tieum67

    tieum67

    Joined:
    Mar 7, 2017
    Posts:
    61
    indeed, it looks like Ide and AutoCompletion made my lazzy about checking the typo and the names... Works fine. Thanks again !
     
    SonicBloomEric likes this.
  32. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    A message from the Koreographer team:

    Hey folks! We're hard at work building new features for Koreographer. In support of that, we're looking for a few things from you if you have the time:
    1. Do you have any suggestions or features you would like us to consider for Koreographer?
    2. Have you made a game that you would like us to promote? (We would like to showcase the many cool projects we've seen you build over the years!)
    3. Are there any tutorials you would like us to create to help you better understand how to use Koreographer?
    Keep your eyes peeled for a new Koreographer build coming soon. In addition, we'll be creating more tutorials to help you build the music and rhythm games of your dreams.

    Thanks!
    The Koreographer Team
     
  33. Eggpunk

    Eggpunk

    Joined:
    Nov 2, 2014
    Posts:
    39
    Hi @SonicBloomEric and Koregrapher team!

    3. I would say tutorial(s) would help me, and maybe other, better get to point 2. It has been a little bit since I tried jumping into Korepgrapher so I don't have more detailed suggestions for tutorials right now, but I can try to get back into it and see what trouble items/obstacles come up.
     
    SonicBloomEric likes this.
  34. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    That would be amazing! Please do keep us posted! :D
     
  35. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Last edited: Nov 24, 2020
  36. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    Hey Everyone! We just released version 1.6.1 of Koreographer and Koreographer Professional Edition to the Unity Asset Store! Here's a rundown of changes:
    • [NEW] Play Mode Monitoring! When this mode is enabled, the Koreography Editor's playhead will track the playback of the Koreography in the Unity Editor's Play Mode!
    • [NEW] Holding the Shift key while creating Koreography Events in the Koreography Editor will ensure that only a single Koreography Event is created for the interaction.
    • [NEW] [ADVANCED] Support customizing some aspects of the core timing estimation system. This should only be necessary in extreme circumstances (e.g. extreme frame rates).
    • TextPayload Peek UI (shown when a OneOff is selected/hovered) is now somewhat flexible and can show more inline than it did previously. [PRO] Custom Payloads can use this feature and will need to implement the IPayload.GetDisplayWidth() API.
    • [PRO] Improve converting spec-non-conformant NoteOn events in MIDI files.
    • [PRO] Fix Koreography not properly saving newly added KoreographyTrack assets in the MIDI Converter.
    • [PRO] Fix integration compatibility with Wwise 2019.2.0+.
    • [PRO] Fix the Master Audio integration's MasterAudioSuperVisor to locate AudioSource components on inactive GameObjects.
    • Fix core timing estimation in situations with high frame rates.
    • Fix time estimation in scenarios with a Time.timeScale of 0.
    • Fix several C# compiler warnings in integration/demo scripts in recent versions of Unity.
    • Support email addresses updated in Koreographer's Help window.
    • Adjust some spacing in UI elements to support recent Unity Editor font changes.
    If you purchased a previous version of Koreographer and Koreographer Professional Edition, then v1.6.1 is a free upgrade! :D
     
    Last edited: Nov 15, 2022
  37. TrashCanHero

    TrashCanHero

    Joined:
    Jul 8, 2018
    Posts:
    4
    I was thinking about getting Koreographer and I was wondering if it was possible to create an in-game editor so users can create their own maps for a rhythm game?
     
  38. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    This is definitely possible - some Koreographer users have done this very thing. Creating and manipulating Koreography data at runtime is fully supported. The work of actually building a runtime Editor Interface (the UI) is up to you so you can make it look/operate in whatever way suits your game best!
     
  39. TrashCanHero

    TrashCanHero

    Joined:
    Jul 8, 2018
    Posts:
    4
    Okay great! That's all I needed to know, thanks!
     
    SonicBloomEric likes this.
  40. lorewap3

    lorewap3

    Joined:
    Jun 24, 2020
    Posts:
    58
    Hey guys! I'm new to using Koreographer so I apologize if this has been asked before. Is there a built-in mechanism in Koreographer to trigger events on the beat increments, given only a bpm?

    I'm planning to use Midi heavily for particle triggering, but I also have looping animations that I would like to time to chosen beat increments. For example, I have a 128 bpm song. I want the animation to loop on every 1/4 note. I can easily do the math to figure out how long those animations need to be, but without an event trigger from Koreographer it won't be in sync. Or it will start in sync and diverge over time. I want a 1/4 event trigger to force the animation restart so it's always in time.

    So I could create a midi file and just add the beats manually and use that, but that seems tedious to do for every different bpm. And perhaps I loop on 1/8 notes or 1/2 notes or every bar. Can Koreographer do this automatically? In essence, a way to create a koreographer track using just the audio file and setting the bpm. From there I should be able to click "Create Beat Track" to create a track that would trigger an event on every beat. That payload would include whether it was beat 1/2/3/4 of 4.

    Thanks!
     
  41. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    There isn't a built-in mechanism for trigger events on beat by default, but you can easily create this yourself (more on this below!). In addition, Koreographer's Music Time API (available if you use a Music Player like the Simple or Multi Music Player) provides information about what the current time in beats is and there are multiple ways to use that to determine when a beat has occurred.

    You actually have several options with Koreographer here. See:
    1. Trigger animations to start based on a beat event.
    2. Drive the looping animations directly by replacing the standard music clock with a beat clock.
    The first option above is what you've described here:
    While you can absolutely do this with MIDI as you've suggested, a simpler way would be to simply use the Koreography Editor to do it. Here's what you would do (assuming your basic Koreography is set up with the right audio and tempo setup):
    1. Create a Koreography Track for each beat division you want. You might label them something like "beat_4" and "beat_8" just to be extra clear.
    2. For a "beat_4" track, you would set the Koreography Editor to "Draw" mode (click the "Draw" button above the waveform) and then click at one end of the waveform and drag to the right. With default settings, this will leave a OneOff event at every quarter note with no payload. (This assumes you've left the "Snap to Beat" option checked.)
    3. For a "beat_8" track, you would do the same as in step 2 except right before drawing, you would set the "Divide beat by" field (below the waveform) to 2 instead of the default 1. That will subdivide the beats giving you 8th notes instead of quarter notes.
    You can then register for those tracks by listening to the "beat_4" events or the "beat_8" events depending on your target/setup/whatever!


    The second option, however is a bit more slick as it will guarantee a perfect loop each time. This approach requires that you do the following:
    1. Ensure that your looping animation is built to last exactly one (1) second of time! We will be re-writing that "second" word there as "beat" later. In other words, your one second looping animation is the same as saying that the animation loops at a precise 60bpm.
    2. We will write a script that stops Unity from updating the Animator component with solar time (seconds) and instead, updates it with music time (beats). The following MusicalAnimator.cs script was adapted from this post over on Koreographer's forums:
      Code (CSharp):
      1. using UnityEngine;
      2. using SonicBloom.Koreo;
      3.  
      4. public class MusicalAnimator : MonoBehaviour
      5. {
      6.     public Animator animCom;
      7.     public int subdivisions = 1;
      8.  
      9.     float lastBeatTime = 0f;
      10.  
      11.     void Start()
      12.     {
      13.         // Self-initialize.
      14.         if (animCom == null)
      15.         {
      16.             animCom = GetComponent<Animator>();
      17.         }
      18.  
      19.         if (animCom == null)
      20.         {
      21.             enabled = false;
      22.         }
      23.         else
      24.         {
      25.             animCom.enabled = false;
      26.         }
      27.  
      28.         lastBeatTime = (float)Koreographer.Instance.GetMusicBeatTime(null, subdivisions);
      29.     }
      30.  
      31.     void Update()
      32.     {
      33.         double currentBeatTime = Koreographer.Instance.GetMusicBeatTime(null, subdivisions);
      34.         float delta = (float)currentBeatTime - lastBeatTime;
      35.  
      36.         animCom.Update(delta);
      37.  
      38.         // Accumulate the error. Our subtraction next frame from the current known
      39.         //  position will help correct for this.
      40.         lastBeatTime += delta;
      41.     }
      42. }
    3. Add that component (called Musical Animator) to the GameObject with the Animator you want to control with music time.
    4. Play the music; play the animation.
    Please note that you probably also need to call Animator.Play and specify the normalized time using the normalized beat time (e.g.: Koreographer.Instance.GetMusicBeatTimeNormalized(null, subdivisions)). This will ensure that the animation starts in sync with the music.


    Note: The above doesn't handle half-notes or full bars. There are other ways to handle that type of time-step, but the basics are outlined above. One example would be to draw Span events where needed and then call the KoreographyEvent.GetEventDeltaAtSampleTime API and pass in the current sample time. That will give you a normalized time (effectively a percentage) of the current position through the Span event in question. In order to get the current Sample Time in your event callbacks, remember to register for them using the RegisterForEventsWithTime API. There is an example of how to do this in the CubeScaler.cs script included with Koreographer.

    Hope this helps!
     
    lorewap3 likes this.
  42. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
  43. lorewap3

    lorewap3

    Joined:
    Jun 24, 2020
    Posts:
    58
    Thank you Eric for the quick reply! This is wonderful I will give it a shot!
     
    SonicBloomEric likes this.
  44. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    Okay, so we're having some issues updating to 1.6.1, it seems as though the folder the editor dlls are installed to have changed, since we're now getting duplicate dll errors?

    Can you send me the file list so I can manually remove everything for a fresh install of 1.6.1? thanks :)
     
  45. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    There were some minor changes, I believe, yes. The Package Importer should have been able to overcome folder changes, however, as the DLLs' meta files remained unchanged. The Package Importer replaces assets by GUID first and folder location second. This is surprising as our test updates prior to releasing v1.6.1 showed no issues here.

    May I ask what process you went through for upgrading?

    Check out the "Package Content" tab (between "Description" and "Releases") over on the Asset Store page (Koreographer; Professional Edition). You can see the full list of files included over there!
     
  46. NemesisWarlock

    NemesisWarlock

    Joined:
    Jan 21, 2017
    Posts:
    141
    Thanks, referenced the file list and deleted the offending duplicate. Everything's good, thanks :)
     
    SonicBloomEric likes this.
  47. lorewap3

    lorewap3

    Joined:
    Jun 24, 2020
    Posts:
    58
    Hey again Eric! I've had some time to expand upon the MusicalAnimator code you gave me and I would appreciate a little more guidance on a few things.

    I'm enclosing the code below. In essence I did this:
    1) Added BeatLength enum for lengths from 1/32 notes to 8 bars
    2) BeatLength can be updated in real time
    3) I changed lastBeatTime to be set exactly to the currentBeatTime at the end of the update loop. The live switching of beat lengths broke the delta incrementing.
    4) Added currentContextualBeatTime to handle the larger note increments. It was my way of normalizing (if that's even the right word to use) the beat time based on the beat length desired. So for 1/2 note each Koreographer beat lasts twice as long. A 2 bar length makes each beat last 8 times as long, etc.

    I just feel like this is more complicated than it needs to be. I implemented this in GameMaker and just used the animations frame index to animate it. I didn't have to worry about making the animations a certain number of frames or a certain length because it just normalized everything. That's how I'd like to animate these guys but I can't figure out how to do it in Unity.

    But the biggest advantage of doing it complete normalized is that I could use easing functions on them to get more specific with the animations. I could have an animation looping every beat, but using easing functions I could get a huge array of movement within those beats. I could even do elastic our bounce easing. I could have an animation that lasts a bar but bounces into it's final frames. I can't do that by just doing animCom.Update(delta) because it's linear in nature.

    So that's why I have that animCom.Play("IDLE", 0, delta) in there, but it's calling it every update and I feel pretty strongly that's not how it was intended to be used lol. But maybe I'm wrong. At any rate, this code works at any beat length, even switching it live, but can't do easing and still needs animations of length 1 sec. It's also just doing the IDLE state but I'll add the logic for more states later when I solidify the easing.

    My biggest challenge is figuring out how to play the animations using normalized ranges of 0-1. If I could do that I could implement the easing functionality. Being new to Unity I just can't figure out how to do that mixed in with all the deprecated and 3D specific animation elements.

    Code (CSharp):
    1. using UnityEngine;
    2. using SonicBloom.Koreo;
    3. using Sirenix.OdinInspector;
    4. using DG.Tweening.Core.Easing;
    5. using DG.Tweening;
    6.  
    7. namespace LorewaPlay
    8. {
    9.     public enum BeatLengths
    10.     {
    11.         L1_32,
    12.         L1_16,
    13.         L1_8,
    14.         L1_4,
    15.         L1_2,
    16.         L1,
    17.         L2,
    18.         L4,
    19.         L8
    20.     }
    21.  
    22.  
    23.     public class MusicalAnimator : MonoBehaviour
    24.     {
    25.         // Beat Length Selector
    26.         [LabelWidth(80)]
    27.         [EnumToggleButtons]
    28.         [OnValueChanged("ChangeBeatLength")]
    29.         public BeatLengths beatLength;
    30.  
    31.         // Easing type selector
    32.         [LabelWidth(80)]
    33.         public Ease easing = Ease.Linear;
    34.  
    35.         [LabelWidth(80)]
    36.         public double currentBeatTime;
    37.  
    38.         [LabelWidth(80)]
    39.         public double currentContextualBeatTime;
    40.  
    41.         [LabelWidth(80)]
    42.         public int beatDivider = 1;
    43.  
    44.         [Space]
    45.         [LabelWidth(80)]
    46.         public double lastBeatTime = 0f;
    47.  
    48.         [LabelWidth(80)]
    49.         public float delta;
    50.  
    51.         // Internal
    52.         private Animator animCom;
    53.         private int subdivisions;
    54.  
    55.         void Start()
    56.         {
    57.             // Self-initialize.
    58.             if (animCom == null)
    59.             {
    60.                 animCom = GetComponent<Animator>();
    61.             }
    62.  
    63.             if (animCom == null)
    64.             {
    65.                 enabled = false;
    66.             }
    67.             else
    68.             {
    69.                 animCom.enabled = false;
    70.             }
    71.  
    72.             ChangeBeatLength();
    73.             lastBeatTime = (float)Koreographer.Instance.GetMusicBeatTime(null, subdivisions);
    74.         }
    75.  
    76.         void Update()
    77.         {
    78.             currentBeatTime = Koreographer.Instance.GetMusicBeatTime(null, subdivisions);
    79.  
    80.             // "Contextual" beat time normalizes the beat time based on beat length
    81.             // (eg. Divide beats by 2 for half notes, making animation twice as long
    82.             currentContextualBeatTime = currentBeatTime / beatDivider;
    83.  
    84.             //delta = (float)(currentContextualBeatTime - lastBeatTime);
    85.             delta = (float)(currentContextualBeatTime % 1);
    86.  
    87.             //delta = (EaseManager.Evaluate(easing, null, delta, 1, 1.70158f, 0)).Wrap(0,1);
    88.             animCom.Play("IDLE", 0, delta);
    89.             animCom.Update(delta);
    90.  
    91.             // Set last beat time to current contextual beat time
    92.             lastBeatTime = currentContextualBeatTime;
    93.         }
    94.  
    95.         private void ChangeBeatLength()
    96.         {
    97.             // Set Subdivisions for beat and faster than beat (1/32, 1/16, 1/8) animations
    98.             switch (beatLength)
    99.             {
    100.                 case BeatLengths.L1_32:
    101.                     subdivisions = 8;
    102.                     break;
    103.                 case BeatLengths.L1_16:
    104.                     subdivisions = 4;
    105.                     break;
    106.                 case BeatLengths.L1_8:
    107.                     subdivisions = 2;
    108.                     break;
    109.                 default:
    110.                     subdivisions = 1;
    111.                     break;
    112.             }
    113.  
    114.             // Set beat divider to achieve multi bar animations
    115.             switch (beatLength)
    116.             {
    117.                 case BeatLengths.L1_2:
    118.                     beatDivider = 2;
    119.                     break;
    120.                 case BeatLengths.L1:
    121.                     beatDivider = 4;
    122.                     break;
    123.                 case BeatLengths.L2:
    124.                     beatDivider = 8;
    125.                     break;
    126.                 case BeatLengths.L4:
    127.                     beatDivider = 16;
    128.                     break;
    129.                 case BeatLengths.L8:
    130.                     beatDivider = 32;
    131.                     break;
    132.                 default:
    133.                     beatDivider = 1;
    134.                     break;
    135.             }
    136.  
    137.             // Reset values based on beat length selection
    138.             currentBeatTime = Koreographer.Instance.GetMusicBeatTime(null, subdivisions);
    139.             currentContextualBeatTime = currentBeatTime / beatDivider;
    140.             lastBeatTime = currentContextualBeatTime;
    141.         }
    142.  
    143.     }
    144. }
    Thanks Eric! I really appreciate any insight you may have!
    Will
     
  48. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    I'm honestly not sure. You should profile your code and see if calling Play the way you do here shows up as any kind of "heavy". It might simply be a non-issue.

    Another option you have is to use an AnimatorControllerParameter to control the IDLE state's Motion Time parameter. This would replace the delta time handling, allowing you to leave the Animator running on whatever standard clock it does normally. Note that this also assumes that your IDLE state loops indefinitely until something triggers it to leave that state. (And that you don't want/need music-time processing on other states as well...)

    I'm just putting this option out there as the Motion Time is a very poorly communicated "feature" of the Animator system. (It does not control the actual playhead, but forces the animator to use a "pose" from a specific time. The actual timer driving the state remains untouched.) This approach also sidesteps any extra State Machine Behaviour "OnStateEnter/Exit"-style processing that may be configured when compared with the "repeatedly call Play() approach".

    This is pretty straightforward to do with the system you've designed. Specifically, your "delta" value is normalized - it represents the "percentage of the way through the current interval, whether that be a beat, an 8th note, or 4 bars". That is what "% 1" does. You can pass that normalized value into your easing function and it will return an eased normalized value, which you can then pass to Play.

    [As a side note, you do not need "lastBeatTime" when calling Play the way that you do.]

    If you wish to maintain the Delta-time version, then you would store your post-easing normalized value instead of the "lastBeatTime" value. When passing the delta to the Update call, you would subtract your previous frame's post-easing normalized value from your current frame's post-easing normalized value. The only trick to watch out for here is what happens when your last frame's position is larger than your current frame's. This indicates that the normalized time has looped. In this case, the value is "currentFramePostEasingNormalizedTime + (1 - lastFramePostEasingNormalizedTime)". A unified way to do this without an if-statement would be something like:
    Code (CSharp):
    1. delta = (float)((currentEasedTime + 1d) - lastEasedTime) % 1d;
    You would then store your "currentEasedTime" into lastEasedTime in preparation for the next frame.

    Basically, an easing function transforms your 0-1 linear time into a 0-1 non-linear time. This is a Normalized-space to Normalized-space mapping. What you want is the "delta between normalized-and-eased times", which the above gets you.

    I should also point out that if you do use the delta version outlined above, you should move your Play("Idle"...) call into your ChangeBeatLength method. That way, whenever you change the beat length, the playhead position is properly repositioned for playback with the new timing interval.

    Make sense?
     
  49. lorewap3

    lorewap3

    Joined:
    Jun 24, 2020
    Posts:
    58
    Ah! There might be some promise to this. I will look into the Motion Time parameter. But your last sentence halted my excitement a bit. Does doing this method prevent me from beat syncing other states as well? All of the states I'd like beat synced are looped. I don't want to box myself into a corner of only being able to do 1 beat synced state per character. For reference I have a few characters like that:
    1) Crawler - Ground crawler that has idle/walk/jetpack states. The walk and idle are 1/4 beats while jetpack loops at 1/8.
    2) Bird - Small bird with idle/walk/flap wings/glide animations. Idle/Walk/Glide are 1/4 beat loops, but the flap is a non-looping onshot animation. That would play normally. This does mean that after the flap animation, the glide animation that follows won't be syncd right, but I'm ok with that. The glide animation would simply start at where it should be if it'd been playing the whole time.
    3) Flower - Has an idle/bloom state. Idle plays on 1/2 beats, while the bloom is 2 measures long as it's a much more involved detailed loop.


    You're right. I just didn't think to remove it yet.

    There is alot to chew here lol. I see what you're getting at, and you're right I believe that will work, for all easings that don't double back on themselves. But that wouldn't work for elastic/bounce where it actually reverses the animation during playback, possibly a couple times before landing on the final frame. Or even overshoot the final frame a little.

    What I'm looking for is a way to simply use that normalized beat time, pass to an easing function, and then past that value back to the playhead.

    Something like this? (somewhat pseudocode)

    Code (CSharp):
    1. // Gets easing for currentContextualBeatTime (0-1 or percentage complete of beat)
    2. // Applies elasticInOut easing to where the playhead should be based on
    3. //    animationStartTime (0)
    4. //    animationEndTime (currently 1 sec but would love for this to be able to be anything at least up to 10 sec)
    5. currentPlayheadTime = elasticInOutEase(animationStartTime, animationEndTime, currentContextualBeatTime);
    6.  
    7. // Because elastic has overshoot, The currentPlayheadTime could be < 0 or > 1. This wraps around the value so it
    8. //    stays within the valid range of playhead values.
    9. currentPlayheadTime = currentPlayheadTime.Wrap(animationStartTime, animationEndTime);
    10.  
    11. // Set playhead directly to where it should be
    12. animCom.Play("IDLE", 0, currentPlayheadTime );
    13. animCom.Update(delta);
    And up until the last two lines the code is valid. It's that animCom.Update where it falls apart. I tested and without the Update it never actually moves. Even though Play is being called every frame and the time being passed in is correct, without the Update call it never... well updates lol, so it never changes frame. But I don't see how Update can work like this. I don't want to update it by a certain time value, I simply want it to update to show the frame the playhead has been set to.


    Edit: I JUST got it to work! I'm sorry I posted such a long post and seemed to get it working immediately after. I'm still calling Play every frame, but from what I can tell in the profiler it doesn't seem to be having any adverse effects. But I guess time will tell once I have dozens or a hundred characters in the scene.

    Once I re-enabled the animator and removed the Update call it plays as expected. I even tested it with elastic easing, and it works in time!

    Code (CSharp):
    1.            
    2. delta = (EaseManager.Evaluate(easing, null, delta, 1, 1.70158f, 0)).Wrap(0,1);
    3. animCom.Play("IDLE", 0, delta);
    4. //animCom.Update(delta);
    5.  
    When I switch to non-beat sync'd states it should play the normal animation speed. I will probably have a few issues getting the state changes to work right, but I'm much closer! I will still read up on the Motion Time though it might come in handy later.

    Thank you enough Eric! I can't tell you how appreciative I am of you taking the time to help me with this. You were not only fast to respond but very thorough and patient and it really means alot to me. One of the biggest reasons I switched to Unity, the support community :)
     
    Last edited: Dec 1, 2020
    SonicBloomEric likes this.
  50. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    1,090
    No, this would not prevent you from beat syncing other states as well. You would simply need to ensure that all synced states use same Animation Parameter for their Motion Time. In theory that should be enough (I have not tried this). I have no idea what overlapping animation state transitions would look like with such a setup.


    That is very true and an excellent point. I do not think there is a good* way to do this with the Animator system* unless you slam the playhead every frame with Play or use the Motion Time parameter. I have no idea what happens if you supply a negative delta to the Update method. I would not expect it to work.


    Right. Would be nice, wouldn't it. If only Unity supplied us with a way to Force a State to a specific Normalized Time, right?

    (*) Welcome to Unity Dark Arts. I didn't mention this before because Unity deprecated it, but there is a method on the Animator called Animator.ForceStateNormalizedTime where you simply pass the normalized time that you would like the current base layer state to use. I mentioned that this was Dark Arts because it was deprecated back in Unity 4.3. I just checked and it appears to still exist in Unity 2019.4.15 (autocompletes, at least), but there are no guarantees that it will continue to exist going forward (or that it won't have quirks in modern Unity versions). It appears that there are others who would like to see this API resurrected with full support (perhaps you could chime in?).

    (*) Another option would be, of course, to use the "legacy" Animation system instead of the Animator system. Driving the Animation system by setting time directly is incredibly trivial. You simple grab the current state and set either its normalizedTime or time property.


    The trick to this is to simply call:
    Code (CSharp):
    1. animCom.Update(0f);
    Kinda dumb, but that should be enough to have the animation Play at the time you indicated with no further time bump (I have used this trick before).


    Hurray! Congratulations! My guess is that it is processing the delta time and the offset is so small that you don't notice the difference. Regardless, it's good to know that you got something working!


    This should actually work without issue. The normalization is simply a percentage so it doesn't matter what the actual animation length is. The reason I suggested a 1s animation is because Animator.Update uses delta time adjustments, not delta normalized time. It is far easier to treat the clock as "time in beats" that way.

    That said, if you're slamming the playhead to a normalized time (as you do with Animator.Play), then the story is different. If you have an animation timeline of any length that you simply want to loop with every beat (or whatever timebase), you can find the percentage (normalized time) of the way through that timebase (beat, eight note, bar, etc.) that the current time is through and then pass that as the percentage (normalized time) to the position controlling logic. This is one of the nice things about specifying time in "normalizedTime".

    It is possible, however, to adjust the Update values to also work with non-1s-length animations. This simply involves getting the current AnimatorStateInfo from the Animator and then multiplying its duration (AnimatorStateInfo.length) by the normalized delta time. Easy-peasy!


    That's a fair point. You might be able to encapsulate the logic you have here in an Animator State Machine Behaviour which would make setting which states are beat-matched as trivial as adding the behaviour to the states that need it. This would also be trivial if the Motion Time option works for you (which I expect it will...).


    You are very welcome! Thank you for being a Koreographer user and asking your questions here where others may also benefit from the discussion! This has been a good exercise for us as well!

    Best of luck with the project! Please let us know how it goes - would love to see what you're putting together!