Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

RhythmTool - Music Analysis for Unity

Discussion in 'Assets and Asset Store' started by HelloMeow, Sep 26, 2014.

  1. jggiles

    jggiles

    Joined:
    Oct 14, 2012
    Posts:
    8
    This is very helpful.
    How can I slow the movement with this equation?
     
  2. jggiles

    jggiles

    Joined:
    Oct 14, 2012
    Posts:
    8
    Got it - just divide that equation by the offset and use that to drive a Lerp between two positions that are closer together.
     
  3. daniFMdev

    daniFMdev

    Joined:
    Jun 21, 2017
    Posts:
    7
    Hi @HelloMeow

    It's my first time posting here, I hope I'm not messing it up...
    I have a question about your RhythmTool. How can I play a song that is in a "RhythmTool" object from a specific frame/beat index? I want to be able to play a song starting from the middle, for example.

    Your plugin is awesome, btw!
     
  4. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    You can do that with RhythmTool.time or RhythmTool.timeSamples. If you need to convert a frame index to the time in seconds, you can use RhythmTool.frameLength.
     
  5. daniFMdev

    daniFMdev

    Joined:
    Jun 21, 2017
    Posts:
    7
    There is no "RhythmTool.time" or "RhythmTool.timeSamples" in your script, unless I'm running an older version, which I doubt. those properties belong to the AudioSource so, for now, I've added this method to RhythmTool.cs:
    Code (CSharp):
    1.     public void PlayAt(int frameIndex)
    2.     {
    3.         audioSource.time = frameIndex * frameLength;
    4.     }
    It seems to be working. I hope this doesn't break anything, I'm pre-analyzing the song, so I think it shouldn't. Does it look good for you?
     
  6. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Oh, I'm sorry. That must be something that didn't make it into the last version. That code looks good, but you might want to set lastDataFrame to frameIndex as well (if you're using the events).
     
  7. daniFMdev

    daniFMdev

    Joined:
    Jun 21, 2017
    Posts:
    7
    I see, that was what I was missing. Thank you very much!
     
  8. XochiInteractive

    XochiInteractive

    Joined:
    May 21, 2018
    Posts:
    19
    Hello @HelloMeow

    I'm currently using RhythmTool for a rhythm game in current development by myself and a friend. We have almost everything set up correctly, but only have two problems we are having trouble solving.

    1.) When the song first begins to play the notes start at the bottom of the screen and all the notes following scroll down. For your example provided the lines start all mapped out and then they start moving when song starts playing. Instead, I'm trying to give a 3 second pause to allow the Rhythm Tool to analyze the song and have all notes start from the top and come down. Sorry if this is worded awkwardly, if you don't understand what I'm saying here I can send a screenshot to help show you what I'm talking about.

    2.) We are also running into an issue when the player hits a key at the corresponding time of the note the game glitches and freezes up. In your Visual Example the notes disappear when the beat hits, our goal is to place silhouettes in that position of when the beat hits, and if the player fails to hit the button and destroy the note then it scrolls down off the screen. However, if I remove the DestroyGameObject in the UpdateLines function the game won't work. I'm wanting to give the lines/notes life after they are on time instead of being destroyed. Once they go off screen they will be destroyed there by a fail safe.

    I don't know if you can help with any of these questions, but if you can that would be greatly appreciated! Great program by the way, we absolutely love it! We are currently students and trying to learn all that we can.

    -David
     
  9. XiaodongLi

    XiaodongLi

    Joined:
    May 22, 2018
    Posts:
    2
    Hi!
    I have a demand:I want to get the song`s bpm and offset before playing it. Does your project meet my needs?
    Can you tell me directly whether your product can meet my needs?
    thank you!
     
  10. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    RhythmTool 3 is now available on the asset store.

    For the last year or so, I’ve been working on version 3 of RhythmTool. I wasn’t happy with how it performed and with the way it was designed, so I threw everything away and started from scratch.

    Previously, the different algorithms and analysis results were kind of tucked away and difficult to configure. Both playback and analysis were done in one place, which made it tricky to use.

    This has all been separated. Each algorithm is its own separate component which is easy to configure in the inspector. Playback and synchronization can be done simply by using an AudioSource or by using the included event system.

    Analysis results are provided through a custom asset type. This way it’s easy to reference and save analysis results when needed. Even though there is no real way to edit and inspect the asset yet, this makes it possible to eventually add this functionality.

    I want to make it possible to create custom beatmaps. I am planning on adding a beatmap editor for the Unity editor and in-game.

    I have rewritten beat tracking and onset detection to make them more reliable. All the heavy lifting now happens on a separate thread, which reduces the impact that analyzing a song has on the frame rate.

    The last thing I’ve been working on before getting RhythmTool 3 ready for release was pitch detection in the form of a chromagram. The chromagram represents the most prominent notes at any given time in the song. This isn’t perfect yet for more complex music, but it provides enough information to create interesting patterns that match the music.
     
    skullthug and jashan like this.
  11. Orca91888

    Orca91888

    Joined:
    Jul 17, 2018
    Posts:
    1
    Oh, so far I've found out that your dll file only supports 32-bit windows,. Is my version wrong?
     
  12. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    RhythmTool has no dll files.
     
  13. han1108th

    han1108th

    Joined:
    Feb 24, 2019
    Posts:
    4
    sorry for bad English..:(
    before buy asset, I wanna ask question.

    first of all, I am kind of new to rhythm game and my goal is make a game like BeatSaber.
    and I tried to fallow this works on youtube. (
    )

    now my problem and goal is detect music and make cubes(node) as music goes on.

    after several days dig in, I found your video (
    ) and
    your asset presents BEATS, ONSETS, SEGMENTS which played on video.

    is there any way that can I use ONSETS's data to make BeatSaber's cube or other rhythm game's node?
    or if I buy this Great(really, when I see on youtube, I shocked) asset, other short cut to make nodes or cubes??

    thank you for your help!!!
     
  14. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi han1108th,

    It should be fairly straightforward to use RhythmTool for a game like BeatSaber.

    The visualizer you see in the video is included as an example. You can download the demo here, so you can try out different songs.

    You can find the documentation here. Maybe it can help you decide RhythmTool can do what you want.
     
  15. han1108th

    han1108th

    Joined:
    Feb 24, 2019
    Posts:
    4
    Thank you for your reply!! I will buy this asset soon!!!
     
  16. tim44

    tim44

    Joined:
    May 15, 2019
    Posts:
    58
    Suggestion to make it clearer how to register the RhythmEventProvider in the Documentation. I was stuck on this for a couple of hours. I eventually tracked it down in the RhythmPlayer Targets section of the example code after cussing a bunch :)
     
  17. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Thanks for the suggestion. I agree this was a little vague. I've added a screenshot that should clarify it.
     
  18. majnsejo

    majnsejo

    Joined:
    Jul 28, 2017
    Posts:
    3
    Hello @HelloMeow

    I'm currently using RhythmTool for a rhythm game in current development by myself and a friend.
    We are very interested in "Onsets" in RhythmTool.
    I have a question as to when I added a new song and RhythmTool started onsets, it was very convenient to play it on the unity editor, but it didn't work on mobile,was there any way to fix this case?
    so sorry,becasue my english is not good.
    Hope to receive your answer soon.
     
  19. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    Can you please give me a few more details? In what way isn't it working? Are there any error messages?

    One possible issue could be the DebugDrawer, which can lower the frame rate significantly on mobile platforms.
     
  20. majnsejo

    majnsejo

    Joined:
    Jul 28, 2017
    Posts:
    3
    Hi,
    I am very happy to receive your answer!
    When I use the Scene "AudioImporter Visualizer".
    Everything still works fine on unity and on the .exe file, but it didn't work on mobile (android).
    There are no error messages, simply nothing works on mobile, do I need anything else to use it on mobile?
    I have some pictures to illustrate the problem so it can help you understand my problem better.
    Hope to receive your answer soon.
     

    Attached Files:

  21. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    It looks like you might be using NAudioImporter, which does not work on mobile platforms. If you're using NAudioImporter, you should change it to MobileImporter on the Visualizer GameObject.
     
  22. majnsejo

    majnsejo

    Joined:
    Jul 28, 2017
    Posts:
    3
    Hi,
    Thank u so much!
    It solved my problem XD
     
  23. unity_Q2dkZtL7w8H5HA

    unity_Q2dkZtL7w8H5HA

    Joined:
    Nov 11, 2019
    Posts:
    2
    Hello

    I bought one but when checking the beat and the value in RhythmData, their length is always equal 0. Is it a bug ?

    Thanks
     
  24. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    This is not a bug. All Beats currently have a length of 0, just like Onsets.

    It might be useful to give Beats a length. If you add this at line 115 in BeatTracker.cs, it should give beats a length.

    Code (CSharp):
    1. length = FrameIndexToSeconds((float)beatLength / resolution),
     
    unity_Q2dkZtL7w8H5HA likes this.
  25. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Hi,

    we've bought the asset, and have some questions. It would be great if you could clear this up for us!
    1. There doesn't seem to be a way to get the overall BPM of a song using the API (right?). What would you suggest we do if we want to calculate that - calculate the average of all Beat.bpm values? It seems to me that this might yield wrong results in the presence of outliers though, so perhaps this should be smoothed (or just outliers removed) - any thoughts on this?
    2. What is the relationship between Beat.bpm and Beat.timestamp? Beat.bpm does not (always) seem to be merely 60/(timestamp diff with next/previous beat) - can you explain more what Beat.bpm actually represents at the local position of the beat?
    3. It appears that the algorithm to detect BPM is capped to a low and high value; we can't seem to get the 170 bpm of a certain song, for example (instead, the average is about 86). Is this documented somewhere? I've read 80-160 in this thread, but this was before v3 of the tool.
    4. Is there a way to get a "confidence level" for a beat, or the overall bpm? I.e. I would like to know how confident your tool is that it has found a beat at a certain place, or a certain bpm, so that I can throw away a beat and/or the entire calculation if the value is below a certain threshold.
    5. It is not clear to me how the Segmenter is supposed to be used. What is a single "segment"? A continuous area in the song with the same sound characteristics? Something else? What does the "value" of a segment actually represent? Again, is there a confidence level for these things?
    Thanks!

    Philip
     
  26. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi Philip,

    The best way to find the overall bpm of a song, assuming the song has only one stable tempo, is to count the most common bpm.

    To track the beat, RhythmTool does 2 things. First if finds the most likely beat length. Then it tries to find the most likely point in time at which beats should occur, also called the offset. Both of these can vary a little. Beat.bpm is based only on the most likely beat length, while Beat.timestamp is based on both the beat length and offset.

    Beat tracking is still limited between 80 and 160 BPM. This makes the beat tracking more reliable because it can be hard to determine a BPM outside of this range. For example, it would be hard to tell apart a BPM of 170 or 85, because 170 is a multiple of 85.

    I have thought about adding a confidence level to Beats. This would be useful, since beat tracking can definitely be wrong sometimes. I'll look into it.

    The Segmenter looks for changes in the song based on volume. The segment represents a single point in time where a change is detected. The value is based on the average volume at the beginning of a new segment. There is no confidence for this. False positives occur most often when there is a relatively loud sound that lasts relatively long.
     
    plmx likes this.
  27. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Hi,

    thanks very much for your fast and thorough answer, this already helps a lot!

    I'm not sure I quite understand your explanation of beat length and offset, though. Perhaps I should explain what I am trying to do.

    My issue is this: I am able to find a BPM which matches a manual "tap" as well as the results of other tools (thanks to your insights, I'm now using bucketing to find the commonly occurring BPM as rounded to 1 decimal places, which is usually shared by >95% of beats in my case). However, I am still not able to properly find the song offset (in seconds), by which I mean the seconds between the start of the song and the occurrence of the first beat which falls into the BPM pattern identified as outline above (and from which point on the beats then occur regularly). The idea is to be able to perfectly match the >95% of beats with the given BPM from the start of the song.

    My current strategy is simply using the timestamp of the first returned beat with the most commonly occurring BPM. This works sometimes, and doesn't in other cases. While trying to understand the timestamps, I looked at a dump of all beats returned by RhythmAnalyzer and their BPMs and timestamps, and found that there are sequences of hundreds of beats which share the identical (local) bpm to four decimal places, but are differently spaced from each other (e.g. 0.48 vs. 0.46 apart) when calculating the distances from the timestamps. Why is that?

    Do you have any thoughts on how to find the offset I am looking for?

    Thanks again,

    Philip
     
    Last edited: Dec 6, 2019
  28. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Beats are placed based on the most likely beat length and offset. Both of these can be a little (or a lot) off for various reasons. Different components of the rhythm can be more prominent at different times. There is also a limited resolution, so the true beat length and offset can fall in between two possible values. This can cause the beats to shift around a little.

    The first beat often is not aligned well, because a lot of songs have an intro without a clear beat. Even if a song does start with a beat, it might not be prominent enough to be picked up right away.

    You can find the most common offset by looking at the remainder when you divide a beat's timestamp by the beat's length.

    Here is an example that looks for the most common BPM and offset:
    Code (CSharp):
    1.  
    2. Track<Beat> beatTrack = rhythmData.GetTrack<Beat>();
    3.        
    4. //Group beats by rounded BPM
    5. Dictionary<int, List<Beat>> beatsByBPM = new Dictionary<int, List<Beat>>();
    6.  
    7. for (int i = 0; i < beatTrack.count; i++)
    8. {
    9.     Beat beat = beatTrack[i];
    10.  
    11.     int bpm = Mathf.RoundToInt(beat.bpm);
    12.  
    13.     if (!beatsByBPM.ContainsKey(bpm))
    14.         beatsByBPM.Add(bpm, new List<Beat>());
    15.  
    16.     beatsByBPM[bpm].Add(beat);
    17. }
    18.  
    19. //Find group with most beats
    20. List<Beat> beats = beatsByBPM.Values.First();
    21. int count = beats.Count;
    22.  
    23. foreach(var value in beatsByBPM.Values)
    24. {
    25.     if (value.Count > count)
    26.     {
    27.         count = value.Count;
    28.         beats = value;
    29.     }
    30. }
    31.  
    32. //Find the average BPM
    33. float averageBPM = beats.Sum(b => b.bpm) / count;
    34. float averageBeatLength = 60 / averageBPM;
    35.  
    36. //Find the most common offset rounded to x decimals
    37. Dictionary<float, int> offsetCount = new Dictionary<float, int>();
    38. int decimals = 3;
    39.  
    40. foreach (var beat in beats)
    41. {          
    42.     float offset = (float)Math.Round(beat.timestamp % averageBeatLength, decimals);
    43.  
    44.     if (offsetCount.ContainsKey(offset))
    45.         offsetCount[offset]++;
    46.     else
    47.         offsetCount.Add(offset, 1);
    48. }
    49.  
    50. int n = 0;
    51. float bestOffset = 0;
    52.  
    53. foreach(var item in offsetCount)
    54. {
    55.     if (item.Value > n)
    56.     {
    57.         n = item.Value;
    58.         bestOffset = item.Key;
    59.     }
    60. }
    61.  
    62. Debug.Log("Average BPM: " + averageBPM);
    63. Debug.Log("Best offset: " + bestOffset);
    64.  
     
    plmx likes this.
  29. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Hi Tim,

    Thanks. So the idea would be to use the most common offset, like the most common BPM. I see.

    The issue I have here is that the results seem pretty unpredictable, both in "normal" energetic Dance songs and in more primitive, tailored test songs.

    Consider this 120-beat tempo song which only consists of a drum. The BPM is 120, as correctly identified by your tool (all beats except 4 have a BPM of 119.63). The list of offsets given by your code above is 22 entries long, with the most common one (0.33) having a count of 16 beats. Now I can't really find a reason for the offset to be 0.33, and by extension for the timestamps of the beats to be where they are, which is at 3.34, 3.9, 4.41, 4.92, 5.43, 5.94 etc. - although, in the song, each beat drum appears at 3.0, 3.5, 4.0, 4.5, 5.0, 5.5, etc. and is 0.044 second long.

    I've tried more complex songs. It seems that the (rounded) BPMs are correct in almost all cases, but the timestamps and, by extension, offsets are wrong in at least 50% of cases. Is there something I can do about this using your tool or is this just the way it is?

    Philip
     
    Last edited: Dec 12, 2019
  30. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi Philip,

    The beat tracker isn't perfect and there are cases where it isn't accurate. Unfortunately I haven't found any methods that are much better. However, 50% of timestamps being wrong doesn't sound good. I would be interested to see which songs are causing issues, so I can see what is going wrong.

    I can't replicate the issue with the 120 bpm test song. It's mono and I had some issues getting it to do anything, which is a different issue that needs some attention. After resampling it, it works and it appears fairly accurate, with an offset of 0.01 seconds.
     
  31. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Hi,

    yeah, I realize that beat detection is a difficult business :) What surprises me is that the BPM detection is spot-on, while grabbing the BPM shift (our offsets) seems to be really problematic. You are using a variant of the spectral flux algorithm by Simon Dixon, right? Maybe it's an artifact of that.

    However, since you can't replicate the issue with the test song, maybe there is indeed some other issue as well. My code to kick off detection is fairly simple. I have a game object in Unity with 3 scripts on it: RhythmAnalyzer, BeatTracker, and a custom script which takes the RhythmAnalyzer and an AudioClip as properties, and whose Start method looks like this:

    Code (CSharp):
    1. IEnumerator Start() {
    2.         analyzer.Analyze(clip);
    3.         while (!analyzer.isDone) {
    4.             yield return new WaitForEndOfFrame();
    5.         }
    6.         RhythmData rhythmData= analyzer.rhythmData;
    7.  
    8.        // your code from post #78 above
    9. }
    The result is a BPM of 119,6289 and a best offset of 0,325 - both for the wave file as downloaded, and for a stereo'd version I created in Audacity. Can you replicate it like this? I can send you a Unity project as well if you'd like.

    As for further song analysis and wrong timestamps: In our tests there is almost always one bpm instance with the overwhelming majority of beats behind it (these are beat-heavy dance songs). However, we just can't reliably find the offset for this beat, although we know it exists. The problem is that the list of different offsets returned by the code in post #78 is long and there are 3-10 "top places" with nearly identical counts. This makes it very hard to pick the right offset. Songs we've tried are this, this, and this (in all, BPM is correct, but the offset is not).

    Some additional thoughts:
    • I still don't understand what a per-beat BPM is supposed to mean if not distance to its neighbors, as we see many beats with identical BPM but non-identical distance in timestamp. I know you said above that bpm and timestamp are calculated differently, but it still strikes me as an inconsistency that the BPM is identical for those beats but their timestamp differences to neighbors is not.
    • In our tests, almost all songs had whole digit BPM values, yet your tool returns decimals, which if used directly in a metronome lead to drifting in the beat. I'm not sure whether the values are actually wrong or if there is another issue at play here; using the rounded value works fine.
    We can also take this to email if you'd like. We are really interested in a solution to the offset detection, which, it seems to me, is indeed different from BPM detection. Since we already have a BPM, maybe there is some other way of overlaying that onto the song to find the offset (as a second step).

    Philip
     
  32. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Many beat trackers use spectral flux or something similar, including Simon Dixon's Beatroot and RhythmTool, but the method of tracking the beat is different.

    For me, the mono .wav file gives an offset of 0.03 seconds, while the stereo version gives 0.01. Something weird might be going on here. It would be great if you could send me a unity project. I think it would be better to continue this via email (tim@hellomeow.net).

    The songs you mentioned appear to work fine, except for Alex Beroza - Art Now. This song has a strong back beat, which is what the beat tracker picks up instead of the actual beat for most of the song. This is probably the most common thing that can go wrong.

    The beat tracker first it finds the beat length, or bpm. Then it finds the most likely beat locations, or offset. The per-beat bpm is the most likely bpm at that time in the song. It does not relate to the timestamp directly. The timestamp is determined by the offset. Between two neighboring beats, the offset can fluctuate a little, which moves them closer together or further apart.

    The beat tracker smooths things a bit and has a limited resolution, so the bpm and offset it finds fluctuate a little around the actual bpm and offset. The per-beat bpm value isn't rounded, because a lot of non-electronic music has a variable bpm.
     
  33. amaltheia

    amaltheia

    Joined:
    Feb 4, 2015
    Posts:
    5
    Hi,

    I have tried analyzing around 30 songs and samples, and the first onset listed on RhythmData always has a timestamp between 0.2 and 0.3, even if the sample starts with some sound, and no matter how i set sensitivity parameters on the OnsetDetector component. The files I tried are taken from multiple sources, mostly electronic music production construction kits, and are all in .wav format, and compression format is PCM.
    Actually the only case I could find where the first onset is properly tracked is at the beginning of the song "Coexistenz", where the first onset's timestamp is around 0.04.
    Is this related to how the asset works or to the particular samples I used?

    Thank you!
     
  34. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    When detecting an Onset, the OnsetDetector looks for peaks within a certain length of time. I just found a bug that would contaminate this with a NaN, which could push the first possible onset to around 0.24 seconds or even later.

    Add this on line 104 of OnsetDetector.cs to fix this bug. This should help with the first detected onset.

    Code (CSharp):
    1. if (standardDeviation == 0)
    2.     return 0;
    Another reason why the first onset isn't detected could be that it's not prominent enough compared to the surrounding signal. This can be tricky near the start, because there isn't a lot of signal to use yet.
     
  35. amaltheia

    amaltheia

    Joined:
    Feb 4, 2015
    Posts:
    5
    Hi,
    detection of the first onset now works after the bug fix on OnsetDetector.cs
    Thanks
     
  36. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    Hi,

    In older versions of rhythmtool (~2 years ago), there was an option to track the sub beat as well as the beat. Does such a feature still exist? I can't seem to find anything.
     
  37. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    Hi,

    Nope, currently not. The old version would only be correct for some time signatures, so I didn't want to add this. Here is a script that basically does the same: https://pastebin.com/qem0N0rP

    Just add it as an analysis, somewhere below the BeatTracker. It adds a Value Track with 4 features for every beat.
     
  38. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    Thank you! I was hoping to ask you one more question about this. I'm trying to add an event for subbeats. I have the following:
    Code (CSharp):
    1. myEventProvider.Register<Value>(OnSubBeat);
    2.  
    3.     private void OnSubBeat(Value value)
    4.     {
    5.         Debug.Log("SUB BEAT");
    6.     }
    But it does nothing. I'm not sure about the parameter type here (do I need to create a type subbeat inherited from Value?)

    Apologies if this is a stupid question!
     
  39. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    I'm sorry, I didn't explain how to use it very well. The SubBeatTracker adds a Value track named "SubBeats". To use this track with the RhythmEventProvider, do this:

    Code (CSharp):
    1. myEventProvider.Register<Value>(OnSubBeat, "SubBeats");
    2.  
    3. Also make sure that the event provider is added to the RhythmPlayer that's playing the AudioClip and RhythmData.
     
  40. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    Hi, thanks again for the reply. Your event provider code is what I had tried originally.

    I just looked closer and realized that the subbeat value track is added when I analyze a track (from the project menu), but it adds a size zero array with no features. This happens regardless of what audio clip I use.

    Any ideas? Thanks again for your continued support!
     
  41. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    This appears to be a bug. In the editor, Update isn't called while the progress bar is shown, so the track isn't being updated with the newest features until the end of the analysis process.

    If you add this near the bottom of Analysis.cs it should fix it.

    Code (CSharp):
    1. #if UNITY_EDITOR
    2.         void OnEnable()
    3.         {
    4.             UnityEditor.EditorApplication.update += Update;
    5.         }
    6.  
    7.         void OnDisable()
    8.         {
    9.             UnityEditor.EditorApplication.update -= Update;
    10.         }
    11. #endif
     
  42. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    That did it. Thank you!

    Was also wondering if you had any tips on segmenting songs. I've tried a few different audioclips but at this point no matter what settings I tweak it can't seem to subdivide these tracks correctly. This is vague, I realize-- was hoping you might have some general advice :)
     
  43. RavenTravelStudios

    RavenTravelStudios

    Joined:
    Oct 15, 2015
    Posts:
    100
    Hi HelloMeow,

    i was playing around with your code because my aim is to measure beats on 1/8 fractions instead of 1/4 fractions, having so a doubled number of lines on screen to catch 1/8 taps from the user. My first attempt would be something like:

    Code (CSharp):
    1.  
    2. maxBeatLength = Mathf.RoundToInt(framesPerMinute / 160);
    3. minBeatLength = Mathf.RoundToInt(framesPerMinute / 320);
    4.  
    That seems to do the trick, but i guess there's a better way...?

    Also, the lines are well synced, but with on lift accents, not the beat ones. Can i tweak this behavior somehow?

    Thank you!
     
    Last edited: Jan 17, 2020
  44. skullthug

    skullthug

    Joined:
    Oct 16, 2011
    Posts:
    202
    Hello, long time user since 1.0.
    I'm currently now taking a look at updating my project to 3.0, but I've noticed that the ability to save/load RhythmData to disk is no longer present. Is that correct?
    Is the analysis now completely live? Is it possible to still analyze the entire song before beginning playback? (this is critical for my project unfortunately)
     
  45. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    It's not included at the moment, but I do have a JSON serializer that can save and load RhythmData. It doesn't automatically save and load data for each song, so you would have to do that yourself. Please let me know if you're interested, so I can clean it up a little and send it your way.

    You can still analyze the full song before playback. You can use RhythmAnalyzer.isDone and RhythmAnalyzer.progress to monitor the analyzer's progress.

    That would be a good way to do it. You could also edit the track or add a new track that adds an extra beat in between existing beats.

    I'm not sure what you mean with on lift accents. I'm not very familiar with music terminology. In some songs the beat can be synced with other prominent repetitive components.

    At line 126 of BeatTracker.cs it uses all frequencies to calculate spectralFlux. It uses the entire length of the magnitude array. If you use only the first part of magnitude, it might help sync with the beat.

    For example:

    Code (csharp):
    1.  
    2. for (int i = 0; i < 20; i++)
    3.     spectralFlux += Mathf.Max(magnitude[i] - prevMagnitude[i], 0);
    4.  
    This means that it will only use lower frequencies. I've found that this can help, but it can also negatively affect the outcome for other songs.

    The segmenter only uses the volume of the song to find segments, so it's a bit limited. Currently there isn't much you can do except for playing around with the parameters, but I'm looking into making it more sophisticated.
     
    Last edited: Jan 18, 2020
  46. skullthug

    skullthug

    Joined:
    Oct 16, 2011
    Posts:
    202
    Hmm, ok. I want to say yes. But I'm also theoretically a couple months away from launching this thing (into Early Access tho). Do you think this is a good idea to try and adapt this in the state you have it?
     
  47. HelloMeow

    HelloMeow

    Joined:
    May 11, 2014
    Posts:
    280
    It uses Unity's JsonUtility, so it should be pretty robust. Usage is also exactly the same as JsonUtility.

    Here is the serializer:
    https://gist.github.com/Hello-Meow/306520e10632109a54e64d1e23c7b355

    And here is a modified version of FileSelector from the example that uses the serializer:
    https://pastebin.com/ysvHgri0
     
    Last edited: Jan 23, 2020
    skullthug likes this.
  48. skullthug

    skullthug

    Joined:
    Oct 16, 2011
    Posts:
    202
    Thanks! much appreciated
     
  49. spelafort

    spelafort

    Joined:
    May 8, 2017
    Posts:
    37
    I had the best results by manually, subtly tweaking the gross volume of my tracks in Logic Pro, and then just tweaking the parameters on the segmenter. It's working pretty well now! Thanks again
     
  50. khos

    khos

    Joined:
    May 10, 2016
    Posts:
    1,476
    Hi,
    I am liking your asset very much, would like to ask for your guidance if possible please.
    My game replies on the BPM part quite heavily, but when there is no music/audio silence I want to the BPM value to decrease gradually until 0 value, currently I am looking at RhythmTool.Examples > Visualizer > OnBeat > the bpm float variable, I notice that for audio with silence or not beat that it does not match the beat, the value seems to stay at what the previous detected beat was.

    I have tried to implement my own code:

    public float PublicBPM;
    public float PublicBPMNOld;
    public float counter;

    private void OnBeat(Beat beat)
    {
    float bpm = Mathf.Round(beat.bpm * 10) / 10;
    PublicBPM = bpm;
    }

    void Update()
    {
    //if bpm idle then slowdown..
    counter++;
    if(counter>5 && PublicBPM!=PublicBPMNOld){
    PublicBPMNOld=PublicBPM;
    counter=0;
    }

    //bpm slow down
    else if(counter>5 && PublicBPM==PublicBPMNOld){
    PublicBPM=PublicBPM-5;
    if(PublicBPM<5){PublicBPM=0;}
    counter=0;
    }

    But the PublicBPM value fluctuate, not stable. Can you suggest an approach to this?