I was playing with audio playables and clips. I came to these conclusions : - To draw waveform on the clip, it must satisfy either that it must be a subclass of AudioPlayableAsset, so I must subclass from this, or one other criteria is that the asset must be able to as AudioClip as I see in the package code, but how could this be possible? (If it is (AudioClip) casting instead of as, I could write explicit operator to my playable asset.) - To get live preview, the track must be a subclass of AudioTrack. (?) And AudioTrack create its mixer in internal OnCreateClipPlayableGraph so even with a subclass, I couldn't modify its course because it will create a mixer this way anyways. The only way to have my own AudioTrack is an empty subclass. (so it is just a rename, with different clip type or binding type check). I can't have an entirely original class that programmed to work like AudioTrack. - It follows from that I must use AudioTrack, it's behaviour is that it is looking for AudioClipProperties which was set and get with internal methods (so my class could not emulate this). So in the end, a subclass of AudioPlayableAsset must somehow hack into inherited AudioClipProperties to change it's clip or volume rather than "providing" audio data, before it is passed on to the track to multiply its own volume, panning, etc. adding to it. I am able to use AudioClipPlayable.SetClip properly in my own CreatePlayable override that do base call then hack in a new clip, however the volume must be hacked in via reflection since m_ClipProperties is not protected. This is how it looks like in my subclass : Code (CSharp): static FieldInfo hack = typeof(AudioPlayableAsset).GetField("m_ClipProperties", BindingFlags.Instance | BindingFlags.NonPublic); static Type acp = typeof(AudioPlayableAsset).Assembly.GetType("UnityEngine.Timeline.AudioClipProperties"); static FieldInfo volumeField = acp.GetField("volume", BindingFlags.Instance | BindingFlags.Public); private void HackVolume() { var got = hack.GetValue(this); volumeField.SetValue(got, SummarizedVolume); } public override Playable CreatePlayable(PlayableGraph graph, GameObject owner) { //Somehow we have to hack in the clip property before the playable could be created //since it use the internal SetScriptInstance to pass on value and we have no chance after this. HackVolume(); AudioClipPlayable p = (AudioClipPlayable)base.CreatePlayable(graph, owner); if (AudioObject != null) { p.SetClip(AudioObject.AudioClip); //Can't hack the internal SetVolume here, since the track would like to use it to blend clip prop with track volume. } return p; } - There is no documentation of AudioMixerPlayable and AudioPlayableBinding. I have been guessing that it look for a binding of type AudioSource to use its properties and mixer channel, or use no mixer if no AudioSource bound. However it seems I didn't guessed all the criteria correctly as when I made a no-binding audio track I couldn't hear anything, while the official AudioTrack with null AudioSource binding do produce sound in runtime. What I want to do : - Make a custom track binding-less and output audio like AudioTrack that has null AudioSource binding, so I could see track's name instead of empty AudioTrack slot. - Use Timeline tab to sequence a combination of sound effects by reusing little pieces of AudioClip, to create a new kind of serialized PlayableAsset that is subclassed from TimelineAsset. This may contains multiple unbounded tracks, which use the property of the 1st point so I could preview and design the sequence after putting it in a dummy PlayableDirector somewhere. - Then, I want this asset to be able to live on "main" TimelineAsset as 1 unit, as a clip, and then be able to play it as if all the little audio clips are arranged on the timeline manually. I am fine with not seeing waveform of this one as it is an aggregation of audio, and also fine being not able to preview it at edit time. However it is impossible to design runtime behaviour without knowing the exact criteria that AudioMixerPlayable and AudioClipPlayable works together to output sound. Also then, I found that subclassing from TimelineAsset with [CreateAssetMenu] : - I could create a new kind of timeline asset in my object. - Allows me to click on it in Project panel and arrange my audio around without preview - Allows me to put that in a dummy PlayableDirector and preview it while design. - But it could not become a clip, even if that track has a clip type of my subclass of TimelineAsset, and it has ITimelineClipAsset by inheritance, the mouse do not turn into green + sign when dragging the asset from Project panel to the track. Do I need something like Control Track, but instead "Timeline" track so it accepts the timeline asset and able to make a playable out of it? If there is a solution that need to use 2019.1 new audio backend, it is fine. Thank you.