Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official Audio system

Discussion in 'Open Projects' started by cirocontinisio, Oct 13, 2020.

  1. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Thread for discussing the Audio System in the game. From the card:

    A system that is able to play music and SFX on demand.
    Characters, effects, events, can ask the audio system to play a sound once, repeated, and specify requested volume, pitch, location, etc.
    The audio system could also have a few extras to do things like picking a random sound from a list and/or randomise pitch (for footsteps).
    It acts as a centralised system to control volume. The settings modify the volume on the audio system, and the audio system outputs music and sfx with the correct volume each time.

    Extra: start thinking of how this also integrates with the Timeline. Maybe just through a common AudioMixer?

    Card on the roadmap
     
    Last edited: Oct 13, 2020
  2. Fuzzeh

    Fuzzeh

    Joined:
    Feb 7, 2016
    Posts:
    3
    Hi There!

    A while back, I opened a simple PR related to Audio. This was before this card became available.

    It's quite simple right now, but I think it at least provides a foundation in which we can expand on. Right now, it offers the following functionality:
    • Play background music
    • Play sound effects on demand
    • Does not destroy the object on scene load. This allows the background music to continue to play as we transition scenes.
    • Static access to make it easier to use
    • Basic support for using AudioMixerGroup
     
    Last edited: Feb 12, 2023
  3. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    One question - Adaptive music?
     
    davejrodriguez likes this.
  4. RKS13D

    RKS13D

    Joined:
    May 20, 2019
    Posts:
    6
  5. RKS13D

    RKS13D

    Joined:
    May 20, 2019
    Posts:
    6
    What we can do is have 2 structs one for Theme and one for SFX
    Theme will be decided according to which area or the Scene we are in.
    And the SFX can be controlled according to which the events or functions called by the characters.
     
  6. RKS13D

    RKS13D

    Joined:
    May 20, 2019
    Posts:
    6
    Music design is done by the Unity Team as per Code Deck Roadman.
    But if they decide to use it then it may be great.
     
  7. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    Do you mean like having music in two or more parts, solo and full band (for example) and fading up/down full band/other parts based on current game intensity or some other parameter? Because that would be awesome.
     
    Neonage likes this.
  8. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    Yeah, we can have unique arrangements for each location or section of the game, and add new layers as the player is making progress. Also we can create seamless transitions between beach, town, forest or cage, all just by changing instruments of the main theme!

    For direct examples, check out this beautiful video by GMTK that covers this topic:


    And let me know if were going for it! I'd love to help with making arrangement tools, and maybe even with some chunks of music (I'm an amateur composer, but it would be hell lot of experience) :D
     
  9. Zantis

    Zantis

    Joined:
    Jun 20, 2016
    Posts:
    6
    Well, the idea is obviously pretty nice. I also don't see anything hindering us there. If i'm not mistaken, we should be able to easily achieve this by some triggers (invisible colliders), which would then change the music once the player enters a specific area (this would also work on any other condition, not just on collisions of course).

    I also don't see too much work from a programming side, as this should be solved by a simple fade-in fade-out of two soundtracks, right?

    So as much as programming goes the AudioManager only needs one method like transitionMusic(newSong), which would then fade-out the old song and fade-in the new song. Though this might need a bit of tweaking on where to start the new song.
     
  10. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    First of all create ScriptableObject that will hold reference to audio and never use bare AudioClip on your prefab/components. Only drag and drop AudioClipScriptableObject.
    Naturally fields on components should be of type IAudioClipScriptableObject.

    In my project I have something like this.

    Code (csharp):
    1.  
    2. public abstract class IAudioClipScriptableObject : ScriptableObject{
    3.  
    4.     public AudioClip GetAudioClip() {
    5.         return OnGetAudioClip();
    6.     }
    7.  
    8.     protected abstract AudioClip OnGetAudioClip();
    9. }
    10.  
    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4.  
    5. [CreateAssetMenuAttribute(menuName = "Game/AudioClipScriptableObject")]
    6. public class AudioClipScriptableObject : IAudioClipScriptableObject{
    7.     public AudioClip audioClip;
    8.  
    9.     protected override AudioClip OnGetAudioClip() {
    10.         return audioClip;
    11.     }
    12. }
    13.  
     
  11. Zantis

    Zantis

    Joined:
    Jun 20, 2016
    Posts:
    6
    Hey koirat,

    i would be interested in knowing

    a) what you are getting at, as i can not connect this to anything within this discussion
    b) why it is important to wrap the AudioClip in a ScriptableComponent

    ?
     
  12. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    a) the topic is about audio system, why it is not connected ?

    b) Flexibility.
    You can always swap the content (even runtime).
    You can extend ScriptableObject whenever needed, you could for example chose random AudioClip from a list.
    You can create multiple AudioClipScriptableObject that refer to the same AudioClip, than just change some of them in the future.


    In my project I put this class on a component that is going to spawn the sound. (some things are specific to my project).

    Code (csharp):
    1.  
    2. [Serializable]
    3. public class SoundInfoData{
    4.  
    5.     public IAudioClipScriptableObject audioClipObject;
    6.    public float baseVolume=1;
    7.     public float basePitch = 1;
    8.     public float SoundGeneratorRadius=-1;
    9.    public SoundPriority soundPriority;
    10.    public bool isMusic;
    11.    public bool ignorePause;
    12.    public bool ignoreTimeScale;
    13.    public AudioSourceType audioSourceType;
    14.     public GameObject customAudioSourcePrefab;
    15. }
    16.  
    17. //40 is arbitrary.
    18. public enum SoundPriority{
    19.  
    20.    Mandatory = 0, //music || ambient
    21.    VeryHigh = Mandatory + 40, // GUI || InfoSounds
    22.    High = VeryHigh + 40, //Events
    23.    Medium = High + 40, //Characters
    24.    Low = Medium + 40, //Background
    25.    VeryLow = Low + 40, //World
    26.  
    27. }
    28.  

    I definitely will change if possible
    Code (csharp):
    1.  
    2.    public AudioSourceType audioSourceType; // <--my enum
    3.     public GameObject customAudioSourcePrefab;
    4.  
    for a ScriptableObject in the future.

    And damn it i just realized I have got SoundGeneratorRadius in PascalCase. I'm so angry right now.
     
  13. Yilos

    Yilos

    Joined:
    Oct 4, 2014
    Posts:
    1
    I opened a PR with a sound system implementing a sound manager that has functionality to modify, save and load volume of mixer groups, manage a pool of audio sources wrapped in a custom class so its easily extendable while being relatively simple.

    Link to the PR:
    https://github.com/UnityTechnologies/open-project-1/pull/90
     
    davejrodriguez likes this.
  14. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    Hah I was just working on almost that exact implementation last night. First impression is that it looks solid! I will try to leave a detailed review over my lunch break.
     
  15. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    In the video below I present my idea for the audio system. I'm going over how to integrate sounds and how to trigger those sounds through scripting, using the audio system prototype.

    It's a rather lengthy video, but I wanted to explain it in more detail to make sure it's fully understood. I hope my explanation communicates the idea well enough, but you can let me know.

    What I completely missed to mention in the video is that the AudioCue picks a random sound from the list that is then played. That's why the list exists in the first place.

    Please post constructive feedback what you think.



    Tagging Unity staff who seem active in the open projects forum @cirocontinisio @superpig @MileyUnity , hope you don't mind.

    PS: Sorry for bad English/accent :oops:
     
    davejrodriguez and Neonage like this.
  16. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    Hmm, the idea is solid, but I'm not sure whats the benefit of using prefab for that.
    May we have a link to your branch or PR?
     
  17. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    I recorded a video where I try to explain it:


    There is no branch or PR yet. I wanted to get the overall workflow-design approved first. Then I go and make the code "compliant" and push it.
     
  18. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    Generally I'm using abstracted ScriptableObject as you can see in my example. (IAudioClipScriptableObject)
    The reason for this is that It gives me flexibility. For example I can create ScriptableObject that will use prefab.

    Additional when you use ScriptableObject you will be able to chose from a list when filing field on a component. With pure prefab you don't have this functionality.

    Now this IAudioClipScriptableObject only holds AudioCip.
    I got different components to play them, there you can have more settings, as AudioSourceType etc.
     
  19. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    It's a great idea to use an interface.

    In order to use the interface to its actual potential, so that it doesn't matter if the implementation is a ScriptableObject or Component, requires to expose the IAudioClipScriptableObject interface to the Inspector, rather than the type that implements that interface.

    How do you make Unity 2019.4 serialize an interface? Using the SerializeReference attribute does expose it, but there is no UI:
    Code (CSharp):
    1.  
    2. public interface IAudioClipScriptableObject { }
    3.  
    4. public class NewBehaviourScript : MonoBehaviour
    5. {
    6.    [SerializeReference]
    7.     public IAudioClipScriptableObject audioThingy;
    Unity does show the field "audioThingy" in the Inspector, but it doesn't allow me to pick anything. Do I need to implement a custom property drawer for it?
    upload_2020-10-18_19-47-7.png

    I would like Unity to display a "Selection window" for ScriptableObjects that implement IAudioClipScriptableObject, as well as Prefabs that have components that implement the IAudioClipScriptableObject. How would I do this?


    Exposing the ScriptableObject that actually implements the interface, rather than using interface, unfortunately defeats the purpose to use an interface in the first place in my opinion.

    Because then it's again bound to a specific type, in this case the "abstract ScriptableObject". Means if I implement the IAudioClipScriptableObject in a Component instead of a ScriptableObject, I would need to expose the component to the Inspector. So we would end up with different types and the user needs to understand when and where to use what. That's not ideal in my opinion.
    Code (CSharp):
    1. public class NewBehaviourScript : MonoBehaviour
    2. {
    3.     public AudioClipScriptableObject audioSO;


    That's true. It reveals another weakness of the Unity editor that Unity Technologies should solve. The Open Projects initiative is a great opportunity for us to show Unity staff where the tech lacks. This is another prime example where functionality is missing.


    Can you provide examples of those different components that implement the IAudioClipScriptableObject interface? What is the AudioSourceType for?
     
    Last edited: Oct 18, 2020
  20. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    For a little more take a look at few previous posts.
    Since we cannot use interface we have to cheat it out with abstract class.

    Code (csharp):
    1.  
    2. [INDENT][LIST=1]
    3. [*]public abstract class IAudioClipScriptableObject : ScriptableObject{
    4. [*]
    5.  
    6. [*]    public AudioClip GetAudioClip() {
    7. [*]        return OnGetAudioClip();
    8. [*]    }
    9. [*]
    10.  
    11. [*]    protected abstract AudioClip OnGetAudioClip();
    12. [*]}
    13. [/LIST][/INDENT]
    14.  
    The system is more complicated, I got my SoundManager that plays SoundInfo object. Programmer almost never use AudioClip or AudioSource during programming. All this is hidden from him.

    I use for example this component to play sound.


    Sound Source - position of the sound.
    Audio Clip Object - Audio clip that will be played (abstracted)
    Base Volume - Base volume of the sound. This will let you simulate the volume of your sound. It will work like your AudioClip had lower volume if lower than 1.
    Base Pitch - Same as base volume but for pitch.
    Sound Generator Radius - As I said I'm using my own layer around unity3d audio system. In my system only sound inside a FOV of camera are played but sound got radius so if the sphere is intersecting with fov it will be played. -1 = infinite radius.
    Sound Priority - Basically audio source priority but turned into ranges and "enumed" (look my previous post).
    Is Music - is sound a music. Mostly for volume control/muting of music in game.
    Ignore Pause - Sound is played when pause is turned off.
    Ignore Time scale - Sound ignore scaling of time. (no sound distortion)
    Audio Source Type - this were predefined AudioSource configurations (like long range / short range sounds etc.). But I will change it for ScriptableObject in a short future.
    Custom Audio Source Prefab - when Audio Source Type set to custom will use this prefab to create AudioSource, but as i have said this will be changed for scriptable object in the future.
    Loop - loop
    Start At Random Position - sound start at random position, useful for some looping sounds, especially when started simultaneously.
    Enable Disable Aware - When component disabled/enabled should sound stop/start.
    Smooth Enable Disable - on enable disable sound should gradually increase/decrease it's volume (no cutting hard, no starting out of the blue) (useful for sound transitions)
    Smooth Dimnish/Amplify Speed - the speed of this volume change.
     
    Last edited: Oct 18, 2020
    Peter77 likes this.
  21. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    Thanks again for your feedback. I now changed it to use ScriptableObjects instead of Prefabs. If that's what the community wants, then be it. :)

    I actually wished Unity staff would have shared their thoughts on this too, but you can't have everything.

    Please see video below for the update.

     
  22. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    Don't scare me like that !
    I just wanted to show how I do it and that it worked for me quite well.
    I'm definitely not the will of community.
     
    cirocontinisio and Peter77 like this.
  23. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    That's generally true in life, but never despair :)

    I watched the video, it's great work!! I also agree with the others that SOs are the way to go to encapsulate AudioCues.
    In your last video, at 5:26, you say you can't define a template for the sound settings anymore because of SOs. That's not true, actually: you could have another type of SO just for that! And then link it inside your AudioCue, so that several AudioCue SO could refer to the same AudioSettingsTemplate SO. That would be a really powerful workflow, allowing us to control all settings from one centralised location (in fact I think we would only need 2-3 AudioSettingsTemplate SOs in the whole game).

    As the others suggested, I'd suggest too to open a PR and start exposing what you're working on. People could comment faster, and potentially make contributions to your PR too before it's merged.
     
    Peter77 likes this.
  24. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    sally98 and Amel-Unity like this.
  25. fran_m

    fran_m

    Joined:
    Jul 22, 2020
    Posts:
    40
    One question from my ignorance of Unity.
    I have seen (not only in the audio thread), that when someone creates a manager (AudioManager for example), the manager is always attached to a game object. Would not be possible to create the managers as a singleton and save resources of creating new gameobjects to contain just a manager?
    Thanks
     
  26. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    Ideally, we'd put the managers into a ScriptableObject and not use singletons at all. Then the managers would be available across scenes by asset reference rather than by singleton lookup.
     
  27. fran_m

    fran_m

    Joined:
    Jul 22, 2020
    Posts:
    40
    Ok I understand now for big projects.
     
    Last edited: Oct 29, 2020
  28. MansyrevAY

    MansyrevAY

    Joined:
    Nov 10, 2016
    Posts:
    14
    Hi guys! I'd like to jump on the train, and to contribute to the project. I think I'll read the comments it the PR and start from there. Hoping to get some work done in a few days
     
  29. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    I don't know to which one I should participate, as they all don't follow consistent design guidelines (of using ScriptableObjects) and feel kinda clumsy.
    I like @Peter77's system much more, but he didn't made PR.

    Do I better make my own? I'm kinda lost :confused:
     
  30. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Yes, I was also hope he'd make a PR before I pull together all the ones we have, but I guess he's busy with other stuff (understandable).

    I'd say making yet a new one is not the best idea :D
    I will pull something in as soon as possible, maybe today (but can't make any promise, it's Sunday after all :D)
     
  31. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    I'm doing a big refactoring and getting ready to pull a bit of code from all 3 PRs from elocutura, fqureshi and RKS13D regarding the AudioSystem, so please don't push new code or close them :)
    Or, worse, make a new AudioSystem!! :D


    @Peter77 I'm also integrating some of the stuff you've shown in the video!
     
  32. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Hey all, I finally merged stuff on the AudioSystem. I took something from all PRs, a lot from elocutura's one, and a bit even from the solution that @Peter77 showed above. There's a bit from everyone! Great job.
    You'll find it merged on main if you want to explore it. We'll show it in subsequent livestreams too.

    But as much as I like the current solution, there's still a long way to go!

    We had a big discussion with the team this morning and @ChemaDmk convinced me that just having pooled AudioSources is not enough. They are good for one-off effects, like it would be for a game with physics where collisions make noise (which we obviously don't have), but there is also a point in having AudioSources already placed in the scene - for instance for ambient sounds. No point in requesting those from a pool: we know where they are, we know how many they are in advance, and we know we need them.

    The other point is that we will also have timelines. How do we deal with those? They will contain AudioTracks, which only tie to AudioSources (track binding). And frankly I don't want to write a custom track just to be able to interface it the timelines the AudioManager, then we'd lose a lot of functionality.
    So I'm inclined to think that for those, we'll just route the Timeline sound through a series of prepared AudioSources, which will route through an appropriate AudioMixer for volume and other effects.

    And for the volume(s), our AudioManager will control that but also set it on the AudioMixer, so both dynamic sounds (i.e. the ones that are pooled), ambient ones (the one preplaced in the scene) and Timeline sounds will have the same volume.

    Makes sense?
     
  33. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    I'm super confused by AudioCueEventSO - what is this thing, how do we use it and why?
     
  34. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Sorry for the late reply.

    This has been renamed in
    AudioCueEventChannelSO
    . The idea is that, like all other events SOs (which I all renamed be "event channels"), this particular SO class acts as a channel on which
    AudioCue
    * MonoBehaviours can request an SFX or music to be played.

    The
    AudioManager
    on the other scene would be listening to this channel SO, picking up the request, and would play the sound (or multiple sounds, since
    AudioCue
    s can be made of many).

    * I'm thinking of renaming this to
    AudioCuePlayer
    or
    AudioCueRequest
    for more clarity, since it's not the data but the script who fires the request to the manager.
     
  35. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Continuing the conversation with @davejrodriguez in the Object Pooling thread....

    To clarify why I'm thinking that all audio requests should go through the
    AudioManager
    class (through the event-based mechanism explained above), here is a list of all the things that the
    AudioManager
    should be able to do (and I might be forgetting something):
    • Play pooled one-off sounds "on request"
      • Jump SFX, Land SFX. Request comes from
        AudioCue
        on player Gameobject.
      • UI sounds (2D)
    • Play pooled looped sounds "on request"
      • Ambient loops, like a fire crackling
    • Play, fade out and cross-fade music
      • Ambient music (2D)
      • Entering a new scene: new music
    • Be able to ignore play sound "requests" if the sound is too far
      • Mostly for one-off requests, like an NPC or enemy making a sound far away
    • Turn off completely looping sounds that are too far (?)
      • Fire burning in the distance, once the camera exits the falloff range
    • Be able to stop all sounds, to resume them later
      • When player pauses the game
      • When a cutscene starts
      • When game goes back to main menu
      • When player enters a different scene
    • Be able to apply volume settings to all SFX and music
      • To one-offs
      • To ambient loops
      • To Timeline-produced sounds
    • Be able to apply a filter to all SFX and music (like Ducking). Most probably done directly on the AudioMixer (so all sounds need to be passing through it).
      • When player opens inventory or is in a dialogue
    So to answer your question in the other thread. Yes, we could think of a scenario where sounds have no connection to the manager, and they just play the sound on their own. This way they would have access to a pool of
    SoundEmitter
    s and request to play a sound independently.
    The problem I see with this is for instance for sounds that need to be ignored because they are too far away. The
    AudioCue
    shouldn't know where the camera is, but the
    AudioManager
    might? (I need to figure out how)

    On the other hand, as @ChemaDmk pointed out in a conversation we had, there is also a point in configuring an looping ambient sound in the scene, "the old way", by just tweaking an
    AudioSource
    . I'm still figuring out how to bridge that too. My current thinking is that when you do that, you put also a
    SoundEmitter
    script on it which acts as the controller. The
    AudioManager
    would find that script, and put it in a list of "currently active sounds" (not in the pool, eh! another list), to be able to stop/resume/fade/etc.
     
  36. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    At first glance, I see nothing in that list that couldn't be done local to the requester. Whether it should is another question. I'll try to put together some examples to see if they work well.

    Doesn't the engine already do this via 3D sound settings + priority on the audio source? I know a virtual voice is different than not playing the sound at all, but do we really need that? Is the performance difference that much? We could tighten up the max distance and reduce the max virtual voices allowed in the audio settings if we're trying to save processing.

    Trying to recreate this functionality by hand seems like a trap. What if the player moves to within range of an audio source in the middle of when it was supposed to be playing? Now it's not at the correct playback time. Whereas the engine just keeps the playback going and switches between real and virtual as needed.

    Yeah, I think there are many ways to go about this. What you propose sounds fine. If it were me, I would have a MusicPlayer component that just responded to events to do all the playback functions itself. But at this point, I'm just a decentralization evangelist lol.
     
    cirocontinisio likes this.
  37. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Yeah, totally. Also, this is not an Open World game that we will have dozens of AudioSources playing in the distance... I think scenes shouldn't be bigger than something like 200x200 meters (at least the playable area). So yeah, I agree with you. Let's not reinvent the wheel :)

    By the way I've done a lot of work and refactoring on the
    AudioManager
    . Check it out on main, there's an example scene under
    /Scenes/Examples/
    .
    The fact that it should keep track of what's playing and what's not is still not there. I was thinking about some more implications, like: should we track 3D sounds (i.e. action in the scene) in a different list than 2D sounds from the UI? This way, when you pause the
    AudioManager
    could pause all gameplay sound, but you could still hear UI sounds.

    In reality, this poses yet another question: when you pause, does the world freeze? With enemies stopping in their track, the player stopping mid-jump, particles freezing in the air, and sounds pausing, ready to be resumed? If that is the case, and I think it is, we should start thinking about this and incorporate it in all systems we make.
     
    davejrodriguez likes this.
  38. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    I agree. In a purely singleplayer game such as this, I'd expect everything (except perhaps music) to pause. Did you have any ideas on how to integrate pausing into systems? Time.timeScale works really well to pause just about everything other than audio, but I think it would interfere with UI animations and probably other things in menus. Could be event based or an IPausable interface...Sounds like a new codeck card coming on Tuesday? :D
     
    cirocontinisio likes this.
  39. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    Unity supports this already.

    AudioSources that should continue playing when audio is paused, must set ignoreListenerPause = true. If you want to pause audio, you set AudioListener.pause = true. This causes to pause all sounds except those that are set to ignoreListenerPause=true.

    You don't need to keep a list of sounds and call audioSource.pause for each of them if you just need the functionality described above.

    Please implement it, because it shows another case that real-world games must cover and it's a great opportunity for Unity Technologies to learn that Time.timeScale=0 might not be a great way to pause a game.
     
    davejrodriguez likes this.
  40. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
  41. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    I don't think we should use Animators for the UI anyway, it's not really good practice (for performance, and for other reasons). We're going to use the tweening library DOTween. So we shouldn't have a problem in that sense.

    So given that... maybe
    Time.timeScale = 0
    should... work?

    It's a great suggestion, Peter. I wonder if we need more functionality around it, but maybe I'm overthinking.
     
    davejrodriguez likes this.
  42. Botaemic

    Botaemic

    Joined:
    May 6, 2019
    Posts:
    3
    What about playing a sound with variable length? Best way I can explain is for instance a missile (different lenght of cooking). A missile could hit in 1 sec or 10, same with a recipe. A missile would start with a launching sound then continues with a looped engine sound until it explodes or runs out of fuel.

    Shouldn't there be some way to control the Soundemitter from the missile object then (through the AudioManager)? a way to say stop looping the enige sound clip.
     
  43. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    Ideally yes, and that I think could be done easily. But I wouldn't do it until we really need it.
    What would be the use case? We don't have missiles :) and for the cooking, I'd stick to a simple model where it's a short, unified loop (i.e. 1-2 seconds) + animation. Also because once you cooked many times, it will become boring so you want it to last as short as possible.
     
  44. Smurjo

    Smurjo

    Joined:
    Dec 25, 2019
    Posts:
    296
    I think that also the background music would be looping (possibly over several clips) as long as the player doesn't do anything special. But then the background music needs to be stopped or even faded out e.g. for a cutscene, since the cutscene would have it's own audio. After the cutscene the same or a different background music needs be started.
     
  45. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    271
    I still don't get the idea behind this "Event Channel", it look like an extra complexity layer for every event system.
    What problem does it trying to solve? Why does it uses C# Events? Why can't we directly call abstract SO's event methods, like "Play"?
    Was there a thread discussing this events workflow before?
     
  46. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    I'm interested in this as well, @cirocontinisio can you enlighten us?
     
  47. cirocontinisio

    cirocontinisio

    Joined:
    Jun 20, 2016
    Posts:
    884
    We first mentioned these SO-based events in the second livestream, here. There is also a wiki page about them.
    At that time we were just calling them Events, we have renamed the class to "EventChannel" but the concept is the same. The wiki is updated.

    Mostly decoupling the AudioManager (or any event listener) from the object that fires the event.
     
    Peter77 likes this.
  48. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    Hey @cirocontinisio , I just watched your Audio System segment of Episode 4. Thank you for the shout-out :)

    This is pretty much what I had in mind, but there are a few things that you could improve in my opinion.

    For example, move the
    AudioConfigurationSO
    from the AudioCue to the AudioCueSO.

    The reason why you want this in the AudioCueSO is that entire sound setup is then done in a single place: the AudioCueSO asset. The advantage of having all sound settings for that particular "sound definition" in one place is that it's much easier for a sound designer to tweak those sounds then.

    Otherwise you have three places where the configuration of a single sound takes place:
    • AudioCueSO to define what sounds should be played
    • AudioConfigurationSO to define how something is played
    • AudioCue where the AudioCueSO and AudioConfigurationSO are glued together
    ... and the AudioCue then sits on prefabs in the project and on GameObjects in various scenes. That doesn't scale when the project gets bigger.

    Having the AudioCueSO and AudioConfigurationSO separation also leaves more room for errors. Means the same AudioCueSO could be used with different configurations. While this might seem powerful, it's something you usually don't want/need.

    The AudioConfigurationSO is most likely not sufficient to tweak sounds. You want to allow the sound designer to override settings from the AudioConfigurationSO per AudioCueSO. For example, if you have a "3D Audio" AudioConfigurationSO, you still want to allow to overwrite the volume per AudioCueSO, even more likely per AudioClip.

    The other option would be that the mix is already perfect in the .wav files, but you lose a lot of iteration speed if you need to: change playback volume in the .wav, then export and import in Unity. It's faster to be able to make tweaks in Unity while the game is playing.

    I would also move the "AudioCueEventChannelSO" to the AudioCueSO, because if you design an AudioCueSO as sound effect (PlaySFX_Channel), you want to use it on this channel everywhere. You don't want to play it suddenly on a different channel, eg as music, this wouldn't make much sense. This makes using sounds a lot easier to handle, because AudioCueSO contains all the relevant information that is needed then.

    PS: The whole "being able to override some settings" template feature you'd get basically for free when using prefabs rather than SO's, as explained in the video that I posted earlier.
     
    Last edited: Dec 13, 2020
    Neonage and davejrodriguez like this.
  49. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,589
    @cirocontinisio Another question that comes to mind is how do you plan to manipulate playing AudioCue's in that design?

    I can't think of a game where I didn't need to change volume or spatialBlend of a playing sound, or the position of it.

    AudioManager.PlayAudioCue seems to exist to play sounds, but it doesn't return a "handle" that could be used to manipulate those sounds it started.
     
    davejrodriguez likes this.
  50. davejrodriguez

    davejrodriguez

    Joined:
    Feb 5, 2013
    Posts:
    69
    This is along the lines of a question I posed in the Generic Object Pool thread and never got around to asking here. At some point, we'll be fighting with the fact that we're not giving the requester access to the requested object.