Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

AudioStream - {local|remote media} → AudioSource|AudioMixer → {outputs}

Discussion in 'Assets and Asset Store' started by r618, Jun 19, 2016.

  1. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    sorry about that hah / yea it's wrong in .txt in couple of places !
     
  2. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    few fixes && updates submitted

    3.2.2 072023 200k+
    Updates/fixes:
    - AudioStreamBase: more resilient on low network bandwidth and partial network outages
    - will now wait in Read until the read request from network is satisfied in blocking fashion
    - it will attempt 'starvingRetryCount' times of 100 ms timeout of blocking reads when audio is missing

    : all locally played media files are accessible/read via UnityWebRequest now, instead of direct .NET file I/O
    - this means that local files should be now playable from Android app archives and asset bundles
    - note that the content of the media is read into memory
    - AudioStream: Unity AudioClip channels should be now fixed/following the network stream format
    - AudioStreamInput*: clarified usage of 'loopback' interfaces applicable only on Windows

    : fixed ASIO related bug in the Editor

    - tested with FMOD 2.02.15

    * this update touched only scripts in 'Scripts' and 'Demo' folders but since new scripts were added it is probably safer to delete these two asset's folders before updating
    / if that fails, please delete 'AudioStream' folder manually via filesystem first
     
  3. sctiendat

    sctiendat

    Joined:
    Mar 14, 2020
    Posts:
    22
    Hello @r618 ,
    Before buy your asset, I want to make sure it can handle my requirements. Now I need to handle ASR and TTS over network, I have tasks for that:
    1. I receive multiple byte arrays continuously from network in a period of time. So I want to stream audio at the first time I got byte array, and play smoothly all data that I got.
    2. I need to use Microphone to take audio data and send it continuously over the network to reduce latency as much as possible.
    Thank for your confirm and if the asset can help, please tell me where can I find solution for that, because I saw too many feautures on your asset. Thank you!
     
  4. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    Hello, so for
    you stack AudioStreamInput / AudioStreamInput2D and AudioStreamNetMQSource on a GameObject and it will expose mic/input audio for clients to connect to
    I intend to add a separate demo scene for this in next update but it's basically the above
    * note that NetMQ doesn't scale to internet

    audio is encoded / decoded using OPUS codec when transported over network - you need AudioStreamNetMQSource/AudioStreamNetMQClient components for this, arbitrary buffers won't work
    if you're talking about raw PCM data that's not something present in the asset since you basically just convert it to float[] audio buffer and play it directly if that's what you mean

    / as a sidenote 'AudioStreamNetMQSource' has currently scaling issues ( it's basically limited to single component/gameobject in the scene ), it should be fixed in next update
     
    sctiendat likes this.
  5. sctiendat

    sctiendat

    Joined:
    Mar 14, 2020
    Posts:
    22
    Thanks for your response. Yeah I talking about raw PCM data. I tried convert it to float[] and create audio clip, it worked with the first byte array. but when I need to add new byte array , It didn't work. I think Audio default system of Unity is not good with stream audio in real-time. So I finding a solution that can support me stream audio in real-time with raw PCM data.
     
  6. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    the components won't work with arbitrary data, even if I add support for something like 'PCM codec'
    with your own source of the PCM audio you'd have to modify NetworkSource component so it uses your audio data and not audio from Unity
    (you will have to convert data array and add/write it to internal queue)
    / AudioStreamNetMQClient/s would then connect normally
     
    sctiendat likes this.
  7. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    .that said, adding support for ~ 'PCM codec' to AudioStreamNetMQSource/Client would probably make sense though since it would allow to at least overcome current OPUS's 2-channel limitation
    but multichannel data could be transported this way directly from unity if bandwidth is sufficient
     
  8. heavisideresearch

    heavisideresearch

    Joined:
    Jul 21, 2023
    Posts:
    1
    I'm planning on having 8 virtual listeners in a scene and would like to route their audio to 8 separate channel pairs (i.e. listener 1 goes to channels 1+2, listener 2 goes to channels 3+4, etc.). This is for an audio installation where there will be 8 pairs of headphones with head tracking that will be controlling the listener location in the virtual scene.

    Does AudioStream support this kind of setup?
     
  9. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    people ask this sometimes but no, the asset doesn't have multiple listeners

    it seems more plausible would be to have a mix matrix which would be updated based on a listener position in the scene but this would work only for a single listener without further development so yeah, long shot unfortunately
     
  10. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    3.2.3 update should be submitted

    3.2.3 082023 Bridge#2

    Updates/fixes:

    - AudioStreamNetworkSource && AudioStreamNetworkClient :
    - cleaned up and updated
    - removed Unity Events for now; they weren't used and properly implementing them left 4 later
    - added PCM 'codec' option which trasnports audio buffer without any compression (see docs for more)

    - AudioStreamNetMQSource : fixed outstanding scaling issues - multiple sources in a scene should behave properly
    - NetMQ.Unity surfaced an old/er Unity bug in Il2CPP builds - minimum suported Unity version is 2019.4.5 which fixed it

    - AudioStreamNetMQSourceDemo : added default microphone to network source
    - scene's AudioListener is captured

    - AudioStreamInput2D and AudioStreamInput_iOS_ExternalDevices:
    : Speex resampler was removed and replaced by simple linear interpolation
    - custom resampler is suitable for all audio and is recommended to use although Unity resampling is still default

    - removed Box/Sphere colliders (which got there, again) from demo scenes
     
  11. Careprod_DevTeam

    Careprod_DevTeam

    Joined:
    Dec 29, 2020
    Posts:
    7
    Hello, I have a question about iOS. Can I use 2 AudioStreamInput_iOS_ExternalDevices to get sound from 2 different input devices simultaneous ?
    I tried with 2 AudioStreamInput2D in a windows project and it works but it seems not working on iOS, the first input device don't get sound when I select the second one (I fear it's limited by the OS)
    Thanks
     
  12. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    HI thanks for question, but unfortunately you're right, at least with current implementation
    (on OS level it's configured via AVAudioSession instance which, for all intents and purposes, is a singleton so only the last route/preferred input is activated)
    I'm not sure how demanding this could be to implement with multiple active input routes, but I suspect the chances are 'very', if at all right now
     
  13. Adrien_CareprodTech

    Adrien_CareprodTech

    Joined:
    Aug 22, 2023
    Posts:
    8
    Thanks for your reply ! I will use only one input for iOS then.
    I'm curious about using different devices for input and output. I ran a few tests and my current conclusions are:
    - changing input also changes output (is this the default?)
    - when I check availableOutputs, I only get "Core Audio output"
    - if I select the iPad microphone, the output will still be my headphones (linked to "Prepare iOS to record" option?)
    - even if I mute the input AudioSource I still hear some echo

    Please correct me if I'm making mistakes :)
     
  14. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    input and output are linked by OS automatically, it can be influenced to a degree, in Unity 'Force iOS Speakers when Recording' in iOS player settings which might be working now, too
    see also https://github.com/r618/AudioStream...691e292271ed/Documentation060_mobiles.txt#L25

    for more atypical inputs such as Bluetooth the plugin needs to activate recording session as well - see https://developer.apple.com/documen...ession/categoryoptions/1616518-allowbluetooth
    (resp. https://developer.apple.com/documen...on/categoryoptions/1771735-allowbluetootha2dp for playback)

    you should be able also see how this behaves here - https://github.com/r618/AVAudioSessionPodTest
    (i'm not sure how much out of date UI dependencies might be there though)

    i'm not sure about the echo circumstances, i suspect this might be either some misconfiguration, or possibly recording from an input with more than 1 component at the same time (?)
    in any case lmk it persists and i'll have closer look
     
  15. Adrien_CareprodTech

    Adrien_CareprodTech

    Joined:
    Aug 22, 2023
    Posts:
    8

    "Force iOS Speakers when Recording" is on but I still don't get the output device list on my iPad. The list only contains "Core Audio output".
    I tested with bluetooth headphones as an input without changing anything and it works the same way as wired headphones.

    Unfortunately, I can't provide you with more details, I can only run my builds on an iPad after a CICD pipeline auto build and I'm not familiar with iOS environments.
     
  16. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    yes that's the output on device, you have to enable external inputs support - and be running 'AudioStreamInput_iOS_ExternalDevicesDemo' scene for external device/s to be included in the list:

    https://github.com/r618/AudioStream...ed/Documentation060_mobiles.txt#L32C27-L32C27
     
  17. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    eh ok, I partly misremembered -
    the #define AUDIOSTREAM_IOS_DEVICES is applicable only when playing back stream/AudioSource to e.g. connected external output
    - it's an outlier case for iOS outputs added later and it should probably renamed to something more descriptive....

    running 'AudioStreamInput_iOS_ExternalDevices' alone is enough to get all connected iOS inputs
     
  18. denischernitsyn

    denischernitsyn

    Joined:
    Apr 24, 2013
    Posts:
    15
    Hello. We are using AudioStream version 3.2.3. Unity version 2022.3.6
    Target Platforms iOS and Android.
    We've faced an exception in the end of the mp3 streaming


    ArgumentException: Reading would overrun buffer
    System.IO.FileStream.Read (System.Byte[] array, System.Int32 offset, System.Int32 count) (at <f78378fc25024256ba9c49776f5e810c>:0)
    AudioStream.DownloadFileSystemCachedFile.Read (System.UInt32 offset, System.UInt32 toread, System.UInt32 mediaLength) (at Assets/AudioStream/Scripts/AudioStream/DownloadFileSystemCachedFile.cs:90)
    AudioStream.AudioStreamBase.Media_AsyncRead (System.IntPtr infoptr, System.IntPtr userdata) (at Assets/AudioStream/Scripts/AudioStream/AudioStreamBase_FS.cs:137)
    (wrapper native-to-managed) AudioStream.AudioStreamBase.Media_AsyncRead(intptr,intptr)
    UnityEngine.<>c:<RegisterUECatcher>b__0_0(Object, UnhandledExceptionEventArgs) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Scripting/UnhandledExceptionHandler.bindings.cs:46)


    The settings that we've set are the following:
    • downloadToCache
    • StreamType : auto || mpeg
    Everything else is set as default. Below you can see some screenshots
    image (7).png
    image (8).png
    Could you tell us what are we doing wrong?
     
  19. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    Hi! Is it possible to send/PM the link/media ? Thanks !
     
  20. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    just for the record; UT added
    upload_2023-9-7_11-15-46.png
    to iOS player settings (not sure when exactly)
    use it if you want only default in/output on the device (and if it works in general)
    / I have to test all things more proper though
     
  21. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    @denischernitsyn Please change line 90 in DownloadFileSystemCachedFile.cs
    this.fileStream.Read(result, 0, (int)toread);

    to
    this.fileStream.Read(result, 0, (int)result_size);


    Sorry ! hah thank you for reporting and hope this helps
     
  22. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    an update with the above bug fixed submitted
    also added ASIO support to the mixer plugin

    3.2.4 092023 1Kilo
    Updates/fixes:
    - tested with latest FMOD 2.02.17
    - fixed end of file bug when streaming/playing remote files
    - added ASIO support to mixer effect output plugin - needs manual setup & care please refer to documentation for more
     
  23. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    / it should be now also possible to add recording from an/any input to the mixer effect/s directly (incl. the ASIO input) - so without the need to go through a scene AudioSource/AudioStreamInput*
    will see how it goes
     
  24. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    basic Audio Mixer input effect from a device is working
    m1.png
    m2.png

    but there's much more to do it properly e.g.
    - mixer has to be made active - it's enough to output a silenced audio into the group - in order for Unity to invoke processing callback, and e.g. remains active as long as there's _any_ audio on a group (from mic) - even after exiting the play mode;
    - things like bypass will stop effect processing so the input lags behind when enabled again\
    I'm thinking about adding either automatic or manual toggle to restart recording in these cases
    - ASIO is exclusive and needs to share resources with e.g. 'OutputDevice' effect, too

    but in general it works incl. stuff like automatic samplerate + input channels conversion, so it always correctly follows Unity mixer format/output..
    e.g. in the screen above the input is a 16 kHz, 4 channels kinect2 mic array
     
    Last edited: Sep 26, 2023
  25. yangriyi98

    yangriyi98

    Joined:
    Sep 7, 2023
    Posts:
    1
    Hello, I have a question about iOS. After importing the AudioStream Asset (even if I don't use it in a component), the iOS app freezes when it is switched to the background and then back to the foreground. e.g. all buttons on the interface become unresponsive. I'd like to ask if you have any suggestions for this issue?
     
  26. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    108
    Is it possible to use this asset to output audio to different channels?

    I'm developing a PC app and I need to output 4 different audio streams to 4 directional speakers. Lets say I have hardware connected that supports that, can I output different audio to each channel?

    maybe with something like this?
    https://focusrite.com/products/scarlett-4i4
     
    Last edited: Oct 9, 2023
  27. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    if the streams are supposed to be independent then

    - the easiest way is to just prepare multi (4) channels audio clips and play them normally via AudioSource
    - or use asset's MediaSourceOutputDevice component - see Output devices / MediaSourceOutputDeviceDemo scene in demo app -, if the HW interface properly registers its output channels in the system, you can then see and play on them individually or in combination
    note that MediaSourceOutputDevice is usable/accessible only via scripting and doesn't use Unity audio

    [ AudioSourceOutputChannelsDemo overtakes the output / is the same for all AudioSources played on it
    , multiple AudioSources each with its own output channel (combinations) are not currently supported.

    it is possible to process an AudioClip/Source (just) via Unity scripting and e.g. add necessary 0-ed out channels at runtime,
    this should be doable by using pieces in AudioSourceChannelsSeparationDemo/AudioClipChannelsSeparationDemo
    this is basically just Unity scripting so was never implemented since the focus was elsewhere so to speak
    I might reconsider in the future, but all this'll take is the above scripting and another demo scene ]

    if the streams are supposed to play spatialized on multiple speakers, then just use Unity built-in spatializer or use asset's Resonance component which should properly distribute using defaults

    for completeness I also mention that there's internal FMOD limit of 32 channels per output, so these HW channels (above that IIRC) not accessible by the asset
     
  28. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    replied via email, for posterity:

    It looks like Unity background processing broke on iOS in one of the recent versions (I confirmed this on 2022.3.10),
    it is enough to comment out
    UnityBatchPlayerLoop();
    in AudioStreamAppController.mm (or remove the `iceTimer` completely)
    it’s not used for anything else.

    I guess this should be fixed by Unity so I will leave the call in the asset for now - unless they would want to remove batch updates completely, but I don’t think this is the case
     
  29. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    * apart from background processing, of course (…)

    if you rely on updating the player loop from background on iOS this is currently simply broken (and it is the only way) - so I suggest please file a bugreport if possible/needed
     
  30. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    AudioMixer recording + Resonance fixes + URL resolving submitted

    3.2.5 102023 -300kX-
    Updates/fixes:
    - tested with latest FMOD 2.02.18
    AudioStreamBase: added 'attemptToResolveUrlRedirection' option to resolve Url redirects if needed based on System.Net.HttpWebRequest
    : more rare network playback shutdown fix

    Audio Mixer: new 'AudioStream InputDevice' (and 'AudioStream ASIO InputDevice') effect/s which allows recording directly into an AudioMixer group
    : note: currently Windows only
    : for details plese see Documentation010_recording_and_inputs.txt

    Resonance Source & Soundfield : fixed marshaling/performance for some run cases

    Demo scenes:
    - moved all Unity AudioMixer related demos to Demo\UnityMixer\
    - new InputDeviceUnityMixerDemo : AudioStreamInputDevice -> AudioMixer group
    - and InOutDeviceUnityMixerDemo : AudioStreamInputDevice -> AudioMixer group -> AudioStreamOutputDevice effect
     
  31. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,271
    video w/ new AudioStreamOscSource/Client to demo low latency audio over (local) network



    - similar latency can be achieved with existing NetMQ based components, but I will probably remove NetMQ in next update and leave only OSC based components in the asset