Search Unity

AudioStream - {local|remote media} → AudioSource|AudioMixer → {outputs}

Discussion in 'Assets and Asset Store' started by r618, Jun 19, 2016.

  1. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    it looks like the response contains header with content length / which doesn't actually correspond to the media .. /
    either they've changed the response, or unity webrequest changed how the request is handled or smh

    in any case it should be enough just to ignore it for now - you can e.g. comment out/delete the assignment to contentLength member var in AudioStreamBase in 'ReceiveContentLengthHeader' methods (it should have default INFINITE_LENGTH then at all times)

    will have to think about this, but it's most likely ok to just always ignore it, i.e. as in the above -
    lmk if you're able to play it afterwards, sorry for the inconvenience !
     
  2. OscarCybernetic

    OscarCybernetic

    Joined:
    Sep 12, 2018
    Posts:
    23
    That seems to work perfectly! Thanks so much for your help and for the explanation; I was scratching my head for a while
     
    r618 likes this.
  3. unity_YuwBmWq8k8aHXQ

    unity_YuwBmWq8k8aHXQ

    Joined:
    Mar 26, 2018
    Posts:
    3
    Been trying to build for android but none of the demos seem to work on my Galaxy S10. Either they crash instantly or just playback glitched audio. Any ideas?
     
  4. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    .. not sure! have you tried to install & run the demo apk linked on the asset's store page ?
    it might be maybe a permission issue too - i think android build requires microphone access permission to be present in the manifest -
    crash log would be helpful though !
     
  5. sinemayacreative

    sinemayacreative

    Joined:
    Oct 7, 2018
    Posts:
    4
    Hi Martin,

    I published a Mobile app built in Unity and it is a meditation app. There Are many mp3 files around 40 - 60 mb. each and I play them from a dedicated server.
    There is delay for sure and I want to get rid of it. THIS PLUG IN of yours is the cure.? in this situation.?

    2nd Q:
    I wish to embed a live stream feature to my users where they will click and listen to my podcast if I am live at that moment. Is this possible with your plug in embedded in my Unity project.

    This is an iOS and Android App scenario.

    Best Regards.
     
  6. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi, depending on what you need exactly there are few scenes (you can test your existing link/s in each of them) in the demo app - please download it from links on the asset's store page -
    AudioStreamDemo will just play the content
    AudioStreamDownloadDemo will just download it w/o playback
    AudioStreamDownloadRealtimeDemo will save the content from url while playing it
    you can set the components to retrieve file/s saved in the above from local cache (based on url) on subsequent playbacks
    - files are stored uncompressed currently only though (so they're bigger than original mp3 files)

    those are all relevant scenes/components i think

    For 2nd question you're looking for icecast/shoutcast server to distribute the content; once you have it running you can connect and play from it via e.g. AudioStream component to listen online
    (note i'm not familiar with whole icecast/shoutcast production setup, but i think there are existing readymade solutions)
    - the radio link used in the demo is an example of icecast audio server

    ask if you have questions, hope this helps!
     
  7. sinemayacreative

    sinemayacreative

    Joined:
    Oct 7, 2018
    Posts:
    4

    Thank you. Can you say,

    - files are stored uncompressed currently only though (so they're bigger than original mp3 files)
    "how can we store files uncompressed?"
     
  8. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    suppose you meant storing files compressed (e.g. in original format) - this is unfortunately not currently there since the result is played via normal Unity AudioClip which needs only uncompressed audio
    i could skip AudioClip creation/playback and play streamed/cached data (most likely) directly without Unity but it's bigger change and will have to think about this
     
  9. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    you're right though in that uncompressed files are several times larger than compressed originals which might pose an issue on mobiles
     
  10. unity_YuwBmWq8k8aHXQ

    unity_YuwBmWq8k8aHXQ

    Joined:
    Mar 26, 2018
    Posts:
    3
    I managed to get the networking demo working, which is what I need. I'm swapping the AudioStream for AudioStreamInput. I'm able to stream my microphone on windows to my phone and have it playback relatively normally. However, trying to stream microphone from android to windows, the playback is heavily distorted (deep voice, slow playback). I'm sure it has something to do with the frequency, but not sure why it would function normally one way but not the other, and also not sure what values need to be adjusted, if any.
     
  11. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    do both AudioStreamInput, and AudioStreamInput2D behave the same ?
    note AudioStreamInput2D has additional option to turn on/off Unity resampling
    let me know your Unity version, I'll do some testing / but it will not be immediately /
     
  12. unity_YuwBmWq8k8aHXQ

    unity_YuwBmWq8k8aHXQ

    Joined:
    Mar 26, 2018
    Posts:
    3
    AudioStreamInput2D doesn't work at all from my phone, just cracks and pops, even if I play back directly on the phone. I'm on version 2020.3.19f1. Thanks!
     
  13. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi! (sorry for late reply, but F***ing war broke out who would have thought ! - I'm not directly affected but being in neighbouring county it took (and still taking) its toll nevertheless - so I would like anybody reading this to not stress over this in any way for now;... anyway)
    there is one missing feature and one crappy usage/code used (this didn't surface heavily so far cause the samplerate on desktop is usually fine, on mobiles is ~ 24000 and this causes issues now)
    I have no generic - as in you can deploy an application for all users - solution for now, only this hardcoded workaround - so I don't know how this will be helpful, I will have to fix this properly in next update;
    to at least test / use this in local manner you can:
    - set server sample rate in AudioStreamNetMQClient.cs, line 99 to: this.serverSampleRate = 24000; ( sample rate of the mobile phone )
    - set Unity *project* samplerate in Project settings -> Audio -> System Sample rate to 24000, too.
    - I built with Best latency too to help microphone on the phone too seemed it helped a bit

    I understand this is not entirely feasible, but at least it should get things going somehow right now
    Let me know if this makes sense, hope to fix this next; Thanks !
     
  14. griffinlarson

    griffinlarson

    Joined:
    Jan 31, 2022
    Posts:
    2
    Hey @r618 thanks for the awesome plugin. Any update on WebGL support? Looks like FMOD supports it, but getting build errors from AudioStreamSupport (`Options` not available in this context) when attempting a build.
     

    Attached Files:

  15. griffinlarson

    griffinlarson

    Joined:
    Jan 31, 2022
    Posts:
    2
    And no rush on a response. Wishing peace to your side of the world friend :)
     
  16. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi, drop me a PM I'll send you link to 2.6.1 update
    Thanks !
     
  17. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi, thank you! also for supporting words
    It's possible to build it (at least last time I tried), and FMOD Studio supports WebGL authoring, but the asset uses parts of FMOD API which are not compatible (namely threading) - so this will probably never happen.
    That said - I'm hoping to finish another asset some time soon™ (limited codec support just for MPEG/OGG/VORBIS/OPUS and just streamed playback initially, with decoding done entirely in .NET) which will run in WebGL
    Hope to post some demo soon too
    Thanks again !
     
    Last edited: Mar 4, 2022
  18. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    submitted update with fixes for two previous issues, should be online on the store shortly:

    2.6.1 032022 St.Javelin

    Updates/fixes:
    AudioStream:
    - ignore 'too small' Content-Length in some request responses to more likely handle media length correctly
    AudioStreamNetworkSource / AudioStreamNetworkClient
    - fixed running under default/system samplerate when samplerates of source and client differ significantly (ex. mobile<->PC)
    - AudioStreamNetworkClient is now using Unity AudioClip for playback, which means it has larger latency than previously, but received stream can be e.g. placed in 3D
    - removed Best latency setting as requirement
     
  19. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    just submitted an update with several times requested feature to split (mainly input) channels into separate AudioSources:

    2.6.2 032022 St.Nilwa

    Updates/fixes:
    AudioStreamInputBase:
    - system is initialized with FMOD.CONSTANTS.MAX_CHANNEL_WIDTH allowed channels/speakers when initializing it
    this was max./default also before, only now it's explicit - the constant is currently 32 output channels, this is mentioned also in Docs for Audio input.
    IL2CPP:
    Marked remaining OnAudioFilterRead and PCMReadCallback callbacks for IL2CPP optimization

    New components:
    Simple (hard) separation of channels of playing input device, AudioSource, and AudioClip:
    (these were originally intended as demo scenes only, but I added them as separate components, too)

    AudioStreamInputChannelsSeparation:
    - splits recording channels of audio being recorded from an AudioStreamInput* into separate single channel AudioSources prefabs and instantiates them in the scene
    please see AudioStreamInputChannelsSeparationDemo for example usage

    AudioSourceChannelsSeparation:
    - splits audio of an AudioSource/AudioClip being played on current Unity output into separate single channel AudioSources prefabs and instantiates them in the scene
    please see AudioSourceChannelsSeparationDemo for example usage

    AudioClipChannelsSeparation:
    - splits original channels of (a multichannel) AudioSource/AudioClip being played into separate single channel AudioSources prefabs and instantiates them in the scene
    please see AudioClipChannelsSeparationDemo for example usage

    - basic usage only is currently implemented, e.g. all components play automatically, they don't have complete user facing API yet, but they're usable even in this state.

    For all of the above please see 'Separating AudioSource, AudioClip and AudioStreamInput* channels' in Documentation
     
  20. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    ok so I mentioned WebGL support in a new asset in my previous reply
    thought I better mention this now in order to not keep people's hopes high meanwhile, but this, unfortunately, won't happen, at least for the time being :
    after more thorough testing audio stuff should be doable, but due to missing threading support in WASM/WebGL UnityWebRequest can't be used in a way which is required for this to work
    [there are threads on the forums which go into more detail, but in short System.Threading namespace would need to be supported for this, or at least as a minimum UnityWebRequest's ReceiveData callback would have to be able to interrupt main thread - otherwise this won't happen until then]; sorry ^!^
    (just for clarification - new asset as roughly outlined previously should happen with more proper mobile support, just not with WASM/WebGL)
     
  21. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    should be live
    // forgot to update text in the store page description w/ download links - it says demos are based on 2.6.1, they're built with latest (2.6.2)
     
  22. colinclevr

    colinclevr

    Joined:
    May 12, 2021
    Posts:
    5
    I'm using AudioStream solely for redirection of mixer groups on Windows, which I'm doing with the
    AudioStreamOutputDevice
    mixer group effect, and I'm having some erratic results which I hope you can help me with.

    I've been using
    FMODSystemsManager.AvailableOutputs()
    to get a list of
    OUTPUT_DEVICE
    s; then I search this list for the name of the device I want to redirect to, and assign the associated
    id
    as the
    OutputDevice ID
    in the
    AudioStreamOutputDevice
    effect. This seemed to be working at first, but I've noticed that it's situational: the ids in the returned list vary depending on several factors (the current Windows system default output is definitely one of them, as this always has id=0, but the ids seem to change even just after a reboot). And sometimes - maybe 20% of the time - when assigned as
    OutputDevice ID
    , the ids just don't redirect to the intended output.

    Is
    FMODSystemsManager.AvailableOutputs()
    the intended way of obtaining the required
    OutputDevice ID
    ? Can I just use these ids raw, or is some processing required?
     
  23. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    hey, AvailableOutputs is just wrapper call for FMOD's getNumDrivers/getDriverInfo, but couple of things:
    - make sure you use the component only after it's ready and use its SetOutput method to set the id, don't assign it directly
    . see AudioSourceOutputDeviceDemo - it waits in Start until .ready and uses user picked id to SetOutput(outputDriverId)
    - demo uses runtimeOutputDriverID too (which might differ from outputDriverID when/if device/s are un/plugged while running - which is probably not the case here)
    that said the order and ids are not guaranteed and comes from OS/WIndows (though they should stay more or less unchanged most of the time ..) -
    you could search also by GUID - but name should be fine as long as you can identify the device reasonably enough
    lmk if you make it work
     
  24. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    / I should /will update this to prevent direct assignment..
     
  25. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    btw I missed this, I'll do a test w/ mixer effect to see if anything changed, will let you know
    (there changing just the output id is ok, the rest in the above for MB component still applies)
     
  26. colinclevr

    colinclevr

    Joined:
    May 12, 2021
    Posts:
    5
    Thanks for the reply.

    I'm using the
    AudioStreamOutputDevice
    mixer effect, not the
    AudioStream
    component. And I don't think there's anything wrong with the process of setting the id on the mixer - it always correctly changes to the value I set, whether I use AudioMixer.SetFloat() or set it directly in the Inspector - it's just that the id itself doesn't correspond to the correct device (although as I say, about 80% of the time it is correct).

    I've tried calling
    AvailableOutputs
    at several points during runtime to determine if I was doing it too early or something, but the returned ids don't change.
     
  27. Jon_Olive

    Jon_Olive

    Joined:
    Sep 26, 2017
    Posts:
    23
    Alternatively - set your mobile device samplerate to 44/48 AudioSettings.outputSampleRate - or, actually, now using the slightly more round about AudioConfigurations struct. iOS and android (I think) default to 24000 samplerate - but are quite happy to be set manually. On a side note - this isn't true of macOS - where samplerate cannot be set programmatically it seems - but must be set by the user in audio midi setup.
     
    r618 likes this.
  28. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    yea i realized afterwards you're using the effect, see #825 ; as you mentioned switching output itself looks to be working without issues (just plugged an AudioSource into a mixer group with the effect and modified AudioSourceOutputDeviceDemo to use only mixer instead of MBs in the scene)
    AvailableOutputs has to be called only after .ready flag once on the component is set otherwise the list stays the same unless a device is un/plugged
    can you verify that switching in this demo [https://www.dropbox.com/s/4fp24wwoz4q3scx/OutpuDeviceUnityMixerEffectDemo.zip?dl=1] behaves correctly ?
    - you'll have to maybe un/ and reexpose the mixer parameter, otherwise it should pick up everything else from the asset
     
  29. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    ah that's good to know / never tried it even on iOS since mobile samplerates are hw limited, thought AudioConfig reset wouldn't even work there ^^/
    i added samplerate payload to each compressed frame in 2.6.1 as fix for the issue
    the processing overhead is nonexistent/minimal, so hopefully it's a reasonably good tradeoff
    (previously even resampling from e.g. 44 to 48 kHz wasn't precise enough with speex resampler, Unity doing it should be better, although with latency penalty...)
     
  30. colinclevr

    colinclevr

    Joined:
    May 12, 2021
    Posts:
    5
    Ok, after playing around with this a lot, it seems that the demo has a similar issue, but the circumstances which break it differ from those which break our own project.

    At first, the redirection appeared to work correctly in the demo, but not in our project. However, after a lot of playing around and unplugging/replugging devices, I find that there are certain circumstances in which the opposite is the case: the audio in our project is fine, but not in the demo.

    Some more specifics. Windows lists four available audio outputs, two of which I want to redirect to: "Realtek Digital Output (Realtek(R) Audio)", which refers to my headphones, plugged into a standard audio jack, and "Speakers (Realtek USB 2.0 Audio)", which refers to an HP Reverb VR HMD, plugged into a USBC port.

    In both our project and the demo,
    AvailableOutputs()
    returns the same ids: the headphones are assigned id 0 (as they're selected as the default system output), and the HMD has id 3. However, in the demo, selecting device 0 causes the audio to be heard through the HMD, while selecting device 3 (or indeed 1 or 2) causes no audio to be heard on either device. The mixer group's
    OutputDevice ID
    is correctly changing to the selected id. The only way I can hear the audio on the headphones is to manually assign an
    OutputDevice ID
    of 4, 5, 6 or 7 (all of which work, though 8 and above do not).

    As mentioned, in these circumstances, redirection in our own project works correctly. But there are other id configurations which break our project and not the demo. And in other situations, audio is heard on the headphones but not the HMD.

    In order for the HMD even to be detected as an audio device, we have to have the Microsoft Mixed Reality Portal running. Is it possible that that's interfering somehow with the redirection?
     
  31. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    @colinclevr are you changing (un/pluggin in) device/s when this happens ?
    - can you please verify that changing the output via effect works for you reliably when no devices change occurs at runtime ? (the Mixed Reality app shouldn't be a problem, it might manipulate output/s for its own needs, but that should happen only once at startup - afterwards the queries to AvailableOutputs() from the app should always return the same list)
    - i forgot to mention that effect plugin doesn't responds to device changes at runtime correctly, to be more precise - at all - . that's why you're probably seeing difference between GO component and the plugin....
    if you can, please run your setup w/o devices being changed after startup for now
    I will try to update the mixer effect meanwhile, but don't have ETA. Thank you and let me know if this works for you for the time being.
     
  32. colinclevr

    colinclevr

    Joined:
    May 12, 2021
    Posts:
    5
    When I mentioned unplugging devices, I meant between runs (in order to get different values returned by
    AvailableOutputs()
    so that I could find a configuration which breaks). We don't plug or unplug anything at runtime - the required devices are connected at program start and stay that way.

    So I'm still at a loss as to why we're getting these unpredictable results.
     
  33. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    ok, but changing devices while Unity is opened - i.e. the mixer effect is still loaded - has the same effect I think: the effect has still old / invalid entries since it doesn't respond to devices changes... ( AvailableOutputs() at script level has new/current entries since it 1] creates new system when entering the Play mode, and 2] can respond to changes at runtime (this is not relevant at this point though) )
    so if you're testing various devices configurations can you please make sure that you close Unity before making the changes (and don't change devices after re/opening the project..)
    I'll still have to implement the notification in the effect to make this more robust, but meanwhile if the environment is stable before running and at runtime (in this case this also applies to Editor), the effect and AvailableOutputs() should provide consistent results
    Thanks !
     
  34. nthomps

    nthomps

    Joined:
    Nov 4, 2021
    Posts:
    2
    Hi @r618 I'm a Unity noob but have bought your asset as I am creating a prototype for my final project at University. I only need to do one thing, which is to stream audio directly from my synths via line input to my interface into Unity. So that I can do a live performance within a Unity VR environment. I have created an empty cube object and added the Audio Stream Input component to it. What are my next steps? Thanks in advance. Apologies if it's a basic question.
     
  35. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    For VR AudioStreamInput is OK since that adds automatically an AudioSource to game object which can be spatialized - see Spatial Blend on AudioSource; (if you don't want/need spatialization you can use AudioStreamInput2D which has better latency but you'd also have to add AudioSource component manually after adding it since it works differently), otherwise the setup is the same, the simplest/easiest:
    upload_2022-4-21_20-26-29.png

    this will record from your 0 (default) audio input after starting the scene. Notice AudioSourceMute script at the bottom - this will prevent feedback should you need it
    You also need to know your device index - record device id - of your line-in, run AudioStreamInputDemo scene, your input should be listed there (if not let me know) together with the id assigned by OS
    - the demo scene also reacts to devices un/plugging and switching but i suppose you shouldn't need that for now; i recommend examining the demo scene regardless though
    Best!~~
     
  36. OscarCybernetic

    OscarCybernetic

    Joined:
    Sep 12, 2018
    Posts:
    23
    Hello again,
    When the ReadTags() method of AudioStreamBase runs, sound.getTag always returns ERR_NOTAGFOUND. I even tried to use sound.getNumTags to verify, and that also returns as 0 for every radio station I try.

    I'm not sure at what point it stopped working, but it's the same behavior in a fresh project with only the AudioStream demo and the latest FMOD for Unity.
     
  37. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi @OscarCybernetic, thanks for heads up!
    You're correct in that it doesn't seem to be working at all, hard to tell where the change happened :-|
    - I would probably have to switch and test a whole different FMOD filesystem strategy for this - I'm not sure it would work and I'm not sure it's worth it at this point
    It's another reason to transition away from FMOD for networked audio streams tbh
    If you want post/send me a PM with link to your media I'll have a look how it behaves in player i'm trying to develop
    / FWIW tags should work for local files if that's of any help //
     
  38. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    I just submitted an update for review which should fix the mixer effect (on Windows, at least)-
    @colinclevr if you want send me a PM I'll send you the package before it goes live on the store. Thanks!

    ===============================================================================
    2.6.3 042022 Not a special operation

    Updates/fixes for AudioMixer effect:
    - built using 2.02.04 - latest Unity verified version
    - updates from Unity official Native Audio Plugins SDK
    - solo/mute/bypass on the effect should now work
    - updated devices/output switching
    - the plugin listens for devices changes now and updates its internal outputs as needed when in the mix
    ^ see 'AudioStreamOutputDevice mixer effect usage instructions' in main documetation (macOS is limited in this regard)

    New demo scene for the above
    OutputDeviceUnityMixerHotpluggingDemo:
    - displays current output device list and plays audio on user chosen device via mixer effect
    - listens for devices changes using AudioStreamDevicesChangedNotify's event in the scene, updates device list accordingly and switches playback to user selected output
     
  39. nthomps

    nthomps

    Joined:
    Nov 4, 2021
    Posts:
    2
    Hi there, thanks I have got audio working live through Unity now but I was wondering how I could use the AudioStreamInput with FMOD rather than Unity, Audio Source.

    Appreciate the help!
     
  40. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hey ! I'm not sure I follow tbh : asset itself uses certain parts of FMOD to function (such as in this case access to your line-in input) but since it's Unity asset it somewhat makes sense to make that audio available in standard Unity way as AudioSource.
    It doesn't -use, or -interfaces with FMOD Studio so things like studio banks/projects, FMODEvents and so on.. if that's what you're asking - sorry !
    / i'm not sure if it's possible to access different mic/devices from FMOD Studio (it might be!), but if not you'd have to probably mix your FMOD project audio output with mic/line-in audio [via scripting] , or you could do it via unity mixer in Editor too I guess.
    If it makes sense let me know~!
     
  41. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    Hello. I'm having some problems using the plugin. When I use the "AudioSourceOutputDevice" component of the application, it crashes on an android device, it works fine in the editor. Judging by the logs, android cannot call the "FMOD5_System_SetOutput" method.
    I will be grateful for any help

    Logs



    Unity 2021.2.7



    Best regards,
    Simple Assets
     
  42. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    Tried with FMOD version 2.01 and it worked. Error is gone
     
  43. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    Hi, ok, thanks for the update -
    does the apk demo from the store page behave the same ? - Running AudioSourceOutputDeviceDemo scene should be enough - it uses the same codepath, and
    I couldn't replicate it w/ latest official FMOD 2.02.04 - but I was using IL2CPP scripting backend
     
  44. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    Hello, I tried to run the AudioSourceOutputDeviceDemo scene from your demo, but the application crashed.
    I also had another problem with FMOD version 2.01. I can't hear my voice through the microphone, instead I hear an incomprehensible "flu" on Android, although everything works fine in the editor. I tried the same with your "AudioStreamInput2DDemo" demo scene. I pressed "Record" and then "Mute output" and the result is the same, I hear flus.
    I decided to try the AudioStreamInput2DDemo scene from the apk provided on the asset-store, but I get a crash when I go to this scene.
    Can you help me?
     
  45. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    I tried a fresh project with FMOD 2.02.04 and IL2CPP, the result is the same, crash when trying to call the "FMOD5_System_SetOutput" method
     
  46. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    I tried it on different Android devices - the result is the same
     
  47. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    Work with FMOD 2.01.05 and mono. With IL2CPP I have building error..
     
  48. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    thanks for the update, can you post the exact build error you're getting ? it should be possible to fix it
    also am I correct to assume that mic + output redirect work when you use 2.01.05 with mono ?

    this is worrying though I wanted to ask about phone you're using (and possibly update OS if possible)
     
  49. SimpleAssets

    SimpleAssets

    Joined:
    Jul 14, 2021
    Posts:
    24
    Hello, yes, all features work correctly with mono
    Build error

    Building Library\Bee\artifacts\Android\d8kzr\1a3n_ic_plugins.o failed with output:
    In file included from E:/Work/UnityProject/Dobbler/Doppler/Library/Bee/artifacts/Android/il2cppOutput/cpp/fmod_register_static_plugins.cpp:8:
    E:\Work\UnityProject\Dobbler\Doppler\Library\Bee\artifacts\Android\il2cppOutput\cpp\fmod_static_plugin_support.h(48,115): error: too few arguments to function call, expected 2, have 1
    void *library = il2cpp::os::LibraryLoader::LoadDynamicLibrary(StringViewUtils::StringToStringView(libraryName));
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ^
    C:\Program Files\Unity\Hub\Editor\2021.2.7f1\Editor\Data\il2cpp\libil2cpp\os\LibraryLoader.h(42,9): note: 'LoadDynamicLibrary' declared here
    static Baselib_DynamicLibrary_Handle LoadDynamicLibrary(const utils::StringView<Il2CppNativeChar> nativeDynamicLibrary, std::string& detailedError);
    ^
    In file included from E:/Work/UnityProject/Dobbler/Doppler/Library/Bee/artifacts/Android/il2cppOutput/cpp/fmod_register_static_plugins.cpp:8:
    E:\Work\UnityProject\Dobbler\Doppler\Library\Bee\artifacts\Android\il2cppOutput\cpp\fmod_static_plugin_support.h(57,41): error: no matching function for call to 'GetFunctionPointer'
    sRegisterDSP = (RegisterDSPFunction)il2cpp::os::LibraryLoader::GetFunctionPointer(library, "FMOD_System_RegisterDSP");
    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    C:\Program Files\Unity\Hub\Editor\2021.2.7f1\Editor\Data\il2cpp\libil2cpp\os\LibraryLoader.h(43,36): note: candidate function not viable: requires 3 arguments, but 2 were provided
    static Il2CppMethodPointer GetFunctionPointer(Baselib_DynamicLibrary_Handle handle, const PInvokeArguments& pinvokeArgs, std::string& detailedError);
    ^
    C:\Program Files\Unity\Hub\Editor\2021.2.7f1\Editor\Data\il2cpp\libil2cpp\os\LibraryLoader.h(44,36): note: candidate function not viable: requires 3 arguments, but 2 were provided
    static Il2CppMethodPointer GetFunctionPointer(Baselib_DynamicLibrary_Handle handle, const char* functionName, std::string& detailedError);
    ^
    2 errors generated.
    UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&)



    Device - Realme GT Master Edition
    OS - Android 12

    Hope you can help me
     
  50. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    1,305
    HI, please use 2.01.11 (latest for unity in 2.01 series) i hope they didn't diverge from 2.01.05 too much.... / 2.01.05 is unsalvageable for me at this point probably ( see e.g. https://qa.fmod.com/t/unity-2021-1-0f1-fmod-2-01-07-il2cpp-build-failed/16933 )

    i used this package import:
    upload_2022-5-18_14-53-23.png

    especially removal of the obsolete folder is important
    player settings:
    upload_2022-5-18_14-54-53.png

    with these is looks to be running OK
    / note i'm not an android expert by any means but your phone looks to be more than sufficient (sans any Android OS incompatibilities), especially compared to my old testing phone so hopefully this should help !

    if not lmk but going further back would be rather problematic at this point