Search Unity

Native Audio - Lower audio latency via OS's native audio library (iOS/Android)

Discussion in 'Assets and Asset Store' started by 5argon, Apr 15, 2018.

  1. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    I have produced a test scene that plays looping audio at track 0 and other at track 1 2 3 and I could not make the track 1 2 3 spam to ever affecting the looping track 0, on my device Xiaomi Mi A2.

    I can't see anything wrong in the RedMi log.
     
  2. Quatum1000

    Quatum1000

    Joined:
    Oct 5, 2014
    Posts:
    889
    Hi 5argon,

    a beside question. I work also with Midi and Unity on PC. I'm searching for a solution to have a high resolution timer in unity.

    It should be able to tick up to 1 to 6144 per second smoothly without blocking (pooling) the cpu while calculating the time between the ticks. There is a high resolution timer in windows but cannot access it from unity.
    Any idea how this can be done in Uniy generally?

    Thanks a lot.
     
  3. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    "Generally" I think that is the intention of `_Scheduled` methods in Unity, so it try to begin the action in the future time according to platform's native audio timer (differently across-platform). Is using that results in an accurate enough timer on PC?

    I have no experience on PC, but I notice that `AudioSettings.dspTime` which goes in a certain step is updating at the same frequency and same value as iOS's native audio timer, but slightly different on Android. I guess that the resolution that AudioSettings.dspTime updates might be indicating how accurate the `_Scheduled` could execute.
     
  4. eSmus1c

    eSmus1c

    Joined:
    Apr 17, 2018
    Posts:
    10
    I just wanted to make sure these logs are correct. Okay, so, I will send to you the same logs from Huawei P20 Pro soon.

    ------
    Okay, here you go. Two logs from Huawei P20 Pro. Check it please.

    1. Opened command prompt ("cmd"), executed "adb logcat > logcat.txt", pressed 3 times on the screen to play sound with Native Audio (using your demo app), closed command prompt (I don't know how to stop log recording).

    2. Opened command prompt ("cmd"), pressed 1 time on the screen to play sound with Native Audio (using your demo app), executed "adb shell dumpsys media.audio_flinger" command immidiately after that.

    I hope everything was done right. Thanks!
     

    Attached Files:

    Last edited: Apr 5, 2019
  5. eSmus1c

    eSmus1c

    Joined:
    Apr 17, 2018
    Posts:
    10
    Hi @5argon, could you check these P20 Pro logs please? We are still facing with poor Native Audio latency on this phone.
     
  6. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Ok, I didn't noticed you edit the message. After I get home I will try looking for abnormalities.

    *By the way, I also know one of my user that is facing audio problem on built-in Unity on P20/P20 Pro/Mate 20/Mate 20X, but not Native Audio. The problem is that the buffer size that Unity select for those phones on Best Latency was too low to be usable.
     
    Last edited: Apr 16, 2019
  7. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    What's wrong with the logs :

    1. Despite being an high-end phone, the audio track thread is not FAST.

    Screenshot 2019-04-16 22.28.12.png

    This image shows an audio output thread. PID 10790 is the game. The 24000 Hz one is Unity's. The other two 48000Hz ones are Native Audio's. Turns out it couldn't get a fast flag for some reason, and if it is able to, it would spin up a new thread since this thead is for primary audio without fast mixer option. And because these 3 are exactly at the same thread, the only advantage Native Audio get is bypassing Unity's mixer. Hence you see only little improvement or not at all. (But theoretically it must be faster than Unity that pass through mixers and must wait for the end frame)

    HAL Frame count is 960. It is somehow related to buffer size, which the larger the bigger the latency. For an high end phone like this, native buffer size of 960 is abnormally large. Even my Xiaomi Mi A2 is sized at only around 240.

    To compare with your earlier RedMi phone, this is what the success case looks like :

    Screenshot 2019-04-16 22.26.21.png

    This thread is for fast audio, there is a letter "F" in front of 2 Native Audio's source, and also Unity's 24000Hz source is no where in sight because it is in an another thread with just PRIMARY flag.

    2. The reason why fast track couldn't be created

    Screenshot 2019-04-16 22.38.19.png

    The requested flag is definitely Native Audio trying to get 2 fast sources, but it was denied because of something mismatching. We will have to look at Android's C++ what could be the souce of disqualification of fast track here, and if is it possible to get fast track on P20 Pro at all.

    The line in question is right here : https://android.googlesource.com/pl...master/services/audioflinger/Threads.cpp#1896 and one of the check is that if it just doesn't support fast track, then your request will be denied even though sample rate + buffer size matches perfectly. Maybe it is likely that P20 Pro just doesn't have any fast mixer. But usually phones that are like that are low cost ones. So I am surprised why, but could be that it was cutting cost and put effort into camera department more because apparently general public doesn't care about audio latency and it doesn't help sell the product.

    Also according to your original Superpowered test that returns 200ms, that's likely the lowest bound it could get. No amount of Unity hacks would get us faster than that.
     
    Last edited: Apr 16, 2019
  8. Radik-Salakhov

    Radik-Salakhov

    Joined:
    Nov 16, 2014
    Posts:
    16
    Hello,

    I've got the following crash in production: (I use latest version of NativeAudio)
    upload_2019-4-19_18-15-17.png

    Any suggestions how to fix?

    Thanks.

    best regards,
    Radik
     
  9. eSmus1c

    eSmus1c

    Joined:
    Apr 17, 2018
    Posts:
    10
    Thank you @5argon for such detailed answer! As far as I understood, we can do nothing at all with such phones like Huawei P20 Pro. Only Huawei can fix this with new firmware. Am I right?

    The only solution I see here (in rhythm-based music games in general) is to make some kind of "safe mode" for such "problem phones" - disable tap-sounds and enable phone vibration that have no issues with latency for the best user experience.
     
  10. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    As talked in the private message, while we still don't know why some phones returns strange number that couldn't be parsed in the next version I will add a catch then use buffer size 256 + sampling rate 44100 regardless of device if that happens. (Likely wrong, but I believe better than crashing)


    I suspect we could do something to "unlock the fast track" at first, but seeing even the phone itself could not get a fast track I think it might be the case that the fast track is not available at all. If it is usable, the phone should be the first one that utilize it.

    One more suspect maybe some audio effects or enhancement baked on the phone that prevents fast track for everyone. I don't know how much Huawei do to their phone but it is possible for higher end phone to contains more exotic technologies. I would try searching option menu/developer menu to disable things then check again if the thread changes or not. (Having "F" or FAST somewhere)

    I believe players nowadays learned to turn off the tap sounds when their phone has bad latency, so I think it is fine as long as an option to turn off the tap sound is there. Actually, experienced players even intentionally turn off the tap sound regardless of phone's latency because their finger nail sound is going to be the most accurate feedback sound, traveling straight to their ear. These players will calibrate the game so that if finger nail traveling through air matches the audio coming through hardware latency then through air/headphone matches, they should get a perfect. (So it is also very important to include a calibration option along side with tap sound on/off)
     
  11. ixikos

    ixikos

    Joined:
    Jun 21, 2013
    Posts:
    26
    Does this plugin support audio from the Streaming Assets folder? I am getting crackling when attempting to do so via Unity. We have lots of large sound files for a meditation app and the quality is imperitive
     
  12. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Hello, originally this plugin supports only
    StreamingAssets
    folder before moving to piggyback on
    AudioClip
    returning raw audio data, so it should still works. However the format is not as flexible, it must be uncompressed WAV 16-bit. Where if you use
    AudioClip
    I could ask it to return raw data from whatever compression and quality.
     
  13. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Native Audio 4.5.0 is online now with small improvements. I have improve some native Android code. That part of code is very very hot path, so it might improve latency. (didn't compare)

    [4.5.0] - 2019-08-15
    Added
    • Better performance of OpenSL ES double buffering callback function by removing all if, replaced by inline conditionals. Compiled assembly could be better with inline conditionals since it may avoid costly branch prediction. This callback is a very hot function as it is being called on every little bits of audio buffer sends out your speaker, so potentially it could improve latency. (theoretically)
    • Added some explanations why nativeAudioPointer.Unload() is unsafe and could cause segmentation fault in the code documentation. You have to ensure no tracks are playing before you unload. It is by design.
    Fixed
    • The multi-track demo scene now wait 0.5s after disposing before re-initializing 4 native sources, to fix throttling time problem which cause you to not get back fast native tracks you just released.
     
  14. dhalper

    dhalper

    Joined:
    Jul 24, 2019
    Posts:
    3
    Hey do you know if this works for Oculus Mobile VR (Quest)? My game involves looping audio that the user triggers by colliding with game objects-– basically a live looper in which you can play multiple instruments. Currently, I do not have it set up with AudioSource.DspTime but have heard this is a better way to do it rather than relying on the update function. Does your program use DspTime? If not, why?

    If this works out, I'd love to have one on one sessions with you via Unity Connect and figure out how to integrate Native Audio into my project! Please let me know :)
     
    Last edited: Nov 15, 2019
  15. dhalper

    dhalper

    Joined:
    Jul 24, 2019
    Posts:
    3
    @5argon Just tagging you to make sure you get a ping
     
  16. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    NA currently contains no timing or scheduling functions at all. NA is basically you call load, the audio memory goes to wait at the native side. Then when you call Play using a pointer referred to that memory location, the native side plays an audio as fast as possible.

    This "when" means when you call it, be it in Update, as a part of Playables that is running on dspTime, etc. It will be as fast as possible after that moment.

    The answer is no. It plays when you asked it to and cannot wait for an amount of any dspTime.

    It will be somewhat dependent to dspTime if you ask it to play when a dspTime reaches a certain point. This is possible with playables which the PlayableGraph contains that time update mode (https://docs.unity3d.com/ScriptReference/Playables.DirectorUpdateMode.html) then you may have a custom script playable that call to NA. Or simply you coroutine and wait for a dspTime. But these approch defeat the purpose of dspTime since in Playable solution, it just meant the time increments along with dspTime, by the time dspTime reaches your desired point it will surely be a bit over that point. For coroutine, it is already frame-based and the play will not occur on the exact dspTime you want.

    Unity has a more accurate thing which is PlayScheduled on the AudioSource, it plays when dspTime arrives at a certain point in the future but it will be exact and does not care about frame. However it must be in the future.

    For accuracy, NA will lose to PlayScheduled. NA is designed to solve a realtime feedback audio that you cannot predict to be as low latency as possible, like a button press or live instruments. If your application is a looper that means you know the future, you better base your logic on entirely PlayScheduled. Or a mix of Playable graph that ended up doing PlayScheduled as the time goes on. Remember that you must always "play the future" with PlayScheduled.

    It maybe possible to create PlayScheduled for NA so it is both in a precise future and fast, but I have not investigate the native side yet how I can hook it up to Unity's dspTime. Also even if it could, it is not useful and out of concept since if you need future accuracy, then latency is not a problem anymore when you can delay the whole thing. (The sequence will be slower as a whole, but the sequence sounds in-time.) And defeat the purpose and trouble of loading audio to native in NA when you can use PlayScheduled instead to get a precise future, with latency that doesn't matter.
     
    Last edited: Nov 17, 2019
  17. dhalper

    dhalper

    Joined:
    Jul 24, 2019
    Posts:
    3
    @5argon It may make sense for me to still rely on NA to play when the box collider is triggered as I assume this is using frames and then will use PlaySchedule/dspTime to play the audio in the following loops. Do you do tutoring work on Unity Connect? You seem very knowledgeable on the general subject and Id love to get help/learn from you.
     
  18. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Hmm I have no idea how Unity Connect works. If you want real time chat you can go to Discord channel also.

    When box collider triggered, it will sound instantly or add something to the looper?

    - If instantly, then NA is the solution. As soon as the box collider triggered you call Play on audio loaded to native side. But this is not related to dspTime, as it is instant at the colliding moment and collider uses in-frame time. Your best latency is going to be at the collider detection.
    - If not instantly but some time in the future, then you should use PlayScheduled in Unity instead of NA because with NA you will have a problem of "when to call Play" after colliding, and you can never have that exact moment because NA cannot schedule. You say following loop so I guess this is what you want. If the play is not instant, then I think there is no reason to use NA over what Unity provided.
     
  19. Chris-Rowsell

    Chris-Rowsell

    Joined:
    Feb 3, 2017
    Posts:
    9
    Hi 5argon,

    I know this question has been asked a while back but.... will this asset help reduce the real-time Microphone latency on the Android platform? I just wondered if your latest updates can help resolve this issue.

    To give you a bit of background I am currently developing an app for the Oculus Quest. As the user speaks into the virtual mic in the scene their voice is then amplified through the virtual speakers in the scene.

    I have it working through the Unity Mic to AudioSource components but as you know the latency is horrendous.
    So hoping your Asset could solve this issue some way, if not, in your experience do you have any suggestions?

    I'm still buying your asset tho because I intend to do a drumkit easter egg :) shh don't tell anyone lol
     
  20. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Hi, Native Audio does not deal with anything input related. It is only output unfortunately..
     
  21. sharimken

    sharimken

    Joined:
    Jul 30, 2015
    Posts:
    14
    I am also looking for some solution of native audio play with input data. like PCM.
    In my case, PCM data are streaming via networking. But the Unity3D playback adds minimum 0.5 sec delay after my data arrived. I noticed that because I tried streaming same PCM data to HTML5 and it plays in almost real time through networking..

    Thus, I've spent many weeks for native PCM playback solution. I understood that it's not supported audio input with you asset. but is there any good resource or keywords for searching solution? I wish you wouldn't mind to share some advice with your expert. thanks.
     
  22. Brast_Grand

    Brast_Grand

    Joined:
    Nov 8, 2018
    Posts:
    1
    Hello, @5argon. I would like to know if this plugin can work when the application is minimized? And in a minimized form, transfer control to the audio player of the device? As if you were using a regular music player to listen to music.
     
  23. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    I answered in the mail too but here goes : No, it returns all allocated native audio sources to the phone after minimizing and get them back on maximizing. Sorry!
     
  24. BachmannT

    BachmannT

    Joined:
    Nov 20, 2016
    Posts:
    386
    Hello 5argon. Thank for your great work!
    I have a simple question: is NA provides access to the audio callback as OnAudioFilterRead do with an AudioSource ?
    Thank
     
  25. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Hello! Unfortunately no, it can only play exactly the bytes you load to NA from AudioClip. The native sources are so simple that they only want to keep running over a static portion of memory where your exported AudioClip bytes are placed. That helps with latency of course so it is intentional that NA is lacking flexibility.
     
  26. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Hello,
    I'm having a seizure right now while struggling against the retarded latency my reflexes-based game is experiencing on Android. I heard Unity on iOS handles sounds in a better fashion, but on Android it's downright terrible on some good smartphones (Samsung, Huawei, Sony, etc.).

    I have several FX sounds, most of them so small that they can be loaded in the RAM as "raw" PCM (sort of) in the background (doing this to have the most immediate experience possible besides all the other necessary loadings and initializations), although these sounds are compressed as Vorbis at first in the build, where most of them are below 8 Kb, a few sit around 16 Kb and just three or four are around 25 Kb.
    I also use Unity's mixer for a few effects, some fading, ducking, etc.
    Finally, my game is coded using Playmaker and runs smoothly but the latency literally kills the game and makes it feel like it's lagging when it's not. I have no input latency but the delay makes it feel like everything is half a second late as far as the gaming experience is concerned and that's just plain horrible.

    Would it be a sensible choice for me to acquire your extension tool and would it be easy for me to integrate it into my game?
    On this last part, I could code very simple Playmaker actions assuming the methods are very easy to call: Nothing too fancy, no references to external scripts I need to understand, no subscriptions/delegates I should manage; just simple stuff at the end of it for me, anything like Play().

    Thanks!
     
  27. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Hello. I have no experience with Playmaker, but your description sounds like Native Audio could solve. The sound you loaded to NA will bypass all Unity audio pipelines, the mixer and effects would not work. The uncompressed audio memory is absolute static and there is just a playhead to run over it. It also supported loading from Unity's editor encoded Vorbis. (Will be uncompressed in memory)

    You can download demo APK and see on your problematic devices if the improvement is good for you or not. An another way to see if it could solve your problem or not is to purchase then refund later if it can't.
     
  28. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Right, thanks. It's a pity the mixer seems to add so much latency, I had neat effects, now I'm going to have to multiply the number of audio files to mimic this. It will have to wait for an update though and I'll cross fingers that not too many users encounter this issue on their Android (crap)phone.
    It's easy to underestimate how Android phones that look advanced and powerful are still lagging in very simple areas.
    Then again, if these hardware limitations could be detected and somehow listed under Google Play, maybe it could be possible to build a list of devices to exclude for the time being, which is preferable than having bad reviews on the app's page.
     
  29. danishgoel

    danishgoel

    Joined:
    Apr 24, 2016
    Posts:
    3
    Hi @5argon, firstly thanks for this great asset. I have used it in my game to reduce UI & Game SFX latency which makes the game feel crisp & snappy :)

    Recently I got a few crash reports in unity analytics for an Android 4.1.2 device. Below are the device details and stack trace. I am using the latest 5.0.0 version of native audio. Can you look into it.

    Device : HUAWEI/MediaPad 7 Youth/hws7701u
    OS : Android OS 4.1.2 / API-16 (HuaweiMediaPad/C232B005)
    CPU : ARMv7 VFPv3 NEON;RK30board

    Stack Trace:
    Thank you.
     
  30. mylastggeast

    mylastggeast

    Joined:
    Jun 14, 2021
    Posts:
    41
    Hello @5argon!

    I have recently purchased the plugin and everything seems to be working as intended. I am now trying to play around with the volume but I can't seem to make it work. This is the code flow:

    Code (CSharp):
    1. private PlayOptions playOptions = PlayOptions.defaultOptions;
    2. ...
    3. playOptions.volume = 0.0f;
    4. nativeAudioPointer = NativeAudio.Load(awesomeHitAudioClip);
    5. ...
    6. nativeSource = NativeAudio.GetNativeSourceAuto();
    7. nativeSource.Play(nativeAudioPointer, playOptions);
    but regardless of which volume I input, I can always hear the sound effects. Am I doing something wrong? It seems to be working on iOS by the way. Only Android seems to be affected.

    Thanks for your help!
     
  31. danishgoel

    danishgoel

    Joined:
    Apr 24, 2016
    Posts:
    3
    I also faced this issue on Android and worked around it by only playing the sound when volume is more than zero. Like this:
    Code (CSharp):
    1. // only play the sound if volume is more than zero
    2. if (playOptions.volume > 0.000001f) {
    3.     nativeSource = NativeAudio.GetNativeSourceAuto();
    4.     nativeSource.Play(nativeAudioPointer, playOptions);
    5. }
     
    Last edited: Oct 4, 2021
    mylastggeast likes this.
  32. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    This seems to be an error on my part. Because I used struct as an option to not cause garbage collection (audio playing is frequently called, so I was being extra careful about this), but struct doesn't have custom default constructor, the default volume of 0 is not a good default. Then I think I code in that volume 0 is 1 (the real default I want) thinking that no one would ever want to intentionally play at volume 0... causing this bug.

    There were a lot of people that didn't know PlayOptions.defaultOptions exists to get around the default struct problem, then `new` the options from scratch and only change the other things not knowing that volume starts at 0. So I code in the "0 = 1" that way. Sorry about that!
     
    mylastggeast likes this.
  33. danishgoel

    danishgoel

    Joined:
    Apr 24, 2016
    Posts:
    3
    Hi @5argon

    volume 0 but sound playing issue on Android might be due to
    log10(0)
    being -Infinity.
    Function
    setVolume()
    in nativeaudioe7.c computes decibel value like this:
    (SLmillibel) (20 * log10(volume) * 100)

    And this might be causing the issue.

    Since I did not want to recompile the native library, I used the above posted workaround.

    Also can you take a look at the crash log posted above in comment #79?
    Would be great if you can send the symbol files for the included prebuilt binaries.
     
    Last edited: Oct 4, 2021
    mylastggeast and 5argon like this.
  34. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Thinking of it, would you have any suggestion on how to mimic the mixer's audio effects? I have a few audio assets that have varying degrees of reverb and echo and without the mixer it looks like I'd have to create a variant for each reverb or echo value in some audio tool, which would produce a large amount of assets. Does Android manage some effects natively in a way that is accessible through Unity?
     
  35. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Specifically, would there be a way to reach any of these effects through your product in a way similar you call native audio functions?
     
  36. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    There is no way that I have programmed in the plugin.
     
  37. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Ok thanks.
    However, I suppose that in theory, it could be done?
     
  38. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Native Audio 7.0.0 is now live with several small fixes.

    Minor update but big semver bump because minimum support is now 2019.4 LTS and folks on 2017 cannot use it anymore

    https://exceed7.com/native-audio/CHANGELOG.html

    Oh no, I forgot to come back to this, but yeah I think in theory it could be done. Not sure how hard it would be having to interface with OpenSL ES which Native Audio uses, nowadays people shifted to AAudio or Oboe I think?
     
  39. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    A few questions.

    1. How many sounds can be played at once? Especially if there's a long music playing in the background?
    2. Does it use Unity's compression settings as defined for each audio file in Assets?
    3. Does it handle pitch?
    4. Does it allow a basic form of fade in / out?
    5. Is the code complex? I work on Playmaker. If it's simple enough I can control the methods or add them in actions.
    6. Since it's using native code, will this work at runtime in the Editor? In other words, can I test it on a computer first?
    7. Can the documentation be read without buying and downloading the Native Audio extension?
     
  40. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    1. When using NA, you are asked to take ownership of native sources on initialization. Since we don't mix audio, that number equals number of sounds, but that varies. Please see : https://exceed7.com/native-audio/theories/ways-around-latency.html

    Long music should be played using regular Unity's audio system, all from Unity mixes into a single native source (on Android there are multiple), so not affecting overlapping plays through NA.

    2. Yes, it export audio data from imported AudioClip to the native side using Unity's https://docs.unity3d.com/ScriptReference/AudioClip.GetData.html . That data changes according to compression settings.

    3. No. You can see other features whether they are missing or not in this table : https://exceed7.com/native-audio/index.html#feature-list

    4. There is no pre-made fade function, but you can control the volume of the native source that was selected to play your audio. With some effort you can perhaps implement it, but remember that one source can only play one audio, there is no mixing.

    5. Sorry, I have no experience in Playmaker and don't know how it works at all.

    6. No, it will throw UnsupportedException since it is considered PC / macOS in the editor, and NA doesn't know native PC / macOS audio interfacing. Just to debug, you may wire up to e.g. audioSource.PlayOneShot with preprocessor directive wrapping the call. I tried doing that for you, but turns out it would cause more trouble than having you do it on your own. See yellow box on top of this page : https://exceed7.com/native-audio/how-to-use/getting-started.html

    7. Yes, what's bundled in the package are all Markdown (with images), which are used to generate this website : https://exceed7.com/native-audio . Reading on the site is 100% equal to reading in the .md
     
  41. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Does this work with Unity Timeline and Video Player components?
     
  42. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    What do you mean work? There is no relationship with Unity Timeline.

    If you meant Unity Timeline's Audio Track, that will always use Unity's built-in audio engine. Native Audio cannot override anything, all must be explicitly used.

    There is no "Native Audio Timeline Track" included, but if you create a Timeline Signal and call NativeAudio methods then that "works".
     
  43. lwn

    lwn

    Joined:
    May 28, 2021
    Posts:
    5

    • Does this work for Unity mobile background playback of multi-end audio
     
  44. Harsh-K

    Harsh-K

    Joined:
    Oct 29, 2021
    Posts:
    5
    Do you @5argon have native touch please share the asset
     
  45. Harsh-K

    Harsh-K

    Joined:
    Oct 29, 2021
    Posts:
    5
    please if u have native touch share it
     
  46. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Is it possible to load multiple files in parallel?
    I have tried two methods to load them, one that goes through each file, picks its path then loads it and only then moves to the next one.
    There's also another method which I thought would be much faster, although it would probably tax the phone's CPU much more all at once. It's a method that creates instances (or say objects) of a template script that will take in all of the 'file path' values, and therefore launch several loaders at once. Each instance runs with its own 'file path' value, loads the file and then closes once it's done.
    But there is no difference in time when letting each instantiated script load its own dedicated file, so it seems that the loadings are queued no matter what and there's no gain of time. On some devices, it can takes perhaps 5 or 7 seconds to complete and I'm looking for a way to curb that down.
     
  47. wildgoose789

    wildgoose789

    Joined:
    Feb 7, 2023
    Posts:
    5
    Hi, I am triggering audio via MIDI that triggers with no latency in the editor on desktop, but when built to quest the audio has a slight lag. Is this something NA can help with?
     
  48. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    NA only supports Android and iOS, I don't know if it works on the Quest, or even if it works, how much better it would be.. sorry!
     
  49. Starbox

    Starbox

    Joined:
    Sep 17, 2014
    Posts:
    467
    Hey, you!
    May you have any opinion on my remark above about the multi-loading of files? :)
    Thanks.