Search Unity

Question Audio memory in WebGL

Discussion in 'Audio & Video' started by zengjunjie59, Apr 12, 2023.

  1. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    I have many WAV format audio files, totaling 2MB in size, but after playing them in Safari, they occupy nearly 200MB of memory and cannot be released. How can I release this memory or reduce the memory consumption of these audio files?
    unity2021.2.13f1
    Here is some simple code:
    Code (CSharp):
    1. void DownloadAudio(){
    2. var www = UnityWebRequest.Get(url);
    3. www.downloadHandler = new
    4. DownloadHandlerAudioClip(url, AudioType.WAV);
    5. StartCoroutine(CoroutineDownload);
    6. }
    7.  
    8. IEnumerator CoroutineDownload(UnityWebRequest request){
    9. yield return request.SendWebRequest();
    10. var hand = request.downloadHandler as DownloadHandlerAudioClip;
    11. this.audio = hand.audioClip;
    12. }
    13. void Play(){
    14. this.audioSource.clip = this.audio;
    15. this.audioSource.Play();
    16. }
     
  2. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    A few things you could do...

    1. Use MP3 or OGG if possible in your context, this would be a great improvement in file sizes.
    2. Can you consider using the built-in audio asset management instead of manually loading? You might want to look into Addressables!
    3. You could pool your AudioClip objects, so when one's data is unloaded you could fill it with another one's data.
    4. You can manually unload with AudioClip.UnlaodAudioData() once the file is done playing. You could schedule this with coroutines:
    Code (CSharp):
    1. void Play()
    2. {
    3.     audioSource.clip = audioClip;
    4.     audioSource.Play();
    5.     StartCoroutine(UnloadAudioData(audioClip));
    6. }
    7.  
    8. private IEnumerator UnloadAudioData(AudioClip audioClip)
    9. {
    10.     // Wait for the audio to finish playing
    11.     yield return new WaitForSeconds(audioClip.length);
    12.  
    13.     // Unload audio data
    14.     audioClip.UnloadAudioData();
    15. }
     
  3. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    I have tested audio files in both MP3 and AAC formats, and the memory usage is similar. The memory usage of the audioclip itself is not significant. However, after decoding through the Web Audio API, the memory usage is exceptionally huge and cannot be released. I will test reusing the audioclip to see if there is any improvement.
     
  4. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    From the other thread:
    Are you trying to release the AudioContext of WebAudio? I think it's possible but I'm really not sure it wouldn't backfire. I've seen this somewhere else, I have not tested it personally but maybe it can be a source of inspiration:

    Create a
    WebAudioUnload.jslib
    file in
    Assets/Plugins

    Code (JavaScript):
    1. mergeInto(LibraryManager.library, {
    2.   CloseWebAudioContext: function () {
    3.     if (typeof AudioContext !== 'undefined' || typeof webkitAudioContext !== 'undefined') {
    4.       var audioContext = Module.s_Instance;
    5.       if (audioContext) {
    6.         audioContext.close();
    7.       }
    8.     }
    9.   },
    10. });
    Invoke it in Unity with
    Code (CSharp):
    1. public class AudioManager : MonoBehaviour
    2. {
    3.     // Other audio-related code...
    4.  
    5. #if UNITY_WEBGL && !UNITY_EDITOR
    6.     [DllImport("__Internal")]
    7.     private static extern void CloseWebAudioContext();
    8. #endif
    9.  
    10.     public void CloseAudioContext()
    11.     {
    12. #if UNITY_WEBGL && !UNITY_EDITOR
    13.         CloseWebAudioContext();
    14. #endif
    15.     }
    16. }
     
  5. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Thank you, I will try this method to see if it can release audio memory. Releasing audio memory is crucial because the memory limit for iOS web is below 1.4GB. For MMORPG games, this amount of memory is not enough, and I cannot accept that audio alone occupies 200MB of that memory.
     
  6. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    I hope you find a way to reach your goal!

    I just want to mention that audio often take a surprising amount of memory in game projects, especially when it comes to dialogues! I've seen a few game projects where they would have different repositories for code, game data and sound. This is really to highlight that it is somehow expected to have a big chunk of you game's weight coming from audio.

    I want to come back on something you mentioned earlier...
    Do you mean that you have a certain number of sounds and that their total combined size is around 2MB altogether, but that they occupy 200MB once used in the game?
     
  7. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Yes, there are about 9 BGM tracks with a duration of one minute each, sampled at 2000 Hz and 8-bit unsigned. There are also 60 sound effects sampled at 4000 Hz and 8-bit unsigned. The total size of the WAV files is around 2MB. After playing all of them, the memory usage is around 200MB.
     
  8. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    When building for a WebGL target, all audio files are converted to AAC/MP4 and resampled to 44100Hz. This decision was taken a while ago to improve browser compatibility. Basically, no matter how tight you compress or downsample for this target, you will most likely see bloathing of your audio files occur.
     
  9. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Can I modify the local Unity file to change this sampling rate?
     
  10. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    I don't think so, it's the internal serialization sampling rate. I was thinking that maybe you could use your original audio file, remove the file extension and load them with a UnityWebRequestMultimedia, maybe this way you could bypass the serialization process, but I have honestly not tested that so I am not sure it would work.
     
  11. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    I tested on Safari mobile browser and found that Module.s_Instance is a nil value.
     
  12. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Unity2021.2.13f1 Audio.js code
    Can I modify the following code to solve my current problem?
    Code (JavaScript):
    1. var LibraryAudioWebGL = {
    2. $WEBAudio: {
    3.     audioInstanceIdCounter: 0,
    4.     audioInstances: {},
    5.     audioContext: null,
    6.     audioWebEnabled: 0,
    7.     audioCache: []
    8.     },
    9.  
    10. JS_Sound_Init__proxy: 'sync',
    11. JS_Sound_Init__sig: 'v',
    12. JS_Sound_Init: function()
    13. {
    14.     try {
    15.         window.AudioContext = window.AudioContext||window.webkitAudioContext;
    16.         WEBAudio.audioContext = new AudioContext();
    17.  
    18.         var tryToResumeAudioContext = function() {
    19.             if (WEBAudio.audioContext.state === 'suspended')
    20.                 WEBAudio.audioContext.resume();
    21.             else
    22.                 Module.clearInterval(resumeInterval);
    23.         };
    24.         var resumeInterval = Module.setInterval(tryToResumeAudioContext, 400);
    25.  
    26.         WEBAudio.audioWebEnabled = 1;
    27.  
    28.         // Safari has the restriction where Audio elements need to be created from a direct user event,
    29.         // even if the rest of the audio playback requirements is that a user event has happeend
    30.         // at some point previously. The AudioContext also needs to be resumed, if paused, from a
    31.         // direct user event. Catch user events here and use them to fill a cache of Audio
    32.         // elements to be used by the rest of the system.
    33.         var _userEventCallback = function () {
    34.             try {
    35.                 // On Safari, resuming the audio context needs to happen from a user event.
    36.                 // The AudioContext is suspended by default, and on iOS if the user switches tabs
    37.                 // and comes back, it will be interrupted. Touching the page will resume audio
    38.                 // playback.
    39.                 if (WEBAudio.audioContext.state !== "running") {
    40.                     WEBAudio.audioContext.resume();
    41.                 }
    42.                 // How many audio elements should we cache? How many compressed audio channels might
    43.                 // be played at a single time?
    44.                 var audioCacheSize = 20;
    45.                 while (WEBAudio.audioCache.length < audioCacheSize) {
    46.                     var audio = new Audio();
    47.                     audio.autoplay = false;
    48.                     WEBAudio.audioCache.push(audio);
    49.                 }
    50.             } catch (e) {
    51.                 // Audio error, but don't need to notify here, they would have already been
    52.                 // informed of audio errors.
    53.             }
    54.         };
    55.         window.addEventListener("mousedown", _userEventCallback);
    56.         window.addEventListener("touchstart", _userEventCallback);
    57.  
    58.         // Make sure we release the event listeners when the app quits to avoid leaking memory.
    59.         Module.deinitializers.push(function() {
    60.             window.removeEventListener("mousedown", _userEventCallback);
    61.             window.removeEventListener("touchstart", _userEventCallback);
    62.         });
    63.     }
    64.     catch(e) {
    65.         alert('Web Audio API is not supported in this browser');
    66.     }
    67. },
    68.  
    69. JS_Sound_ReleaseInstance__proxy: 'async',
    70. JS_Sound_ReleaseInstance__sig: 'vi',
    71. JS_Sound_ReleaseInstance: function(instance)
    72. {
    73.     // Explicitly disconnect audio nodes related to this audio channel when the channel should be
    74.     // GCd to work around Safari audio performance bug that resulted in crackling audio; as suggested
    75.     // in https://bugs.webkit.org/show_bug.cgi?id=222098#c23
    76.     var channel = WEBAudio.audioInstances[instance];
    77.     if (channel) {
    78.         if (channel.disconnectSource) channel.disconnectSource();
    79.         if (channel.gain) channel.gain.disconnect();
    80.         if (channel.panner) channel.panner.disconnect();
    81.     }
    82.  
    83.     // Let the GC free up the channel.
    84.     delete WEBAudio.audioInstances[instance];
    85. },
    86.  
    87. JS_Sound_Load_PCM__proxy: 'sync',
    88. JS_Sound_Load_PCM__sig: 'iiiii',
    89. JS_Sound_Load_PCM: function(channels, length, sampleRate, ptr)
    90. {
    91.     if (WEBAudio.audioWebEnabled == 0)
    92.         return 0;
    93.  
    94.     var sound = {
    95.         buffer: WEBAudio.audioContext.createBuffer(channels, length, sampleRate),
    96.         error: false
    97.     };
    98.     for (var i = 0; i < channels; i++)
    99.     {
    100.         var offs = (ptr>>2) + length * i;
    101.         var buffer = sound.buffer;
    102.         var copyToChannel = buffer['copyToChannel'] || function (source, channelNumber, startInChannel)
    103.         {
    104.             // Shim for copyToChannel on browsers which don't support it like Safari.
    105.             var clipped = source.subarray(0, Math.min(source.length, this.length - (startInChannel | 0)));
    106.             this.getChannelData(channelNumber | 0).set(clipped, startInChannel | 0);
    107.         };
    108.         copyToChannel.apply(buffer, [HEAPF32.subarray(offs, offs + length),i, 0]);
    109.     }
    110.     WEBAudio.audioInstances[++WEBAudio.audioInstanceIdCounter] = sound;
    111.     return WEBAudio.audioInstanceIdCounter;
    112. },
    113.  
    114. JS_Sound_Load__proxy: 'sync',
    115. JS_Sound_Load__sig: 'iiii',
    116. JS_Sound_Load: function(ptr, length, decompress)
    117. {
    118.     if (WEBAudio.audioWebEnabled == 0)
    119.         return 0;
    120.  
    121.     var sound = {
    122.         buffer: null,
    123.         error: false
    124.     };
    125.     WEBAudio.audioInstances[++WEBAudio.audioInstanceIdCounter] = sound;
    126. #if USE_PTHREADS
    127.     // AudioContext.decodeAudioData() does not currently allow taking in a view to a
    128.     // SharedArrayBuffer, so make a copy of the data over to a regular ArrayBuffer instead.
    129.     // See https://github.com/WebAudio/web-audio-api/issues/1850
    130.     var audioData = new ArrayBuffer(length);
    131.     new Uint8Array(audioData).set(HEAPU8.subarray(ptr, ptr+length));
    132. #else
    133.     var audioData = HEAPU8.buffer.slice(ptr, ptr+length);
    134. #endif
    135.  
    136.     // We don't ever want to play back really small audio clips as compressed, the compressor has a startup CPU cost,
    137.     // and replaying the same audio clip multiple times (either individually or when looping) has an unwanted CPU
    138.     // overhead if the same data will be decompressed on demand again and again. Hence we want to play back small
    139.     // audio files always as fully uncompressed in memory.
    140.  
    141.     // However this will be a memory usage tradeoff.
    142.  
    143.     // Tests with aac audio sizes in a .m4a container shows:
    144.     // 2.11MB stereo 44.1kHz .m4a file containing 90 seconds of 196kbps aac audio decompresses to 30.3MB of float32 PCM data. (~14.3x size increase)
    145.     // 721KB stereo 44.1kHz .m4a file 29 seconds of 196kbps aac audio decompresses to 10.0MB of float32 PCM data. (~14x size increase)
    146.     // 6.07KB mono 44.1kHZ .m4a file containing 1 second of 101kbps aac audio decompresses to 72kB of float32 PCM data. (~11x size increase)
    147.     // -> overall AAC compression factor is ~10x-15x.
    148.  
    149.     // Based on above, take 128KB as a cutoff size: if we have a .m4a clip that is smaller than this,
    150.     // we always uncompress it up front, receiving at most ~1.8MB of raw audio data, which can hold about ~10 seconds of mono audio.
    151.     // In other words, heuristically all audio clips <= mono ~10 seconds (5 seconds if stereo) in duration will be always fully uncompressed in memory.
    152.     if (length < 131072) decompress = 1;
    153.  
    154.     if (decompress) {
    155.         WEBAudio.audioContext.decodeAudioData(
    156.             audioData,
    157.             function(buffer) {
    158.                 sound.buffer = buffer;
    159.             },
    160.             function(error) {
    161.                 sound.error = true;
    162.                 console.log ("Decode error: " + error);
    163.             }
    164.         );
    165.     } else {
    166.         var blob = new Blob([audioData], { type: "audio/mp4" });
    167.         sound.url = URL.createObjectURL(blob);
    168.  
    169.         // An Audio element is created for the buffer so that we can access properties like duration
    170.         // in JS_Sound_GetLength, which knows about the buffer object, but not the channel object.
    171.         // This Audio element is used for metadata properties only, not for playback. Trying to play
    172.         // back this Audio element would cause an error on Safari because it's not created in a
    173.         // direct user event handler.
    174.         sound.mediaElement = new Audio();
    175.         sound.mediaElement.preload = "metadata";
    176.         sound.mediaElement.src = sound.url;
    177.     }
    178.  
    179.     return WEBAudio.audioInstanceIdCounter;
    180. },
    181.  
    182. JS_Sound_Create_Channel__proxy: 'sync',
    183. JS_Sound_Create_Channel__sig: 'vii',
    184. JS_Sound_Create_Channel: function (callback, userData)
    185. {
    186.     if (WEBAudio.audioWebEnabled == 0)
    187.         return;
    188.  
    189.     var channel = {
    190.         gain: WEBAudio.audioContext.createGain(),
    191.         panner: WEBAudio.audioContext.createPanner(),
    192.         threeD: false,
    193.         playUrl: function(startTime, url, startOffset) {
    194.             try {
    195.                 this.setup(url);
    196.                 var chan = this;
    197.                 this.source.onended = function() {
    198.                     chan.disconnectSource();
    199.                     if (callback) dynCall("vi", callback, [ userData ]);
    200.                 };
    201.                 this.source.start(startTime, startOffset);
    202.                 this.source.playbackStartTime = startTime - startOffset / this.source.playbackRate.value;
    203.             } catch (e) {
    204.                 // Need to catch exception, otherwise execution will stop on Safari if audio output is missing/broken
    205.                 console.error("playUrl error. Exception: " + e);
    206.             }
    207.         },
    208.         playBuffer: function(startTime, buffer, startOffset) {
    209.             try {
    210.                 this.setup(); // Ensure we have a fresh AudioBufferSourceNode lined up.
    211.                 this.source.buffer = buffer;
    212.                 var chan = this;
    213.                 this.source.onended = function() {
    214.                     // Immediately disconnect the AudioBufferSourceNode
    215.                     // that played back in this channel so that the
    216.                     // JS memory related to the audio can be GCd.
    217.                     chan.disconnectSource();
    218.                     if (callback)
    219.                         dynCall('vi', callback, [userData]);
    220.                 };
    221.                 this.source.start(startTime, startOffset);
    222.                 this.source.playbackStartTime = startTime - startOffset / this.source.playbackRate.value;
    223.             } catch (e) {
    224.                 // Need to catch exception, otherwise execution will stop on Safari if audio output is missing/broken
    225.                 console.error("playUrl error. Exception: " + e);
    226.             }
    227.         },
    228.         disconnectSource: function() {
    229.             if (this.source && !this.source.isPausedMockNode) {
    230.                 this.source.onended = null;
    231.                 this.source.disconnect();
    232.                 if (this.source.mediaElement) {
    233.                     var url = this.source.mediaElement.src;
    234.                     this.source.mediaElement.pause();
    235.                     this.source.mediaElement.src = "";
    236.                     // Here we delete the Audio element instead of recycling it to the audio pool
    237.                     // because WebAudio does not let you use an Audio element for another WebAudio
    238.                     // node once it's been associated with one.
    239.                     delete this.source.mediaElement;
    240.                     URL.revokeObjectURL(url);
    241.                 }
    242.                 delete this.source;
    243.             }
    244.         },
    245.         stop: function(delay) {
    246.             // stop sound currently playing.
    247.             if (channel.source && channel.source.buffer)
    248.             {
    249.                 try {
    250.                     channel.source.stop(WEBAudio.audioContext.currentTime + delay);
    251.                 } catch (e) {
    252.                     // when stop() is used more than once for the same source in Safari it causes the following exception:
    253.                     // InvalidStateError: DOM Exception 11: An attempt was made to use an object that is not, or is no longer, usable.
    254.                     // Ignore that exception.
    255.                 }
    256.  
    257.                 if (delay == 0)
    258.                 {
    259.                     // Disconnect the AudioBufferSource node in this channel so that its memory can be reclaimed.
    260.                     channel.disconnectSource();
    261.                 }
    262.             }
    263.         },
    264.         pause: function() {
    265.             var s = this.source;
    266.             if (!s) return;
    267.  
    268.             // If the source is a compressed audio MediaElement, it can be paused directly.
    269.             if (s.mediaElement) {
    270.                 this.pauseMediaElement();
    271.                 return;
    272.             }
    273.             // WebAudio does not have support for pausing and resuming AudioBufferSourceNodes (they are a fire-once abstraction)
    274.             // When we want to pause a node, create a mocked object in its place that represents the needed state that is required
    275.             // for resuming the clip.
    276.             var pausedSource = {
    277.                 isPausedMockNode: true,
    278.                 loop: s.loop,
    279.                 loopStart: s.loopStart,
    280.                 loopEnd: s.loopEnd,
    281.                 buffer: s.buffer,
    282.                 url: s.mediaElement ? s.mediaElement.src : null,
    283.                 playbackRate: s.playbackRate.value,
    284.                 // Specifies in seconds the time at the clip where the playback was paused at.
    285.                 // Can be negative if the audio clip has not started yet.
    286.                 playbackPausedAtPosition: s.estimatePlaybackPosition(),
    287.                 setPitch: function(v) { this.playbackRate = v; }
    288.             };
    289.             // Stop and clear the real audio source...
    290.             this.stop(0);
    291.             this.disconnectSource();
    292.             // .. and replace the source with a paused mock version.
    293.             this.source = pausedSource;
    294.         },
    295.         resume: function() {
    296.             var pausedSource = this.source;
    297.             // If the source is a compressed audio MediaElement, it was directly paused so we can
    298.             // directly play it again.
    299.             if (pausedSource && pausedSource.mediaElement) {
    300.                 pausedSource.start();
    301.                 return;
    302.             }
    303.             // N.B. We only resume a source that has been previously paused. That is, resume() cannot be used to start playback if
    304.             // channel was not playing an audio clip before, but playBuffer() is to be used.
    305.             if (!pausedSource || !pausedSource.isPausedMockNode) return;
    306.             delete this.source;
    307.             if (pausedSource.url) {
    308.                 this.playUrl(WEBAudio.audioContext.currentTime - Math.min(0, pausedSource.playbackPausedAtPosition), pausedSource.url, Math.max(0, pausedSource.playbackPausedAtPosition));
    309.             } else {
    310.                 this.playBuffer(WEBAudio.audioContext.currentTime - Math.min(0, pausedSource.playbackPausedAtPosition), pausedSource.buffer, Math.max(0, pausedSource.playbackPausedAtPosition));
    311.             }
    312.             // Restore the remembered attributes from the paused mock object.
    313.             this.source.loop = pausedSource.loop;
    314.             this.source.loopStart = pausedSource.loopStart;
    315.             this.source.loopEnd = pausedSource.loopEnd;
    316.             this.source.setPitch(pausedSource.playbackRate);
    317.         },
    318.         setup: function(url) {
    319.             // Sets up a new AudioBufferSourceNode to play in this channel,
    320.             // if one did not exist already.
    321.             if (this.source && !this.source.isPausedMockNode) return;
    322.  
    323.             // If a URL wasn't passed as an argument to setup, then it's decompressed audio.
    324.             if (!url) {
    325.                 this.source = WEBAudio.audioContext.createBufferSource();
    326.             } else {
    327.                 // Compressed audio goes through an Audio DOM element. Safari has a restriction that
    328.                 // prevent Audio elements from being playable if they are not created during a
    329.                 // direct user event, so a cache is used where a cache of audio elements is
    330.                 // populated during a direct user event. If the cache runs out of elements,
    331.                 // this will create a new Audio element, which will not be playable on Safari
    332.                 // but will be playable on other browsers. Interacting with the page will
    333.                 // repopulate the Audio cache.
    334.                 this.mediaElement = WEBAudio.audioCache.length ? WEBAudio.audioCache.pop() : new Audio();
    335.                 this.mediaElement.preload = "metadata";
    336.                 this.mediaElement.src = url;
    337.                 this.source = WEBAudio.audioContext.createMediaElementSource(this.mediaElement);
    338.  
    339.                 this.source.playbackRate = {};
    340.                 var source = this.source;
    341.  
    342.                 Object.defineProperty(this.source, "loop", {
    343.                     get: function () {
    344.                         return source.mediaElement.loop;
    345.                     },
    346.                     set: function (v) {
    347.                         if (source.mediaElement.loop !== v) source.mediaElement.loop = v;
    348.                     }
    349.                 });
    350.  
    351.                 Object.defineProperty(this.source.playbackRate, "value", {
    352.                     get: function () {
    353.                         return source.mediaElement.playbackRate;
    354.                     },
    355.                     set: function (v) {
    356.                         if (source.mediaElement.playbackRate !== v) source.mediaElement.playbackRate = v;
    357.                     }
    358.                 });
    359.  
    360.                 Object.defineProperty(this.source, "currentTime", {
    361.                     get: function () {
    362.                         return source.mediaElement.currentTime;
    363.                     },
    364.                     set: function (v) {
    365.                         if (source.mediaElement.currentTime !== v) source.mediaElement.currentTime = v;
    366.                     }
    367.                 });
    368.  
    369.                 Object.defineProperty(this.source, "mute", {
    370.                     get: function () {
    371.                         return source.mediaElement.mute;
    372.                     },
    373.                     set: function (v) {
    374.                         if (source.mediaElement.mute !== v) source.mediaElement.mute = v;
    375.                     }
    376.                 });
    377.  
    378.                 var self = this;
    379.                 // Playing MediaElements is asynchronous and pausing the element before the play
    380.                 // has started causes an error. Play returns a Promise that resolves when the play
    381.                 // has finished, after which we can pause if necessary. So this watches the promise,
    382.                 // and if a pause is called before it has finished, it sets a flag to have the
    383.                 // element pause when the previous play promise is resolved. If there is no
    384.                 // play promise currently pending, it can be paused immediately.
    385.                 //
    386.                 // This asynchronous dealy between calling play and the audio actually playing,
    387.                 // makes compressed audio not suitable for precise audio timing, but is fine for
    388.                 // things that don't require precise timing like background audio.
    389.                 this.playPromise = null;
    390.                 this.pauseRequested = false;
    391.  
    392.                 this.pauseMediaElement = function () {
    393.                     // If there is a play request still pending, then pausing now would cause an
    394.                     // error. Instead, mark that we want the audio paused as soon as it can be,
    395.                     // which will be when the play promise resolves.
    396.                     if (self.playPromise) {
    397.                         self.pauseRequested = true;
    398.                     } else {
    399.                         // If there is no play request pending, we can pause immediately.
    400.                         source.mediaElement.pause();
    401.                     }
    402.                 };
    403.  
    404.                 var _startPlayback = function(offset) {
    405.                     if (self.playPromise) {
    406.                         self.pauseRequested = false;
    407.                         return;
    408.                     }
    409.                     source.mediaElement.currentTime = offset;
    410.                     self.playPromise = source.mediaElement.play();
    411.                     if (self.playPromise) {
    412.                         self.playPromise.then(function () {
    413.                             // If a pause was requested between play() and the MediaElement actually
    414.                             // starting, then pause it now.
    415.                             if (self.pauseRequested) {
    416.                                 source.mediaElement.pause();
    417.                                 self.pauseRequested = false;
    418.                             }
    419.                             self.playPromise = null;
    420.                         });
    421.                     }
    422.                 };
    423.  
    424.                 this.source.start = function (startTime, offset) {
    425.                     // Compare startTime to WEBAudio context currentTime, and if
    426.                     // startTime is more than about 4 msecs in the future, do a setTimeout() wait
    427.                     // for the remaining duration, and only then play. 4 msecs boundary because
    428.                     // setTimeout() is specced to throttle <= 4 msec waits if repeatedly called.
    429.                     var startDelayThresholdMS = 4;
    430.                     // Convert startTime and currentTime to milliseconds
    431.                     var startDelayMS = (startTime - WEBAudio.audioContext.currentTime) * 1000;
    432.                     if (startDelayMS > startDelayThresholdMS) {
    433.                         setTimeout(function () { _startPlayback(offset); }, startDelayMS);
    434.                     } else {
    435.                         _startPlayback(offset);
    436.                     }
    437.                 };
    438.  
    439.                 this.source.stop = function () {
    440.                     self.pauseMediaElement();
    441.                 };
    442.             }
    443.  
    444.             // Add a helper to AudioBufferSourceNode which gives the current playback position of the clip in seconds.
    445.             this.source.estimatePlaybackPosition = function() {
    446.                 var t = (WEBAudio.audioContext.currentTime - this.playbackStartTime) * this.playbackRate.value;
    447.                 // Collapse extra times that the audio clip has looped through.
    448.                 if (this.loop && t >= this.loopStart) {
    449.                     t = (t - this.loopStart) % (this.loopEnd - this.loopStart) + this.loopStart;
    450.                 }
    451.                 return t;
    452.             }
    453.             // Add a helper to AudioBufferSourceNode to allow adjusting pitch in a way that keeps playback position estimation functioning.
    454.             this.source.setPitch = function(newPitch) {
    455.                 var curPosition = this.estimatePlaybackPosition();
    456.                 if (curPosition >= 0) { // If negative, the clip has not begun to play yet (that delay is not scaled by pitch)
    457.                     this.playbackStartTime = WEBAudio.audioContext.currentTime - curPosition / newPitch;
    458.                 }
    459.                 if (this.playbackRate.value !== newPitch) this.playbackRate.value = newPitch;
    460.             }
    461.  
    462.             // Finally, connect the node to panner/gain and destination
    463.             this.setupPanning();
    464.         },
    465.         // Changes this audio channel to either 3D panning or 2D mode (no panning)
    466.         setupPanning: function() {
    467.             // We have a mocked paused object in effect?
    468.             if (this.source.isPausedMockNode) return;
    469.  
    470.             this.source.disconnect();
    471.             // Configure audio panning options either for 3D or 2D.
    472.             if (this.threeD) {
    473.                 // In 3D: AudioBufferSourceNode/MediaElementSourceNode -> PannerNode -> GainNode -> AudioContext.destination
    474.                 this.source.connect(this.panner);
    475.                 this.panner.connect(this.gain);
    476.             }
    477.             else {
    478.                 // In 2D: AudioBufferSourceNode/MediaElementSourceNode -> GainNode -> AudioContext.destination
    479.                 this.panner.disconnect();
    480.                 this.source.connect(this.gain);
    481.             }
    482.         }
    483.     };
    484.     channel.panner.rolloffFactor = 0; // We calculate rolloff ourselves.
    485.     channel.gain.connect(WEBAudio.audioContext.destination);
    486.     WEBAudio.audioInstances[++WEBAudio.audioInstanceIdCounter] = channel;
    487.     return WEBAudio.audioInstanceIdCounter;
    488. },
    489.  
    490. JS_Sound_Play__proxy: 'sync',
    491. JS_Sound_Play__sig: 'viiii',
    492. JS_Sound_Play: function (bufferInstance, channelInstance, offset, delay)
    493. {
    494.     // stop sound which is playing in the channel currently.
    495.     _JS_Sound_Stop(channelInstance, 0);
    496.  
    497.     if (WEBAudio.audioWebEnabled == 0)
    498.         return;
    499.  
    500.     var sound = WEBAudio.audioInstances[bufferInstance];
    501.     var channel = WEBAudio.audioInstances[channelInstance];
    502.  
    503.     if (sound.url) {
    504.         try {
    505.             channel.playUrl(WEBAudio.audioContext.currentTime + delay, sound.url, offset);
    506.         }
    507.         catch(e) {
    508.             // Need to catch exception, otherwise execution will stop on Safari if audio output is missing/broken
    509.             console.error("playUrl error. Exception: " + e);
    510.         }
    511.     } else if (sound.buffer) {
    512.         try {
    513.             channel.playBuffer(WEBAudio.audioContext.currentTime + delay, sound.buffer, offset);
    514.         }
    515.         catch(e) {
    516.             // Need to catch exception, otherwise execution will stop on Safari if audio output is missing/broken
    517.             console.error("playBuffer error. Exception: " + e);
    518.         }
    519.     }
    520.     else
    521.         console.log ("Trying to play sound which is not loaded.")
    522. },
    523.  
    524. JS_Sound_SetLoop__proxy: 'sync',
    525. JS_Sound_SetLoop__sig: 'vii',
    526. JS_Sound_SetLoop: function (channelInstance, loop)
    527. {
    528.     if (WEBAudio.audioWebEnabled == 0)
    529.         return;
    530.  
    531.     var channel = WEBAudio.audioInstances[channelInstance];
    532.     if (!channel.source) {
    533.         channel.setup(); // Set up a new AudioBufferSourceNode if one did not yet exist.
    534.     }
    535.     if (channel.source.loop !== loop) channel.source.loop = loop;
    536. },
    537.  
    538. JS_Sound_SetLoopPoints__proxy: 'sync',
    539. JS_Sound_SetLoopPoints__sig: 'vidd',
    540. JS_Sound_SetLoopPoints: function (channelInstance, loopStart, loopEnd)
    541. {
    542.     if (WEBAudio.audioWebEnabled == 0)
    543.         return;
    544.     var channel = WEBAudio.audioInstances[channelInstance];
    545.     if (!channel.source) {
    546.         channel.setup(); // Set up a new AudioBufferSourceNode if one did not yet exist.
    547.     }
    548.     var s = channel.source;
    549.     if (s.loopStart !== loopStart) s.loopStart = loopStart;
    550.     if (s.loopEnd !== loopEnd) s.loopEnd = loopEnd;
    551. },
    552.  
    553. JS_Sound_Set3D__proxy: 'sync',
    554. JS_Sound_Set3D__sig: 'vii',
    555. JS_Sound_Set3D: function (channelInstance, threeD)
    556. {
    557.     var channel = WEBAudio.audioInstances[channelInstance];
    558.     if (channel.threeD != threeD)
    559.     {
    560.         channel.threeD = threeD;
    561.         // Set up a new AudioBufferSourceNode if one did not yet exist, and
    562.         // also setup new panning options.
    563.         if (!channel.source) {
    564.             channel.setup(); // Set up a new AudioBufferSourceNode if one did not yet exist.
    565.         }
    566.         channel.setupPanning();
    567.     }
    568. },
    569.  
    570. JS_Sound_Stop__proxy: 'sync',
    571. JS_Sound_Stop__sig: 'vid',
    572. JS_Sound_Stop: function (channelInstance, delay)
    573. {
    574.     if (WEBAudio.audioWebEnabled == 0)
    575.         return;
    576.  
    577.     var channel = WEBAudio.audioInstances[channelInstance];
    578.     channel.stop(delay);
    579. },
    580.  
    581. JS_Sound_SetPosition__proxy: 'sync',
    582. JS_Sound_SetPosition__sig: 'viddd',
    583. JS_Sound_SetPosition: function (channelInstance, x, y, z)
    584. {
    585.     if (WEBAudio.audioWebEnabled == 0)
    586.         return;
    587.  
    588.     var p = WEBAudio.audioInstances[channelInstance].panner;
    589.     // Work around Chrome performance bug https://bugs.chromium.org/p/chromium/issues/detail?id=1133233
    590.     // by only updating the PannerNode position if it has changed.
    591.     // See case 1270768.
    592.     if (p.positionX) {
    593.         // Use new properties if they exist ...
    594.         if (p.positionX.value !== x) p.positionX.value = x;
    595.         if (p.positionY.value !== y) p.positionY.value = y;
    596.         if (p.positionZ.value !== z) p.positionZ.value = z;
    597.     } else if (p._x !== x || p._y !== y || p._z !== z) {
    598.         // ... or the deprecated set function if they don't (and shadow cache the set values to avoid re-setting later)
    599.         p.setPosition(x, y, z);
    600.         p._x = x;
    601.         p._y = y;
    602.         p._z = z;
    603.     }
    604. },
    605.  
    606. JS_Sound_SetVolume__proxy: 'sync',
    607. JS_Sound_SetVolume__sig: 'vid',
    608. JS_Sound_SetVolume: function (channelInstance, v)
    609. {
    610.     if (WEBAudio.audioWebEnabled == 0)
    611.         return;
    612.  
    613.     try {
    614.         var g = WEBAudio.audioInstances[channelInstance].gain.gain;
    615.         // Work around WebKit bug https://bugs.webkit.org/show_bug.cgi?id=222098
    616.         // Updating volume only if it changes reduces sound distortion over time.
    617.         // See case 1350204, 1348348 and 1352665
    618.         if (g.value !== v) g.value = v;
    619.     } catch(e) {
    620.         console.error('JS_Sound_SetVolume(channel=' + channelInstance + ', volume=' + v + ') threw an exception: ' + e);
    621.     }
    622. },
    623.  
    624. JS_Sound_SetPaused__proxy: 'sync',
    625. JS_Sound_SetPaused__sig: 'vii',
    626. JS_Sound_SetPaused: function(channelInstance, paused) {
    627.     if (WEBAudio.audioWebEnabled == 0)
    628.         return;
    629.     var channel = WEBAudio.audioInstances[channelInstance];
    630.     var channelCurrentlyPaused = !channel.source || channel.source.isPausedMockNode;
    631.     if (paused != channelCurrentlyPaused) {
    632.         if (paused) channel.pause();
    633.         else channel.resume();
    634.     }
    635. },
    636.  
    637. JS_Sound_SetPitch__proxy: 'sync',
    638. JS_Sound_SetPitch__sig: 'vid',
    639. JS_Sound_SetPitch: function (channelInstance, v)
    640. {
    641.     if (WEBAudio.audioWebEnabled == 0)
    642.         return;
    643.  
    644.     try {
    645.         WEBAudio.audioInstances[channelInstance].source.setPitch(v);
    646.     } catch(e) {
    647.         console.error('JS_Sound_SetPitch(channel=' + channelInstance + ', pitch=' + v + ') threw an exception: ' + e);
    648.     }
    649. },
    650.  
    651. JS_Sound_SetListenerPosition__proxy: 'sync',
    652. JS_Sound_SetListenerPosition__sig: 'vddd',
    653. JS_Sound_SetListenerPosition: function (x, y, z)
    654. {
    655.     if (WEBAudio.audioWebEnabled == 0)
    656.         return;
    657.  
    658.     var l = WEBAudio.audioContext.listener;
    659.  
    660.     // Do not re-set same values here if the orientation has not changed. This avoid unpredictable performance issues in Chrome
    661.     // and Safari Web Audio implementations.
    662.     if (l.positionX) {
    663.         // Use new properties if they exist ...
    664.         if (l.positionX.value !== x) l.positionX.value = x;
    665.         if (l.positionY.value !== y) l.positionY.value = y;
    666.         if (l.positionZ.value !== z) l.positionZ.value = z;
    667.     } else if (l._positionX !== x || l._positionY !== y || l._positionZ !== z) {
    668.         // ... and old deprecated setPosition if new properties are not supported.
    669.         l.setPosition(x, y, z);
    670.         l._positionX = x;
    671.         l._positionY = y;
    672.         l._positionZ = z;
    673.     }
    674. },
    675.  
    676. JS_Sound_SetListenerOrientation__proxy: 'sync',
    677. JS_Sound_SetListenerOrientation__sig: 'vdddddd',
    678. JS_Sound_SetListenerOrientation: function (x, y, z, xUp, yUp, zUp)
    679. {
    680.     if (WEBAudio.audioWebEnabled == 0)
    681.         return;
    682.  
    683.     // Web Audio uses a RHS coordinate system, Unity uses LHS, causing orientations to be flipped.
    684.     // So we pass a negative direction here to compensate, otherwise channels will be flipped.
    685.     x = -x;
    686.     y = -y;
    687.     z = -z;
    688.  
    689.     var l = WEBAudio.audioContext.listener;
    690.  
    691.     // Do not re-set same values here if the orientation has not changed. This avoid unpredictable performance issues in Chrome
    692.     // and Safari Web Audio implementations.
    693.     if (l.forwardX) {
    694.         // Use new properties if they exist ...
    695.         if (l.forwardX.value !== x) l.forwardX.value = x;
    696.         if (l.forwardY.value !== y) l.forwardY.value = y;
    697.         if (l.forwardZ.value !== z) l.forwardZ.value = z;
    698.  
    699.         if (l.upX.value !== x) l.upX.value = x;
    700.         if (l.upY.value !== y) l.upY.value = y;
    701.         if (l.upZ.value !== z) l.upZ.value = z;
    702.     } else if (l._forwardX !== x || l._forwardY !== y || l._forwardZ !== z || l._upX !== xUp || l._upY !== yUp || l._upZ !== zUp) {
    703.         // ... and old deprecated setOrientation if new properties are not supported.
    704.         l.setOrientation(x, y, z, xUp, yUp, zUp);
    705.         l._forwardX = x;
    706.         l._forwardY = y;
    707.         l._forwardZ = z;
    708.         l._upX = xUp;
    709.         l._upY = yUp;
    710.         l._upZ = zUp;
    711.     }
    712. },
    713.  
    714. JS_Sound_GetLoadState__proxy: 'sync',
    715. JS_Sound_GetLoadState__sig: 'ii',
    716. JS_Sound_GetLoadState: function (bufferInstance)
    717. {
    718.     if (WEBAudio.audioWebEnabled == 0)
    719.         return 2;
    720.  
    721.     var sound = WEBAudio.audioInstances[bufferInstance];
    722.     if (sound.error)
    723.         return 2;
    724.     if (sound.buffer || sound.url)
    725.         return 0;
    726.     return 1;
    727. },
    728.  
    729. JS_Sound_ResumeIfNeeded__proxy: 'sync',
    730. JS_Sound_ResumeIfNeeded__sig: 'v',
    731. JS_Sound_ResumeIfNeeded: function ()
    732. {
    733.     if (WEBAudio.audioWebEnabled == 0)
    734.         return;
    735.  
    736.     if (WEBAudio.audioContext.state === 'suspended')
    737.         WEBAudio.audioContext.resume();
    738.  
    739. },
    740.  
    741. JS_Sound_GetLength__proxy: 'sync',
    742. JS_Sound_GetLength__sig: 'ii',
    743. JS_Sound_GetLength: function (bufferInstance)
    744. {
    745.     if (WEBAudio.audioWebEnabled == 0)
    746.         return 0;
    747.  
    748.     var sound = WEBAudio.audioInstances[bufferInstance];
    749.  
    750.     // Fakemod assumes sample rate is 44100, though that's not necessarily the case,
    751.     // depending on OS, if the audio file was not imported by our pipeline.
    752.     // Therefore we need to recalculate the length based on the actual samplerate.
    753.     if (sound.buffer) {
    754.         var sampleRateRatio = 44100 / sound.buffer.sampleRate;
    755.         return sound.buffer.length * sampleRateRatio;
    756.     }
    757.  
    758.     // Convert duration (seconds) to number of samples.
    759.     return sound.mediaElement.duration * 44100;
    760. }
    761.  
    762. };
    763.  
    764. autoAddDeps(LibraryAudioWebGL, '$WEBAudio');
    765. mergeInto(LibraryManager.library, LibraryAudioWebGL);
     
  13. DerDicke

    DerDicke

    Joined:
    Jun 30, 2015
    Posts:
    292
    Maybe finding an js audiolib with the ability to stream may be a better alternative. Usually this is done by having two buffers (so called doublebuffer), one of the buffers is playing while the other is filled with decompressed data from the file on the fly. When buffer 1 is done playing, buffer 2 will continue playing and buffer 1 will be filled.

    Used to play music in any soundlib I know since the 90ies or so. Probably it is possible to use it in parallel to Unity's sound system for music only. Just an idea. Setting sampling rate is also common in any soundlib.
     
  14. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    This workload feels a bit heavy.:eek:
     
  15. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    The problem is not coming from there. When you build your game for the WebGL target, Unity will internally convert all your audio assets to 44100Hz MP4 and serialize that as your game data. It this build/serialization step that we would have to bypass to fix your issue I think.

    This is why I suggested to add your files in an unrecognizable state in your resources and then load them as AudioClips with a web request. I do have a similar issue I'm working on right these days, if I find something interesting I'll let you know!
     
  16. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    I tried your suggestion of removing the audio file extension and downloading it using UnityWebRequestMultimedia.GetAudioClip(url, AudioType.WAV), but it still occupies a lot of memory.
     
  17. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    Just to double-check, was there any size difference with this method (and how do you measure the memory occupied).
    Thanks!
     
  18. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    used my MacBook to connect to my iPhone and used the Xcode -> Product -> Profile tool to check the memory usage of the mobile application. While playing the same audio clip, I observed that the memory usage increased by the same amount, and after releasing the audioclip, the memory was not released.

    Another thing is that I added some logging in the JS_Sound_Load method of Audio.js and found that the 'length' variable indeed matches that of my WAV file, and they are both very small.
     
    SeventhString likes this.
  19. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    Hi @zengjunjie59
    I confirmed that if you put your files in the StreamingAssets folder (
    Application.streamingAssetsPath
    ) they would be excluded from the serialization process, so maybe if you put them there and load them with a webrequest you could avoid the AAC conversion and the size bloating. Here's the loading code I used, let me know if this helps

    Code (CSharp):
    1. private IEnumerator LoadFromWeb()
    2. {
    3.     string filePath = Path.Combine(Application.streamingAssetsPath, audioClipFileName);
    4. #if UNITY_WEBGL && !UNITY_EDITOR
    5.     filePath = System.Uri.EscapeUriString(filePath);
    6. #endif
    7.     webRequest = UnityWebRequestMultimedia.GetAudioClip(filePath, audioType);
    8.     yield return webRequest.SendWebRequest();
    9.  
    10.     if (webRequest.result == UnityWebRequest.Result.Success)
    11.     {
    12.         audioClip = DownloadHandlerAudioClip.GetContent(webRequest);
    13.         Log("AudioClip loaded");
    14.     }
    15.     else
    16.     {
    17.         Log($"Error loading AudioClip: {webRequest.error}.");
    18.     }
    19. }
     
  20. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Why do you think this method can bypass the serialization process? Aren't these audio files still downloaded from the CDN server, just with a different path? Is it because Unity does not serialize the audio file when it is downloaded from the streaming assets path?
     
  21. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    I tested it according to the method you provided, and the result is the same as before.
     
  22. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    Yes, files in this folder are excluded from serialization and are kept as they are in the game data.
     
  23. zengjunjie59

    zengjunjie59

    Joined:
    Dec 10, 2019
    Posts:
    19
    Indeed, audio files are not serialized in the editor, but is it the same for audio files downloaded through CDN?
    From my tests, it seems that the issue with memory usage might not be related to serialization?
     
  24. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    I think your CDN data would be treated as themselves, and like you say serialization might not be the issue. Kind of back to square one...

    So if serialization is not the issue, and
    AudioClip.UnloadAudioData()
    has no effect, the only way I would see is to keep digging in the custom js plugin and try to find and release the javascript audio object. It's ID should be the same you're getting from
    audioClip.GetInstanceID()
    .
     
  25. washming

    washming

    Joined:
    Aug 5, 2018
    Posts:
    2
    How about the next ? I have the same crazy problem
     
  26. Unifikation

    Unifikation

    Joined:
    Jan 4, 2023
    Posts:
    1,086
    Firstly, this is on Unity, to not forcibly resample audio files to 44100Hz and into a codec not of the users choice, because size is always the first and foremost consideration of all web targeting projects.

    Secondly, and probably much more importantly, Unity could and should provide access to OnAudioFilterRead in WebGL/WASM so that we can make our own sounds through the lightest possible means, which is sound synthesis.
     
  27. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    I totally agree with you. It is my opinion that web is the lesser audio platform in the case of Unity, essentially because it cannot leverage the FMOD codebase in that context. We are currently planning to review this platform implementation.
    The main blocker is synchronization in WebGL. While wasm workers are a thing, there is a considerable amount of effort to invest to properly shim FMOD toward the web in its current state.
     
    Unifikation likes this.
  28. Unifikation

    Unifikation

    Joined:
    Jan 4, 2023
    Posts:
    1,086
    It might be easier and faster and more pragmatic to simply make your own "thread" for audio creation and processing, that's able to inject into/over the FMOD stream, after it's done its Unity component work, kind of a like a "post processing" process, tacked on, especially for WebGL. Otherwise I don't think it'll ever be possible, and there's no telling how long you'll have to wait for compute shaders with WebGPU to provide a similar functionality, despite this probably being a superior, years down the road, option.

    Given that access/control/injection, I can create a simple FM/AM/PM synthesiser for most classical (and many modern) game sounds, and simple reverb, delay, low pass and high pass filters and effects. Will need someone else to make a granular synth, as I'm currently struggling with that, despite it (theoretically) being simpler to make.
     
  29. washming

    washming

    Joined:
    Aug 5, 2018
    Posts:
    2
    I fully agree with this point. The current problem is that I just want to play some simple WAV format mono audio and cannot be safely supported
     
  30. SeventhString

    SeventhString

    Unity Technologies

    Joined:
    Jan 12, 2023
    Posts:
    410
    I agree with all of this. The reason why I believe it has not already happened, is that we wouldn't want to just frankenstein this new feature on top of what we already have, and we would need to update/create that feature equally for all platforms. As you can certainly imagine, this is more than a localized surgical strike and would likely cause deprecation/api-breaks.
    I really, honestly, really resonate with what you're suggesting, and while I can't make it happen by myself solely by being internally vocal, there are places where you can be vocal such as pinned posts asking for your feedback and public roadmaps.
     
  31. Unifikation

    Unifikation

    Joined:
    Jan 4, 2023
    Posts:
    1,086
    This has been a problem for the lifetime of WebGL. The only option is to Frankenstein it. I think.

    And that's what I'm suggesting. That to get capability parity, there needs to be some consideration given to the fact that WebGL is unlike all other platforms and targets, and some customisations are going to be necessary... and we all deal with it. That means Unity deals with it, and all of us (the users) too, so that we can make better games for this most unique and far reaching of all platforms.

    PLEASE!!!
     
    SeventhString likes this.