Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. The 2023.1 beta is now available for testing. To find out what's new, have a look at our 2023.1 feature highlights.
    Dismiss Notice

How to receive Audio? How to play received Audio?

Discussion in 'Unity Render Streaming' started by mrSaig, May 19, 2021.

  1. mrSaig

    mrSaig

    Joined:
    Jan 14, 2010
    Posts:
    68
    Hey
    im currently working with the WebRTC 2.4.0-exp.1 packageg ... and managed to send Camera Video from one Unity Client to another Unity Client.
    Also looked at the Datachannel example and managed to send an AudioStream to the other Unity Client ... but i dont know how to play the received Audiostream!

    Currently i have something like that.
    Code (CSharp):
    1. receiveStream = new MediaStream();
    2.         receiveStream.OnAddTrack = e =>
    3.         {
    4.             Debug.Log("On Add Track");
    5.             if (e.Track is VideoStreamTrack track)
    6.             {
    7.                 receiveImage.texture = track.InitializeReceiver(1280, 720);
    8.                
    9.             }
    10.             if (e.Track is AudioStreamTrack atrack)
    11.             {
    12.                
    13.                 Debug.Log("RECEIVED AUDIOSTREAM");
    14.            
    15.             }
    16.         };
    but in the Type AudioStreamTrack there is nothing like Initialize receiver like with VideoStreamTrack?

    How can i assign the stream to a audio source?

    Thx for any help!
     
  2. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    47
    @mrSaig can you confirm for me if you get SDP offer/answers each time you add a new track? I'm getting the session connected between two Unity PC's and the add track works, I also see the UDP packets but the raw texture does not update. Then to answer your question on audio, do the following:

    private void OnAudioFilterRead(float[] data, int channels)
    {
    Audio.Update(data, data.Length);
    }

    OR

    private void Start()
    {
    AudioRenderer.Start();
    }

    private void Update()
    {
    var sampleCountFrame = AudioRenderer.GetSampleCountForCaptureFrame();
    var channelCount = 2; // AudioSettings.speakerMode == Stereo
    var length = sampleCountFrame * channelCount;
    var buffer = new NativeArray<float>(length, Allocator.Temp);
    AudioRenderer.Render(buffer);
    Audio.Update(buffer.ToArray(), buffer.Length);
    buffer.Dispose();
    }
     
  3. mrSaig

    mrSaig

    Joined:
    Jan 14, 2010
    Posts:
    68
    @WayneVenter Thanks will try your audio solution!
    If you get the track and you texture simply does not update,
    you probably forgott on one or both sides the webrtc update call
    Code (CSharp):
    1.  StartCoroutine(WebRTC.Update());
     
    Last edited: May 20, 2021
  4. mrSaig

    mrSaig

    Joined:
    Jan 14, 2010
    Posts:
    68
    @
    Hmm you Audio solutions are for recording only right? i have Audio Update already in my code ... but my problem is the other side ... audio output
     
  5. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    245
    mrSaig likes this.
  6. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    47
    Hi @mrSaig Thank you for this, it solved my problem, I now have video streaming working between to PC's
     
    mrSaig likes this.
  7. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    47
    @mrSaig Yes, was for Recording, currently what I am doing to solve this problem is use another package from FROZEN MIST https://assetstore.unity.com/packages/templates/packs/fmetp-stream-143080, it works really well but the Video bandwidth usage is insane, so I need WebRTC VP8 encoding to deduce my bandwidth needs. I see from the release notes we will only get audio out in July/August, so my suggestion would be to try the audio encoder from Frozen Mist and stream the data over the dataChannel in WebRTC or just direct via the WebSocket and keep the video on WebRTC.
     
    mrSaig likes this.
  8. mrSaig

    mrSaig

    Joined:
    Jan 14, 2010
    Posts:
    68
  9. hyeongwooman

    hyeongwooman

    Joined:
    May 31, 2021
    Posts:
    6
  10. hyeongwooman

    hyeongwooman

    Joined:
    May 31, 2021
    Posts:
    6
    I would like to know an example of sending and receiving Audio via AudioStream . I would like to know the specific sending part and receiving part.
     
  11. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    245
    Audio renderer support has changed to 2.4.0-exp.4.
     
  12. hyeongwooman

    hyeongwooman

    Joined:
    May 31, 2021
    Posts:
    6
    Thank you very much for your reply. gtk2k.
     
  13. sheliaarisa

    sheliaarisa

    Joined:
    Oct 24, 2022
    Posts:
    1
    Many applications on the Internet allow you to get audio from video. Plus you don't have to pay any money for it. But if you decide to do it yourself, you can find a code in the public domain that will enable you to do it. Modern technology amazes me because video sound is very high-quality today. I use https://productz.com/en/yamaha-yas-209/p/8VE1l while watching a movie. Sometimes I feel like someone is talking next to me, as this soundbar produces sound in three-dimensional quality.
     
    Last edited: Oct 26, 2022