Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

How to get WebSockets work with WebGL

Discussion in 'Scripting' started by Zita, Jan 8, 2017.

  1. Zita

    Zita

    Joined:
    Oct 24, 2014
    Posts:
    17
    I'm have made a Unity WebGL app based on the WebSocket asset and also a web socket service using AspWebSockets (ashx) which I host in IIS on port 80 with scheme "ws". The app works fine when I run it in the editor. It communicates with my ws-server as it should. But when I build the WebGL-version and put it up on the server it stops working. I'm running the app in Chrome and everything else works fine. It loads and runs without problems, except that it never seem to be able to contact the server. Do I have to enable WebGL to communicate on ws:// with port 80 somehow or should it work right away? It's strange since it works fine in the editor on the same computer. Any ideas?
     
  2. Zita

    Zita

    Joined:
    Oct 24, 2014
    Posts:
    17
    I got it working now. I changed one thing and then suddenly it worked. In my socket.SendAsync-method on the server I had specified WebSocketMessageType = Text and when I changed that to Binary it began to work. If anyone of you know why please let me know. The strangest thing is that it worked fine with Text when running in the editor, but only with Binary when running in a browser. Also, what is this parameter for really? Is sending in Binary more efficient somehow?