Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Correct way to host game on Amazon S3 with GZIP compression?

Discussion in 'WebGL' started by Deleted User, Jul 21, 2016.

  1. Deleted User

    Deleted User

    Guest

    Hi,
    We've released Piranh.io, a web game similar to Agar.io and Slither.io, a few weeks ago. The game is hosted on Amazon S3 with Cloudfront. And we're using Unity 5.3.5f1.

    We are getting the gzip notifications ("Decompressed Release/Piranhio.memgz in 125ms. You can remove this delay if you configure your web server to host files using gzip compression").

    What is the correct way to setup gzip support for Amazon S3 and Cloudfront? I followed these instructions but I still got the SyntaxError: illegal character error.
     
  2. Deleted User

    Deleted User

    Guest

    Ok, I got this working. If someone else is struggling with the same issue, below are the steps. This applies to Unity 5.3.5 + WebGL + Amazon S3 + Amazon Cloudfront.

    1. Build the WebGL version from Unity
    2. Go to "Release" folder and delete .htaccess file
    3. Rename YOURGAME.datagz to YOURGAME.data
    4. Rename YOURGAME.jsgz to YOURGAME.js
    5. Rename YOURGAME.memgz to YOURGAME.mem
    6. Go to your Amazon S3 bucket and upload all files there
    7. Go to Release folder, select YOURGAME.data and click "Properties" from top right. The right side panel should open.
    8. Open "Metadata" accordion panel
    9. Click "Add more metadata". For "Key" select "Content-Encoding" and enter gzip to Value (see attached image). Then click "Save".
    10. Do steps 7-9 for files YOURGAME.js and YOURGAME.mem
    11. Done.
     

    Attached Files:

    AFrisby likes this.
  3. liortal

    liortal

    Joined:
    Oct 17, 2012
    Posts:
    3,559
    We are also hosting our WebGL content on Amazon S3, and we run through the same steps you mention.

    There's an Amazon AWS plugin for Unity that allows you to automate pretty much every aspect when working with S3. I believe most of the steps you listed above could be automated.

    We plan to automate this and run this on every successful build that completes in Unity cloud build.
     
    ramand likes this.
  4. alexsuvorov

    alexsuvorov

    Unity Technologies

    Joined:
    Nov 15, 2015
    Posts:
    327
    This should normally work, however, be aware that some clients and intermediate proxy servers might not support/expect Content-Encoding: gzip. Original .htaccess overrides this header only if Accept-Encoding: gzip request header was provided by the client, otherwise the load might fail. Original build configuration might add about 1 second to the loading time when hosting on services that do not support url rewrite, but it also guarantees correct loading for all the clients.
     
  5. Deleted User

    Deleted User

    Guest

    Thank you for the insight Alex! We'll consider if we're going to use this solution in the future or not. Too bad Amazon S3 does not support .htaccess so it will always default to non-gzip version even if the client would support gzip.
     
  6. DimensionU

    DimensionU

    Joined:
    Aug 1, 2013
    Posts:
    43
    I'm running into an issue that appears to resemble what @alexsuvorov was talking about. We have been hosting our .datagz, .jsgz and .memgz files on S3 and CloudFront adding the encoding metadata that @JyriKilpelainen mentioned. It has been working for us, but we now have a client who gets 403 errors and the files show up as .datagzgz, .jsgzgz, etc. We're building with Unity 5.4, so there is no .htaccess file. If I made my own (modeled after the old pre 5.4 one), would that help in our situation? Our dev site uses S3 in the same exact way except that we're not using CloudFront on dev. The client can access those files. Is this a CloudFront/CDN issue? Is there a solution? What should we do to remedy this?
     
  7. Deleted User

    Deleted User

    Guest

    @DimensionU Did you solve the problem? I would want to use that gzip version on Amazon S3 to lower the time to load the game. But I really don't want to cause any problems with players.
     
  8. DimensionU

    DimensionU

    Joined:
    Aug 1, 2013
    Posts:
    43
    @JyriKilpelainen sorry for the late response. I just saw your post...
    I did, in fact, solve it. I had made the mistake of accessing those gzip compressed resources on S3 via http instead of https. I am not sure why this was causing the problem, but when I corrected that, the problem went away.