Search Unity

How to set up local multiplayer for AR?

Discussion in 'AR' started by Artificial_Illusions, Jun 7, 2019.

  1. Artificial_Illusions

    Artificial_Illusions

    Joined:
    Jul 24, 2018
    Posts:
    20
    Hello,
    I want to get into developing multiplayer games now that we have collaboration features with ARKIT 3. I am interested in local multiplayer where users share the same network and play together in the same room. I have never gotten into coding for multiplayer games and I'm very interested in learning more about it.
    What woud be the best workflow and guides to start implementing this feature into my AR game?

    thank you!.
     
  2. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Check out the ARCollaborationDataExample sample scene in the ARFoundation Samples repo. To reduce dependencies for the example, I just used the TcpClient and TcpListener classes built into C#. The exact networking solution that you use will depend on your app. Other options are HLAPI and Photon.
     
  3. Artificial_Illusions

    Artificial_Illusions

    Joined:
    Jul 24, 2018
    Posts:
    20
    Thanks so much for the simple answer!. I've looked into Photon for multiplayer but I wanted to start with something basic for development purposes first.
    Now, I tried to sideload the ARCollaborationDataExample sample scene to my iOS devices but for some reason the screen is blank on my phone and is throwing an error in Xcode. I am using Xcode 11 beta, and Unity 2019.1.5f1, phone version is IOS 12.4. What am I doing wrong? never had this happen before.
     

    Attached Files:

  4. Artificial_Illusions

    Artificial_Illusions

    Joined:
    Jul 24, 2018
    Posts:
    20
  5. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Collaborative session is an iOS 13 feature, so you'd need to update your phone to iOS 13 to use this feature.
     
  6. Artificial_Illusions

    Artificial_Illusions

    Joined:
    Jul 24, 2018
    Posts:
    20
    Sweet, I updated both my iphone and ipad devices to latest OS and I was able to test the ARCollaborationDataExample scene. I got both devices to communicate together and the log displays whenever either devices sends or receives data. But, whenever I click on an plane to add a prefab or local point, the device shows the hit as local and does not show up on the other device even though the log reports having sent/received data from the other device, is this normal?
     
  7. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Both devices have to agree they are in the same place before a reference point from one device can be resolved on the other device. This can take a bit of time and is not a precise science. There is no additional debugging information, unfortunately.

    Both devices should be showing a log on screen. One thing you can check is that when one device logs that it sent data of a certain size, see if the other device has a corresponding "received data of XXX bytes" message.
     
  8. Artificial_Illusions

    Artificial_Illusions

    Joined:
    Jul 24, 2018
    Posts:
    20
    Hey!, yes, absoultely...my process was:

    -Get IP4V Address from host Apple device (Wi-Fi network) and input it on the app for the host device
    -Input the same IP on the client device and connect to host device
    -Connection is established, I get green lights flashing depending on receive/send signals
    -Every time I do a hit, it does show that it sends x amount of bytes from whichever device I do the hit from. And, the receiving device gets the receive message "received data of XXX bytes"
    But, the hit only spawns a local coordinate model on the device the hit was made on. The other device does not show that model even though it has received the message that it has received data.
    I tried attaching a component to spawn prefabs on hit on ARSessionOrigin to test, and I still can't get the devices to "see" models spawned by the other device and viceversa.

    If you have it working, would you mind share what was the process?
    Also, do you know if the SwiftShot AR demo showcased by Apple uses the same networking method? I am currently developing an AR game and I am interesting in implementing local multiplayer like SwiftShot has. What would be the best approach with Unity, would HLAPI be enough? or ARCollaborationDataExample would be a good starting point?

    Thanks much!.
     
  9. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Sounds like you are doing it right; that is the correct process. In testing, we found that you need lots of recognizable features (e.g., a row of smooth white desktops are terrible for localization). Try placing a graphic T-shirt or some other unique object for the devices to key off of. A wood grain table top works pretty well too.

    SwiftShot uses networking methods specific to Apple. As for the best networking approach, there's a recent blog post about this that may help clarify.