Search Unity

Can ARKit be used on PC?

Discussion in 'AR' started by dukerustfield, Apr 24, 2020.

  1. dukerustfield

    dukerustfield

    Joined:
    Dec 5, 2019
    Posts:
    33
    I want to use AR from my apple devices to modify Unity assets, but I work on a PC. If there isn't going to be a build or deployable game/product/app, do I still need to have IOS? I've been able to use Facecap and other tools to directly link to Unity on my IOS devices, but I'd rather be using the source ARKit.

    Thanks.
     
  2. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    ARKit is developed by Apple and the source code is proprietary. Apple's goal in this is to get more people to buy iOS devices, so they have no incentive to open source ARKit for PC.

    Also ARKit relies on carefully calibrated cameras and sensors all running in sync. There are few PC peripherals that combine a calibrated camera with the necessary sensors. The only two that come close are the RealSense D435i and the Azure Kinect.

    If you are just looking for a 6dof device for modifying assets, I would recommend looking into the RealSense T265 or a Vive Tracker puck.
     
  3. dukerustfield

    dukerustfield

    Joined:
    Dec 5, 2019
    Posts:
    33
    I've already said I want to use Arkit on my APPLE DEVICES. Which is where all those cameras and sensors are. They aren't on the Mac. I want to use Apple->Unity and Unity is on my PC. Unity shouldn't give a crap what OS it is running on.

    iClone can LiveFace using an iphone in realtime on a PC. And with a lot of damn work I can get other apps to use iPhone->Unity. While I appreciate your response, you didn't seem to read what I wrote.

    * I got an iphone capable of scanning faces and processing blendshapes and meshes. It is made by Apple.
    * I have Unity on a PC running Windows. Or running any goddamn thing that can run Unity. Why should it matter?
    * I do not want to change operating systems just because of one upstream application. That would be incredibly wasteful.
    * I am not building it for a target platform. So I need no libraries, methods, hardware or anything other than what the iPhone is recording.
     
    SmithySFC likes this.
  4. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    Hi Duke,

    I tried my best to understand your post, but it is not exactly clear what you are trying to accomplish. Your question appears to be "Can ARKit be used on PC?" and the answer is no for the following reasons:

    - ARKit is a proprietary API made by Apple that only runs on iOS and iPadOS.
    - In order to build an app for iOS or iPadOS you need to use XCode.
    - In order to use XCode you need a device that runs OSX.

    When deploying to iOS or iPadOS Unity generates and XCode project. Unity tells XCode to use ARKit, but Unity does not contain the ARKit source, or have any direct access to ARKit.

    Now you mention in your second post LiveFace by iClone and that ARKit is capable of capturing face blendshapes. Are you trying to use something like Facial AR Remote (https://github.com/Unity-Technologies/facial-ar-remote) where an app on an iOS device streams the data to a PC?

    If so the answer is yes, it is easy to accomplish. But you will still need an OSX device to deploy an app to iOS, and pay the yearly developer fee. The only alternative is to have someone build the iOS app for you, and send it to you through test flight.

    Hopefully this helps clear up the confusion.
     
  5. dukerustfield

    dukerustfield

    Joined:
    Dec 5, 2019
    Posts:
    33
    I'm not building an app. I've stated that several times. Facial-ar-remote is 2 years old and arkit2 and Xcode.

    I have an app now where I can stream directly to unity using my Iphone. So that's done. But I'm looking for Unity technologies so I don't have to C# this left and right and reinvent the wheel.

    I want to get animation into Unity editor from Iphone. Period. That's it. Unity. Iphone.
     
  6. dukerustfield

    dukerustfield

    Joined:
    Dec 5, 2019
    Posts:
    33
    Here is the solution I have so far:

    Software on iphone to capture data
    * https://www.bannaflak.com/face-cap/
    Transfer to gamemode in Unity via OSC
    *https://www.bannaflak.com/face-cap/livemode.html
    Record humanoid blendshapes with Unity GameObjectRecorder, this tool specifically, where you can simply drag it to the object (such as a head) that has your SkinnedMeshRenderer
    *https://github.com/newyellow/Unity-Runtime-Animation-Recorder

    Then you can record your iphone animating your blendshapes which have other animation and scenes. Then put that animation in a timeline (or wherever). The recorder is very heavyweight, and so are the other components. So this is not a great method for very intense or complex operations, IMHO. But it's a wireless transfer from phone->unity that can be done all in the editor. So excellent for spot corrections or synching up scenes.
     
  7. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    Hi Duke, it sounds like you have a working solution, so I am not sure what you are trying to accomplish.

    If you are having performance issues or dropped frames with sending face data over WiFi, or recording animation to disk using the Unity Runtime Recorder, it looks like the face-cap app lets you record data as an FBX. This should let you capture all of your performance data, and that file can be easily imported into Unity or your DCC tool of choice.

    It’s not as immediate as going to straight into Unity, but you might get a better end result. Those are options you will have to weigh in your decision.

    As I have explained several times only way to use ARKit natively is through an iOS or iPadOS app. That is why solutions like face cap require a native app to capture the data initial, and then provide workflow tools or get the data into a PC.
     
  8. dukerustfield

    dukerustfield

    Joined:
    Dec 5, 2019
    Posts:
    33
    I think it's just a terminology issue. Arkit is both the software that runs on the IOS device and the SDK on a Mac you can use to develop. And it's used interchangeably, as well as to mean, loosely, "AR." But you can plug into whatever is on the device and it is irrelevant where you're at or what you're using on the backend. To directly communicate or make apps is where they throw things at you.

    ARKit is just an API. And you can interface with APIs all sorts of ways. No matter how much they want to lock it down and make it proprietary. We do this all the time in Unity.

    As for Facecap, I use the FBX methods for longer portions and when I want to synch with audio. The OSC is for spot modifications during edit mode. And, again, it makes use of the Arkit capabilities of the phone which I utilize on my PC.
     
    yellowgypsy likes this.
  9. ArtificeDesigned

    ArtificeDesigned

    Joined:
    May 22, 2015
    Posts:
    1
    Ey sry to bump this old thing. I wanted to direct any other readers to this unity article:
    ARKit Remote: Now with face tracking!

    and also wondering what workflow the OP ended up going with? The one he mentioned earlier sounds like a lot of DOY might be involved...
     
  10. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    The framework being referenced in the article has since been deprecated and is no longer supported by Unity.