Search Unity

Accessing CameraCapture.mm functions in xCode

Discussion in 'AR/VR (XR) Discussion' started by Kevin__h, Apr 3, 2018.

  1. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    So I want to implement CoreML (a computer vision model from apple) in a Unity ARKit application. I managed to implement CoreML in Unity without a problem using a native Swift plugin. The problem is implementing it with ARKit for Unity, my native swift uses an AVCaptureSession to get the camera feed but so does ARKit and you can't run 2 AVCaptureSessions at the same time. Now I'm trying to make my CoreML script use the AVCaptureSession from Unity but the only reference to it is in CameraCapture.m, adding this class' header to my swift bridging header allowed me access to the captureSession but when I try to build xcode throws the following error: Symbols not found for architecture ARM64, this is a very common error and I tried every solution I could find online, nothing worked. Am I implementing this objective c function wrong or does Unity simply not allow for us to access that function? Maybe there is another way to achieve my goal by looking for an active AVCaptureSession and adding my CoreML output to it?

    (posting this here because I can't on Unity Answers, getting an error every time I try)
     
  2. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Do you need the AVCaptureSession, or just the CVPixelBufferRef that ARKit uses? I believe CoreML just need a pointer to an image to carry out its process. If so, you can just use https://developer.apple.com/documentation/arkit/arframe/2867984-capturedimage?language=objc
    This has not been exposed to C# directly in the plugin, but since you seem familiar with writing native plugins, it should be straightforward for you?
     
  3. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    This is the way I'm trying to approach it right now but getting the pixelbuffer is a pain in the ass. I somehow need to access the pixelbuffer from ARSessionNative.mm and that seems impossible.
     
  4. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    line 750 of ARSessionNative.mm is
    CVPixelBufferRef pixelBuffer = frame.capturedImage;

    You can save that into a static global and access it from your native plugin as a quick and dirty solution.
     
  5. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    That's one way to go about it I guess, at the moment I'm trying to access the pixelbuffer from Unity and pass it through that way. Just need to find a way to convert intptr to cvpixelbuffer and we should be there :'). Oh and FYI I'm not that familiar with writting native plugins, the past 2 weeks I just followed A LOT! of different tutorials trying to figure out how to use native swift in Unity so I could get CoreML in there. All tutorials were missing one important part tho, what to do in xCode once build from Unity, a lot of settings have to be changed in there as well. Anyways your solution might turn out to be not such a quick but dirty solution for me :') (2 weeks ago I knew nothing about swift or xcode) but if it works it's better then having nothing at all.

    Link to my tutorial on how to implement CoreML in Unity: https://medium.com/@kevinhuyskens/implementing-coreml-in-unity-e91bcf80a3c5
    (is a quick draft, will make it better once I have some free time)
     
  6. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Ok so you were talking about a dirty solution, I did a super dirty one. Creating a Swift CVPixelBuffer variable with starting by pointing to an empty pixelbuffer and then in the ARSessionNative setting that swift variable to the correct pixelbuffer. Accessing swift in objective c was no problem but tyring to access that stupid Objective C pixelbuffer in swift never seemed to work, either it threw an error of "unrecognized selector sent to instance" once I started my function on my device or "Property with 'retain (or strong)' attribute must be of object type" in my header file. So yeah ... dirty solution it is :')
     
  7. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,084
    Were you able to access CoreML from within Unity - do you have a working sample on github
     
  8. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I just realized that we pass up the pointer to cvPixelBuffer up to the C# layer, so you should be able to pass that down into your CoreML plugin if you wanted.

    To get the pointer, take the UnityARCamera reference that you get from every frame update and get the following field from it

    unityARCamera.videoParams. cvPixelBufferPtr
     
  9. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    CoreML from within Unity is no problem, it's combining ARKit and CoreML in unity that causes the problem but eventueally I got that one to. All I have online is the tutorial mentioned above. I do plan on releasing a sample on github and expanding on the tutorial but for the moment I'm kinda busy working on the project it self and it's not optimized what so ever so before I put a sample online I will clean out the code and optimize it a bit more.
     
  10. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Ok so I managed to get the pixelbuffer from Unity and everything runs ok. A mobilenet model gets 11fps on my iPhone 6S and a Resnet50 model gets 3fps. (this is when running CoreML and ARKit simultaneously) So to have a smoother experience I'm now looking into multithreading. Do you have any sources I could access (difficult to find good up to date sources on google)?

    Anyways I'd like to thank you for your time and input. It helped me out achieving something not many people have done before. And this without having any knowledge about the subjects 2 weeks prior :')

    Link to my, still not finished, tutorial: (includes a Unitypackage with the plugins and code needed to achieve the ARKit CoreML combo) https://medium.com/@kevinhuyskens/implementing-coreml-in-unity-e91bcf80a3c5

    Edit: I did try to use the Swift DispatchQueue function which is meant for multithreading in my Swift code when it needs to analyse the pixelbuffer (this is what causes the low framerate) but I think somehow Unity overrules this because it doesn't change anything.
     
  11. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Ok so here we are again, completely desperate this time. I was using the pixelbuffer pointer provided by Unity and passing it through in my native app. The weird thing is, depending on the CoreML model I'm using, the application crashes. If I'm passing both a string (object tag) and the pixelbuffer pointer through it crashes. The crash is caused by, apparently, bad memory management in the C part. The error it gives is IOS_EXC_BAD_ACCESS, according to the internet it means that object I'm pointing to is already released. What surprises me is that it depends on the data I'm sending through ... also if I limit the amount of times I sent through the pixelbuffer pointer it doesn't give this error either. Do you by any chance know a solution or explanation? Also the pixelbufferpointer, does it ever change or is it always the same, so is there any use in passing it through every frame?
     
  12. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Update: It seems to be depending on the framerate, if I say, push the pixelbuffer through 30 times per second (a frame rate the application can steadily manage) it doesn't crash but if I tell it to push it through 50 times per second (a framerate that the application can barely manage) it crashes, so I'm guessing that for a split second the framerate drops below 50 causing Unity to pass through an empty pixelbuffer? Do you by any chance know if the Unity frame rate (the rate at which Update() is executed) is separate from the incoming pixelbuffer framerate from ARKit? To me it looks that way, but I'm not sure.
     
  13. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    ARKit always provides the frame at 60fps - Unity's Update happens at the rate at which it completes other operations. So they are probably out of sync. You could try to use FixedUpdate.
     
  14. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Tried that one, didn't work. Also tried LateUpdate (perhaps Update was being called at the moment the object was being created so LateUpdate being called a split second later could perhaps make the difference ... it did not). If, however I set the framerate in the ARCameraManager to 30 (this is the Application.Framerate) it does work. But by doing this ARKit will still provide 60fps? (not visible of course since Unity will only output 30)
    I'm struggling to grasp what essentially is causing this problem. If ARKit provides at 60fps and Unity "struggles" to keep up, there shouldn't be any problem. The pointer is passed through as many times as Unity can keep up which is less then the amount ARKit can provide so it should never pass through an empty buffer/pointer?
    Don't know if I'm making any sense, it's late right now, will review this in the morning.
     
  15. dormouse

    dormouse

    Joined:
    Mar 1, 2011
    Posts:
    82
    interesting topic. i know that you can integrate your tensorflow trained model with AI-agent of Unity.
     
  16. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    But can you use it for image classification or object detection?
     
  17. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    @Kevin__h @jimmya I've bumped into the same `EXC_BAD_ACCESS`.

    * I've tried calling my native function from unity from within ` ARFrameUpdatedEvent += ARFrameUpdated`, same problem
    * Tried reducing frame rate as Kevin mentioned. That does seem to delay the problem, but it still hits.

    It seems to me that we're always trying to use a pointer to the last frame which is being released from memory, but I haven't dug very deep into the problem.

    Has anyone else made progress on this?
     
  18. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Setting the Application.Framerate to a framerate low enough to a frame rate you can maintain stable, that does seem to "solve" the problem for me. It's not really a solution, but it'll do for now since I'm temporarily placed on another project :'( but next week I'll continue with this one and try to find a solution.
     
  19. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    @Kevin__h

    I'm pretty sure that what's happening is that ARKit is sending frames faster than I can process. I've logged the CVPixelBuffer refs from within ARSessionNative.mm and from within the Objective-C function called from Unity and here's what I get.

    "On Frame", refers to an ARKit frame from `ARSessionNative.mm`
    "From Unity", is the point passed in from Unity.

    In this run, I get one good in sync frame, then ARKit sends a bunch of frames before Unity responds and crashes. My guess is that by the time my iOS plugin is called, the reference it is using has already been released from memory. With that said, I don't really understand why your fix, lowering the frame rate, would work.

    @jimmya any clever thought on how to synchronize these things?

    Pixel Ref: On Frame <CVPixelBuffer 0x1d0126f40 width=1920 height=1080 pixelFormat=420f iosurface=0x1cc014da0 planes=2>
    Pixel Ref: From Unity <CVPixelBuffer 0x1d0126f40 width=1920 height=1080 pixelFormat=420f iosurface=0x1cc014da0 planes=2>
    Pixel Ref: On Frame <CVPixelBuffer 0x1d01271c0 width=1920 height=1080 pixelFormat=420f iosurface=0x1d0017b10 planes=2>
    Pixel Ref: On Frame <CVPixelBuffer 0x1d0127260 width=1920 height=1080 pixelFormat=420f iosurface=0x1d4201cd0 planes=2>
    Pixel Ref: On Frame <CVPixelBuffer 0x1d412aa00 width=1920 heigh2018-04-26 20:29:53.793959-0700 facekit[21249:6334824]
    Pixel Ref: On Frame <CVPixelBuffer 0x1d412dc00 width=1920 height=1080 pixelFormat=420f iosurf2018-04-26 20:29:53.794785-0700
    Pixel Ref: On Frame <CVPixelBuffer 0x1d412d480 width=1920 height=1080 pixelFormat=420f iosurface=0x1d4203960 planes=2>​
     
  20. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    Alright @Kevin__h here's my "solution":

    ARSessionNative.mm

    Code (CSharp):
    1.  
    2. static NSMutableArray *capturedImageBuffer = [[NSMutableArray alloc] init];
    3. ...
    4. CVPixelBufferRef pixelBuffer = frame.capturedImage;
    5. if ([capturedImageBuffer count] > 10) {
    6.     [capturedImageBuffer removeObjectAtIndex: 0];
    7. }
    8. [capturedImageBuffer addObject: frame];
    9.  
    Clearly this isn't the most performant fix, but it seems pretty reliable. The array simply prevents the reference from being released.

    This seems like a difficult problem to architect nicely. Ideally there would be some way to have both plugins access the pixelBuffer at the time its returned from ARKit without tightly coupling the two code bases.
     
  21. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Changing anything in the ARSessionNative.mm is something I would avoid at all times, either find a way within Unity to prevent ARKit from sending frames to a released object or change the CoreML plugin to directly use the pixelbuffer from ARKit instead of passing it through in Unity.
     
  22. Vander-Does

    Vander-Does

    Joined:
    Dec 22, 2015
    Posts:
    19
    What’s your reason for avoiding it? For maintenance reasons? It so, agreed, it’s not ideal. Perhaps we can find something that would be an acceptable addition and create a pull request.

    I don’t see how unity could prevent the release.

    Did you find a way to access the pixel buffer from the plugin directly without altering the ARKit plugin? I couldn’t think of a way to do that without also changing the ARKit plugin.
     
  23. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Reason for not doing this: if an updated version is released, you'd have to redo this so it's definitely not optimal.
     
  24. Vander-Does

    Vander-Does

    Joined:
    Dec 22, 2015
    Posts:
    19
    Great, then we’re on the same page. I’ll let you know if I find a better solution.
     
  25. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    One thing you can do is check if the pixelbufferptr != System.IntPtr.Zero, works for me at least.
     
  26. Vander-Does

    Vander-Does

    Joined:
    Dec 22, 2015
    Posts:
    19
    That didn’t work for me at all. It was never null in Unity.
     
  27. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38

    I ended up using this implementation as it seems to be the only 100% working solution, the only problem I have is that it completely freezes my screen for as long as I'm calling the image classification function. Anyone has a solution for this?
     
    jessevan likes this.
  28. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    @Kevin__h didn't you say earlier that you were planning on looking into multithreading? I was using face detection, not coreml, but it made Unity run like molasses. I assumed that was simply the expense of running face detection and haven't dug much deeper.

    If I were to dig deeper, I think would do two things:

    * Look into multithreading
    * Only call face detection ever few seconds and interpolate the values
     
  29. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Multithreading is easier said then done with Unity. I tried running the image analysis on another thread from within Swift (quiet easy to do) but for some reason Unity just overrides that and doesn't allow multithreading. The only way I know you can really do multithreading in Unity is with C# jobs but thats something new that's just in preview mode and also doesn't work for everything. For my case interpolating the values isn't an option, to prevent the app from being completely stuck, just analysing every few frames might do the trick, tried that before and it does work but it's still a very low framerate tho. And a low framerate is bad for tracking with ARKit.
     
  30. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    Ah, I didn't know that about multithreading, that's too bad. Agreed on the running only every few frames. Even if it only happens every few seconds, its still going to cause undesirable jank.
     
  31. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Indeed, but in optimal conditions, all it needs to do is analyse 1 frame, problem with that is that it needs an incredibly accurate model and those are very slow and aren't accurate enough atm
     
  32. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    UPDATE: I think I fixed the issue with the pixelbuffer accessing bad memory thingy that makes the app crash. The solution is in the Swift code. I followed apple's preview project where they use regular ARKit with CoreML and adjusted it to work with my Unity ARKit. The updated Swift script can be found in my tutorial on how to implement CoreML in Unity here: https://medium.com/@kevinhuyskens/implementing-coreml-in-unity-e91bcf80a3c5
     
    jimmya and Vander-Does like this.