Search Unity

Video recording on Android

Discussion in 'AR' started by coder89, Dec 15, 2018.

  1. coder89

    coder89

    Joined:
    Oct 26, 2018
    Posts:
    29
    Hi,

    I am trying to implement video recording (5-10 seconds recording) of a camera textures (not the screen textures) provided by the AR Foundation similarly to how it can be done with pure ARCore in Java (https://stackoverflow.com/questions/47869061/providing-video-recording-functionality-with-arcore). I come from Windows/DX11/MF world but I am a bit lost in this Android universe. :) So the question to Android/Unity/AR experts - is this something even feasible?

    So far I tried dumping raw camera frames but GPU/CPU syncing kills the performance. Ideally, I would like to use HW accelerated video encoding passing external camera textures directly to the encoder so I wouldn't need to copy them between GPU/CPU. It seems like I am already getting NV12 frames from the camera (via CameraImage struct).

    This would be fairly simple on Windows using DX11 and MF but since I am not an Android expert I would be super grateful for some hints where to start.

    Thanks!
    Lukasz