Has anyone successfully used the displayChanged() method in the UnityPlayer Java class to add a Surface as an additional display that can be rendered to? I'm trying to use this with a Surface created for input to a MediaCodec encoder. The only reference I can find is a single forum post that references a single dot point in Unity 5.x release notes, but this looks like the perfect feature I need to get a rendered image from Unity into the MediaCodec encoder. When the app starts, Display.displays returns one display (the phone screen). I call UnityPlayer.displayChanged(1, encoderSurface), and a short time later Display.displays will return two displays. The Display.onDisplaysUpdated event never got triggered however, so I check each update for a change to the length of the Display.displays array. Then I enable a camera and set its targetDisplay to the index of the new display. I don't appear to be getting anything though. I'm still working on it, but if anyone has used this successfully it would be great to know I'm not running up a dead-end alleyway. And Unity, please document this stuff. Please.