I've been trying a few different approaches to get the texture images from an android SurfaceTexture to show in a Unity texture. I'm building on an Oculus Go (armv7). Here's my list of different approaches: GPU: 1. Display external texture from surfacetexture directly via `Texture2D.CreateExternalTexture` 1. Creating the surfacetexture via issuepluginevent but the rawimage with this texture attached just looks like a weird button sprite. 2. Draw surface texture -> framebuffer object -> unity texture 2D 1. If everything is generated and `updateTexImage()` is called from `GL.IssuePluginEvent()` the surfacetexture never updates and shows a black texture 2. Surfacetexture is generated via a helper glsurfaceview and the fbo and unity texture from `GL.IssuePluginEvent()`. I get a glerror when trying to access the fbo from the glsurfaceview's render thread, i.e. render into the fbo from the external texture. 3. If everything is created on the glsurfaceview's render thread except the unity texture, I have to call `glTexImage2D()` on the unity texture for opengl not to crash and from then on I can’t even draw a blue color into the unity texture, it remains black even though the surfacetexture is updating properly (I can see via glReadPixels and an android imageview) CPU : 1. Surface -> imagereader -> rawimage 1. image reader gives `error: SkAndroidCodec::NewFromStream returned null when trying to decode the bitmap` 2. Surfacetexture -> pixel buffer object -> unity texture2d 1. I'm using a glsurfaceview to update the surfacetexture for me and can confirm it is properly creating images. I get opengl error invalid operation when I try to gles30.glReadPixels into the pbo, however. Does anyone have any idea why I can't get this to work? I've unclicked multithreaded rendering but it seems like there's something going wrong with the opengl contexts anyways.