Search Unity

  1. Are you interested in providing feedback directly to Unity teams? Sign up to become a member of Unity Pulse, our new product feedback and research community.
    Dismiss Notice

Mixing Unity with native OpenGL drawing on Android

Discussion in 'Android' started by A_M, May 3, 2012.

  1. A_M


    May 3, 2012
    Hi! I'm fairly new to Unity, and find myself trying to do some fairly basic OpenGL drawing on top of the Unity viewport (using same GLSurfaceView) for an Android Only project. I have tried a couple methods I thought might work but am getting stuck, and would hugely appreciate any help anyone might have to offer.

    I'm using Unity inside my Android Eclipse project, in basic accordance with the instructions in this thread:
    Integrating Unity and Eclipse

    My first attempt was to simply extend UnityPlayer, and in my subclass override onSurfaceCreated, onSurfaceChanged and onDrawFrame, and in each first call the respective super...() method, and then calling through to a GLSurfaceView.Renderer where I use the static GLES20. I tried glClear(...) which was working fine, and using this method I can load shaders, setup my glVertexAttribPointers, but whenever I attempt to do any drawing I end up with the screen basically flickering between the Unity Splash screen and what appears to be the last rendered Unity frame.

    I tried a couple things, like delaying the calls to my GLSurfaceView.Renderer until after Start() in my scene (thinking maybe Unity was destroying my GL setup code when it was doing it's loading), and also calling my Renderer's instead onDrawFrame from Unity's OnPostRender via an AndroidJavaObject like so:
    Code (csharp):
    1. GL.PushMatrix();
    2. AndroidJavaObject activity = mActivity;
    3. if(activity != null){
    4.     activity.Call ("onPostRender");
    5. }
    6. GL.PopMatrix();
    7. GL.InvalidateState();
    Neither of which appeared to make any differense.

    Then I happened upon this page and thought "hey, maybe I'll try drawing from native code with the NDK instead.":
    Low-Level Native Plugin Interface

    I basically modeled my code on that from the example linked to at the bottom of that page.

    This is my code in Unity/C#:
    Code (csharp):
    1.     // Use this for initialization
    2.     void Start () {
    3.         using (AndroidJavaClass player = new AndroidJavaClass("com.unity3d.player.UnityPlayer")){
    4.             AndroidJavaObject activity = player.GetStatic<AndroidJavaObject>("currentActivity");
    5.             mActivity = player.GetStatic<AndroidJavaObject>("currentActivity");
    6.             mActivity.Call("onMainCameraStart", new object[]{ Application.loadedLevel });
    7.         }
    8.         StartCoroutine("CallPluginAtEndOfFrames");
    9.     }
    11.     [DllImport ("gl2jni")]
    12.     private static extern void SetTimeFromUnity (float t);
    14.     private IEnumerator CallPluginAtEndOfFrames (){
    15.         while (true) {
    16.             // Wait until all frame rendering is done
    17.             yield return new WaitForEndOfFrame();
    19.             // Set time for the plugin
    20.             SetTimeFromUnity (Time.timeSinceLevelLoad);
    22.             // Issue a plugin event with arbitrary integer identifier.
    23.             // The plugin can distinguish between different
    24.             // things it needs to do based on this ID.
    25.             // For our simple plugin, it does not matter which ID we pass here.
    26.             GL.IssuePluginEvent(0);
    27.             GL.IssuePluginEvent(1);
    28.         }
    29.     }
    And this is my C code compiled with NDK-build (notice I don't do any GL stuff yet:
    Code (csharp):
    1. extern "C" {
    2.     void UnitySetGraphicsDevice (void* device, int deviceType, int eventType);
    3.     void UnityRenderEvent (int eventID);
    4.     void SetTimeFromUnity (float t);
    5. }
    7. void UnitySetGraphicsDevice (void* device, int deviceType, int eventType){
    8.     LOGI("native: UnitySetGraphicsDevice");
    9. }
    11. static float g_Time;
    12. void SetTimeFromUnity (float t) {
    13.     LOGI("native: SetTimeFromUnity");
    14.     g_Time = t;
    15. }
    17. void onPostRender(){
    18.     LOGI("native: onPostRender");
    19. }
    21. extern "C" void UnityRenderEvent (int eventID){
    22.     LOGI("native: UnityRenderEvent");
    23.     switch(eventID){
    24.     case 0:
    25.         onPostRender();
    26.         break;
    27.     }
    28. }
    However, based on my LogCat output it appears that although I can call SetTimeFromUnity explicitly perfectly alright, my UnityRenderEvent never gets invoked (should be triggered by my GL.IssuePluginEvent calls), and neither does UnitySetGraphicsDevice. Why is this, can anyone help?

    I copied my compiled .so file from the armeabi-v7a folder of my Android Project to Assets\Plugins\Android of my Unity Project. Is there anything anything else I need to do to get Unity to detect my plugin and call the UnitySetGraphicsDevice and UnityRenderEvent methods?

    Again, any help will be greatly appreciated!
    / Andreas
  2. A_M


    May 3, 2012
    I've done some more testing. Still haven't been able to get UnityRenderEvent or UnitySetGraphicsDevice to get caled, however I tried doing some rendering in C (calling my DllImported method directly from OnPostRender) and I get the exact same results as when I try to draw from Java code: I can do everything I want to do without getting any glErrors except calling glDrawArray. (glClear works, so apparently I can draw to the screen alright, and am presumably not calling my GL code from the wrong thread)

    I did some more searching and it seems others have had similar issues:

    Unfortunately, no replies to either of those. :(
  3. eriQue


    Unity Technologies

    May 25, 2010
    UnitySetGraphicsDevice / UnityRenderEvent are not supported on Android (or mobiles in general).
    It's generally not needed either as we (currently) don't do multi-threaded rendering, and there is no 'global' graphics device to keep track of.

    When it comes to rendering from a plugin you need to make sure you have a clean state setup before starting your rendering commands - you cannot assume any state to be set/cleared. One way is to set a material with a known state before you call your native rendering code.
    Same thing applies when returning to the Unity rendering path - you need to call GL.InvalidateState() to indicate that Unity must reset the GL state for any additional rendering commands.
  4. thorbrian


    Aug 26, 2010
    hey A_M,
    you didn't post your rendering code, but the problem is probably there. On android, you can call java/native code that does gl rendering from a MonoBehavior.OnPostRender event attached to the camera, and the gl code you call will render there.

    My guess as to your problem is that you are calling "glVertexAttribPointers" and Unity uses VBO's. when a VBO is active, glVertexAttribPointers is interpreted as an offset into the bound buffer. You probably just need to "glBindBuffer(GL_ARRAY_BUFFER_BINDING, 0);" to unbind Unity's vertex VBO before calling glVertexAttribPointers, and to call "glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);" to unbind the Index VBO before calling glDrawElements.

    I think the reason why you are getting random flickering, is because you are essentially just getting random tri's rendered in your plugin, since your vert data is just coming from random places in Unity's VBOs.

    Actually this part is not working for me - in particular, "material.SetPass(0)" does not appear to be calling "glUseProgram" with the program handle for the shader in that material, as "glGetIntegerv(GL_CURRENT_PROGRAM, &nCurrentProg);" is just getting the last program set by the code in the java/native plugin - it's like unity is setting and restoring the program handle before and after all it's own rendering ops, and material.SetPass doesn't set it :(

    If anybody knows a way to get the actual shader program handle for a Unity material in an android plugin I would *love* to know how to do that. I know I could always create shaders in the plugin, but being able to use Unity materials with direct gl calls using vert arrays would be awesome.
    zf_jon likes this.
  5. eriQue


    Unity Technologies

    May 25, 2010
    You can't mix and match with the Unity material system and native rendering. If you do native rendering you need to take care of setting the necessary vertex/fragment programs yourself.
  6. s0phist


    Mar 30, 2011
    This causes an immediate SIGABRT on this line in iOS6 for me. Any ideas on how to clear/push the VBO's on iOS without a crash?
  7. rishi-ranjan


    Oct 7, 2015
    I am trying Unity native rendering using Unity 5.3. I am using code from AWS Android SDK in native rendering s it's setting up all the vertex/fragment programs.

    Is there a sample which shows how to setup the clean slate in native code for OpenGL ES?

    I have also posted the question at