Search Unity

  1. The 2022.1 beta is now available for testing. To find out what's new, have a look at our 2022.1 beta blog post.
    Dismiss Notice
  2. Welcome to the Unity Forums! Please take the time to read our Code of Conduct here to familiarize yourself with the rules and how to post constructively.

Help Wanted Different output in Barracuda compared to Keras/ONNX

Discussion in 'Barracuda' started by JK-MITC, Jul 1, 2021.

  1. JK-MITC

    JK-MITC

    Joined:
    Dec 14, 2020
    Posts:
    3
    I have been struggling with my Keras2ONNX converted model for a few days. The output is different from what I get running the model from Keras and ONNXruntime. Maybe someone can spot my mistakes or tell me this is not yet supported.

    The model classifies an image into 4 categories. Image input size is 180x180x3.

    The same image(prescaled to 180x180) is loaded in both Keras/ONNXruntime and Unity.

    Keras/ONNX predicts as expected:

    ********** KERAS predict *************
    [ 2.179296 6.270509 -0.9740573 -3.7748256]
    This image most likely belongs to ds-pass with a 98.28 percent confidence.
    ********* ONNX predict ***************
    [ 2.1792946 6.2705064 -0.97405905 -3.774826 ]
    This image most likely belongs to ds-pass with a 98.28 percent confidence.

    But Barracuda predicts:

    [ 6,128766 -0,5258042 -0,7985633 -1,13862 ]

    I think the input values are correct in the 0-255 range, but I might be missing something.
    The Unity code:

    Code (CSharp):
    1.  
    2.  
    3. void Classify(){
    4.  
    5.        var scaled = Resources.Load<Texture2D>("ds_pass_prescaled");
    6.      
    7.         Color32[] pix = scaled.GetPixels32();
    8.         //Color[] pix = scaled.GetPixels();
    9.         int ModelInputSize = 180;
    10.         float[] floats = new float[ModelInputSize * ModelInputSize * 3];
    11.      
    12.  
    13.         for (int i = 0; i < pix.Length; ++i)
    14.         {
    15.             var color = pix[i];
    16.    
    17.             //Values 0_255
    18.             floats[i * 3 + 0] = color.r;
    19.             floats[i * 3 + 1] = color.g;
    20.             floats[i * 3 + 2] = color.b;
    21.          
    22.             //Values 0_1
    23.             //floats[i * 3 + 0] = (color.r / 255f);
    24.             //floats[i * 3 + 1] = (color.g / 255f);
    25.             //floats[i * 3 + 2] = (color.b / 255f);
    26.  
    27.             //Values -1_1
    28.             //floats[i * 3 + 0] = (color.r - 127) / 127.5f;
    29.             //floats[i * 3 + 1] = (color.g - 127) / 127.5f;
    30.             //floats[i * 3 + 2] = (color.b - 127) / 127.5f;
    31.  
    32.         }
    33.  
    34.         //Create the input tensor from pixel values
    35.         Tensor in_tensor = new Tensor(1, ModelInputSize, ModelInputSize, 3, floats);
    36.  
    37.         worker.Execute(in_tensor);
    38.  
    39.         Tensor out_tensor = worker.PeekOutput("dense_1");
    40.         var arr_comp = out_tensor.ToReadOnlyArray();
    41.         Debug.Log(arr_comp);
    42.  
    43.  
    44.         in_tensor.Dispose();
    45.         out_tensor.Dispose();
    46.         worker.Dispose();
    47. }
    48.  
    The model:

    The rescaling takes 0-255 into 0-1 range. I have also tried a version of the model without the rescaling layer with no luck.
    ds_ss_model.png


    I am running Unity 2020.3.2f1 and currently Barracuda 2.1.0 (Tested with 1.3.3 also)
    Any kind of help is greatly appreciated.
     
    Last edited: Jul 2, 2021
  2. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    138
    Hey @JK-MITC,

    Inference code look fine, also the network seems to be something that Barracuda should run just fine.

    Two potential trouble / investigation direction here:

    - Channel ordering when converting model from Keras->ONNX->Barracuda.
    Keras input is Nx180x180x3
    ONNX being channel first, input should then be Nx3x180x180?
    Barracuda being channel last, input should be Nx180x180x3 again.
    Is that the case? Does the intermediary onnx file/architecture seems fine? If you post the model i can take a look.

    - It would be good to validate that the input to the network Barracuda are the same that the input to the network in python. An easy way to do that could be to print a bunch of selected value in python and look at the same in C#
    Something like
    print(inputTensor[0,0,0,0])
    print(inputTensor[0,0,1,0])
    print(inputTensor[0,0,2,0])
    print(inputTensor[0,50,0,0])
    print(inputTensor[0,50,1,0])
    print(inputTensor[0,50,2,0])
    And Debug.Log in C#
    This would allow to verify normalization and color space is the same (in python vs texture from game engine).

    Let us know how it goes :)
    Florent
     
    Last edited: Jul 2, 2021
  3. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    138
    BTW what Barracuda version are you using? I would also recommend trying latest Barracuda aka 2.1.0.
     
  4. JK-MITC

    JK-MITC

    Joined:
    Dec 14, 2020
    Posts:
    3
    Thank you for your answer.
    I am currently using Barracuda 2.1.0 (Tested with 1.3.3 also). I will do some more testing on the channel order and come back.
     
    fguinier likes this.
  5. JK-MITC

    JK-MITC

    Joined:
    Dec 14, 2020
    Posts:
    3
    Seems like I just needed to flip the input Texture2d vertically before feeding it to the Tensor.

    Your idea of printing the pixel values ponted me in the right direction.

    Thanks
     
    fguinier likes this.
unityunity