Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Using a custom SSD model in tf-lite-unity-sample

Discussion in 'Editor & General Support' started by Neil09, Jul 19, 2021.

  1. Neil09

    Neil09

    Joined:
    Jul 8, 2021
    Posts:
    1
    I've been trying to use my own custom SSD model in the SSD sample of this repo: https://github.com/asus4/tf-lite-unity-sample, however I've been running into issues getting it to work.

    Under Assets/Samples/SSD/SsdSample.cs
    Code (CSharp):
    1. public class SsdSample : MonoBehaviour
    2. {
    3.     [SerializeField, FilePopup("*.tflite")] string fileName = "coco_ssd_mobilenet_quant.tflite";
    4.     [SerializeField] RawImage cameraView = null;
    5.     [SerializeField] Text framePrefab = null;
    6.     [SerializeField, Range(0f, 1f)] float scoreThreshold = 0.5f;
    7.     [SerializeField] TextAsset labelMap = null;
    Simply changing the fileName to my own model's name (my model is in the same folder as the default model), does not work as removing the default model results in an error.

    Under Issue #121, the owner mentioned changing the BaseImagePredictor in Under Assets/Samples/SSD/SSD.cs from sbyte to float if your model is quantized differently. I checked my model using netron, and it uses float instead of byte, so I changed it.
    Code (CSharp):
    1.     public class SSD : BaseImagePredictor<float>
    2.     {
    3.         public struct Result
    4.         {
    5.             public int classID;
    6.             public float score;
    7.             public Rect rect;
    8.         }
    However, this still does not work, giving me the error "Failed to create TensorflowLite Model" from Packages/com.github.asus4.tflite/Runtime/Interpreter
    upload_2021-7-19_11-0-17.png

    Code (CSharp):
    1.         public Interpreter(byte[] modelData, InterpreterOptions options)
    2.         {
    3.             GCHandle modelDataHandle = GCHandle.Alloc(modelData, GCHandleType.Pinned);
    4.             IntPtr modelDataPtr = modelDataHandle.AddrOfPinnedObject();
    5.             model = TfLiteModelCreate(modelDataPtr, modelData.Length);
    6.             if (model == IntPtr.Zero) throw new Exception("Failed to create TensorFlowLite Model");
    7.  
    8.             this.options = options ?? new InterpreterOptions();
    9.  
    10.             interpreter = TfLiteInterpreterCreate(model, options.nativePtr);
    11.             if (interpreter == IntPtr.Zero) throw new Exception("Failed to create TensorFlowLite Interpreter");
    12.         }
    13.  
    In Issue #89, the owner made mention of changing the input shape, but did respond as to where this might be located (the default model shape is 1x300x300x3, but mine is 1x640x640x3).
    upload_2021-7-19_11-1-35.png
    The author of the issue also trained a model with the same input shape as the default, and he says that it still didn't work, but also did not get a response.

    Any help is appreciated!
     

    Attached Files:

  2. Romaissa_Guetoutou

    Romaissa_Guetoutou

    Joined:
    Feb 11, 2023
    Posts:
    3
    hello, please did you solve this I am in the same situation
     
  3. bilalFrag

    bilalFrag

    Joined:
    Feb 7, 2022
    Posts:
    1
    do anyone have solution to import custom tflite models in ssd template