Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Thread group count is above the maximum allowed limit

Discussion in 'ML-Agents' started by SaschaWaal, Jun 25, 2020.

  1. SaschaWaal

    SaschaWaal

    Joined:
    Sep 17, 2019
    Posts:
    2
    Hi, I am trying out a pre-trained model (https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/duc)

    But I keep getting an error that the thread group is above the maximum allowed limit.
    Any tips or direction on how I could fix it?

    Error
    Code (CSharp):
    1. Thread group count is above the maximum allowed limit. Maximum allowed thread group count is 65535.
    2. UnityEngine.ComputeShader:Dispatch(Int32, Int32, Int32, Int32)
    3. Barracuda.ComputeFunc:Dispatch(Int32, Int32, Int32) (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/BarracudaReferenceCompute.cs:1422)
    4. Barracuda.ComputeFunc:Dispatch(Int32[]) (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/BarracudaReferenceCompute.cs:1408)
    5. Barracuda.ComputeKernel:Dispatch() (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/BarracudaCompute.cs:435)
    6. Barracuda.ComputeOps:Softmax(Tensor) (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/BarracudaCompute.cs:847)
    7. Barracuda.StatsOps:Barracuda.IOps.Softmax(Tensor) (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/StatsOps.cs:209)
    8. Barracuda.<ExecuteAsync>d__27:MoveNext() (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/GenericWorker.cs:587)
    9. Barracuda.GenericWorker:Execute() (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/GenericWorker.cs:117)
    10. Barracuda.GenericWorker:Execute(IDictionary`2) (at Library/PackageCache/com.unity.barracuda@0.4.0-preview/Barracuda/Core/Backends/GenericWorker.cs:105)
    11. <Working>d__10:MoveNext() (at Assets/Scripts/ModelTester.cs:59)
    12. UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
    13.  
    Script
    Code (CSharp):
    1. using Barracuda;
    2. using System;
    3. using System.Collections;
    4. using System.Collections.Generic;
    5. using UnityEngine;
    6.  
    7. public class ModelTester : MonoBehaviour
    8. {
    9.     private const string INPUT_NAME = "data";
    10.  
    11.     [SerializeField]
    12.     private NNModel modelFile;
    13.  
    14.     [SerializeField]
    15.     private Texture2D texture;
    16.  
    17.     private IWorker worker;
    18.  
    19.     private void Start() {
    20.         Model model = ModelLoader.Load(this.modelFile, false);
    21.  
    22.         this.worker = WorkerFactory.CreateWorker(WorkerFactory.Type.ComputePrecompiled, model);
    23.         StartCoroutine(Working());
    24.     }
    25.  
    26.     private IEnumerator Working() {
    27.         Debug.Log("Starting");
    28.         yield return new WaitForSeconds(1);
    29.  
    30.         using (var tensor = new Tensor(texture)) {
    31.             Dictionary<string, Tensor> inputs = new Dictionary<string, Tensor>();
    32.  
    33.             try {
    34.                 inputs.Add(INPUT_NAME, tensor);
    35.  
    36.                 this.worker.Execute(inputs);
    37.             } catch (Exception ex) {
    38.                 Debug.LogError(ex);
    39.             }
    40.  
    41.             // Do stuff with the result
    42.             Debug.Log("Done");
    43.         }
    44.     }
    45. }
     
  2. ervteng_unity

    ervteng_unity

    Unity Technologies

    Joined:
    Dec 6, 2018
    Posts:
    150
    TreyK-47 likes this.
  3. SaschaWaal

    SaschaWaal

    Joined:
    Sep 17, 2019
    Posts:
    2
    Thanks! will do that