Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.

Exporting model from ML-Agents to run inference in python with tensorflow

Discussion in 'ML-Agents' started by ross_allen, Jul 21, 2020.

  1. ross_allen


    Jul 21, 2020
    As the title implies, I am working on a pipeline for taking a model trained in ML-Agents and exporting it such that I can run inference with the model from inside a python script using tensor flow. Informed by this and this forum post, my planned pipeline is:

    1. export model trained in ML-Agents to ONNX using tf2onnx
    2. import onnx model in python script
    3. create tensorflow representation in python with onnx-tx
    4. run inference using tensorflow similar to tutorial here:

    Steps 1 and 2 seem to work but I am stuck on an error at step 3 when trying to create a tensorflow representation of the onnx model created in ML-Agents. Given a trained onnx model "model.onnx", here is a simple python script that reproduces the error:

    Code (python):
    1. import onnx
    2. from onnx_tf.backend import prepare
    3. onnx_model = onnx.load('model.onnx')
    4. tf_model = prepare(onnx_model)
    The error I receive:

    Traceback (most recent call last):
    File "/home/ross/miniconda3/envs/autofly_py3/lib/python3.7/site-packages/tensorflow/python/framework/", line 1654, in _create_c_op
    c_op = pywrap_tf_session.TF_FinishOperation(op_desc)
    tensorflow.python.framework.errors_impl.InvalidArgumentError: Input must be scalar but has rank 1 for '{{node one_hot}} = OneHot[T=DT_FLOAT, TI=DT_INT32, axis=-1](strided_slice__20, const_fold_opt__56, strided_slice_3, strided_slice_2)' with input shapes: [2147483647], [1], [], [] and with computed input tensors: input[1] = <2>.

    I realize that my problem may well be specific to my model.onnx file, but I am posting in the hopes that this is a more general problem others have seen and can help fix. For reference, I am using:

    onnx-tf @ git+
  2. vincentpierre


    May 5, 2017
    We do not support running the generated models outside of Unity as of today, but I think you will have more success using a TensorFlow model directly. In the results folder, look for the frozen_graph_def.pb file. It contains the TensorFlow model directly.
    I do not know what the error you are getting is, but I suspect converting from TensorFlow to onnx and then back to TensorFlow using different libraries might loose some data.
  3. ross_allen


    Jul 21, 2020
    Thank you for the response. A previous forum post ( had discouraged me from trying to directly use the frozen graph .pb file, but I will give it a shot. I will also reach out to the developer who supported ml-agents onnx integration ( as well as the onnx-tf developers to see if they have further insights.
    kpalko likes this.