Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Exporting model from ML-Agents to run inference in python with tensorflow

Discussion in 'ML-Agents' started by ross_allen, Jul 21, 2020.

  1. ross_allen

    ross_allen

    Joined:
    Jul 21, 2020
    Posts:
    2
    As the title implies, I am working on a pipeline for taking a model trained in ML-Agents and exporting it such that I can run inference with the model from inside a python script using tensor flow. Informed by this and this forum post, my planned pipeline is:

    1. export model trained in ML-Agents to ONNX using tf2onnx
    2. import onnx model in python script
    3. create tensorflow representation in python with onnx-tx
    4. run inference using tensorflow similar to tutorial here: https://github.com/onnx/tutorials/blob/master/tutorials/OnnxTensorflowImport.ipynb

    Steps 1 and 2 seem to work but I am stuck on an error at step 3 when trying to create a tensorflow representation of the onnx model created in ML-Agents. Given a trained onnx model "model.onnx", here is a simple python script that reproduces the error:

    Code (python):
    1. import onnx
    2. from onnx_tf.backend import prepare
    3. onnx_model = onnx.load('model.onnx')
    4. tf_model = prepare(onnx_model)
    The error I receive:

    Traceback (most recent call last):
    File "/home/ross/miniconda3/envs/autofly_py3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 1654, in _create_c_op
    c_op = pywrap_tf_session.TF_FinishOperation(op_desc)
    tensorflow.python.framework.errors_impl.InvalidArgumentError: Input must be scalar but has rank 1 for '{{node one_hot}} = OneHot[T=DT_FLOAT, TI=DT_INT32, axis=-1](strided_slice__20, const_fold_opt__56, strided_slice_3, strided_slice_2)' with input shapes: [2147483647], [1], [], [] and with computed input tensors: input[1] = <2>.

    I realize that my problem may well be specific to my model.onnx file, but I am posting in the hopes that this is a more general problem others have seen and can help fix. For reference, I am using:

    python==3.7.6
    onnx=1.7.0
    onnx-tf @ git+https://github.com/onnx/onnx-tensorflow@44c09275a803e04eeeb4e0d24c372adf1f9ff1f5
    tensorboard==2.2.2
    tensorflow==2.2.0
    tensorflow-addons==0.10.0
    tensorflow-estimator==2.2.0
     
  2. vincentpierre

    vincentpierre

    Joined:
    May 5, 2017
    Posts:
    160
    Hi,
    We do not support running the generated models outside of Unity as of today, but I think you will have more success using a TensorFlow model directly. In the results folder, look for the frozen_graph_def.pb file. It contains the TensorFlow model directly.
    I do not know what the error you are getting is, but I suspect converting from TensorFlow to onnx and then back to TensorFlow using different libraries might loose some data.
     
  3. ross_allen

    ross_allen

    Joined:
    Jul 21, 2020
    Posts:
    2
    Thank you for the response. A previous forum post (https://forum.unity.com/threads/loa...ze-graph-into-tensorflow.932076/#post-6093753) had discouraged me from trying to directly use the frozen graph .pb file, but I will give it a shot. I will also reach out to the developer who supported ml-agents onnx integration (https://github.com/Unity-Technologies/ml-agents/pull/3101) as well as the onnx-tf developers to see if they have further insights.
     
    kpalko likes this.