Error when loading in Python an .onnx neural net exported via Matlab

8 views (last 30 days)
Hello,
I can't use in Python an .onnx neural net exported with Matlab. Let say I want to use the googlenet model, the code for exporting it is the following:
net = googlenet;
filename = 'googleNet.onnx';
exportONNXNetwork(net,filename);
In Python, commands for loading the .onnx file are the following (according to https://microsoft.github.io/onnxruntime/)
import onnxruntime
sess = onnxruntime.InferenceSession('googlenet.onnx')
But an error message occurs at this stage:
RuntimeError: [ONNXRuntimeError] : 1 : GENERAL ERROR : Load model from googlenet.onnx failed:
Node:prob Output:prob [ShapeInferenceError] Mismatch between number of source and target dimensions. Source=2 Target=4
I tried different net (alexnet, squeezenet, personal nets...) and the same error always appears.
Here is my config:
-----------------------------------------------------------------------------------------------------
MATLAB Version: 9.6.0.1072779 (R2019a)
Operating System: Microsoft Windows 10 Pro Version 10.0 (Build 17763)
Java Version: Java 1.8.0_181-b13 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
-----------------------------------------------------------------------------------------------------
Deep Learning Toolbox Version 12.1 (R2019a)
Any help is welcomed !

Answers (1)

Don Mathis
Don Mathis on 20 May 2019
I could not reproduce your error. The following works for me:
In MATLAB:
>> net = googlenet;
>> filename = 'googleNet.onnx';
>> exportONNXNetwork(net,filename,'OpsetVersion',8)
In python:
import numpy
import onnxruntime as rt
sess = rt.InferenceSession("googleNet.onnx")
input_name = sess.get_inputs()[0].name
n = 1
c = 3
h = 224
w = 224
X = numpy.random.random((n,c,h,w)).astype(numpy.float32)
pred_onnx = sess.run(None, {input_name: X})
print(pred_onnx)
It outputs:
[array([[3.29882569e-05, 3.58083460e-04, 3.37624690e-04, 1.43901940e-04, 5.39901492e-04, 4.93929256e-04, 1.84278106e-04, 1.47032852e-05, 3.41630061e-06, 7.50037043e-06, 2.41960952e-05, 4.77660433e-06, 8.67359086e-06, 8.24564086e-06, 2.09670925e-05, 2.51299825e-05, 2.65392214e-06, 3.01301202e-06, 1.45755412e-05, 6.66411279e-06, 2.57993106e-05, 1.68685292e-05, 4.03514641e-05, 3.40506740e-05, 6.18301056e-05, 1.30592525e-05, 7.45224024e-05, 5.93718396e-05, 2.10106184e-04, 2.63419988e-05, 5.05311709e-06, 1.60537282e-04, 6.04824818e-05, 1.52395834e-04, 9.41899605e-04, 1.93663309e-05, 1.47942395e-04, 1.34101238e-05, 4.75002344e-05, 1.01176765e-05, 8.80616863e-05, 1.62361575e-05, 2.06871373e-05, 1.32702444e-05,
...
My MATLAB config:
>> ver
-----------------------------------------------------------------------------------------------------
MATLAB Version: 9.6.0.1092380 (R2019a) Update 1
MATLAB License Number: unknown
Operating System: Microsoft Windows 10 Enterprise Version 10.0 (Build 17134)
Java Version: Java 1.8.0_181-b13 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
-----------------------------------------------------------------------------------------------------
Deep Learning Toolbox Version 12.1 (R2019a)
  4 Comments
Martijn
Martijn on 28 May 2019
Thanks Don, I spoke to Patrick and after helping him update to the latest version, things are working correctly.
Patrick Marmaroli
Patrick Marmaroli on 3 Jun 2019
Indeed, pb solved with "Deep Learning Toolbox Converter for ONNX Model Format" version 19.1.2 (I used the version 19.1.0). Thank you guys.

Sign in to comment.

Products


Release

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!