Execution of sample.py from the samples directory /usr/src/tensorrt/samples/python/end_to_end_tensorflow_mnist/ is throwing error

The execution of the script sample.py from the directory
/usr/src/tensorrt/samples/python/end_to_end_tensorflow_mnist/
throws error as
[TensorRT] ERROR: UffParser: Could not open models/lenet5.uff
I have found that the models directory is not present and this file is present under the directory
/usr/src/tensorrt/data/mnist/lenet5.uff
I then changed the path in the sample.py
model_path = os.environ.get(“MODEL_PATH”) or os.path.join(os.path.dirname(file), “models”)

Here I changed models to mnist
and ran the script with the command
python3.6 sample.py =d /usr/src/tensorrt/data/
then also it gave same error
Below are the errors one with running the script as is and the other with modifying “models” to “mnist” to get model path

  1. Error logs when I ran the script sample.py as is
    python3.6 sample.py =d /usr/src/tensorrt/data
    [TensorRT] ERROR: UffParser: Could not open models/lenet5.uff
    [TensorRT] ERROR: 4: [network.cpp::validate::2411] Error Code 4: Internal Error (Network must have at least one output)
    [TensorRT] ERROR: 2: [builder.cpp::buildSerializedNetwork::417] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed.)
    Traceback (most recent call last):
    File “sample.py”, line 81, in
    main()
    File “sample.py”, line 67, in main
    with build_engine(model_file) as engine:
    File “sample.py”, line 51, in build_engine
    return runtime.deserialize_cuda_engine(plan)
    TypeError: deserialize_cuda_engine(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.Runtime, serialized_engine: buffer) → tensorrt.tensorrt.ICudaEngine

Invoked with: <tensorrt.tensorrt.Runtime object at 0x7f88f25fb8>, None

  1. Error logs when “models” is modified to “mnist” in the sample.py
    model_path = os.environ.get(“MODEL_PATH”) or os.path.join(os.path.dirname(file), “mnist”)
    python3.6 sample.py =d /usr/src/tensorrt/data/
    [TensorRT] ERROR: UffParser: Could not open mnist/lenet5.uff
    [TensorRT] ERROR: 4: [network.cpp::validate::2411] Error Code 4: Internal Error (Network must have at least one output)
    [TensorRT] ERROR: 2: [builder.cpp::buildSerializedNetwork::417] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed.)
    Traceback (most recent call last):
    File “sample.py”, line 81, in
    main()
    File “sample.py”, line 67, in main
    with build_engine(model_file) as engine:
    File “sample.py”, line 51, in build_engine
    return runtime.deserialize_cuda_engine(plan)
    TypeError: deserialize_cuda_engine(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.Runtime, serialized_engine: buffer) → tensorrt.tensorrt.ICudaEngine

Invoked with: <tensorrt.tensorrt.Runtime object at 0x7f88100fb8>, None

Hi,

The sample is for TensorFlow 1.x model and is out of date.
Please check our new sample that supports TensorFlow 2.x model with ONNX format.

EfficientDet: https://github.com/NVIDIA/TensorRT/tree/release/8.2/samples/python/efficientdet
EfficientNet: https://github.com/NVIDIA/TensorRT/tree/release/8.2/samples/python/efficientnet

Thanks.

OK. Let me try it.

Thanks
Nagaraj Trivedi

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.