NVIDIA Developer Forums
Unable to parse custom pytorch UNET onnx model with python deepstream-segmentation-app
AI & Data Science
Deep Learning (Training & Inference)
TensorRT
spolisetty
August 12, 2022, 12:42pm
5
Hi,
Please refer to the following similar posts, which may help you.
Thank you.
show post in topic
Related topics
Topic
Replies
Views
Activity
I am trying to convert the ONNX SSD mobilnet v3 model into TensorRT Engine. I am getting the below error
Jetson TX2
tensorrt
,
tensorflow
24
3661
February 17, 2022
How to use onnx file with deepstream-test1-usbcam + Custom models
DeepStream SDK
30
4615
October 12, 2021
Some PyTorch model with slicing operation fails on inference
TensorRT
tensorrt
,
pytorch
,
onnx
,
deepstream
2
1411
January 7, 2022
Unable to parse custom pytorch UNET onnx model with python deepstream-segmentation-app
DeepStream SDK
onnx
,
segmentation
,
deepstream61
9
1190
August 16, 2022
I do not get any performance improvement after using TensorRT provider for object detection model
Jetson Nano
tensorrt
,
onnx
7
1381
July 12, 2022
ONNX to TensorRT Python module doesn't generate dynamic batch size engine
TensorRT
tensorrt
,
cudnn
,
onnx
3
1053
October 20, 2023
Torchvision Faster RCNN failed to convert to TensorRT engine
TensorRT
tensorrt
,
ubuntu
,
python
3
1414
October 5, 2023
Failed to used TensorRT Engine file in deepstream
DeepStream SDK
16
2717
October 12, 2021
Issues with torch.nn.ReflectionPad2d(padding) conversion to TRT engine
TensorRT
tensorrt
,
pytorch
,
onnx
21
4130
February 8, 2022
Unable to generate tensorrt engine using ds-tao-detection app for yolov4_tiny for QAT trained etlt model
DeepStream SDK
16
536
June 14, 2023