TensorRT support NHWC model?

i use trt to create network by c++ API,the weights is extracted from other model file and the it’s in NHWC,
i want to know how to let trt support NHWC?

Hi,

Could you share which model format do you use? Caffe , TensorFlow or others?

For TensorFlow, the uff parser automatically apply required format handling.
But, no matter the model is NCHW or NHWC, please remember to register your input blob with NCHW format.

You can find detail information in our document:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#mnist_uff_keyconcepts

Thanks.

The model file is the TensorFlow, but i don’t use model files directly.
I just extract useful information from it and use the API to create the network,but you know the Tensor for TensorFlow is in NHWC,if i transpose it to NCHW and run the network is OK, i find it troublesome,so i want to know how to set the trt can run with NHWC Tensor?

Hi,

Our implementation is NCHW.
Some format conversion is inserted when parsing the TensorFlow model to TensorRT.
https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/python_api/pkg_ref/lite.html?highlight=nchw#tensorrt.lite.Engine.convert_LCHW_to_LNCHW

To manually create TensorRT engine, maybe this tutorial can give you some hint:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/python_api/workflows/manually_construct_tensorrt_engine.html

Thanks.