Deepstream config option to load channel-last format neural network?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Xavier AGX
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4
• TensorRT Version 7.1.3
• Issue Type( questions, new requirements, bugs) question

My custom segmentation model has the input dimensions of 1024x1024x3 but when running it in Deepstream I got the error

nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::preparePreprocess() <nvdsinfer_context_impl.cpp:874> [UID = 1]: RGB/BGR input format specified but network input channels is not 3

After some debugging and trying a Yolo model with input dimensions of 3x704x704 which ran well I found out that Deepstream seems to expect a channel-first input. I would like to change and adapt it to a channel-last input but didn’t find any options to be set in the config file except the uff-input-dims property which didn’t work as it is probably only for uff models like the name suggests.

Is a channel-last format supported in Deepstream and if not, could you provide some advice on which changes to make to support it?

You can use uff-input-order to specify the format for uff model. refer

Yes I already tried the uff input order like I mentioned in my post.
In the end I just added an additional layer in my neural network to swap from channel-first, to channel-last after the input has been read so this issue can be closed.

Is this mean that the deepstream works by default with models that has channel first ?

deepstream work with channel last

@ayanasser : Yes at least for my use case. DeepSteam expected a channel-first input format but my model used channel-last so I changed the input layer to [C,H,W] and then added a transpose operation to my model to switch to [H,W,C]
@PhongNT : From my experience it expects channel-first.