Can Deepstream set the maximum workspace for nvinfer?

Hi there,
I can not find how to set the maximum workspace when Deepstream tries to build the engine.
There is no relative information in deepstream develop manual or deepstream plugin manual.

It’s 450MB now.

You could search the “setMaxWorkspaceSize” under /opt/nvidia/deepstream/deepstream/sources/libs/nvdsinfer/ , e.g.

./nvdsinfer_model_builder.cpp: m_Builder->setMaxWorkspaceSize(kWorkSpaceSize);

kWorkSpaceSize is the worksapce size

Hi mchi,
Thank you for your quick response.

Can we just modify the configuration files(for example, like config_infer_primary_ssd.txt or deepstream_app_config_ssd.txt) to change the MaxWorkspaceSize value for deepstream inference?

Or we need to change the MaxWorkspaceSize value in nvdsinfer_model_builder.cpp, and then recompile deepstream?

By the way, I can not find nvdsinfer_model_builder.cpp file in PC(deepstream docker) and TX2(4.3).

please fill the info like below so that we can answer your question accurately, and save both time.

• Hardware Platform (Jetson / GPU) Jetson NX
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4 DP
• TensorRT Version 7.1
• NVIDIA GPU Driver Version (valid for GPU only)

On TX2:
• Hardware Platform (Jetson / GPU) Jetson TX2
• DeepStream Version 4.0
• JetPack Version (valid for Jetson only) 4.3
• TensorRT Version 6.0.1

On PC:
• Hardware Platform (Jetson / GPU) Titan Xp
• DeepStream Version 4.0
• TensorRT Version 6.0
• NVIDIA GPU Driver Version (valid for GPU only) 440.82
• Deepstream docker version 4.0.2-19.12-devel