Questions with trtexec

Hi, while using the trtexec, I have some questions that I would like to clarify.

  1. Can I add new parameter when loading engine file?
    For example:
    build command: /usr/src/tensorrt/bin/trtexec --onnx=[onnxModel] --saveEngine=[engineModel] --fp16
    run command: /usr/src/tensorrt/bin/trtexec --loadEngine=[engineModel] --streams=2 --avgRuns=1 --iterations=4 --fp16
    will the throughput accurate? or I have to rebuild the engine with new parameters(stream, avgRuns and iterations)?

  2. In this case, will the engine still be loading with fp16 and the rest parameters?
    build command: /usr/src/tensorrt/bin/trtexec --onnx=[onnxModel] --saveEngine=[engineModel] --fp16 --streams=2 --avgRuns=1 --iterations=4
    run command: /usr/src/tensorrt/bin/trtexec --loadEngine=[engineModel]

Thanks.

Hi,
Please refer to the below link for Sample guide.

Refer to the installation steps from the link if in case you are missing on anything

However suggested approach is to use TRT NGC containers to avoid any system dependency related issues.

In order to run python sample, make sure TRT python packages are installed while using NGC container.
/opt/tensorrt/python/python_setup.sh

In case, if you are trying to run custom model, please share your model and script with us, so that we can assist you better.
Thanks!