OTA update on deepstream-test5

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): GPU
• DeepStream Version: 5.0.0
• JetPack Version (valid for Jetson only): None
• TensorRT Version: 7.0
• NVIDIA GPU Driver Version (valid for GPU only): 440.33.01
• Issue Type( questions, new requirements, bugs): Questions
Hi sirs,

I try to test OTA feature with the following command:

./deepstream-test5-app -c configs/test5_config_file_src_infer.txt -o configs/test5_ota_override_config.txt

And I did not have any modify with configs/test5_config_file_src_infer.txt and configs/test5_ota_override_config.txt. But it is also appear the following log"Model Update Status: Updated model : /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test5/configs/…/…/…/…/…/samples/configs/deepstream-app/config_infer_primary.txt, OTATime = 1605694386635.706055 ms, result: ok"

So I am confused that is there any update?

If I want to test OTA, I just copy the files in configs/test5_ota_override_config.txt to

another folders and change it’s file name, will it work? Or I have to rebuild an engine



OTA works when model get changed. for example, you can modify test5_ota_override_config.txt to reflect your model changes through new engine file.
in the precondition, you passed option -o to test5 sample when running.

Hi Amycao,

Thanks for your prompt reply. I build an engine model with fp32 named

resnet10.caffemodel_b4_gpu0_fp32.engine, but the OTA process still not work.

I upload log and both config files.


test5_ota_override_config.txt (2.4 KB) test5_config_file_src_infer.txt (6.3 KB) log.txt (6.4 KB)

You can comment starting with model-engine-file line in config file ota* when you start run the sample, after the sample run, uncomment the line, save the file, to see OTA if works.

Hi Amycao,

It works, but is it reasonable that I have to do the steps you provided to do model OTA?

just one case to demonstrate how to do on time update of model, not a must.

Hi Amycao,

Got it and thanks your help.