Fast start with large models?

Hi, after I boot up my Nvidia Jetson Xavier, I load very large Tensorflow and other models, which takes several minutes.

When I go to deploy in my autonomous edge-computing application, I have no peripherals, just an on-off switch.

When I hit the on-switch, I want the Xavier to boot up and start doing inference as quickly as possible.

Is there a way to avoid my model-loading step and instead load a memory image that captures the state of the machine after the models were loaded … or some other solution?

I have 8Gb memory and 512Gb SD-card.

Thanks in advance!

Hi,

TensorFlow is a heavy library.
So it’s expected to have a slow initialization time.

Is converting the model into TensorRT an option for you?
If yes, please find an example below for the details:

Thanks.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.