Hi NVIDIA Engineers,

First, I’d like to know when we can finally use this feature, MODEL STORE, in DIGITS 5.

Some state-of-the-art methods haven’t been included in DIGITS 5 as well.

We startup a new company in Taiwan.

I think I really need a tool like DIGITS because it really saves me a lots of time in development.

Do you have a schedule of future update, or can we discuss whether some new models will be included next?

Hi TimCook

the DIGITS training system is not supported on ARM/Jetson and is meant to run from PC for training. Partially this is because the nvcaffe that DIGITS uses for training, on TX1 nvcaffe is optimized for FP16 inference and not training. So for training a network, run DIGITS in the cloud (like in AWS or Azure) or on a local x86 machine. With each training epoch, DIGITS will save a network model checkpoint, which you can copy over to your Jetson for deploying the inference. You can do this with DetectNet as well, after you get it trained in DIGITS to your liking, copy it over to your Jetson. There you can load it with TensorRT using example code like this.


Hi Kayccc

I haven’t passed the registration of TensorRT yet.
Can you help me to go through?

I really want to install TensorRT on my Tx1 and speed up the inference time.

Many Thanks,

Hi TimCook, TensorRT is already available through JetPack on Jetson since JetPack 2.3 or newer. If you follow default JetPack install, TensorRT will automatically be installed to your Jetson TX1.

FYI, monitor this issue on DIGITS GitHub to follow when the Model Store goes live: https://github.com/NVIDIA/DIGITS/issues/1214