the DIGITS training system is not supported on ARM/Jetson and is meant to run from PC for training. Partially this is because the nvcaffe that DIGITS uses for training, on TX1 nvcaffe is optimized for FP16 inference and not training. So for training a network, run DIGITS in the cloud (like in AWS or Azure) or on a local x86 machine. With each training epoch, DIGITS will save a network model checkpoint, which you can copy over to your Jetson for deploying the inference. You can do this with DetectNet as well, after you get it trained in DIGITS to your liking, copy it over to your Jetson. There you can load it with TensorRT using example code like this.
Hi TimCook, TensorRT is already available through JetPack on Jetson since JetPack 2.3 or newer. If you follow default JetPack install, TensorRT will automatically be installed to your Jetson TX1.