Detectnet on TK1

Hi everyone,

I would like to know if anyone has used a detectnet on a TK1 board?
I have seen some posts about doing inference using detectnet on a TX1 (also using TensorRT) but nothing about TK1.

Best regards,
Gonçalo

Hi Goncalo, the 64-bit TensorRT is not supported on 32-bit TK1 unfortunately, however you can still deploy DetectNet using Caffe. Here’s an example of getting Caffe up & running on TK1.

Hello Dusty,

Thanks for the answer. I have used Caffe in a Jetson for a while but with simpler network architectures.
Now I have successfully tried the DetectNet on the TK1. It takes around 2.166 seconds to do a forward pass and I need to be careful not be use a lot of memory on other stuff, otherwise the script is killed by SIGKILL.

Hello!
Does DetectNet work ok on TX1/TX2 without being killed and are TX1/TX2 resources enough for running it together with other stuff?
I’ve successfully installed nvcaffe on TK1 and using it for simpler networks, but for DetectNet it crashes indeed when I am using Caffe API in my cpp application.

Hi,

For TX1/TX2, we inference DetectNet via tensorRT and doesn’t find any out of resource issue.

For TK1,
Could you add some swap space and try it again?

Connect to some extra storage and then:

fallocate -l 8G swapfile
ls -lh swapfile
chmod 600 swapfile
ls -lh swapfile
mkswap swapfile
sudo swapon swapfile
swapon -s

Thank you! Swap did the job indeed!

Should I fee the allocated memory after:

cudaAllocMapped((void**) cpu, (void**) gpu, (qImg.width() * qImg.height() * sizeof(float) * 4));

Hi wiany11,

Not sure about the problem you are facing.
Could you share more details about it?

Which sample do you want to use? Which frameworks (Caffe or TensorRT) do you want to use?