SSD on NVCaffe

Has anyone run SSD on NVCaffe? If so, how has achieved that?

Thank you very much.

Hi,

SSD is a custom Caffe branch.
If you want to run SSD with NvCaffe, it’s required to port all the custom implementation to NvCaffe.

Rather than using NvCaffe, it’s recommended to use our TensorRT engine with SSD model to get better performance.
There are lots of topics discussing on this:
https://devtalk.nvidia.com/default/topic/1007313/how-to-build-the-objection-detection-framework-ssd-with-tensorrt-on-tx2-/

Thanks.

Hi AasTaLLL,

I thank you for the information. I will give TensorRT a try.

Many thanks.

Hi,

I’m trying to execute TensorRT with my Pascal GPU but I can’t find how I can execute a example of face recognition with SSD for example.

Do you know?

Thanks

Hi,

You can write your pipeline based on our MNIST sample which is located at ‘/usr/src/tensorrt/samples/sampleMNIST’

Thanks.

Thanks!

When I do: g++ sampleMNIST.cpp -o exit

I have this problem:

usr/include/c++/5/memory:64:0,
from samplePlugin.cpp:12:
/usr/include/c++/5/bits/stl_construct.h:83:7: error: invalid conversion from ‘char’ to ‘const char*’ [-fpermissive]
::new(static_cast<void*>(__p)) _T1(__val

Hi,

Makefile is also included in our TensorRT sample.
Please compile it with Makefile directly.

cp -r /usr/src/tensorrt/ .
cd tensorrt/samples/sampleMNIST
make
cd ../../bin/   
./sample_mnist

Thanks.

Thanks!!