I’m trying to use imagenet-console from jetson-inference on a custom model. It works fine with FP16, but for FP32, I get the following error:
[GIE] network profiling complete, writing code to …caffemodel.1.tensorcache
[GIE] completed writing cache to …caffemodel.1.tensorcache
[GIE] …caffemodel loaded
[GIE] failed to allocate -1 bytes to deserialize model
failed to load …caffemodel
imageNet – failed to initialize
imagenet-console: failed to initialize imageNet
When I test the model without using sudo, it gives me the following error:
Input “data”: 3x224x224
Output “prob”: 1000x1x1
cudnnEngine.cpp (55) - CUDA Error in initializeCommonContext: 4
could not build engine
Engine could not be created
Engine could not be created
However, when I ran the same command with sudo, it worked fine.
“strace” has a lot of output, but you might be able to get a useful subset of messages. If your program is “program arg1”, then it would go something like this while running as your regular user and without sudo:
The strace is just to see where the permissions went wrong. I’m guessing you shouldn’t need to run as root, and if that is the case, then you might see a location where permissions failing provides other clues. You can just run as root.
Could you try if the last comment also works for you?
[i]------------------------------------------------------------
I have the same problem with inference running in docker. I found the problem.
You need to set the right CUDA_ARCH for your GPU
For example
export CUDA_ARCH=“50 52”
------------------------------------------------------------[/i]