GazeNet & Triton Error

• Hardware (NVIDIA GeForce 3090)
• Network Type (GazeNet)

Hey👋 Im having this realy wierd error when trying to load GazeNet .engine model in Triton.
Here are the steps:

  1. I loaded model from:
    Gaze Estimation | NVIDIA NGC

  2. Then converted it with:
    tao-converter -k nvidia_tlt -p input_right_images:0,1x1x224x224,4x1x224x224,8x1x224x224 -p input_left_images:0,1x1x224x224,4x1x224x224,8x1x224x224 -p input_face_images:0,1x1x224x224,4x1x224x224,8x1x224x224 -p input_facegrid:0,1x1x625x1,4x1x625x1,8x1x625x1 model.etlt

  3. When im trying to load it with triton i get realy wierd error
    E0801 10:58:35.713590 753 model_repository_manager.cc:1348] failed to load 'gazenet' version 1: Internal: unable to allocate memory for output 'fc_joint/concat:0_before_shuffle' for gazenet

When its trying to run it, the peek of memry use is 950MiB then it crash


Befor run, the 3090 is fully unloaded from any process, to ensure there is no interference.

I tried to run it as automation config using model.plan and also added manual config file, still same error.

Yet what is intresting that trtexec --loadEngine=model.egine run model with no problem.

Here is some logs of Triton and trtexec:
Triton_full (25.1 KB)
trtexec.log (42.0 KB)

Can you share more steps about loading it in triton?
Currently, trition app(GitHub - NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton) does not support loading gazenet model.

Sorry, i meant that i “tried to load” it in triton. By running command

tritonserver --strict-model-config=false --log-verbose=4 --model-repository=/models

So what you are saying that GazeNet simply not supported in Triton?

See toolkit-triton-apps(GitHub - NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton). We did not implement gazenet yet.

1 Like