Hi
I have a problem that the inference results from tao unet jupyter notebook and from
tao-toolkit-triton-apps are different but using the same model.
Therefore, I tried to use the peoplesemsegnet model form your website to check the problem is related to inference platform or the model.
However, I got an error when I tried to using the peoplesemsegnet model on tao-toolkit-triton-apps.
Below are my command & error log:
python tao_client.py ./test3 -m test3 --mode Test -i https -u localhost:8000 --async --output_path ~/tao-toolkit-triton-apps/tao_triton/python/entrypoints/results3
Traceback (most recent call last):
File “tao_client.py”, line 447, in
main()
File “tao_client.py”, line 428, in main
responses.append(async_request.get_result())
File “/home/justin927/anaconda3/envs/deepstream/lib/python3.6/site-packages/tritonclient/http/init.py”, line 1590, in get_result
_raise_if_error(response)
File “/home/justin927/anaconda3/envs/deepstream/lib/python3.6/site-packages/tritonclient/http/init.py”, line 64, in _raise_if_error
raise error
The attachment is my config file. Because I didn’t know how to set the config, so I just used my unet model setting. If there are any config setting reference?
config.pbtxt (246 Bytes)