Hi,
Description
Facing error while connecting to Triton inference server(seems server startup is having errors)
Environment
• Hardware Platform (GPU): NVIDIA 2080 TI
• DeepStream Version: 5.0
• NVIDIA GPU Driver Version : 450.102.04
• Issue Type: question
Steps To Reproduce
I am trying to use Triton Inference Server for image classification. The docker image that I have tried are:
I have run the server using following command:
sudo docker run --gpus all -it -d -p8000:8000 -p8001:8001 -p8002:8002 -v:/models tritonserver --model-repository=/models
Now, when I try to check the health status of the running Triton server using:
curl -v localhost:8000/v2/health/ready
It connects for a moment and then give: connection reset error:
- Trying 127.0.0.1…
- TCP_NODELAY set
- Connected to localhost (127.0.0.1) port 8000 (#0)
GET /v2/health/ready HTTP/1.1
Host: localhost:8000
User-Agent: curl/7.58.0
Accept: /
- Recv failure: Connection reset by peer
- stopped the pause stream!
- Closing connection 0
curl: (56) Recv failure: Connection reset by peer
I have also tried to run classification on image on triton-inference-server using triton-client setup outside the triton container.
python3 python/image_client.py -m inception_graphdef -s INCEPTION qa/images/mug.jpg
But it gives following traceback
Traceback (most recent call last):
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/response.py”, line 190, in _read_headers
data = self._sock.recv(self.block_size)
File “/home/ubuntu/.local/lib/python3.6/site-packages/gevent/_socketcommon.py”, line 657, in recv
return self._sock.recv(*args)
ConnectionResetError: [Errno 104] Connection reset by peer
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “./python/image_client.py”, line 401, in
model_name=FLAGS.model_name, model_version=FLAGS.model_version)
File “/home/ubuntu/.local/lib/python3.6/site-packages/tritonclient/http/init.py”, line 494, in get_model_metadata
query_params=query_params)
File “/home/ubuntu/.local/lib/python3.6/site-packages/tritonclient/http/init.py”, line 258, in _get
response = self._client_stub.get(request_uri)
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/client.py”, line 266, in get
return self.request(METHOD_GET, request_uri, headers=headers)
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/client.py”, line 260, in request
raise e
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/client.py”, line 254, in request
block_size=self.block_size, method=method.upper(), headers_type=self.headers_type)
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/response.py”, line 298, in init
super(HTTPSocketPoolResponse, self).init(sock, **kw)
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/response.py”, line 170, in init
self._read_headers()
File “/home/ubuntu/.local/lib/python3.6/site-packages/geventhttpclient/response.py”, line 204, in _read_headers
‘connection closed.’)
geventhttpclient.response.HTTPConnectionClosed: connection closed.
Have tried latest and older version of triton-inference-server, in both it is giving the same error.
Looking out for some hint what is going wrong with the setup. In case any other information needed let me know.