Loading Triton Server library: unable to find required entrypoint 'TRITONSERVER_Inference RequestSetCorrelationIdString'

A30

running the triton serv er from within the triton docker am running into a problem with the library … would appreciate any suggestions, Thanks!

sudo docker run --gpus all -it --restart always  -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY -v /home/dell:/home/dell -w /opt/nvidia/deepstream/deepstream-6.0 nvcr.io/nvidia/deepstream:6.0-triton
tritonserver --model-repository=/opt/nvidia/deepstream/deepstream-6.0/samples/trtis_model_repo/retinanet_resnet18_mod/1 &
   
curl -v localhost:8000/v2/health/ready
*   Trying 127.0.0.1:8000...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 8000 (#0)
> GET /v2/health/ready HTTP/1.1
> Host: localhost:8000
> User-Agent: curl/7.68.0
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Content-Length: 0
< Content-Type: text/plain
<
* Connection #0 to host localhost left intact

perf_analyzer -m trt.intnt8.engine --service-kind=triton_c_api --triton-server-directory=/opt/tritonserver  --model-repository=.

 USING C API: only default functionalities supported
OpenLibraryHandle: /opt/tritonserver/lib/libtritonserver.so
error: Loading Triton Server library: unable to find required entrypoint 'TRITONSERVER_InferenceRequestSetCorrelationIdString' in backend library: /opt/tritonserver/lib/libtritonserver.so: undefined symbol: TRITONSERVER_InferenceRequestSetCorrelationIdString