We’re having an issue while running the VLM workflow
We’re able to add a camera and generate alerts on the mobile app, but while adding a second camera we are getting an error saying ‘Max WebRTC connections reached’
Also the ROI feature fails and says ‘Failed to set ROI’
Can you update “max_webrtc_out_connections”: 1, in /opt/nvidia/jetson-1.1.0/services/vst/config/vst_config.json to the prefer number?
Here is the guide on JPS VLM. Why you need set ROI for VLM workflow?
One more issue we are facing - after adding 4 cameras on VST, on the mobile app we get an error saying - Invalid VLM stream ID whenever we use the chat option to interact with the VLM
And on the docker logs for VLM, we see that there is an error saying - max retries exceeded for URL - /api/v1/chat-completion
Error on jps_vlm container : HTTPConnectionPool(host=‘0.0.0.0’, port=5015): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0xffff8a5c7700>: Failed to establish a new connection: [Errno 111] Connection refused’))
It is working when I stop AI-NVR containers using sudo docker compose -f compose_nano.yaml down --remove-orphans
But VLM gives error saying - HTTPConnectionPool(host=‘0.0.0.0’, port=5015): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0xffff8a5c7700>: Failed to establish a new connection: [Errno 111] Connection refused’))
even with curl, when AI-NVR containers are running
Do you think there is some port forwarding / port blocking issue ?