JPS v2 VLM Interface Service chat completions issue

The chat completions gives random and incorrect responses to the questions that it is being asked.
This was not a problem with the previous version of the JPS (v1), and in the previous version the language model was interactive and was giving reliable answers.
Is this a bug in the updated version?

Can you share the whole video? I want to check if there is the scene which VLM output apear in the privious frame.

Sorry, I don’t have the full video atm, but the whole video was just a tiny robot car circling around itself and I stream it to the VLM service. So, basically the scene that is in the screen shot is all that was shown in the video.

Hello again,
I am still facing the same issue, I tried with streaming various videos, loading various AI models, and also on two different Orin boards. However the issue still persists. Do you have any idea what is causing this issue, and if it is a bug on the VLM side, could you ask to be fixed?
Here is a screen shot from the last output that I got from the VLM chat. I also add the video input that I streamed to the vlm using nvstreamer:


I will have a try in my side and feedback later.

1 Like

Please have a try with adjusting the “multi_frame_input” in the main_config.json from the default value of 1 to 2. I verify it in my side. It works fine after adjusting the default value to 2. For more details of the settings. please refer: Visual Language Models (VLM) with Jetson Platform Services — Jetson Platform Services documentation

Thank you for the update, I adjusted the value, and now it seems to work okay.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.