Having issues on VLM workflow chat output

Please provide complete information as applicable to your setup.

**• Hardware Platform Jetson Orin **
• JetPack Version 7

I have deployed the VLM workflow on my Jetson, i was able to add a stream and generate output, the problem is that the output is always … whatever the message or the alert is, even with chat messages it gives the same output, here’s a snapshot of the output:

I found that when i reduce the default alert prompt i get an answer, for the alerts, it makes it less accurate, it keeps saying {“r0”: X} or so, because i made the prompt smaller and less understandable, as for the chat messages, they work normally.
My question is: is there somwhere max_tokens is set and that i can change?
Also: Is there a way to change the input message in the alerts, i want to change it from “…{rule …} answer with True or False” to “question”, so i can get contnuous scene describing for example.

Thanks in advance

Move to JPS forum.

Can you check if this helps: Having issues on VLM workflow - #6 by sochoa?