Get wrong infer results while testing yolov4 on deepstream 5.0

Hi driver05,

Please help to open a new topic for your issue.
Thanks

You should check your [sink] section.

Well I use the 608_608_fp16 model but I cant reproduce your problem.
The problem I have is with explicit_batch_size > 1 as long as I dont convert with the right batch size.
I have this only with the python test 3 file not with the deepstrea-app testfile there it works…

1 Like

Have you solved this problem?

Hi @ersheng, is this explicit_batch_size > 1 be solved? I got the same problem here.

I have the same problem. Do you have any new things to resolve? Thank