Kafka not receive any msg

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version7.1 in docker
when use python3 deepstream_test5.py -c test5_b16_dynamic_source.yaml -s source_list_dynamic.yaml in /opt/nvidia/deepstream/deepstream/service-maker/sources/apps/python/pipeline_api/deepstr eam_test5_app, the kafka can receive

when use my code cant send to kafka,and no any errors, the kafka settings are same as up example

any one konw why?

payload-type: 1
msgconv_donfig_file: "/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test5/configs/dstest5_msgconv_sample_config.txt"
config: "/opt/nvidia/deepstream/deepstream/sources/libs/kafka_protocol_adaptor/cfg_kafka.txt"
proto_lib: "/opt/nvidia/deepstream/deepstream/lib/libnvds_kafka_proto.so"
conn_str: "localhost;9092"
topic: "test5app"

could you share the deepstream log? wondering if there is any error information. did the client succeed to connect?

no any logs ,i use GST_DEBUG=3 python3 main.py start my code,no any error logs

root@ps:/tmp/data/api# GST_DEBUG=3 python3 main.py 
2024-11-20 06:37:00.265 | INFO     | component.onnx_to_trt:process_onnx:77 - The engine file '/models/engine/yolov8x-trt-fp16-netsize-640-batch-16.engine' already exists and will be reused. Use --force to rebuild.
2024-11-20 06:37:00.272 | INFO     | component.onnx_to_trt:update_config_file:127 - PGIE Configuration file 'config/config_pgie_yolo_det.yaml' updated.
2024-11-20 06:37:00.275 | INFO     | component.pipeline:create_rtsp_server:36 - ***DeepStream: Launched RTSP Streaming at rtsp://localhost:8554//live***
2024-11-20 06:37:00.286 | INFO     | component.pipeline:create_pipeline_multiuri:161 - Creating Pipeline 
 
2024-11-20 06:37:00.286 | INFO     | component.pipeline:create_pipeline_multiuri:167 - Creating Source 
 
2024-11-20 06:37:00.343 | INFO     | component.pipeline:create_pipeline_multiuri:313 - Adding elements to Pipeline 

2024-11-20 06:37:00.343 | INFO     | component.pipeline:create_pipeline_multiuri:337 - Linking elements in the Pipeline 

2024-11-20 06:37:00.350 | INFO     | __main__:main:61 - Starting pipeline 

Failed to query video capabilities: Invalid argument
Civetweb version: v1.16
Server running at port: 9999
0:00:00.414270888  6208 0x560d0daea9d0 INFO                 nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2092> [UID = 1]: deserialized trt engine from :/models/engine/yolov8x-trt-fp16-netsize-640-batch-16.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:327 [FullDims Engine Info]: layers num: 5
0   INPUT  kFLOAT images          3x640x640       min: 1x3x640x640     opt: 16x3x640x640    Max: 16x3x640x640    
1   OUTPUT kINT32 num_dets        1               min: 0               opt: 0               Max: 0               
2   OUTPUT kFLOAT det_boxes       100x4           min: 0               opt: 0               Max: 0               
3   OUTPUT kFLOAT det_scores      100             min: 0               opt: 0               Max: 0               
4   OUTPUT kINT32 det_classes     100             min: 0               opt: 0               Max: 0               

0:00:00.414337971  6208 0x560d0daea9d0 INFO                 nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2195> [UID = 1]: Use deserialized engine model: /models/engine/yolov8x-trt-fp16-netsize-640-batch-16.engine
0:00:00.418414483  6208 0x560d0daea9d0 INFO                 nvinfer gstnvinfer_impl.cpp:343:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:config/config_pgie_yolo_det.yaml sucessfully
2024-11-20 06:37:00.697 | INFO     | __main__:main:66 - Starting loop 


**PERF:  {'stream0': 0.0, 'stream1': 0.0, 'stream2': 0.0, 'stream3': 0.0, 'stream4': 0.0, 'stream5': 0.0, 'stream6': 0.0, 'stream7': 0.0, 'stream8': 0.0, 'stream9': 0.0, 'stream10': 0.0, 'stream11': 0.0, 'stream12': 0.0, 'stream13': 0.0, 'stream14': 0.0, 'stream15': 0.0, 'stream16': 0.0, 'stream17': 0.0, 'stream18': 0.0, 'stream19': 0.0, 'stream20': 0.0, 'stream21': 0.0, 'stream22': 0.0, 'stream23': 0.0, 'stream24': 0.0, 'stream25': 0.0, 'stream26': 0.0, 'stream27': 0.0, 'stream28': 0.0, 'stream29': 0.0} 

uri:/api/v1/stream/add
method:POST
Warning: gst-stream-error-quark: No decoder available for type 'audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)2, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)119056e500, rate=(int)48000, channels=(int)2'. (6): ../gst/playback/gsturidecodebin.c(960): unknown_type_cb (): /GstPipeline:pipeline0/GstDsNvMultiUriBin:multi-uri/GstBin:multi-uri_creator/GstDsNvUriSrcBin:dsnvurisrcbin0/GstURIDecodeBin:nvurisrc_bin_src_elem
Failed to query video capabilities: Invalid argument
mimetype is video/x-raw

**PERF:  {'stream0': 33.35, 'stream1': 0.0, 'stream2': 0.0, 'stream3': 0.0, 'stream4': 0.0, 'stream5': 0.0, 'stream6': 0.0, 'stream7': 0.0, 'stream8': 0.0, 'stream9': 0.0, 'stream10': 0.0, 'stream11': 0.0, 'stream12': 0.0, 'stream13': 0.0, 'stream14': 0.0, 'stream15': 0.0, 'stream16': 0.0, 'stream17': 0.0, 'stream18': 0.0, 'stream19': 0.0, 'stream20': 0.0, 'stream21': 0.0, 'stream22': 0.0, 'stream23': 0.0, 'stream24': 0.0, 'stream25': 0.0, 'stream26': 0.0, 'stream27': 0.0, 'stream28': 0.0, 'stream29': 0.0} 

which configuration are you testing? I didn’t find msgconv_donfig_file in deepstream_test5_app. did you only modify the configurations?
can you add log in handle_metadata? please check if frame_meta.append(event_msg) is called.

in test5_b16_dynamic_source.yaml

  - type: nvmsgconv
    name: msgconv
    properties:
      config: "/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test5/configs/dstest5_msgconv_sample_config.txt"
  

how to add add log in handle_metadata with python?

you can add print before that code.

no handle_metadata do you mean add osd_sink_pad_buffer_probe func?

there is handle_metadata in \opt\nvidia\deepstream\deepstream\service-maker\sources\apps\python\pipeline_api\deepstream_test5_app\deepstream_test5.py

thsi example code is right and no error

i mena use the same config in msgconv and msgbroker,

but my code cant send to kafka

the default msg2p-newapi of nvmsgconv is 0, nvmsgconv will generate json string from event meta.please refer to my last three comment. could you add log to check if event meta is added.

very sorry, this maybe a network problem!

test in other computer is ok

could you help me my other post?

Sure. Glad to know you fixed it, thanks for the update! If need further support, please open a new one. Thanks

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.