How to use nvds_logger to add custom logs into DeepstreamApp?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Orin NX 16Gb

• DeepStream Version
6.3

• JetPack Version (valid for Jetson only)
5.1.3

• TensorRT Version
8.5.2-1+cuda11.4

• Issue Type( questions, new requirements, bugs)
I’m developing some custom applications based on DeepStreamApp and need to insert logs into the logic. I noticed that there’s an nvds_logger plugin, which is documented in DeepStream 4.0:
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/DeepStream_Development_Guide/baggage/group__ee__logging__group.html#gab89180e74f2ffe5d07367b3036020e64

It’s still present in DeepStream 6.3 under /opt/nvidia/deepstream/deepstream/tools/, but I haven’t found any code examples on how to use it.

I’ve looked at the deepstream_test3 sample, but it creates a pipeline from scratch. I’m unable to understand how to use this example to integrate logging with DeepStreamApp.

Could you please provide some guidance or code examples on how to use the nvds_logger plugin for logging in DeepStreamApp?

1.The nvds_logger plugin processes metadata attached by the upstream or pipeline events to output corresponding logs.

2.NVIDIA DeepStream SDK API Reference: Logging API,
The nvds_log function simply wrap the syslog and can output logs to a specified location

You can see the sample code in
/opt/nvidia/deepstream/deepstream/sources/libs/kafka_protocol_adaptor/nvds_kafka_proto.cpp

In addition, I think the above methods should not match your requirements
. What kind of logs do you want to output to deepstream app?

1 Like

Thanks for showing me the example with the Kafka broker.

For now, I’m looking for a logger similar to spdlog in C++ or logging in Python.

I’ve added nvds_logger to my makefile, and now I can include nvds_logger.h in DeepStreamApp. My application is a customization of the DeepStreamApp logic, implemented in the process_meta function.

Here’s how it works basicaly:

void process_meta (GstBuffer * buf, AppCtx * appCtx, NvDsBatchMeta * batch_meta) {

  // For single source, always display text either with demuxer or with tiler
  if (!appCtx->config.tiled_display_config.enable || appCtx->config.num_source_sub_bins == 1) {
    appCtx->show_bbox_text = 1;
  }

  // __________ For each frame in the batch __________
  for (NvDsMetaList * l_frame = batch_meta->frame_meta_list; l_frame != NULL; l_frame = l_frame->next) {

    nvds_log_open();
    g_print("Processing frame %i\n", log_i);
    nvds_log("LOGIC", LOG_ERR, "Processing frame %i\n", log_i);
    log_i++;
    nvds_log_close();
    ...

I’ve run the ./setup_nvds_logger.sh script before running my code, specifying level 7 (debug) mode and a custom path. The log folder is created, but during my app’s inference, no log file is generated.

Do you have any advice on how to properly use the plugin?

The log message must contains TAG, such as DSLOG. If you want modify the TAG, refer the following command line in the setup_nvds_logger.sh script.

echo ":msg, contains, \"DSLOG\" ~"  >> 11-nvds.conf

The kafka_client.h also include this TAG.

#define NVDS_KAFKA_LOG_CAT "DSLOG:NVDS_KAFKA_PROTO"
1 Like

Thank you so much for your help!

I was able to integrate the nvds_logger plugin into my DeepStreamApp as you suggested, and it’s working perfectly. The guidance you provided made the implementation process smooth, and now the logs are being captured exactly as needed.

I appreciate the support. Thanks again!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.