Create a new docker Image with modified nvmsgconv

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) : dgpu
• DeepStream Version :6.2 docker image

Hi Everyone,
So I am trying to build a new docker image by using the base docker image of 6.2-devel. I have modified the nvmsgconv and in the Docker file, I am trying to replace the files and build the new ‘libnvds_msgconv.so’. The docker images builds successfully but when I try to use the plugin in test-5 app I get the following error:

/libnvds_msgconv.so: undefined symbol: generate_dsmeta_message_protobuf
**** ERROR: <create_msg_conv_broker_bin:251>: Failed to create ‘sink_sub_bin_transform1’**
**** ERROR: <create_msg_conv_broker_bin:291>: create_msg_conv_broker_bin failed**
**** ERROR: <create_sink_bin:828>: create_sink_bin failed**
**** ERROR: <create_processing_instance:884>: create_processing_instance failed**
**** ERROR: <create_pipeline:1485>: create_pipeline failed**
**** ERROR: main:1472: Failed to create pipeline**
Quitting
Attaching the docker file here.
Dockerfile.txt (565 Bytes)

Did you delete the generate_dsmeta_message_protobuf function in dsmeta_payload.cpp?

If you modify the libnvds_msgconv.so you must rebuild the deepstream-test5-app application.

Hi There,
Thanks for the reply. No I didn’t remove any function. I just added some code in generate_dsmeta_message_minimal function. and then after that I rebuild the test-5 application.
But when I run deepstream-6.2-devel docker and then mount the above two files and make them inside container, then it worked fine.
I am not sure why this issue comes in when I build it from dockerfile

1.When you run the following cli, can you get output ?

readelf -s -W  /opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_msgconv.so |grep generate_dsmeta_message_protobuf

What you mean is that it is ok to compile nvds_msgconv in Docker's bash and then run test-5, but an error will be reported when run Docker build -f dockerfile, right?

Yes When I do it through docker bash, it works fine but when using docker build, its causing the errors

for the readelf i got the following output:
139: 0000000000000000 0 NOTYPE GLOBAL DEFAULT UND generate_dsmeta_message_protobuf
2483: 0000000000000000 0 NOTYPE GLOBAL DEFAULT UND generate_dsmeta_message_protobuf

  1. How did you run test-5 during docker build? Can you share the command line?

  2. Can you share dsmeta_payload.cpp and nvdsmeta.h?
    I want to try to reproduce your problem.

I think this problem may be related to some tricks of docker

the test 5 app, I run the docker in interactive mode and build the test 5 app inside it. Sure I have attached all the three files : nvdsmeta.h, ds_meta_payload.cpp and tes5 app main file(renamed as app.c)
dsmeta_payload.txt (16.5 KB)
nvdsmeta.txt (40.6 KB)
app.txt (53.4 KB)

My flow of work

  1. I build the docker image by docker file
  2. run the image in interactive mode
  3. make the test5 app(kafka already up and running with topic created)
  4. run the app
  1. You did delete the generate_dsmeta_message_protobuf function in dsmeta_payload.cpp

  2. Because -Wl,-no-undefined is missing from CFLAGS, there is no error during compilation.

  3. In addition, when you run it in docker bash, libnvdsgst_msgconv.so may already be in the blacklist, you can use
    gst-inspect-1.0 -b view
    Or run the install.sh in /opt/nvidia/deepstream/deepstream to clean cache.
    This issue will reappear.

You can modify the Makefile of /opt/nvidia/deepstream/deepstream/sources/libs/nvmsgconv to the following

CFLAGS:= -Wall -std=c++11 -shared -fPIC -Wl,-no-undefined

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one. Thanks

Please restore the generate_dsmeta_message_protobuf function in the code, and then implement your own generate_dsmeta_message_minimal

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.