Thanks for the help! I double-checked the existence of that file and the path of PROTOCOL_ADAPTOR_LIB. Both seem to be correct. Here is an output of the macro of C file and the output of the command:
Have you tried to implement this on a tesla analytics server and a jetson board perception server? I am using jetpack 4.2.1 with deepstream 4.0 on Jetson.
Would it be possible for you to also implement this with the same setting and let me know which version of the software is able to run test4 please? I am thinking it could also be a version compatibility problem. Thanks again for your support!
hmm, you can find the kafka library,
nvidia@Nvidia:~$ ll /usr/lib/aarch64-linux-gnu/tegra/libnvds_kafka_proto.so
-rwxr-xr-x 1 root root 35224 Jul 10 15:55 /usr/lib/aarch64-linux-gnu/tegra/libnvds_kafka_proto.so*
have no idea why it can not open it.
can you follow the README step by step as provided in test4 sample app try again?
you are using Deepstream 4.0 EA release, right? how about this, uname -a, please paste here.
if you are using latest Jetpack 4.2.1 GA release, then DS4.0 EA release package likely not to work
Yes, I am using DS 4.0 EA with Jetpack 4.2.1. I was instructed that Jetpack 4.2.1 was the version to run DS 4.0… so which Jetpack version should I use?
Thank you! For some reasons, running the zookeeper and kafka in the analytics server file with docker composer is necessary. I used my own version of kafka and zookeeper before and that was the problem. Now I have another question: how can I see the message in the analytics server sent from the sample test 4? I ran the kafka-console-consumer.sh in kafka docker container, but can’t see any received message there
In the smart parking application, the information is formed based on the nv-schema.json. However, the message sent from test 4 has a different sets of messages (person/vehicle). Thus, if you look at the kibana, all of the messages are about vehicle. I tried to modify nv-scema.json to include person and bicycle properties, but that doesn’t fix anything. Is there anyway to find the message directly from kafka container?
test4 by default sends message for first object every 30th frame, So most likely first object will be vehicle in the frame, you can use different stream with no or less vehicles if possible.
parking application and test4 both use same schema to form messages but how metadata is generated is different. In parking app, spot and aisle components provide metadata and based on that schema is generated, In test4 app itself generate metadata to show the use case.
Thank you for clarifying this with me! I did find the schema file for test4 app. It is internally a struct in NvDsEventMsgMeta. I found all of the variables in the struct was set, but they are not sent as part of the message. (At least in kibana, some variables are not shown). So, which part of the code in test4.c determines what information is included?
Yes, I noticed this function, but some variables set here is not shown in kibana. For example, the variable “objClassId” shows whether the object is vehicle or person. This is set in test4.c, but is not shown as a field in kibana. Is this something I can set in the analytics server?