I am learning Deepstream360d_Smart_Parking. In the calibration setting of the perception server, there are two config file which are nvaisle_2M.csv and nvspot_2M.csv. There are a lot of unknown setting in both .csv, for example:
Yes, in Deepstream 3.0 Analytics Application pdf section 12.5 AISLE GROUP, there mentioned calibration-file=-./csv_files/nvaisle_2M.csv. I have some questions regarding nvaisle_2M.csv file.
Q1: In the config_dewarper.txt, we set dewarped surface parameters such as width, height, top-angle, bottom-
angle, pitch, yaw=0, roll=278, focal-length. Why do we need to repeat these setting in nvaisle_2M.csv?
Q2: What is the meaning of properties entry_ROI? What is the different between ROI_x0, ROI_y0 and cx0, cy0?
Q3: If I would like to draw a ROI in dewrapped video, which properties is mandatory?
Q4: Must we use deepstream-360d-app?Can we run the video with deepstream-app with ROI configuration in .csv
file?
Q5: Is there any example of video with ROI configuration, just like deepstream-test4-app?
Q6: What is the relationship between camera-id in [source0] from source2_file.txt and serial in
nvaisle_2M.csv? I changed the both numbers from 6 to 11 and the video analytic works. But when I changed
the both numbers to 26, a dark window appear and it said no camera entry. How to solve this?
Q7: In page 42 of Deepstream 3.0 Analytics Application pdf, there are 2 speeds in object element. What is the
different between these 2 speeds? How do we change the unit from mph to kmph? Is the vehicle speed
calculation done in perception server since it is in metadata? or it should be done in analytic server?
08: How can we save the payload in json format in our computer? Can we save payload json before sending to
Kafka?