Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
Target Operating System
Linux
QNX
other
Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other
SDK Manager Version
1.7.0.8846
other
Host Machine Version
native Ubuntu 18.04
other
Hi Team,
I am trying to perform preprocessing using dwDataConditioner APIs. Below are the steps being executed below -
- Read image frames using opencv
- Loaded the image data to dwimageCUDA instance on device memory
- Set the dataConditionerParams for the mean and std. deviation values
- Invoked the dwDataConditioner_prepareDataRaw method accordingly
- Logged the output for input and output pixel data for the reference frame.
It seems that the outputs are not accurate as per the DW API calculations i.e. -
The results are computed using the following formula -
- meanImage is optional and is 0 if it is not set
- perPlaneMeanX is the mean per plane for that channel and is optional and 0
- Normalizaton formula for RGB -
R’ = ((R - meanValue[0] - meanImage[pixelIndex] - perPlaneMeanR) / stdev[0]) * scaleCoefficient
G’ = ((G - meanValue[1] - meanImage[pixelIndex] - perPlaneMeanG) / stdev[1]) * scaleCoefficient
B’ = ((B - meanValue[2] - meanImage[pixelIndex] - perPlaneMeanB) / stdev[2]) * scaleCoefficient
Ref. URL - DataConditioner Interface
Please find the below code snippet from my experiment. Also, do let me know if my approach is correct for handling the image params for the dataconditioner. The output (normalized) obtained from DataConditioner seems to be 0 for the targeted pixels.
code_snip_dataconditioner.txt (2.6 KB)