Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
Target Operating System
Linux
QNX
other
Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other
SDK Manager Version
1.9.1.10844
other
Host Machine Version
native Ubuntu 18.04
other
Hi,
recently we have implemented a driver for our sensor using the nvmimg_cc sample app. We are successfully receiving data, but the data is not as we expect.
Our sensor outputs RAW14 data type; this type hasn’t been implemented by default in the sample app, but after patching it we have managed to get the data from the sensor.
Problem:
we have expected to receive data as follows:
Bit15 and Bit14 - don’t care, they should have been set to zeroes
Bit 13 to Bit o - sensor data
What we actually receive is:
our data is set from Bit 15 to bit 2 and Bit 1 and Bit 0 are mirrored of Bit 15 and Bit 14.
Example:
expected data:
Bit15 …Bit 0
0001 0010 0110 1111
received data:
Bit15 …Bit 0
0100 1001 1011 1101
the received data seems to be our expected_data<<2 and additional the bit15 and bit14 are copied to bit1 and bit0.
We have searched in the documentation for hints about it, but we couldn’t find anything. Does someone faced such an issue?
Thank you in advance for your support,