Sensor fusion


It’s not doubt that PX2 can capture or record the different sensor data synchronously and realize many function with the data under the DriveWorks. Developer can combine different sensor to realize some function.
Whether this ability of PX2 is so call sensor fusion? What is the definition of sensor fusion for PX2.
Whether there is some document about sensor fusion for detail?

The px2 supports sensor data capturing and record.But it could not be called sensor fusion.The DriveWorks is just a middleware.You can realize sensor fusion on it.So,you have much work to do.

Besides, there is no real synchronization of the acquisition. Cameras cannot be triggered with a common signal and the timestamp reference seems to be shared only for the cameras connected to the same group.
No algorithms to match or synchronize captured data from different sensors at different frequencies. This task is left for the user of PX2.

Right,there are too much works to do.Which company do you work for?

Hi, zhuhaijun and jyebes, thanks very much for your reply.

I refer to the document :
. /NVIDIA_DriveWorks_References_0.3/nvdwx_html/dwx_gui_recording_tool.html and ./NVIDIA_DriveWorks_References_0.3/nvdwx_html/dwx_recording_library.html.

It seems that the record tool can record different sensor data synchronously.
At lease the record data contain the time information for user to easily develop the synchronized algorithms.

@jyebes, do you face same difficulties about synchronization ?
@zhuhaijun, after I communicate with the Nvidia engineers, I agree that there much work for user on sensor fusion.
Do you have already realized some kind of fusion?


If you connect 4 GMSL cameras to one port block, they share the same timeline and they are apparently on sync. Not clear if they were triggered at the same time.
Having more cameras on the system and connected to the other GMSL blocks they present different timestamps, meaning that cameras are in sync if they belong to the same group, but between blocks/groups there are few milliseconds (20-70ms) of difference. Can this be considered in sync? are they even triggered at the same time? No much control on this but just imposed by system design, I guess.

Besides, when playing back the data there are also big differences in milliseconds between Lidar packets and camera frames. It this due to storage bandwidth? or inherently upon data reception on DrivePX2?

This is my experience so far.