The Driveworks provide a tool for sensor logging, including a Camera Server Daemon Process that could broadcast the camera frames to a slave app using tcp communication.
Does any one know how does it exactly use the tcp protocol to transfer camera frames?
Thanks for the reply. I checked the ImageStreamer api and also the image streaming cross-process sample in the documentation. If I would like to implement my own application using image streaming to broadcast camera frames as well, how can I specify the port, etc.
For example if on the outer side it’s not an Nvidia slave process but just another socket application listening to the broadcasting images, how can I use imageStreamer to set up a server and build the connection?
Dear zijian.han,
EGLStream supports cross process communication. We have ImageStreamer APIs which are wrapper over EGLStream. If you are asking for socket programming, you can modify our inter process communication sample to read camera frames using DW APIs and send data array to slave process via socket programming. ImageStreamer APIs does not make use of port number.
With the socket programming, did you mean the socket send function can send the array directly or should I use a loop to send each element in an image array? If I am recording multiple cameras, I need to handle the time sync as well when sending the image right?
Did you review “Recording Sensor Data” section in DriveWorks doc?
On your hostPC or DriveAGX : file:///usr/local/driveworks-2.0/doc/nvsdk_html/dwx_recording_devguide_group.html
I guess DW recording tool already supports Data Acquisition from Multiple Sensors including time stamp. Please refer to the doc. Thanks.