Getting network Images in Driveworks

Hi Developers

I am sending images in jpg format (currently webcam as a starting point, later simulation frames) main.cpp (11.8 KB) tcpServer.py (788 Bytes) through socket via a server in python script. I want to receive the images in driveworks using the camera sensor with protocol “data.socket”. Referring to the files in “Connecting camera.nvidia-ip - #6 by shayNV”, I do not know how to handle the received data and properly assign the image values to the image pointer in main.cpp. (Lines 151-208).

I would appreciate any assistance with respect to properly receiving and displaying the image in my code, also because I am not an experienced programmer, I am currently stuck with my project.

Dear @aa.padmashali,
I notice most of the code is already filled in the template code and remaining is expected to be filled as per the use case. I recommend looking at API calls to understand the code flow and usage. Please let us know if you face issues with your code after incorporating your changes.

Thanks for your reply @SivaRamaKrishnaNV. I was able to get the image to render on screen but the colours are mismatched

. The left view is the NVIDIA Sample and the right view is from the simulator.

Also is there any similar sample code for receiving raw xyz points of pointcloud (lidar) over socket from the simulator? Please point me at the right direction :)
Thank you

Dear @aa.padmashali ,
Does that mean, you want AGX to read a simulated point cloud data to via ethernet?

Dear @SivaRamaKrishnaNV ,
I want to have a testing platform using the CARLA simulator where I get sensor data such as camera and lidar as bytearrays over sockets. I have the camera working smoothly. But I am not able to use the same principle for the lidar. Could you share any sample template code (for lidar) which I can make use of as a starting point?