Sample recognition result and CAN

We have confirmed that we could recognize using the camera and sample, but where are the recognized results and values displayed?

Also, when using the recognized results to control the vehicle, how should I input and output CAN? If there is an example of the system configuration to be a reference, I would appreciate your favor.

Dear s18,
We have provided various samples in /usr/local/driveworks/samples/… Could you check canbus samples for your topic(/usr/local/driveworks/samples/src/sensors/canbus/).

I’m sorry.

  1. I do not know what to look in / usr / local / driveworks / samples /. For example, where are the values (results) displayed when running the camera sample with sudo ./sample_drivenetNcameras --input-type = camera?

  2. Excuse me. As with 1, / usr / local / driveworks / samples / src / sensors / canbus / I do not know what to look at.

As it is the first time to touch such a system, it would be helpful if you could tell me a little more.

Dear s18,
For sample_drivenetNcameras sample, you would see the detections in the window it self.
Could you please check the DW documentation(/usr/local/driveworks-2.0/doc/nvsdk_html/index.html) to know details about the samples and DW APIs.
/usr/local/driveworks-2.0/samples/src/sensors/canbus has samples to show reading and interpreting CAN bus data.

Is the result of sample_drivenetNcameras output when stopped?

Dear s18,
It is not clear what you are asking for? If you want to test sample_drivenetNcamerassample with live camera, you are expected to check it on road condition. Otherwise, you can feed the recorded video as input to it. Could you please take a look at the DW documentation and sample source code and let us know which part is not clear.

I will investigate a little more.

I could not tell you the question in English well.

I want to configure the system as the attached document.
For example, if you want to perform recognition processing using sample_drivenetNcameras in Xavier, and use a Simulink control model to experiment with a car or simulator, can you use the recognition results to move a car or simulator?

First of all, even if you use a Simulink control model, if you use Xavier, do you need MicroAutoBox?

Can someone give me an answer?

Dear s18,
It looks like you are asking for something which is similar to DRIVE Constellation. Could you please check https://www.nvidia.com/en-us/self-driving-cars/drive-constellation/

It is difficult to purchase a new product …
I would like to build a hardware-in-the-loop using Xavier (control model created with Mastlab / Simulink using Xavier’s recognition processing technology and MicroAutoBox simulated with IPG CarMaker) and vehicle-in-the-loop. Now you can manage to play Xavier camera samples. However, Xavier is not fully utilized because the basic usage is not understood.
I am a beginner who has never built a C language or system, so I don’t know where to start. How can I make good use of Xavier?