Streaming video outside Isaac

I have set up a robot which moves to a specified location just like in the provided webinar. Next I have added a camera by editing the app.json file. All is working and I can see what the camera sees on Isaac sight. Simulation also works fine. Next I would like to do is stream the video from the robots camera and do some image processing and machine learning outside Isaac. Essentially I would like to record the video frame by frame somewhere on disk and then open it by another program. I have tried looking into the tutorials/opencv_edge_detection example to try to copy and paste the code into my project and view the camera feed by openCV, but no luck. Any help is greatly appreciated. I can also provide the project files.

https://docs.nvidia.com/isaac/isaac/packages/sight/record_replay.html

Have you looked at the record & replay tool?

https://docs.nvidia.com/isaac/isaac/packages/sight/doc/recordReplaySetup.html#recorder-component

I have added the the recorder component to the app.JSON file just like in the link you provided. I’m trying to log the color information from the camera by defining the following edge:

{
“source”: “sim_camera/isaac.alice.TcpSubscriber/ColorSensor”,
“target”: “recorder/isaac.alice.Recorder/ColorSensor”
}

But now when I run the app I get Segmentation fault (core dumped). If I remove this edge, the app runs normally.

Seems like a bug. Noted - will get back to you with a work around soon.

While we are looking into the remote recording, may suggest the workaround of recording locally. That is place the recorded bridge and record node in the Isaac app connected to the camera, and afterward move the recorded log over.

Could you please elaborate more? where is this log located, in /tmp/isaac/ ? I’ve tried recording via Isaac sight by right clicking and start record but I do not know where to find this record file.

By default the recorder component will store the cask in

/tmp/isaac

. This is configurable via the “base_directory” parameter of the recorder node/component which you can set in your app.json or in Sight. The

carter.config.json

in

apps/carter/carter_sim

would have an example of this.

I have run the carter_sim example and set the record to true in the config file. But unfortunately this is not what I’m looking for since in order to play those recordings I need to load the logs in Isaac sight which is also not working for me. But I have found that the ball_segmentation sample project is what I’m looking for. I have managed to access the “live” images via openCV inside the training.py file. However I don’t want to have a ball randomly spawning around.
How can I replace the random ball with a simple robot like the one in the webinar example in this ball_segmentation example?

Thank you for your help.

Hello, I have the same error. Did you figure out how to resovle the error?

Thanks,

Hi Leon, shengchen,

Are you still seeing the issue with record app? Can you please attach the log failure message on the thread?

Hi Leon and Shengchen, can you please respond if you are still seeing the issue with the record app?