Use an internet .H264 video stream for detection (on Linux desktop)

Hello there,

My goal is to use a .h264/mp4 video flow streamed from the internet to apply detection on it (using the DriveNet for example) in real time.

In this process, the hardest/unknown part for me would be, once the .h264 stream is received on the Linux desktop (using LibVLC or GStreamer) to convert in real time its frames to a format acceptable by the DriveNet detector. And finally stream those frames to the DriveNet for processing and display.

I noticed that alot of people tried analog tasks on the Jetson platforms. But similar threads concerning the DriveWorks API and/or Drive PX 2 aren’t legion.

===================================< SPECIFICATIONS >======================================

  • Streamer: VLC media player recording on another computer, and transmitting the stream via the network.
  • Network Streaming Protocol: HTTP / RTP / RTSP / UDP
  • Streamed Frames Format: .h264 video codec + MP4 encapsulation

+> Real time conversion needed

I would be glad if some experienced developpers could advise me on this task, I.E:

  1. how to convert a .h264 +mp4 encapsulation into a pure .h264 stream (is it even necessary ?)
  2. how to interface it with, let’s say, the DriveNet detector ?

(PS: I am not afraid to dive in the DriveWorks’ C++ source codes)

Thank you.

Dear bcollado-bougeard,

Could you please refer to below link for your topic? Thanks.

https://devtalk.nvidia.com/default/topic/1021745/general/running-samples-with-a-video-data-stream-as-input/