Connecting FLIR IP camera to Jetson TX2 for video streaming using gstreamer

Hi,

My application requires connecting an IP camera to a Jetson TX2 for a video streaming application. I have been going through online materials and I couldn’t find any documentation explaining the process. I also found in one of the websites ([url]https://www.flir.com/support-center/iis/machine-vision/application-note/getting-started-with-flycapture-2-and-arm/[/url]) stating that FlyCapture2 doesn’t support GiGE cameras on ARM devices.

However, my goal is to get an FLIR IP camera working with gstreamer for capturing and streaming. Any support on this is highly appreciated.

Thank you,
Thanuja

Hi,
IP camera can be launched as a rtsp source. There are examples of using rtspsrc:
[url]default video playback for the "deepstream-test3-app" in deepstream-4.0 not working - DeepStream SDK - NVIDIA Developer Forums
[url]https://devtalk.nvidia.com/default/topic/1043770/jetson-tx2/problems-minimizing-latency-and-maximizing-quality-for-rtsp-and-mpeg-ts-/post/5295828/#5295828[/url]

Hi DaneLLL,

Thank you for your response. One of the main problems I’m facing is setting up an IP or finding the IP address of the FLIR GigE camera. I’m using BFLY-PGE-31S4C camera ([url]https://www.flir.com/products/blackfly-gige/?model=BFLY-PGE-31S4C-C[/url]). I installed FlyCapture2 SDK and launched ‘FlyCap2_arm’ application. However, the application doesn’t show or recognise the camera which is connected.

The camera is connected using a POE Netgear switch, where the switch connected to the Gige port in TX2 and the camera is connected to the PoE switch.

I’m guessing this has something to do with assigning the proper IP to the camera or to TX2 NIC interface. I just couldn’t find a proper way to do this (or identify the IP assigned to the camera).

Any help on this is highly appreciated.

Thank you.

Not sure, but I’d think that the IP of the camera may be first set from a dedicated software on host (may be Windows or Ubuntu PC depending on your IP camera. You would also want to upgrade firmware if any available).
Depending on your LAN, you may assign a static IP (so you would know which address you’ve set) or set it as DHCP client, and in this case you would have to check for your IP camera MAC address and look into DHCP server logs for which IP address it has got.
Depending on your IP camera model/options you might be able to ping it.
Also be sure that firewall if any isn’t filtering what you’re tring to read.

Hi all,

Thank you for your responses. I managed to get the IP matter resolved. You need to set the connection to link-local and select auto force IP from the flycapture SDK. This will reveal the camera IP and set an IP for the host network interface.

However, I’m now facing another problem when trying capture the video frames from the camera.

Upon running a command similar to this,

gst-launch-1.0 rtspsrc location=rtsp://169.254.0.1:554/video.pro1 latency=0 ! rtph264depay ! fakesink

I get the following output.

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://169.254.0.1:554/video.pro1
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Can someone please help me to get this resolved?

Thank you.

I don’t have such device and cannot tell much more, but I’d see two possible causes:

  • Maybe your camera is configured with login/password, so you would have to provide these as in this post.
  • I think link-local is an IPv6 thing, and I have poor knowledge for this…someone else may better advise.

Hi All,

Thank you for your responses. I managed to solve this issue by referring the following link. [url]https://www.flir.asia/support-center/iis/machine-vision/knowledge-base/lost-ethernet-data-packets-on-linux-systems/[/url]

Basically you need to increase the network receiver buffers, jumbo packet size and MTU size.

However, it seems the FLIR IP camera model that I’m using BFLY-PGE-31S4C doesn’t support rtsp streaming. Therefore, I’m will have to workout another solution for grab frames, compress them through g-streamer pipeline and stream them. I will working through the sample codes available from the FlyCapture SDK and will update the forum when there is any progress.

Does FLIR supports UDP streaming instead ? In such case you may check this (it is also changing kernel socket buffer max size, sorry I missed this for your case…but I like your link with jumbo packets as well).

Hi,

Were you able to do this? Could you please share your findings here?

@nk.gulia

Not sure if you still need this, but checkout aravisproject - GitHub - AravisProject/aravis: A vision library for genicam based cameras

Also, if you reach out to FLIR support, they could help you get Aravis GStreamer set up! Hope this is what you were looking for.

Hi thanuja0z4uf!

Did you manage to solve this problem? I’m in the same situation that it seems my BlackFly camera does not support RTSP :-(. I manage to set up a static IP and all, so all I need is a way to grab the information.