I have pointgrey camera FL3-U3-20E4C-C with USB and JETSON1.I just wanna capture the images from pointgrey camera with JETSON TX1.But I don’t know where is the specification of flycapture’s lib,and how to use camera’s lib and write procedure’s camkelist.
There aren’t any official drivers for Linux ARM (for ptGrey) last time I checked but you can use the Aravis API. I’ve been using the Ethernet GigEVision cameras and they work great. Aravis now has USB camera support so it might just work for you take a look at the aravis project on github.
I’ve implemented the driver API in my github project here https://github.com/Abaco-Systems/jetson-inference-gv to stream ptGrey Blackfly cameras.
Take the camera.h/cpp (base class includes colour space conversion) and gvStream.h/cpp you will also need the CUDA functions to map your USB camera colour space to RGB. So far I’ve written functions for RGB YUV422 and BAYER_GR8… Hopefully your camera will support one of these modes…
NOTE if you don’t offload the colourspace conversion to the GPU your Tegra will struggle (will kill the CPU). What resolutions are you running? Do you know what colourspace your camera supports?
My camera is FL3-U3-20E4C-C,and I’ve found the flycapture.18.104.22.1686_arm64 supporting for my camera in https://www.ptgrey.com/support/downloads.But I don’t know how to use it because I couldn’t find its specification.And I will try your advice.Sorry,I don’t know what is colourspace.I just konw my camera is with RGB.If I meet some troubles again,can I ask it to you?
Your camera works in the RGB colour space so this should be nice and easy putting that out to the display using OpenGL to render the video.
Its interesting to see they now have drivers for ARM64 and ARMHF I need to download these and give them a try at some point. GigEVidion and USB Vision are standard designed to not need a specific vendors driver installed so aravis could still be a good option if you want to run with other vendors cameras at some point in the future.