Hello,everyone. I have one urgent problem to be solved. The question is:
On TX2,it has CSI camera interfaces. I want to know that if I use there interfaces for my camera capturing,does it costs a lot on CPU? In ideal situation,they use little CPU resources so I can do much more for other things with the CPU resources.
I would appreciate it very much if someone can help me solve this problem.Urgently!
Hi 1028554060,
We have internal ISP for processing camera images, usually not many CPU resource, we’re not sure what your applications to utilize the CPU resource, thus can’t give the answer precisely. But in general, we can use Jetson TX2 on multichannel streaming and object detection in real time, the system performance should not be an issue on most use cases.
Thanks
Hi,kayccc. What’s the ISP? And the situation is:I use the Basler USB3.0 cameras,so I use the USB interfaces on TX2 for Basler cameras capturing. However,I found that it utilize much CPU resource and little CPU resource left for other things.(By the way,is it USB interfaces that caused inevitable much CPU occupation or is it the Basler SDK that caused this?) By looking up some information I find that TX2 has something called CSI.So ,I want to know that if the called CSI interfaces use DMA(direct memory access) for data transfer or something similar so that it can reduce much more CPU resource.
HI 1028554060,
ISP(Image Signal Processor) is a dedicated hardware block present in the Jetson boards which perform all major Image Processing operations with minimum CPU utilization. In order to use the internal ISP, you may have to buy CSI cameras from one of the Jetson preferred camera partners.
[url]https://developer.nvidia.com/embedded/community/ecosystem[/url].
There are many features that the NVIDIA ISP can provide which include multichannel streaming and easy interoperability with other frameworks like CUDA, OpenGL, TensorRT etc.
To access the cameras you can use the nvarguscamerasrc gstreamer plugin or the libArgus APIs.
The framework is built in such a way that Argus delivers images in one of two ways.
a) EGLStreams, which are directly supported by other system components such as OpenGL, Cuda, and GStreamer. EGLStreams manage the allocation and lifespan of all buffers, and they are passed between Argus and the consumer directly such that no buffer copies are required during delivery to the consumer.
b) Buffer OutputStreams, which are created and managed by the client in order to wrap native buffer resources as Buffer objects that are used as the destination for capture requests.
To know more about the features, samples and framework provided kindly check the MultiMedia Documentation.
[url]https://docs.nvidia.com/jetson/l4t-multimedia/index.html[/url].
Hi,waisskharni.sm. You suggest I buy CSI cameras from one of the Jetson prefered partners so that I can better use these cameras for image capturing with minimum CPU utilization,so does these cameras have common API for all my demands for image capturing without any other third-party SDK or they have their own SDK for these operations? And another question is: how and where to install these cameras on jetson TX2? Thank you for the kindly help.
Hi 1028554060,
You will be in need of using the ‘tegra_multimedia_api’([url]https://developer.nvidia.com/embedded/downloads[/url]) package provided by NVIDIA. You may not require any other third party SDK’s in order to access the camera.
You do not have to worry about the installation process and steps on accessing the cameras.
On purchase of the Cameras you will be receiving a Kernel package with preinstalled application binaries and source.
The Kernel patches are also provided if you need to customize the camera driver. An application user manual will also be provided for you to understand the various features provided by the camera.
Thank you very much,Waiss Kharni SM.