Take Random Picture From CSI camera stream

Dear all,

I’ve the following hardware setup:
Jetson Nano
Sony IMX477 CSI camera

And I want to take random pictures from an open camera stream for further analasys (some image processing).

I already can do this with the following piece of c++ code (based on this example CSI-Camera/simple_camera.cpp at master · JetsonHacksNano/CSI-Camera · GitHub):

std::string pipeline = gstreamer_pipeline(capture_width,
										   capture_height,
										   display_width,
										   display_height,
										   framerate,
										   flip_method);
										  
cv::VideoCapture capture(pipeline, cv::CAP_GSTREAMER);
	
if(!capture.isOpened()) 
{
	std::cout<<"Failed to open camera."<<std::endl;
	return (-1);
}										 

cv::Mat img;

std::cout << "Hit ESC to exit" << "\n" ;
while(true)
{
	if (!capture.read(img)) //Read operation takes a lot processing load...
	{
		std::cout<<"Capture read error"<<std::endl;
		break;
	}

	if(imageProcessingFlag)
	{
		//Do my "magic" with img Mat variable 
	}
	cv::imshow("CSI Camera",img);
	int keycode = cv::waitKey(10) & 0xff ; 
    if (keycode == 27) break ;
}

//-------------------------

This solution has to much processing load because of the “cap.read(img)” and in my case what I really need is to read the frame when an image processing command is made (imageProcessingFlag).
It there a solution where instead of using “cap.read(img)” I could use “cap.dump()” to avoid uncessary CPU load and use “cap.read(img)” only when needed ?

I’ve tried several things but with no luck.

Thank you in advance for you help and sorry if this post is not on the right place!

Regards

hello smPt,

this simple_camera.cpp launch the stream via gst pipeline, and it capture every frames.

may I know what’s the actual use-case, what did you meant take random pictures?
you should try modify the code flow to only enable the stream for capturing frames when your command imageProcessingFlag is enabled.

Jerry Chang, thank you for your answer,

I was not clear enough, I do not want take a random picture but “event-driven”. Imagine a scenario where an object is placed in front of the camera, and a user pushes a button or a sensor triggers the object analysis.

If I perform “capture.read(img)” only when the triggers happen it does not work well because the captured image will not correspond to when the trigger was fired.
If I perform “capture.read(img)” every single time and process only the captured image that corresponds to the trigger it works well but there is to much processing load due to the unnecessary “capture.read(img)”.

Maybe I’ve to use another method rather than gstreamer, I appreciate your help.

Regards

hello smPt,

you’ll need to process every capture frames per your use-case.
what’s the process you’re going to achieve within imageProcessingFlag, what’s your expectation of CPU usage.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.