accelerated gstreamer components, gstreamermm and, closing the pipeline.

So I have a simple program saving video to file using gstreamermm, the c++ bindings for gstreamer. You can see it below. It works but the problem is that when I need to shut down the pipeline, telling the mainloop to quit doesn’t seem to work like it does with other pipelines.

To test, I added a --test flag which is just “fakesrc num-buffers=50000000 ! fakesink”. When I run that pipeline, mainloop->quit() stops it immeidately. When I try the same with Nvidia’s components, it doesn’t (the callback just prints “Got SIGINT” when ctrl+c is pressed) and I have to kill it, resulting in an unplayable file.

Does anybody more familiar with gstreamer than I know what the recommended way would be to cleanly shutdown a pipeline like I have below (nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1 ! nvv4l2h265enc bitrate=4000000 ! h265parse ! matroskamux ! filesink location=…)

Suggestions in any language would be most welcome. A way to get the camera Element and turn it off, maybe?

#include <iostream>
#include <csignal>
#include <gstreamermm.h>
#include <gstreamermm/message.h>
#include <glibmm.h>
#include <thread>

// main event loop for the program:

Glib::RefPtr<Glib::MainLoop> mainloop;

// a callback to process messages from the Bus, (we need this so the program exits on error or quit)
bool on_bus_message(const Glib::RefPtr<Gst::Bus> &bus, const Glib::RefPtr<Gst::Message> &message) {
	switch (message->get_message_type()) {
		case Gst::MESSAGE_EOS:
			std::cout << "End of stream reached.";
			mainloop->quit();
			break;
		case Gst::MESSAGE_ERROR: {
			// parse message (to see how this works, close the window during playback)
			auto errmsg = Glib::RefPtr<Gst::MessageError>::cast_static(message);
			std::cerr << "Error received from element " << errmsg->get_source()->get_name() << ": "
			          << errmsg->parse_error().what() << std::endl;
			std::cerr << "Debugging information: " << errmsg->parse_debug() << std::endl;
			mainloop->quit();
			break;
		}
		default:
			std::cout << "BUS_DEBUG: " << message->get_source()->get_name() << ":"
			          << gst_message_type_get_name(static_cast<GstMessageType>(message->get_message_type()))
			          << std::endl;
			break;
	}
	return true;
}

void on_SIGINT(int signum) {
	std::cout << " Got SIGINT (" << signum << ")" << std::endl;
	mainloop->quit();
}

Glib::RefPtr<Gst::Pipeline> create_pipeline(const Glib::ustring& outfile) {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();

	// create elements of the pipeline
//	pipeline_string = "nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080,"
//	                  " format=(string)NV12, framerate=(fraction)30/1 ! nvv4l2h265enc bitrate=4000000 "
//	                  "! h265parse ! matroskamux ! filesink location=";

	// source
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30); // when enabled, mainloop->quit() stops working
	camera->set_property("tnr-mode", 2); // high quality temporal noise reduction TODO: test cost
	camera->set_property("ee-mode", 0); // edge enhancement off
	camera->set_property("aeantibanding", 0); // ae antibanding off
	camera->set_property("maxperf", true);

	// filter capabilities (set camera settings)
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);

	/*
	 * gst-launch-1.0 videotestsrc ! capsfilter caps=video/x-raw,format=GRAY8 ! videoconvert ! autovideosink
	 *
	 * and this line
	 *
	 * gst-launch-1.0 videotestsrc ! video/x-raw,format=GRAY8 ! videoconvert ! autovideosink
	 *
	 * are equivalent... that shorthand had me confused for hours
	 *
	 * https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-capsfilter.html
	 */

	// h265 encoding
	Glib::RefPtr<Gst::Element> encoder = Gst::ElementFactory::create_element("nvv4l2h265enc", "encoder");
	encoder->set_property("bitrate", 4000000);

	// packaging stream
	Glib::RefPtr<Gst::Element> parser = Gst::ElementFactory::create_element("h265parse", "parser");
	Glib::RefPtr<Gst::Element> muxer = Gst::ElementFactory::create_element("matroskamux", "muxer");
	muxer->set_property("writing-app", (Glib::ustring) "birbcam");
	muxer->set_property("min-index-interval", 6e+10); // write an index every minute in case of crash/power loss
	// so as not to have to run a video file fixer
	// TODO: experiment with values

	// sink
	Glib::RefPtr<Gst::FileSink> sink = Gst::FileSink::create("sink");
	sink->set_property("location", outfile);

	// add all elements to the pipeline
	try {
		pipeline->add(camera)->add(camera_caps_filter)->add(encoder)->add(parser)->add(muxer)->add(sink);
	} catch (std::runtime_error &err) {
		std::cerr << "Error adding Element:" << err.what() << std::endl;
		throw err;
	}

	// connect all elements in the pipeline
	try {
		camera->link(camera_caps_filter)->link(encoder)->link(parser)->link(muxer)->link(sink);
	} catch (std::runtime_error &err) {
		std::cerr << "Error linking Element:" << err.what() << std::endl;
		throw err;
	}

	return pipeline;
}

Glib::RefPtr<Gst::Pipeline> create_fake_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();
	Glib::RefPtr<Gst::FakeSrc> fakesrc = Gst::FakeSrc::create();
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	pipeline->add(fakesrc)->add(fakesink);
	fakesrc->link(fakesink);
	return pipeline;
}

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30);
//	camera->set_property("tnr-mode", 2);
//	camera->set_property("ee-mode", 0);
//	camera->set_property("maxperf", true);
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	pipeline->add(camera)->add(camera_caps_filter)->add(fakesink);
	camera->link(camera_caps_filter)->link(fakesink);

	return pipeline;
}

int main(int argc, char **argv) {
	Glib::RefPtr<Gst::Element> pipeline;  // element because Gst::Parse::launch returns Element

	if (argc != 2) {
		std::cerr << "Length of the arguments needs to be one (the filename)";
		return 1;
	}

	// initialize gstreamer
	Gst::init(argc, argv);

	// parse arguments and configure pipeline
	if (!strncmp(argv[1], "--test-manual", 16)) {
		pipeline = create_fake_pipeline();
	} else if (!strncmp(argv[1], "--test-camera-to-fakesink", 32)) {
		pipeline = create_camera_to_fakesink_pipeline();
	} else {
		pipeline = create_pipeline((Glib::ustring)argv[1]);
	}

	// get the Gst::Bus from the pipeline and configure the on_bus_message callback
	Glib::RefPtr<Gst::Bus> bus = pipeline->get_bus();
	bus->add_watch(sigc::ptr_fun(&on_bus_message));

	// set the pipeline to the playing state
	pipeline->set_state(Gst::STATE_PLAYING);

	// create the main event loop
	mainloop = Glib::MainLoop::create();
	if (!mainloop) {
		std::cerr << "Failed to create main event loop!" << std::endl;
		return 1;
	}

	// connect shutdown signal callback...
	std::signal(SIGINT, on_SIGINT);

	// run until stopped (blocks here);
	mainloop->run();

	// shut down cleanly (this will shift the pipeline through all states to NULL)
	// https://gstreamer.freedesktop.org/documentation/plugin-development/basics/states.html?gi-language=c
	pipeline->set_state(Gst::STATE_NULL);

	return 0;
}

Hi,
Could you try below pipelines?

nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1 ! fakesink
nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1 ! omxh265enc ! h265parse ! matroskamux ! filesink

It should give more information on the issue.

Result of first string (it works):

$ ./birbcam --nv-test0
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:state-changed
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:state-changed
BUS_DEBUG: pipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:stream-start
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=13608000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: pipeline0:async-done
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: pipeline0:state-changed
^CGot SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

Second is one also works:

$ ./birbcam --nv-test1
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:state-changed
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:state-changed
BUS_DEBUG: pipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: capsfilter0:state-changed
BUS_DEBUG: nvarguscamerasrc0:state-changed
BUS_DEBUG: pipeline0:stream-start
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=13608000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: pipeline0:async-done
BUS_DEBUG: fakesink0:state-changed
BUS_DEBUG: pipeline0:state-changed
^CGot SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

But the pipeline in the original code created with create_pipeline() does not work. I swapped out the Element in the manual pipeline with omxh265enc but SIGINT is still ignored. I think the problem is the manual pipeline creation. I could create the pipeline from a string because mainloop->quit() seems to work with a pipeline created that way, but I was hoping to avoid that for readability sake and to avoid complex string parsing.

I could just use Python but I want this to be as optimized as possible. I am going to make a create_test_manual_pipeline function with a very simple pipeline from fakesrc ! fakesink and update here again. Thanks for your help!

So the manual pipeline created here works (just a fakesrc to fakesink). I’ve updated the code above as well. Something in the pipeline is preventing mainloop->quit(); from working.

Glib::RefPtr<Gst::Pipeline> create_fake_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();
	Glib::RefPtr<Gst::FakeSrc> fakesrc = Gst::FakeSrc::create();
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	pipeline->add(fakesrc)->add(fakesink);
	fakesrc->link(fakesink);
	return pipeline;
}

The above pipeline shuts down immediately in response to sigint. A more complex one does not regardless of encoder. Could it be the camera source? Next, I’m going to try swapping out elements in my manual pipeline until I find the one that’s causing the issue.

Updating in case this is useful to anybody in the future.

The camera directly to fakesink here:

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	pipeline->add(camera)->add(fakesink);
	camera->link(fakesink);

	return pipeline;
}

works properly:

$ ./birbcam --test-camera-to-fakesink
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=13608000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^CGot SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

So it’s not the camera or the CapsFilter. On to the encoder.

I figured it out.

It is the camera.

It’s one of these options:

camera->set_property("timeout", 30); // TODO: remove this line after tests
camera->set_property("tnr-mode", 2); // high quality temporal noise reduction TODO: test cost
camera->set_property("ee-mode", 0); // edge enhancement off
camera->set_property("maxperf", true);

my guess would be the temporal noise reduction for reasons, so i’m going to try that first…

Edit: nope, the timout might be it i guess.

Edit: the timeout is it, If the timout is enabled, mainloop->quit() doesn’t work.

Thanks for your help, DaneLLL!

So more strangeness. I’ve updated my above post. It appears more than timeout is the problem. I can reliably break clean pipeline shutdown by commenting and uncommenting the timeout property in the pipeline created by this function…

This one, signal catches SIGINT, the callback is called, mainloop quits, shutdown is clean.

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30);
//	camera->set_property("tnr-mode", 2);
//	camera->set_property("ee-mode", 0);
//	camera->set_property("maxperf", true);
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	pipeline->add(camera)->add(camera_caps_filter)->add(fakesink);
	camera->link(camera_caps_filter)->link(fakesink);

	return pipeline;
}

but when I tried to fix my real one by commenting out that same line in my real function below, it doesn’t fix it :/

This pipeline I had to terminate the process… so it’s the timeout property and something else?

Glib::RefPtr<Gst::Pipeline> create_pipeline(const Glib::ustring& outfile) {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();

	// create elements of the pipeline
//	pipeline_string = "nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080,"
//	                  " format=(string)NV12, framerate=(fraction)30/1 ! nvv4l2h265enc bitrate=4000000 "
//	                  "! h265parse ! matroskamux ! filesink location=";

	// source
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30); // when enabled, mainloop->quit() stops working
	camera->set_property("tnr-mode", 2); // high quality temporal noise reduction TODO: test cost
	camera->set_property("ee-mode", 0); // edge enhancement off
	camera->set_property("aeantibanding", 0); // ae antibanding off
	camera->set_property("maxperf", true);

	// filter capabilities (set camera settings)
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);

	/*
	 * gst-launch-1.0 videotestsrc ! capsfilter caps=video/x-raw,format=GRAY8 ! videoconvert ! autovideosink
	 *
	 * and this line
	 *
	 * gst-launch-1.0 videotestsrc ! video/x-raw,format=GRAY8 ! videoconvert ! autovideosink
	 *
	 * are equivalent... that shorthand had me confused for hours
	 *
	 * https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-capsfilter.html
	 */

	// h265 encoding
	Glib::RefPtr<Gst::Element> encoder = Gst::ElementFactory::create_element("nvv4l2h265enc", "encoder");
	encoder->set_property("bitrate", 4000000);

	// packaging stream
	Glib::RefPtr<Gst::Element> parser = Gst::ElementFactory::create_element("h265parse", "parser");
	Glib::RefPtr<Gst::Element> muxer = Gst::ElementFactory::create_element("matroskamux", "muxer");
	muxer->set_property("writing-app", (Glib::ustring) "birbcam");
	muxer->set_property("min-index-interval", 6e+10); // write an index every minute in case of crash/power loss
	// so as not to have to run a video file fixer
	// TODO: experiment with values

	// sink
	Glib::RefPtr<Gst::FileSink> sink = Gst::FileSink::create("sink");
	sink->set_property("location", outfile);

	// add all elements to the pipeline
	try {
		pipeline->add(camera)->add(camera_caps_filter)->add(encoder)->add(parser)->add(muxer)->add(sink);
	} catch (std::runtime_error &err) {
		std::cerr << "Error adding Element:" << err.what() << std::endl;
		throw err;
	}

	// connect all elements in the pipeline
	try {
		camera->link(camera_caps_filter)->link(encoder)->link(parser)->link(muxer)->link(sink);
	} catch (std::runtime_error &err) {
		std::cerr << "Error linking Element:" << err.what() << std::endl;
		throw err;
	}

	return pipeline;
}

So I’m not quite sure what’s going on here. Back to more swapping Elements in and out of the pipeline I guess.

So I added the encoder and, while it still shuts down properly and I haven’t found the problem, I get a critical on pipeline free from a failed assertion.

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();

	// camera and caps
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30);
	camera->set_property("tnr-mode", 2);
	camera->set_property("ee-mode", 0);
	camera->set_property("maxperf", true);
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	// h265 encoding
	Glib::RefPtr<Gst::Element> encoder = Gst::ElementFactory::create_element("nvv4l2h265enc", "encoder");
	encoder->set_property("bitrate", 4000000);

	pipeline->add(camera)->add(camera_caps_filter)->add(encoder)->add(fakesink);
	camera->link(camera_caps_filter)->link(encoder)->link(fakesink);

	return pipeline;
}

It responds to shutdown and I get a playable mkv but there is a problem unreferencing the encoder element (not a big deal right this second, but now I know what is causing that)

...
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

(birbcam:21257): GStreamer-CRITICAL **: 16:53:37.351: gst_mini_object_unref: assertion 'GST_MINI_OBJECT_REFCOUNT_VALUE (mini_object) > 0' failed

edit: nevermind, still looking

… so originally in this post I thought it was the parser, becuase that’s what broke next. nope.

Here are two repeated runs of this pipeline:

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();

	// camera and caps
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30);
	camera->set_property("tnr-mode", 2);
	camera->set_property("ee-mode", 0);
	camera->set_property("maxperf", true);
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	// h265 encoding
	Glib::RefPtr<Gst::Element> encoder = Gst::ElementFactory::create_element("nvv4l2h265enc", "encoder");
	encoder->set_property("bitrate", 4000000);

	pipeline->add(camera)->add(camera_caps_filter)->add(encoder)->add(fakesink);
	camera->link(camera_caps_filter)->link(encoder)->link(fakesink);

	return pipeline;
}
[user@host] -- [~/Dev/birbcam/build] 
 $ ./birbcam --test-camera-to-fakesink
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE 
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: encoder:latency
NvMMLiteOpen : Block : BlockType = 8 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8 
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=627200000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: src:stream-status
BUS_DEBUG: src:stream-status
NVMEDIA: H265 : Profile : 1 
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

(birbcam:22518): GStreamer-CRITICAL **: 17:24:44.157: gst_mini_object_unref: assertion 'GST_MINI_OBJECT_REFCOUNT_VALUE (mini_object) > 0' failed
[user@host] -- [~/Dev/birbcam/build] 
 $ ./birbcam --test-camera-to-fakesink
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE 
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: encoder:latency
NvMMLiteOpen : Block : BlockType = 8 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8 
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=627200000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: src:stream-status
BUS_DEBUG: src:stream-status
NVMEDIA: H265 : Profile : 1 
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
^C Got SIGINT (2)
(the callback is called, tells mainloop to quit, nothing happens)

So we’re getting closer, I think

So this works (without the encoder).

So in addition to the timeout breaking it for sure, the encoder will also do it, but only if run a second time… or something like that… i think :/

I think I’m just going to write it in C. I’m guessing this is related to the way gstreamermm frees objects.

Glib::RefPtr<Gst::Pipeline> create_camera_to_fakesink_pipeline() {
	Glib::RefPtr<Gst::Pipeline> pipeline = Gst::Pipeline::create();

	// camera and caps
	Glib::RefPtr<Gst::Element> camera = Gst::ElementFactory::create_element("nvarguscamerasrc", "camera");
//	camera->set_property("timeout", 30);
	camera->set_property("tnr-mode", 2);
	camera->set_property("ee-mode", 0);
	camera->set_property("maxperf", true);
	Glib::RefPtr<Gst::Caps> camera_caps = Gst::Caps::create_from_string((Glib::ustring)
			"video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1");
	Glib::RefPtr<Gst::CapsFilter> camera_caps_filter = Gst::CapsFilter::create("camera_caps_filter");
	camera_caps_filter->set_property("caps", camera_caps);
	Glib::RefPtr<Gst::FakeSink> fakesink = Gst::FakeSink::create();

	// h265 encoding
//	Glib::RefPtr<Gst::Element> encoder = Gst::ElementFactory::create_element("nvv4l2h265enc", "encoder");
//	encoder->set_property("bitrate", 4000000);

	pipeline->add(camera)->add(camera_caps_filter)->add(fakesink);
	camera->link(camera_caps_filter)->link(fakesink);

	return pipeline;
[user@host] -- [~/Dev/birbcam/build] 
 $ ./birbcam --test-camera-to-fakesink
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=627200000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success
[user@host] -- [~/Dev/birbcam/build] 
 $ ./birbcam --test-camera-to-fakesink
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: PowerService: requested_clock_Hz=627200000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: gtkmm__gstfakesink0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
GST_ARGUS: 
PowerServiceHwVic::cleanupResources
CONSUMER: Done Success
GST_ARGUS: Done Success

test post

Edit: ok, so this works, but if I try to make a post with some code I get “access denied”. it appears to be a forum bug.

So I rewrote the program in pure C and it has the same behavior but I am no longer so sure the problem is the pipeline elements themselves.

I did some testing and the first time I run the program after a reboot, it works every time, but repeated runs are hit or miss.

A successful run:

An unsucessful run, video file not cleanly closed:

No changes in the code.

So I’m really not sure what’s going on now other than my suspicion is now on the argus daemon somehow not being ready… That happened to me in another thread when I tried to close the camera and reopen it too quickly and the daemon wasn’t ready.

I found an argus sample in mmapi that does it differently by using an sig_atomic_t to signal shutdown and multiple threads, and not using a Glib MainLoop as in some gstreamer examples. Hopefully somebody reading this thread finds it useful as well.

…/tegra_multimedia_api/argus/samples/gstVideoEncode/main.cpp

I would paste it here but I tried a code block again and it refuses to paste. Oh well. Pastebin still works.

Hi,
If you still hit the issue on r32.2/Nano, please share us how to reproduce it. Thanks.

Thanks. I will. I just finished writing my real app and and now am starting to debug. When it runs I will see if it has problems shutting down.

I am no longer using a global however. Instead I am attaching the main loop to my data struct I pass around. I am not sure if it will make a difference. If I don’t update here, the problem is fixed.

I tested my gstreamermm version again, it’s doing the same. Sometimes it works, sometimes it’s doesn’t and I can’t consistently replicate. It seems to nearly always work the first time after a reboot.

First run (ok):

BUS_DEBUG: src:stream-status
NVMEDIA: H265 : Profile : 1 
BUS_DEBUG: src:stream-status
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
CONSUMER: Done Success
GST_ARGUS: Done Success
GST_ARGUS: 
PowerServiceHwVic::cleanupResources

Second run (ok):

BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)
GST_ARGUS: Cleaning up
CONSUMER: Done Success
GST_ARGUS: Done Success
GST_ARGUS: 
PowerServiceHwVic::cleanupResources

Third run (fail):

NVMEDIA: H265 : Profile : 1 
BUS_DEBUG: src:stream-status
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
^C Got SIGINT (2)

Process finished with exit code 129

Fourth run i ran immediately after and got something more interesting:

Opening in BLOCKING MODE 
BUS_DEBUG: sink:state-changed
BUS_DEBUG: muxer:state-changed
BUS_DEBUG: parser:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: muxer:state-changed
BUS_DEBUG: parser:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: src:stream-status
BUS_DEBUG: camera:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:new-clock
BUS_DEBUG: src:stream-status
BUS_DEBUG: muxer:state-changed
BUS_DEBUG: parser:state-changed
BUS_DEBUG: encoder:state-changed
BUS_DEBUG: camera_caps_filter:state-changed
BUS_DEBUG: camera:state-changed
BUS_DEBUG: encoder:latency
NvMMLiteOpen : Block : BlockType = 8 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8 
(Argus) Error FileOperationFailed: Failed socket read: Connection reset by peer (in src/rpc/socket/common/SocketUtils.cpp, function readSocket(), line 79)
(Argus) Error FileOperationFailed: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 266)
(Argus) Error FileOperationFailed: Receive worker failure, notifying 1 waiting threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 340)
(Argus) Error InvalidState: Argus client is exiting with 1 outstanding client threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 357)
(Argus) Error FileOperationFailed: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 368)
(Argus) Error FileOperationFailed: Client thread received an error from socket (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 145)
(Argus) Error FileOperationFailed:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:515 Failed to create CameraProvider
BUS_DEBUG: gtkmm__gstpipeline0:stream-start
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:async-done
BUS_DEBUG: sink:state-changed
BUS_DEBUG: gtkmm__gstpipeline0:state-changed
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
End of stream reached.
Process finished with exit code 0

My best guess is I just have to wait longer between repeated runs of the program because Argus isn’t ready.

edit: Same results with pure C version.

BUS_DEBUG: encoder: state-changed
BUS_DEBUG: capsfilter: state-changed
BUS_DEBUG: camera: state-changed
BUS_DEBUG: encoder: latency
NvMMLiteOpen : Block : BlockType = 8 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8 
(Argus) Error FileOperationFailed: Failed socket read: Connection reset by peer (in src/rpc/socket/common/SocketUtils.cpp, function readSocket(), line 79)
(Argus) Error FileOperationFailed: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 266)
(Argus) Error FileOperationFailed: Receive worker failure, notifying 1 waiting threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 340)
(Argus) Error InvalidState: Argus client is exiting with 1 outstanding client threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 357)
(Argus) Error FileOperationFailed: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 368)
(Argus) Error FileOperationFailed: Client thread received an error from socket (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 145)
(Argus) Error FileOperationFailed:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:515 Failed to create CameraProvider
BUS_DEBUG: pipeline: stream-start
BUS_DEBUG: sink: state-changed
BUS_DEBUG: pipeline: async-done
BUS_DEBUG: sink: state-changed
BUS_DEBUG: pipeline: state-changed
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
End of stream reached.
Process finished with exit code 0

Hi,
From the log, it looks to be instability of the camera source. Could you share which camera module you are using?

I am using a Waveshare IMX219 based module. Swapping it out is a good troubleshooting suggestion. I will try that next since I have the stock (non wide angle) module, board, and cable somewhere around here. Barring any of that, I’ll try a PS Eye USB camera.

Hi mdegans,

Still issue repro with other camera module?
Any result can be shared?

Thanks

I upgraded to 200 degree fov waveshare module the other day and still it’s the same issue. I won’t be able to troublshoot for the next month since i’ll be on vacation, but I wrote something that works for my temprorary purposes in Python. I’ll get back to this later and when I do I’m going to explore other ways of triggering shutdown, like something in the main loop that listens for a global shutdown signal.