Getting Error for nvv4l2h264enc in application compilation

Hi,

i have developed a C application for recording using gstreamer with NVIDIA accelerated encoder nvv4l2h264enc.

i have following pipeline which working correctely :

gst-launch-1.0 -e -v v4l2src do-timestamp=true ! nvvidconv ! “video/x-raw(memory:NVMM),width=640,height=480,format=I420,framerate=30/1” ! nvv4l2h264enc maxperf-enable=1 insert-sps-pps=1 ! rtph264pay ! shmsink socket-path=/tmp/test shm-size=20000000 wait-for-connection=false pulsesrc device=alsa_input.usb-C-Media_Electronics_Inc._USB_PnP_Sound_Device-00.analog-mono ! queue ! audioconvert ! lamemp3enc ! rtpmpapay ! shmsink socket-path=/tmp/audio wait-for-connection=false shm-size=1000000 sync=true

now i have write a c application for the same. i am getting following errors while compiling. please let me know if it is require any header file to include for NVIDIA accelerated plugins or any library to link during compilation.

below is the log:

(main:14298): GLib-GObject-CRITICAL **: 16:52:12.248: g_object_set: assertion ‘G_IS_OBJECT (object)’ failed
(main:14298): GLib-GObject-CRITICAL **: 16:52:12.248: g_object_set: assertion ‘G_IS_OBJECT (object)’ failed
width = 640
Height = 480
(main:14298): GStreamer-CRITICAL **: 16:52:12.248: gst_structure_new_empty: assertion ‘gst_structure_validate_name (name)’ failed
(main:14298): GStreamer-CRITICAL **: 16:52:12.256: gst_mini_object_unref: assertion ‘mini_object != NULL’ failed
Starting Audio loop
Opening in BLOCKING MODE
libv4l2: error getting capabilities: Inappropriate ioctl for device
ERROR: from element /GstPipeline:test-pipeline/nvv4l2h264enc:video_encoder: Error getting capabilities for device ‘/dev/nvhost-msenc’: It isn’t a v4l2 driver. Check if it is a v4l1 driver.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/v4l2_calls.c(98): gst_v4l2_get_capabilities (): /GstPipeline:test-pipeline/nvv4l2h264enc:video_encoder:
system error: Inappropriate ioctl for device
ERROR: from element /GstPipeline:test-pipeline/nvv4l2h264enc:video_encoder: Could not initialize supporting library.
Additional debug info:
gstvideoencoder.c(1627): gst_video_encoder_change_state (): /GstPipeline:test-pipeline/nvv4l2h264enc:video_encoder:
Failed to open encoder
ERROR: from element /GstPipeline:test-pipeline/GstShmSink:video_sink: Failed waiting on fd activity
Additional debug info:
gstshmsink.c(836): pollthread_func (): /GstPipeline:test-pipeline/GstShmSink:video_sink:
gst_poll_wait returned -1, errno: 16
Starting video loop

regards,
Vikas

Hi,
Please clean gstreamer cache and try again:

$ rm .cache/gstreamer-1.0/registry.aarch64.bin

For information, please share your release version( $ head -1 /etc/nv_tegra_release )

HI,

Release version is as below:

#R32 (release), REVISION: 2.1, GCID: 16294929, BOARD: t210ref, EABI: aarch64, DATE: Tue Aug 13 04:28:29 UTC 2019

Clean gstreamer cache didn’t help. getting same errors.

Regards,
Vikas

Hi,
Please check if the posts help:

It seems to be an issue you run the application in docker and miss certain device nodes. Not sure if this is your case.

HI,

i am not using Deep-stream for my application development . i am using gedit for application development . is it require to use deep-stream SDK for hardware accreted plugin too use.

regards,
Vikas

Hi,
You should not need to use DeepStream SDK in this usecase. Please try the following sample:

#include <cstdlib>
#include <cstring>
#include <sstream>
#include <gst/gst.h>

using namespace std;

#define USE(x) ((void)(x))

static GstPipeline *gst_pipeline = nullptr;
static string launch_string;

GstClockTime usec = 1000000;
static int w = 640;
static int h = 480;

int main(int argc, char** argv) {
    USE(argc);
    USE(argv);

    gst_init (&argc, &argv);

    GMainLoop *main_loop;
    main_loop = g_main_loop_new (NULL, FALSE);
    ostringstream launch_stream;

    launch_stream
    << "v4l2src name=mysource ! "
    << "video/x-raw,width="<< w <<",height="<< h <<",framerate=30/1,format=YUY2 ! "
    << "nvvidconv ! nvv4l2h264enc ! h264parse ! qtmux ! "
    << "filesink location=a.mp4 ";

    launch_string = launch_stream.str();

    g_print("Using launch string: %s\n", launch_string.c_str());

    GError *error = nullptr;
    gst_pipeline  = (GstPipeline*) gst_parse_launch(launch_string.c_str(), &error);

    if (gst_pipeline == nullptr) {
        g_print( "Failed to parse launch: %s\n", error->message);
        return -1;
    }
    if(error) g_error_free(error);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING); 

    g_usleep(15*usec);

    GstElement* src = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");
    gst_element_send_event (src, gst_event_new_eos ());
    // Wait for EOS message
    GstBus *bus = gst_pipeline_get_bus(GST_PIPELINE(gst_pipeline));
    gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(gst_pipeline));
    g_main_loop_unref(main_loop);

    g_print("going to exit \n");
    return 0;
}

Build command:

$ g++ -Wall -std=c++11  test.cpp -o test $(pkg-config --cflags --libs gstreamer-app-1.0)

We run it with Logitech C615 which supports 640x480p30 YUYV.

Please refer to FAQ
Q: I have a USB camera. How can I lauch it on Jetson Nano?
and configure support mode to video/x-raw,width,height,framerate,format.

HI,

I have try above test code. it is working but in my code i am getting errors. i am attaching releted files plese let me know where i am doing mistake.

include “main.h”
include “Video_src.h”
include “Read_config.h”
/******************************************************************************
Global variable
*******************************************************************************/
static GMainLoop *loop;
static GstElement *pipeline;
static GstElement *video_source, *video_convert, *caps_filter, *video_encoder, *video_pay, *video_sink;
static GstBus *bus;
static GstCaps *caps = NULL;
gboolean Vid_cap = TRUE;
gint output_width;
gint output_height;
my_struct_t my_structs[1];
void get_output_resolution()
{
if(strcmp(“1600x1200”, my_structs[0].output_fram_size) == 0)
{
output_width = 1600;
output_height = 1200;
}
else if(strcmp(“640x480”, my_structs[0].output_fram_size) == 0)
{
output_width = 640;
output_height = 480;
}
else if(strcmp(“800x600”, my_structs[0].output_fram_size) == 0)
{
output_width = 800;
output_height = 600;
}
else if(strcmp(“1024x768”, my_structs[0].output_fram_size) == 0)
{
output_width = 1024;
output_height = 768;
}
else if(strcmp(“1152x864”, my_structs[0].output_fram_size) == 0)
{
output_width = 1152;
output_height = 864;
}
else if(strcmp(“1280x960”, my_structs[0].output_fram_size) == 0)
{
output_width = 1280;
output_height = 960;
}
else if(strcmp(“640x360”, my_structs[0].output_fram_size) == 0)
{
output_width = 640;
output_height = 360;
}
else if(strcmp(“1280x720”, my_structs[0].output_fram_size) == 0)
{
output_width = 1280;
output_height = 720;
}
else if(strcmp(“1600x960”, my_structs[0].output_fram_size) == 0)
{
output_width = 1600;
output_height = 960;
}
else if(strcmp(“1920x1080”, my_structs[0].output_fram_size) == 0)
{
output_width = 1920;
output_height = 1080;
}
else if(strcmp(“960x600”, my_structs[0].output_fram_size) == 0)
{
output_width = 960;
output_height = 600;
}
else if(strcmp(“1280x800”, my_structs[0].output_fram_size) == 0)
{
output_width = 1280;
output_height = 800;
}
else if(strcmp(“1440x920”, my_structs[0].output_fram_size) == 0)
{
output_width = 1440;
output_height = 920;
}
else if(strcmp(“1680x1050”, my_structs[0].output_fram_size) == 0)
{
output_width = 1680;
output_height = 1050;
}
else if(strcmp(“1920x1200”, my_structs[0].output_fram_size) == 0)
{
output_width = 1920;
output_height = 1200;
}
}
void set_encoder_bitrate()
{
g_object_set (G_OBJECT(video_encoder), “bitrate”, 4000000,NULL);
}
void set_encoder_rate_control_mode ()
{
guint mode_id = 0;
const gchar *mode_name;
if(strcmp(“Strong”, my_structs[0].Rate_control_mode) == 0)
{
mode_id = 1;
mode_name = “CBR”;
}
else if(strcmp(“Week”, my_structs[0].Rate_control_mode) == 0)
{
mode_id = 0;
mode_name = “VBR”;
}
else
{
printf(“Invalid value for Rate Control Mode\n\r”);
}
g_object_set(G_OBJECT(video_encoder), “control-rate”, mode_id, NULL);
g_print(“Encoder Rate Control Mode set to = %s\n”, mode_name);
}
void set_encoder_preset_level ()
{
guint preset_id = 0;
const gchar preset_name;
if(strcmp(“High”, my_structs[0].video_quality) == 0)
{
preset_id = 4;
preset_name = “Slowpreset”;
}
else if(strcmp(“Medium”, my_structs[0].video_quality) == 0)
{
preset_id = 3;
preset_name = “Mediumpreset”;
}
else if(strcmp(“Low”, my_structs[0].video_quality) == 0)
{
preset_id = 2;
preset_name = “Fastpreset”;
}
else if(strcmp(“verylow”, my_structs[0].video_quality) == 0)
{
preset_id = 1;
preset_name = “Ultrafastpreset”;
}
else
{
printf(“Invalid value for Preset-level\n\r”);
}
g_object_set(G_OBJECT(video_encoder), “preset-level”, preset_id, NULL);
g_print(“Encoder Preset-level set to = %s\n”, preset_name);
}
void set_H265_encoder_profile ()
{
guint profile_id = 0;
if(strcmp(“Main”, my_structs[0].profile) == 0)
{
profile_id = 0;
}
else if(strcmp(“Main10”, my_structs[0].profile) == 0)
{
profile_id = 1;
}
else
{
printf(“Invalid value for Profile\n\r”);
}
g_object_set(G_OBJECT(video_encoder), “profile”, profile_id, NULL);
}
void set_H264_encoder_profile ()
{
guint profile_id = 0;
const gchar * profile_name;
if(strcmp(“BP”, my_structs[0].profile) == 0)
{
profile_id = 0;
profile_name = “Baseline”;
}
else if(strcmp(“MP”, my_structs[0].profile) == 0)
{
profile_id = 2;
profile_name = “Main”;
}
else if(strcmp(“HP”, my_structs[0].profile) == 0)
{
profile_id = 4;
profile_name = “High”;
}
else
{
printf(“Invalid value for Profile\n\r”);
}
g_object_set(G_OBJECT(video_encoder), “profile”, profile_id, NULL);
g_print(“Encoder Profile set to = %s\n”, profile_name);
}
gboolean get_video_encoder_end_payloader ()
{
if(strcmp(“H264”, my_structs[0].Video_codec) == 0)
{
video_encoder = gst_element_factory_make (NVGST_PRIMARY_H264_VENC, “video_encoder”);
/
Set encoder properties /
#if 0
set_H264_encoder_profile();
set_encoder_preset_level();
g_object_set(G_OBJECT(video_encoder), “maxperf-enable”, TRUE, NULL);
g_object_set(G_OBJECT(video_encoder), “insert-sps-pps”, TRUE, NULL);
endif
video_pay = gst_element_factory_make (NVGST_PRIMARY_H264_RTP_PAYLOADER, “video_pay”);
}
else if(strcmp(“H265”, my_structs[0].Video_codec) == 0)
{
video_encoder = gst_element_factory_make (NVGST_PRIMARY_H265_VENC, “video_encoder”);
set_H265_encoder_profile();
set_encoder_preset_level();
g_object_set(G_OBJECT(video_encoder), “maxperf-enable”, TRUE, NULL);
g_object_set(G_OBJECT(video_encoder), “insert-sps-pps”, TRUE, NULL);
video_pay = gst_element_factory_make (NVGST_PRIMARY_H265_RTP_PAYLOADER, “video_pay”);
}
if(!video_encoder)
{
return FALSE;
}
return TRUE;
}
gboolean message_cb_video (GstBus * bus, GstMessage * message, gpointer user_data)
{
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR:
{
GError err = NULL;
gchar name, debug = NULL;
name = gst_object_get_path_string (message->src);
gst_message_parse_error (message, &err, &debug);
g_printerr (“ERROR: from element %s: %s\n”, name, err->message);
if (debug != NULL)
g_printerr (“Additional debug info:\n%s\n”, debug);
g_error_free (err);
g_free (debug);
g_free (name);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_WARNING:
{
GError err = NULL;
gchar name, debug = NULL;
name = gst_object_get_path_string (message->src);
gst_message_parse_warning (message, &err, &debug);
g_printerr (“ERROR: from element %s: %s\n”, name, err->message);
if (debug != NULL)
g_printerr (“Additional debug info:\n%s\n”, debug);
g_error_free (err);
g_free (debug);
g_free (name);
break;
}
case GST_MESSAGE_EOS:
{
g_print (“Got EOS\n”);
g_main_loop_quit ((GMainLoop
)loop);
gst_element_set_state ((GstElement
)pipeline, GST_STATE_NULL);
g_main_loop_unref ((GMainLoop
)loop);
gst_object_unref (GST_OBJECT(pipeline));
break;
}
default:
break;
}
return TRUE;
}
void stopVideo_capture()
{
g_print(“stop video capture\n”);
gst_element_send_event((GstElement
)pipeline, gst_event_new_eos());
Vid_cap = FALSE;
}
void Video_shmsink(void vargp)
{
/
Create the elements /
video_source = gst_element_factory_make (NVGST_VIDEO_CAPTURE_SRC_V4L2, “video_source”);
video_convert = gst_element_factory_make (NVGST_DEFAULT_VIDEO_CONVERTER, “video_convert”);
caps_filter = gst_element_factory_make (NVGST_DEFAULT_CAPTURE_FILTER, “caps_filter”);
if (!get_video_encoder_end_payloader ())
{
printf(“Video encoder element could not be created.\n”);
}
video_sink = gst_element_factory_make (NVGST_DEFAULT_SHM_SINK, “video_sink”);
// Create the empty pipeline
pipeline = gst_pipeline_new (“test-pipeline”);
if (!pipeline || !video_source || !video_convert || !caps_filter || !video_encoder || !video_pay || !video_sink)
{
g_printerr (“Not all elements could be created.\n”);
return NULL;
}
get_output_resolution();
printf(“width = %d\n\r”, output_width);
printf(“Height = %d\n\r”, output_height);
/caps = gst_caps_new_simple (“video/x-raw”,
“width”, G_TYPE_INT, output_width,
“height”, G_TYPE_INT, output_height,
“format”, G_TYPE_STRING, NVGST_DEFAULT_CAPTURE_FORMAT,
“framerate”, GST_TYPE_FRACTION, 30, 1,
NULL);
/
caps = gst_caps_from_string(“video/x-raw(memory:NVMM), format=I420, width=640, height=480, framerate=30/1”);
// Configure elements
g_object_set(GST_BIN(pipeline), “message-forward”, TRUE, NULL);
g_object_set (G_OBJECT(video_source), “device”, NVGST_DEFAULT_VIDCAP_DEVICE, NULL);
g_object_set (G_OBJECT(video_source), “do-timestamp”, TRUE, NULL);
g_object_set (G_OBJECT(video_sink), “socket-path”, NVGST_DEFAULT_SHMSINK_SOCKET_PATH, NULL);
g_object_set (G_OBJECT(video_sink), “shm-size”, NVGST_DEFAULT_SHMSINK_SHM_SIZE, NULL);
g_object_set (G_OBJECT(video_sink), “wait-for-connection”, FALSE, NULL);
g_object_set (G_OBJECT(caps_filter), “caps”, caps, NULL);
gst_caps_unref ((GstCaps
)caps);
//Link all elements that can be automatically linked because they have “Always” pads
gst_bin_add_many(GST_BIN (pipeline), video_source, video_convert, caps_filter, video_encoder, video_pay, video_sink, NULL);
if (gst_element_link_many (video_source, video_convert, caps_filter, video_encoder, video_pay, video_sink, NULL) != TRUE)
{
g_printerr (“Elements could not be linked.\n”);
gst_object_unref (pipeline);
return NULL;
}
loop = g_main_loop_new(NULL, FALSE);
//Wait until error or EOS
bus = gst_element_get_bus ((GstElement
)pipeline);
gst_bus_add_signal_watch((GstBus
)bus);
g_signal_connect(G_OBJECT(bus), “message”, G_CALLBACK(message_cb_video), NULL);
gst_object_unref(GST_OBJECT(bus));
// Start playing the pipeline
gst_element_set_state ((GstElement
)pipeline, GST_STATE_PLAYING);
g_print(“Starting video loop\n\r”);
g_main_loop_run((GMainLoop
)loop);
return NULL;
}

header file :

#ifndef VIDEOSRC_H
define VIDEOSRC_H
include <gst/gst.h>
/* CAPTURE GENRIC*/
define NVGST_DEFAULT_VIDCAP_DEVICE “/dev/video0”
define NVGST_DEFAULT_SHMSINK_SOCKET_PATH “/tmp/test3”
define NVGST_DEFAULT_SHMSINK_SHM_SIZE 20000000
define NVGST_VIDEO_CAPTURE_SRC_TEST “videotestsrc”
define NVGST_VIDEO_CAPTURE_SRC_V4L2 “v4l2src”
define NVGST_VIDEO_CAPTURE_SRC_ARGUS “nvarguscamerasrc”
define NVGST_DEFAULT_CAPTURE_FORMAT “I420”
define NVGST_DEFAULT_VIDEO_CONVERTER “nvvideoconvert”
define NVGST_DEFAULT_VIDEO_SCALE “videoscale”
define NVGST_DEFAULT_VIDEO_RATE “videorate”
define NVGST_DEFAULT_CAPTURE_FILTER “capsfilter”
define NVGST_PRIMARY_H264_VENC “nvv4l2h264enc”
define NVGST_PRIMARY_H265_VENC “nvv4l2h265enc”
define NVGST_PRIMARY_H264_RTP_PAYLOADER “rtph264pay”
define NVGST_PRIMARY_H265_RTP_PAYLOADER “rtph265pay”
define NVGST_DEFAULT_SHM_SINK “shmsink”
define NVGST_PRIMARY_QUEUE “queue”
/* function * /
gboolean message_cb_video (GstBus * bus, GstMessage * message, gpointer user_data);
void stopVideo_capture();
void Video_shmsink(void vargp);
void set_encoder_bitrate();
void set_encoder_rate_control_mode ();
void set_encoder_preset_level ();
void set_H265_encoder_profile ();
void set_H264_encoder_profile ();
gboolean get_video_encoder_end_payloader ();
/
****************************************************************************/
endif

Regards,
Vikas

Hi,
The sample is based on gst_parse_launch(), which can launch a gstreamer pipeline. Suggest you check if you can apply your case to use the functions.