I am trying to simultaneously compress 4 GigE Vision camera streams coming into the TX2 via ethernet.
The frame rate for each camera is around 5Hz. Each video stream is around 4 mega pixels at 10 bits per pixel (monochrome).
I need the compression to be lossless or very near lossless.
Each compressed video stream will need to be written to disk as new frames are processed.
Is this possible?
I have seen this NVEnc library, but I think this is only for x86 desktop PCs
I have also looked at the Jetson MultiMedia API. Another possibility is GStreamer.
Which is the correct API to do what I want?
Does any example code exist? I have not so far seen any examples with lossless or monochrome video compression.
We’d like to compress 10-bit grayscale to lossless 10-bit HEVC (H.265) on a TX2.
I’d prefer to use ffmpeg rather than gstreamer since the documentation seems much easier to understand. Do I understand correctly that only gstreamer is supported with NVENC on a TX2 (as compared with desktop/laptop GPUs where there are more options)?
Could you suggest how we might setup a gstreamer pipeline to accomplish 10-bit Lossless HEVC encoding? We can’t find documentation on using 10-bit channels with gstreamer, and gst-inspect seems to suggest that only 8-bit YUV color planes are supported as inputs. I hope we’re misunderstanding!
Thanks,
Seth Nickell
VP of Engineering @ Ceres Imaging
which will result in close to lossless video compression. I am seeing maximum errors around 1%
(i.e. the gray values are off by 1 to 2 counts) after compression is performed on the TX2.
Are there any other parameters that I need to set on the context_t struct that will enable fully lossless compression using the TX2 compression hardware block?
typedef struct
{
NvVideoEncoder *enc;
uint32_t encoder_pixfmt;
char *in_file_path;
std::ifstream *in_file;
uint32_t width;
uint32_t height;
char *out_file_path;
std::ofstream *out_file;
char *ROI_Param_file_path;
char *Recon_Ref_file_path;
char *RPS_Param_file_path;
char *hints_Param_file_path;
char *GDR_Param_file_path;
char *GDR_out_file_path;
std::ifstream *roi_Param_file;
std::ifstream *recon_Ref_file;
std::ifstream *rps_Param_file;
std::ifstream *hints_Param_file;
std::ifstream *gdr_Param_file;
std::ofstream *gdr_out_file;
uint32_t bitrate;
uint32_t profile;
enum v4l2_mpeg_video_bitrate_mode ratecontrol;
uint32_t iframe_interval;
uint32_t idr_interval;
enum v4l2_mpeg_video_h264_level level;
uint32_t fps_n;
uint32_t fps_d;
uint32_t gdr_start_frame_number; /* Frame number where GDR has to be started */
uint32_t gdr_num_frames; /* Number of frames where GDR to be applied */
uint32_t gdr_out_frame_number; /* Frames number from where encoded buffers are to be dumped */
enum v4l2_enc_temporal_tradeoff_level_type temporal_tradeoff_level;
enum v4l2_enc_hw_preset_type hw_preset_type;
v4l2_enc_slice_length_type slice_length_type;
uint32_t slice_length;
uint32_t virtual_buffer_size;
uint32_t num_reference_frames;
uint32_t slice_intrarefresh_interval;
uint32_t num_b_frames;
uint32_t nMinQpI; /* Minimum QP value to use for index frames */
uint32_t nMaxQpI; /* Maximum QP value to use for index frames */
uint32_t nMinQpP; /* Minimum QP value to use for P frames */
uint32_t nMaxQpP; /* Maximum QP value to use for P frames */
uint32_t nMinQpB; /* Minimum QP value to use for B frames */
uint32_t nMaxQpB; /* Maximum QP value to use for B frames */
uint32_t sMaxQp; /* Session Maximum QP value */
bool insert_sps_pps_at_idr;
bool report_metadata;
bool input_metadata;
bool dump_mv;
bool externalRPS;
bool enableGDR;
bool bGapsInFrameNumAllowed;
bool bnoIframe;
uint32_t nH264FrameNumBits;
uint32_t nH265PocLsbBits;
bool externalRCHints;
bool enableROI;
bool use_gold_crc;
char gold_crc[20];
Crc *pBitStreamCrc;
bool bReconCrc;
uint32_t rl; /* Reconstructed surface Left cordinate */
uint32_t rt; /* Reconstructed surface Top cordinate */
uint32_t rw; /* Reconstructed surface width */
uint32_t rh; /* Reconstructed surface height */
std::stringstream *runtime_params_str;
uint32_t next_param_change_frame;
bool got_error;
} context_t;