JETSON XAVIER NX
I meet a problem, here is the situation:
step1: cd /usr/src/nvidia/tegra_multimedia_api/samples/backend
step2: ./backend 1 …/…/data/Video/sample_outdoor_car_1080p_10fps.h264 H264 --trt-deployfile …/…/data/Model/GoogleNet_one_class/GoogleNet_modified_oneClass_halfHD.prototxt --trt-modelfile …/…/data/Model/GoogleNet_one_class/GoogleNet_modified_oneClass_halfHD.caffemodel --trt-mode fo32 --trt-proc-interval 1 -fps 10
then
Layer(Reformat): upsample-bbox output reformatter 0, Tactic: 0, upsample-bbox output to be reformatted 0[Half(4,132,240)] → bboxes[Float(4,132,240)]
Layer(h884cudnn): cvg/classifier, Tactic: 7158029511300006471, pool5/drop_s1[Half(1024,33,60)] → cvg/classifier[Half(16,33,60)]
Layer(Reformat): upsample-cvg input reformatter 0, Tactic: 0, cvg/classifier[Half(16,33,60)] → upsample-cvg reformatted input 0[Float(16,33,60)]
Layer(gemmDeconvolution): upsample-cvg, Tactic: 0, upsample-cvg reformatted input 0[Float(16,33,60)] → cvg/tile[Float(1,132,240)]
Layer(Activation): coverage/sig, Tactic: 0, cvg/tile[Float(1,132,240)] → coverage[Float(1,132,240)]
Create TRT model cache
Deserialize required 102348 microseconds.
outputDim c 1 w 240 h 132
outputDimsBBOX.c() 4 w 240 h 132
get resolution failed, program will exit
The program did not display the results visually and was terminated. It seems that this problem appears in line 1645 of the cpp file get_disp_resolution(&disp_info); Didn’t get the length and width of the video?
In addition, when I run
./backend -h, the output is
nvbuf_utils: Could not get EGL display connection
backend … [options]
Channel-num:
1-4, Number of file arguments should exactly match the number of channels specifiedSupported formats:
H264
H265OPTIONS:
-h,–help Prints this text
-fps Display rate in frames per second [Default = 30]–input-nalu Input to the decoder will be nal units[Default]
–input-chunks Input to the decoder will be a chunk of bytes–trt-deployfile set deploy file name
–trt-modelfile set model file name
–trt-proc-interval set process interval, 1 frame will be process every trt-proc-interval
–trt-mode 0 fp16 (if supported), 1 fp32, 2 int8
–trt-dumpresult 1 to dump result, 0[default] otherwise
–trt-enable-perf 1[default] to enable perf measurement, 0 otherwise
It has not been solved yet, does anyone know how to solve it? Thank you!