"Failed to create CaptureSession" Iot Edge

Please provide complete information as applicable to your setup.

• Jetson Nano 4GB
• DeepStream SDK 5.1 IoT Edge Marketplace Module marketplace.azurecr.io/nvidia/deepstream51-l4t
• JetPack 4.5.1

I have been following this guide: Azure-IoT-Edge-on-a-NVIDIA-Jetson-Nano. I did manage to get it working with my Custom Vision model and a CSI camera (raspberry camera module v2.1). For a while at least.

After 4-5 restarts of the NVIDIADeepStreamSDK module (tried out different config parameters), the module stopped working. It gave the following error:

depstream-test5-app:1): GLib-GObject-CRITICAL **: 18:33:28.456: g_object_get: assertion 'G_IS_OBJECT (object)' failed

I also get this error

Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:656 Failed t         o create CaptureSession

Which I guess is an error due to the module not being able to access the camera.

I have tried removing the modules, redeploying them, rebooting etc but I can not get it to work. Also tried restarting the docker

sudo systemctl restart nvargus-daemon.service

I feel there should be an easy solution, i.e. kill all processes which occupy the camera, but I cant figure out what exactly.

Happy for any advice!

  1. could you please check the basic camera functionality,
    you may enable gst pipeline with nvarguscamerasrc plugin,
    for example, $ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format= NV12' ! nvoverlaysink -ev
  2. which configuration file are you testing? did you modify it? if yes, please share the whole file.

Regarding your first point. I will have to get back to you since the SD card seems to have gotten corrupted as it will not boot any longer. I will try to reproduce the problem.

Regarding your second point:

I mostly used the same config files as those in the github repository. However, to add a CSI camera and add my own Custom Vision Model I made the following changes:

  • Added my own model files in the custom_models folder:

    model.onnx
    labels.txt

  • I then modified these two files to point to my new model

    test5_config_file_src_infer_azure_iotedge_edited.txt
    msgconv_config_soda_cans.txt

The new test5_config_file_src_infer_azure_iotedge_edited.txt file where I added the camera source and disabled the other sources.


[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0
rows=2
columns=2
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable = 1
#Type - 1 = CameraV4L2 2 = URI 3 = MultiURI 4 = RTSP 5 = Camera (CSI) (Jetson only)
type = 5
intra-decode-enable = 1
gpu-id = 0
camera-id = 0
camera-width = 1920
camera-height = 1080
camera-fps-n = 30
camera-fps-d = 1
camera-csi-sensor-id = 0
drop-frame-interval = 0
num-sources=1
nvbuf-memory-type=0

[source1]
enable=0
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=2
camera-id=1
uri=file://../custom_streams/cam-cans-01.mp4
num-sources=1
gpu-id=0
nvbuf-memory-type=0

[source2]
enable=0
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=2
camera-id=2
uri=file://../custom_streams/cam-cans-02.mp4
num-sources=1
gpu-id=0
nvbuf-memory-type=0

[sink0]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=1
sync=1
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_azure_edge_proto.so
topic=mytopic
#Optional:
#msg-broker-config=../../../../libs/azure_protocol_adaptor/module_client/cfg_azure.txt

[sink2]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265 3=mpeg4
## only SW mpeg4 is supported right now.
codec=3
sync=1
bitrate=2000000
output-file=out.mp4
source-id=0

[sink3]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

# sink type = 6 by default creates msg converter + broker.
# To use multiple brokers use this group for converter and use
# sink type = 6 with disable-msgconv = 1
[message-converter]
enable=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=0
# Name of library having custom implementation.
#msg-conv-msg2p-lib=<val>
# Id of component in case only selected message to parse.
#msg-conv-comp-id=<val>

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Arial
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
## If set to TRUE, system timestamp will be attached as ntp timestamp
## If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
# attach-sys-ts-as-ntp=1

[primary-gie]
enable=1
gpu-id=0
model-engine-file=../custom_models/model.onnx_b3_fp32.engine
config-file=config_infer_custom_vision.txt
batch-size=1
## 0=FP32, 1=INT8, 2=FP16 mode
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;1;1;1
bbox-border-color3=0;1;0;1
nvbuf-memory-type=0
interval=2
gie-unique-id=1

[tracker]
enable=1
tracker-width=680
tracker-height=272
ll-lib-file=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_mot_klt.so
#ll-config-file required for DCF/IOU only
#ll-config-file=tracker_config.yml
#ll-config-file=iou_config.txt
gpu-id=0
#enable-batch-process applicable to DCF only
enable-batch-process=0

[tests]
file-loop=1

And then I also modified test5_config_file_src_infer_azure_iotedge_edited.txt to point to my new model

And as I mentioned, this worked fine in the beginning. I managed to get a camera RTSP feed going and model was working properly. But then after a few restarts it stopped working.

  1. could you share a whole log after running deepstream-app failed? Thanks!
  2. about “create CaptureSession”, it should be camera’s issue, please use the command above to test camera, and please refer to this topic for helptopic1.

Running the command

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1,format=NV12' ! nvoverlaysink -ev

yields the following

verlaysink -ev
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21,000000 fps Duration = 47619048 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28,000001 fps Duration = 35714284 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120,000005 fps Duration = 8333333 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 2
   Output Stream W = 1920 H = 1080
   seconds to Run    = 0
   Frame Rate = 29,999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

Here is the logs from the module. Note that the module never fails, it just keeps spamming this error message

(deepstream-test5-app:1): GLib-GObject-WARNING **: 08:45:30.123: g_object_set_is_valid_property: object class 'GstNvArgusCameraSrc' has no property named 'maxperf'

 *** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***


(deepstream-test5-app:1): GLib-CRITICAL **: 08:45:30.272: g_strrstr: assertion 'haystack != NULL' failed
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_configs/../custom_models/cans-model.onnx_b3_fp32.engine open error
0:00:07.847589048 e[332m    1e[00m       0xdfc600 e[33;01mWARN   e[00m e[00m             nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie>e[00m NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1691> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_configs/../custom_models/cans-model.onnx_b3_fp32.engine failed
0:00:07.847676602 e[332m    1e[00m       0xdfc600 e[33;01mWARN   e[00m e[00m             nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary_gie>e[00m NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1798> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_configs/../custom_models/cans-model.onnx_b3_fp32.engine failed, try rebuild
0:00:07.847700925 e[332m    1e[00m       0xdfc600 e[36mINFO   e[00m e[00m             nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie>e[00m NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1716> [UID = 1]: Trying to create engine from model files
nvds_msgapi_connect : connect success
Opening in BLOCKING MODE
Opening in BLOCKING MODE 
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_mot_klt.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is OFF
gstnvtracker: Past frame output is OFF
----------------------------------------------------------------
Input filename:   /opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_models/model.onnx
ONNX IR version:  0.0.3
Opset version:    7
Producer name:    
Producer version: 
Domain:           onnxml
Model version:    0
Doc string:       
----------------------------------------------------------------
0:01:05.353716387 e[332m    1e[00m       0xdfc600 e[36mINFO   e[00m e[00m             nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie>e[00m NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1749> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_models/model.onnx_b1_gpu0_fp32.engine successfully
0:01:05.967733715 e[332m    1e[00m       0xdfc600 e[36mINFO   e[00m e[00m             nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie>e[00m [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-test5/custom_configs/config_infer_custom_vision.txt sucessfully
INFO: [TRT]: Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.
INFO: [TRT]: Detected 1 inputs and 1 output network tensors.
INFO: [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT data            3x416x416       
1   OUTPUT kFLOAT model_outputs0  40x13x13        


Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume


**PERF:  FPS 0 (Avg)	
Thu May 11 08:46:33 2023
**PERF:  0.00 (0.00)	
** INFO: <bus_callback:181>: Pipeline ready

** INFO: <bus_callback:167>: Pipeline running



(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.524: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.564: g_object_get: assertion 'G_IS_OBJECT (object)' failed
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.605: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.645: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.686: g_object_get: assertion 'G_IS_OBJECT (object)' failed

(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.727: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.767: g_object_get: assertion 'G_IS_OBJECT (object)' failed



(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.807: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.848: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.888: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.929: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:33.969: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.009: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.050: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.090: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.131: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.171: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.211: g_object_get: assertion 'G_IS_OBJECT (object)' failed

(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.251: g_object_get: assertion 'G_IS_OBJECT (object)' failed



(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.292: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.332: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.372: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.413: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.453: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.493: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.534: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.574: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.614: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.654: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.695: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.735: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.775: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.816: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.856: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.896: g_object_get: assertion 'G_IS_OBJECT (object)' failed
GST_ARGUS: Creating output stream


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.936: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:34.977: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.017: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.057: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.098: g_object_get: assertion 'G_IS_OBJECT (object)' failed


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.138: g_object_get: assertion 'G_IS_OBJECT (object)' failed
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.


(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.179: g_object_get: assertion 'G_IS_OBJECT (object)' failed

(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.219: g_object_get: assertion 'G_IS_OBJECT (object)' failed
CONSUMER: Producer has connected; continuing.



(deepstream-test5-app:1): GLib-GObject-CRITICAL **: 08:46:35.259: g_object_get: assertion 'G_IS_OBJECT (object)' failed

I should also mention that my system is extremely unstable. I had to flash the card 2-3 times before I got it to boot properly. I have tried with 2 SD cards and both are unstable.

from the two logs, there is no error like “create CaptureSession”, can you see the output video after running gst-launch-1.0 and depstream-test5-app? please share the whole log if meet error again, Thanks!

Is this still an DeepStream issue to support? could you provide more information?

  1. can you try ds6.0.1 because 5.1 is an old version.
  2. could you elaborate on “After 4-5 restarts”? what did you do exactly?
  3. can you monitor CPU/GPU memory usage when testing? please refer to DeepStream SDK FAQ - #14 by mchi

So after trying a bunch of stuff I think finally found a solution.

Setting enable = 1 under [tiled-display] made it stable. I have no idea why. I initially disabled it since I only had 1 source, but I can achieve the same thing by setting rows = 1 and columns = 1.

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

Thanks fanzh for your input! Hopefully this can be to help for someone else.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.