Hi All,
RidgeRun has developed a gstreamer element that can be easily added to the gstreamer pipeline to detect when there is movement in a desired region of the frames given by the camera. The algorithm is agnostic of the camera and can be used in general with any sensor. If you want to take a look on it please check this video:
[url]GStreamer Based Motion Detection | Motion Detection | RidgeRun
Regards,
-David
Hi,
I am referring to python-example in gtc-2020-demo source code.
When I change the source of stream to anything else than camera, I get error.
Below is the code I am using
# Create GstD Python client
client = GstdClient()
# Create camera pipelines
camera0 = PipelineEntity(client, 'camera0', 'videotestsrc pattern=snow ! video/x-raw,width=1280,height=720 ! interpipesink name=camera0 forward-events=true forward-eos=true sync=false')
pipelines_base.append(camera0)
camera0_rgba_nvmm = PipelineEntity(client, 'camera0_rgba_nvmm', 'interpipesrc listen-to=camera0 ! video/x-raw,format=YUY2,width=1280,height=720 ! videoconvert ! video/x-raw,format=NV12,width=1280,height=720 ! nvvideoconvert ! video/x-raw(memory:NVMM),format=RGBA,width=1280,height=720 ! queue ! interpipesink name=camera0_rgba_nvmm forward-events=true forward-eos=true sync=false caps=video/x-raw(memory:NVMM),format=RGBA,width=1280,height=720,pixel-aspect-ratio=1/1,interlace-mode=progressive,framerate=30/1')
pipelines_base.append(camera0_rgba_nvmm)
# Create encoding pipelines
h264 = PipelineEntity(client, 'h264', 'interpipesrc name=h264_src format=time listen-to=camera0_rgba_nvmm ! video/x-raw(memory:NVMM),format=RGBA,width=1280,height=720 ! nvvideoconvert ! nvv4l2h264enc ! interpipesink name=h264_sink forward-events=true forward-eos=true sync=false async=false enable-last-sample=false drop=true')
pipelines_video_enc.append(h264)
# Play base pipelines
for pipeline in pipelines_base:
pipeline.play()
time.sleep(10)
# Set locations for video recordings
for pipeline in pipelines_video_rec:
pipeline.set_file_location('test_' + pipeline._name + '_0.mkv')
# Play video encoding pipelines
for pipeline in pipelines_video_enc:
pipeline.play()
time.sleep(20)
If I remove the encoding pipeline, it works. So creating a interpipesink is working fine. However, the interpipesrc, created in the encoding pipeline does not work.
I get below error :
0:00:33.451096787 22217 0x55b89729e0 ERROR gstdnoreader gstd_no_reader.c:77:gstd_no_reader_read:<GstdNoReader@0x7f7832b0c0> Unable to read from this resource
I am using jetson nano for this, if that makes any difference.
I see you are already talking with one of our engineers in Github. Let’s continue the conversation there.