When you build the kinesis producer sdk there is a cmake option to additionally build the kvssink gstreamer plugin. Details are on the github repo. I have used it and it works well… You just add it to your pipeline like any other sink.
Note that full-time streaming to the cloud can be costly though. ;-)
Thanks Reply. I am able to build c++ sdk with cmake. Once I got .so file then I want to integrate Deepstream_app sample application. Please elaborate low level how to add these .so file and in sink. Please explain some more details.
If you want to code against the sdk instead of using kvssink then check out the amazon code examples that are in the code repository. They provide sample code for the producer sdk.
Hi I installed kvssink library. I exported GST_PLUGIN_PATH and LD_LIBRARY_PATH. I created sink3(kvssink) and it has Queue-transform-caps-encoder-parse-sink. Iv create_pipeline() function below line is complaining:
NVGSTDS_ELEM_ADD_PROBE(latency_probe_id,
//pipeline->instance_bins[i].sink_bin.sub_bins[0].sink, “sink”,
pipeline->instance_bins[i].sink_bin.sub_bins[0].sink, “sink”,
latency_measurement_buf_prob, GST_PAD_PROBE_TYPE_BUFFER,
appCtx);
I am getting error: ** ERROR: <create_pipeline:1185>: Could not find ‘sink’ in ‘sink_sub_bin_sink1’
** ERROR: <create_pipeline:1273>: create_pipeline failed
** ERROR: main:1406: Failed to create pipeline
Quitting
I added below info in config file: sink3
enable=1 #Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=7
#1=h264 2=h265
codec=1 #encoder type 0=Hardware 1=Software
enc-type=0
sync=1
bitrate=4000000 #H264 Profile - 0=Baseline 2=Main 4=High #H265 Profile - 0=Main 1=Main10
profile=0
Hi @muvva, I’m not sure of the problem. It looks like you are editing deepstream-app which is a very complex application. I have only used kvssink in my own applications (based on deepstream-test3) and it has worked fine.
Thanks. Please explain details how did you achieve?
did you add new sink? what did you specify in config file? Please go through low level details? I blocked completely how to achieve those functionalities?
kvssink does not use a config file… Just standard gstreamer properties. The best thing to do I think would be to start with the gstreamer website and read up on all the types of elements; how to string them together and take a look at the tutorials.
Next step would be to read the source code for the deepstream sample applications, starting with deepstream-test1.
Once you have done that I think you will be able to answer all your questions.
To initially get kvssink to work you can just test on the command line with gst-launch-1.0.
This is just an example from the AWS documentation. Please check there for options. Note in the above pipeline the AWS access tokens must first be set as env vars.
Thank you. I went through all Gstreamer documentation. I ran below command
gst-launch-1.0 -v filesrc location=“/home/sensable/Downloads/test8.mp4” ! qtdemux name=demux ! queue ! h264parse ! kvssink name=sink stream-name=“MyKVStream” access-key=“ccccccc” secret-key=“xxxx” and it works fine. I want to control the frame rate, I ran below command and doesn’t work:
gst-launch-1.0 -v filesrc location="/home/sensable/Downloads/test8.mp4" ! videoconvert ! video/x-raw,format=I420,width=640,height=480,framerate=25/1 ! x264enc key-int-max=45 ! video/x-h264,stream-format=avc,alignment=au ! kvssink stream-name= MyKVStream storage-size=128 access-key="ccccccc" secret-key="xxxxx". For the frame rate control do we need to add "converter-caps-encoder" in the pipeline? If yes, can you please give one example how to convert .mp4 file(30fps) to 25fps to upload to kvssink.
I don’t think framerate works in caps unless your source is an actual camera. I don’t think it does anything if your source if a file and I know it doesn’t if your source is rtsp (where you control the framerate from the source itself)