i am trying to stream a videocapture over network. I have used fastapi and uvicorn for this and it worked well but now i am moving to wireless network and the network can’t handle the stream, im getting 2-3fps with 5 sec lag. I read that gstreamer is the best way to stream the frames, although i will need a decoder on the receiving end of the stream.
i am receiving in python to see if my stream even works and it doesn’t seem like it does, or perhaps my receiving pipeline doesn’t work.
in the end i’ll need a node server that will receive the stream, decode it and show it with html.
i would really appreciate any help as i am new to gstreamer and i can’t seem to find anything about receiving and decoding the stream
but when using the pipeline in opencv, the program is stuck on videocapture.
maybe the pipeline isn’t accurate but i tried a few pipelines in opencv and none worked.
in the end i am trying to restream the received frames into an html server, i want to use the opencv to do so, but i can also stream directly from gst-launch im just not sure how. the html takes the frames from the web.
On Linux system, for putting gstreamer pipeline in cv2.VideoCapture(), wound need to build OpenCV with -D WITH_GSTREAMER=ON. Not sure but it may be same for Windows system. This would need other users to share experience.
i thought nvvidconv was for any nvidia HW.
regardless, i removed nvvidconv and it still got stuck on videocapture,
i removed videoconvert as well and it didnt get stuck on video capture but didn’t manage to capture anything. need to mess more with pipes i guess