Currently having difficulty with performance with Deepstream with Python.. should use C++?

We are currently using Python for deepstream apps to get streaming by rtsp feed by several cameras (4-6) and process them, and send to cloud server for meaningful videos. We are using deepstream inference models to decide on the timing and video clips, and would it be better to use C++ than to use Python?
Right now it’s almost impossible because of the delay and python performance, let alone the Python GIL restriction is a huge blocker to make the process efficient.

C++ itself should be higher efficient than python.
You may could try C++


I’m just afraid it will take too much time coz we are not a team of C++ engineers.
And the deadline is pretty close. Just trying to evaluate how much effort it would take vs how much efficient it would be.
For example, would C++ be enough to optimize 6 cameras and able to analyze, and send to cloud?
Also, it would be much harder to find libraries in C++ than in Python.
Would be helpful if someone with experience can tell their experience.

I don’t think this is so simple to be a C++ or Python languange question.
The performance depends on many factors, what’s the camera type/intreface, inference models, and so on.
Have you done analysis on your pipeline about the performance bottleneck? If you did, the conclusion is the performance is limited by Python itself?

That is the part we would like to know.
We are sure that the bottleneck is the deepstream program itself, as we are running a big python script handling several rtsp feeds and processing them as a whole, but as you said, this could not be a python problem, and it would also be on the machine spec problem, which would be meaningless if we change to C++. We are using TX2 and it seems as it’s kind of overburden, but curious if changing to C++ would improve this behavior by certain or huge portion.

Sorry! There is no enough informantion for us to conclude if changing to C++ would improve.