Announcing general availability for DeepStream 5.0

Key feature highlights

• New inference capability with Triton Inference Server enables developers to deploy models natively in their training framework.

• Choice of developing DeepStream pipeline from either C/C++ or Python with comparable performance

• Build and deploy DeepStream apps natively in Red Hat Enterprise Linux (RHEL)

• Enhanced IoT capabilities, including edge/cloud bi-direciotnal messaging and OTA updates

Read more about the announcement and full list of latest features in our developer news article

Product page

Getting Started

Pull from NGC container

Watch new video tutorial “DeepStream best practices for performance optimization”

Read the updated feature explainer blog “ Building Intelligent Video Analytics Apps Using NVIDIA DeepStream 5.0” (Updated for GA)

New webinar : Create Intelligent places using NVIDIA pre-trained vision models and DeepStream SDK

TLT 2.0 general availability – To find out more, see Announcing general availability for Transfer Learning Toolkit 2.0


Redaction and Reference apps are now LIVE on Github for DS 5.0 GA
Please refer to the links:

Reference apps: