On Screen Display in Real Time on Nvidia's Inferencing GPU

Current Nvidia GPU in the organisation support only inferencing, For development purpose and end to end pipeline we are using Kubeflow environment (interconnected remote servers), so we would like to know whether on remote server, jupyter notebook (our scripting tool) can integrate with on screen display (OSD) in real time for the developers to look at the video outputs real time (Note: development here means training Different deep learning models or using the pretrained Nvidia Deep Learning models and upgrading them and developers need OSD to view results of the model performance on the fly).

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Sorry! what can jupyter notebook receive? Can it receive RTSP stream as RTS client?
And, is this topic related to DeepStream?