I have a mmapi app. I have received h265 and decode it to a RGB32 picture. I want to share the picture to a python AI app and I don’t want to do memory coping because the picture is too large(Maybe 1080 or 4k). How to do it. I see that nvbuffer has a fd. Can I send the fd to python app and let it get the picture? Or any suggestion？
In this case you would need to copy the data from NvBuffer to CPU buffer since the python AI app should only support CPU buffer. There is no method to avoid the copy.
For running deep learning inference on Jetson platforms, the optimal solution is DeepStream SDK. If you can convert the model to be runnable on Tensor RT, we would suggest use DeepStream SDK.
But my mmapi app has finished. I just want to add the AI function and the AI function has finished too using python. Must I copy the momery?
If you can pass CPU pointers to the AI functions, you can call NvBufferMemMap() to get the pointer and pass the the functions for a try. If not, the buffer copy is required.
There are backend and frontend samples to demonstrate aaMMAPI + TensorRT. Would suggest take a look and check if the AI functions can be switched to using TensorRT. The default model is ResNet.
I can get the nvbuffer fd in C++ and pass the fd to AI function. But how to use NvBufferMemMap to get the data in python?
Please check the option ‘-c Enable demonstration of CPU processing’ in 10_camera_recording. It demonstrates modifying some pixels and then send the buffer to encoder. Please take a look and see if it can be applied to your use-case.
But how to get it in python app? I have two apps,one is finished in C++ and the other is finish. Now I just want to send a 4k picture from c++ app to python without coping.
The jetson_multimedia_api is in C code and there’s no implemented python bindings, so it is not possible to work like this. Is it possible to find AI functions in C? For using jetson_multimedia_api, it is better to do C programming.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.