Please provide complete information as applicable to your setup.
• Hardware Platform (GPU)
• DeepStream Version 6.2
I have 2 questions:
- Can I run pytorch model with nvinferserver on CPU because I see an option gpu_ids in config?
- Can I run a full deepstream pipeline on CPU?
please find gpu_ids in nvinferserver, it represents “Device IDs of GPU to use for pre-processing/inference (single GPU support only)”
what is your whole media pipeline or use case?
I run triton model using KIND_CPU and I expect to run nvinferserver on CPU but deepstream doesn’t support, right?
because nvinferserver still needs to be run on GPU to normailze, scale
There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
yes, nvinfersever will use GPU to do preprocess. and nvstreammux needs GPU to otuput GPU buffer.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.