Distributed Data Parralisation

Actually i am facing the issue with Distribute Data Parralisation and error throws
UserWarning: resource_tracker: There appear to be 30 leaked semaphore objects to clean up at shutdown.
Please help with this issue. I’m using 2 gpus.

Hello,
Can you please help us with additional details to understand the issue better.

So i did 2:4 sparsity on a pytorch model using modelopt and do post training and then convert it to onnx and then during Building tensorrt engine engine using trtexec i find out in the logs that 43 layers are eligible to sparse and then it logs chose 0 layers