Model Parallelism Inference Across Multiple Devices

Hi,

What tools are recommended for doing Model Parallelism for Inference across multiple Jetson devices?

For which Jetson platform?

1 Like

Hi,

Could you share more information about your use case?

Do you want to infer the same input on different platforms for acceleration?
Or do you want to infer multiple inputs and each input run parallel on a device?

Thanks.

1 Like

For the Orin.

A single input for a model that has weights across devices.

The purpose of using multiple models is for acceleration

Hi,

We don’t have a library to separate the model into difference devices.
This might also be a model-dependent task.

Maybe others can share their experience here.
Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.