"torch.float" to "torch.double"


I want to control industrial robots with reinforcement learning.
Normally, it seems that “double64” is used for controlling industrial robots.
For example, is there an easy way to get the sample task to run in “torch.double64” instead of running in “torch.float32”?
I hope it can be set in “config.yaml”.

IsaacGym doesn’t seem to have been developed or tested on the premise of “torch.double64”. am I wrong?

Hi @DDPG7,

For high performance simulation, fp32 is much faster than fp64. For industrial robotics use cases I suspect that you can just transform 64-bit tensors to 32-bit tensors to feed to a network trained in simulation.

fp32 has 23 bits of mantissa, so with meters as the base unit you would have below micrometer level accuracy.

Take care,