Deserialize TensorRT Engine Failure with Python API

Device: Jetson Nano
TensorRT Version: 5.0.2

I have a TensorRT engine of my network, built with a plugin layer, which I can deserialize in my C++ code. I need to deserialize the engine with the Python API but I get an error.

engine = runtime.deserialize_cuda_engine(f.read(),onnxparser)
TypeError: deserialize_cuda_engine(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.Runtime, serialized_engine: buffer, plugin_factory: tensorrt.tensorrt.IPluginFactory = None) -> tensorrt.tensorrt.ICudaEngine

The type of onnxparser in line 1 is: <class ‘tensorrt.tensorrt.OnnxPluginFactory’>
How can I correct this issue?