Implement Plugin Layer with support of FP16 mode

I implemented custom plugin layers with fp32 mode, and it works well when running in FP16 mode on tesnorrt 4.0. However, it reports error when running on tensorrt 5.0.2. I am wondering if we need to implement fp16-supported version of plugin layer, or tensorrt will take care of the conversion for us since there’s supposed to be an implicit data conversion between plugin layer and built-in layer?