TensorRT 5.1.6 Custom plugin with fp16 issue

Hi,
Yes, FP16 is supported in TRT 5.X for custom plugin.
Implementation:
With IPluginV2/IPluginV2Ext, implement the supportsFormat API and return true if DataType == FP16 for TRT 5.X:
https://github.com/NVIDIA/TensorRT/blob/master/samples/opensource/samplePlugin/fcPlugin.h
Something like:
bool supportsFormat(DataType type, PluginFormat format) const override
{
return (type == DataType::kFLOAT || type == DataType::kHALF) &&
format == PluginFormat::kNCHW;
}
Or with IPluginV2DynamicExt/IPluginV2IOExt, using the supportsFormatCombinations api, as shown here for TRT 6.0:
https://github.com/NVIDIA/TensorRT/blob/master/plugin/instanceNormalizationPlugin/instanceNormalizationPlugin.cpp

Support matrix (hardware) for TRT 5.x:
TensorRT Support Matrix :: Deep Learning SDK Documentation

But I will recommend you to use the latest TRT version in order to get more flexibility for custom plugin creation.

Thanks