Does TensorRT really support Leaky ReLU

Hi, all
Leaky ReLU is officially claimed to be supported in the release notes of 5.1.3, which also releases the support of Leaky ReLU in UFF format. I check the latest script of uff python package and found there’s no conversion function for Leaky ReLU registered in the tensorflow-converter. Firstly I thought it might be the problem that the converter is not updated and Leaky ReLU is supported in the UFF-Parser. Since the UFFParser is not open-source, I used the “strings” command to check whether the library (.so file) contains some constant strings of Leaky ReLU, searching for the evidence that it supports. Unfortunately, all the operator names are exported and Leaky ReLU is absent.

I searched this forum and tried the old way of compiling a LeakyReLU plugin. According to the C++ API of 5.1.5:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/_nv_infer_plugin_8h.html
the Leaky ReLU is an officially-released plugin, however, when I downloaded the repository from Github, I can find all the announced plugins other than Leaky ReLU.

I tried to persuade myself that TensorRT currently does not support such kind of thing. But I found in the Native Network Construction API, inside the Activation Enum closure, Leaky ReLU is tagged as a supported activation function with an ID 3.

Can anyone explain this?

Hi charlie.yangt5aty
I have this confusion, do you find the reason?