Op type not registered 'ROIPool' in binary running on dockerimage

Hello there,

I’m trying to use

trt.create_inference_graph()

on a frozen graph that contains custom operations written in C++, is it registered with

REGISTER_KERNEL_BUILDER(Name("ROIPool").Device(DEVICE_GPU).TypeConstraint<float>("T"), PSROIPoolOp<Eigen::GpuDevice, float>);

but when I run my script to optimize the graph, it says that my operation is not registered. Any idea on what is happening ? what do I have to provide you to make my problem clearer ? …

Hello,

Looks like you registered a custom operation in TensorFlow via REGISTER_KERNEL_BUILDER macro? TensorRT wouldn’t have visibility into that.

It looks like the “op is not registered” error is coming from TensorFlow. Is it possible that the TF frozen graph was generated on a tensorflow version that’s different than the container you running trt on?

Hey there, thanks for your reply,

My model is running fine in both .ckpt and in frozen model, I can train it and perform inference too.
I’m using TF 1.8 with TensorRT 4 and my frozen graph was generated with TF 1.8 too.

And yes I’m using REGISTER_KERNEL_BUILDER, So TensorRT can not manage this ? In this case, is it even possible to optimize/quantitize this part of my model ?