AttributeError: 'tensorrt.tensorrt.Builder' object has no attribute 'create_builder_config'

I want to run a tensorflow pb model with tensorrt, and I copied the code in tensorrt guide, as showed below:
my tensorrt vesion is 5.0.6, UFF Version is 0.5.5

import tensorflow as tf
import uff
import pycuda.driver as cuda
import pycuda.autoinit
import numpy as np
import time #import system tools
import os
import tensorrt as trt
TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
model_file = ‘out.uff’
builder = trt.Builder(TRT_LOGGER)
with builder.create_network() as network, trt.UffParser() as parser:

    parser.register_input("Input", (1, 513, 513,3))
    parser.register_output("Output")
    parser.parse(model_file, network)

max_batch_size = 1
builder.max_batch_size = max_batch_size
builder.max_workspace_size = 1 << 20
with builder.create_builder_config() as config, builder.build_cuda_engine(network, config) as engine:

    h_input = cuda.pagelocked_empty(trt.volume(engine.get_binding_shape(0)), dtype=np.float32)
    h_output = cuda.pagelocked_empty(trt.volume(engine.get_binding_shape(1)), dtype=np.float32)
    # Allocate device memory for inputs and output
    d_input = cuda.mem_alloc(h_input.nbytes)
    d_output = cuda.mem_alloc(h_output.nbytes)
    # Create a stream in which to copy inputs/outputs and run inference.
    stream = cuda.Stream()

    with engine.create_execution_context() as context:
	
	    cuda.memcpy_htod_async(d_input, h_input, stream)
	    # Run inference.
	    context.execute_async(bindings=[int(d_input), int(d_output)], stream_handle=stream.handle)
	    # Transfer predictions back from the GPU.
	    cuda.memcpy_dtoh_async(h_output, d_output, stream)
	    # Synchronize the stream
	    stream.synchronize()
	    # Return the host output.

Then it showed AttributeError: ‘tensorrt.tensorrt.Builder’ object has no attribute ‘create_builder_config’
Thank you!

Hi,

create_builder_config is a function added from TensorRT v6.0.
For TensorRT v5.0, please update to this function instead:

with builder.build_cuda_engine(network) as engine:

For more information, please check our TensorRT 5.0 python API:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-506/tensorrt-api/python_api/infer/Core/Builder.html

Thanks.