Problem about tensorrt mobilenet in Jetson nano

I have train a model on PC which created a mobilenet.h5

I put this mobilenet.h5 on Jetson Nano which is the Jetpack4.2 , Tensorrt6.0, keras 2.1.3 and converted .h5 to .uff as follow:

python3  /usr/lib/python3.6/dist-packages/uff/bin/convert_to_uff.py ./mobilenet.h5

Then I get the mobilenet.uff in my nano.

And I try to create the .engine as follow:

import tensorrt as trt
model_file = './mobilenet.uff'
TRT_LOGGER = trt.Logger(trt.Logger.WARNING)

def build_engine(model_file):
    with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.UffParser() as parser:
        builder.max_workspace_size = 1<< 20
        parser.register_input('input_1_1', (3, 224, 224))
        parser.register_output('output_1')
        parser.parse(model_file, network)
        return builder.build_cuda_engine(network)

with build_engine(model_file) as engine:
    print('1') # just a test

And get these errors:

[TensorRT] ERROR: UffParser: Validator error: batch_normalization_2/cond/Switch: Unsupported operation _Switch

How to solve it?