Workspace size for Jetson Nano

Hi Nvidia Support Team,

I am trying to convert our Custom model from ONNX to tensorrt Model in Jetson Nano. The conversion is happening without errors, but after the Conversion, the size and type of the TRT Model being generated in Jetson Nano are completely different when I am converting in my Local System.

→ Size of the ONNX Model= 251 MB
->Size of the TRT Model being generated in Jetson Nano= 605 MB[Type: STL 3D Model(binary)]
→ Size of the TRT model being generated in my Local System= 250MB[Type: binary]

Code used for ONNX to TRT Conversion:

    `import tensorrt as trt
    TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
    trt_runtime = trt.Runtime(TRT_LOGGER)
    def build_engine(onnx_path, shape = [1,1,224,224]):
    with trt.Builder(TRT_LOGGER) as builder, builder.create_network(1) as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
    builder.max_workspace_size = (1<<30)
    with open(onnx_path, 'rb') as model:
    parser.parse(model.read())
    network.get_input(0).shape = shape
    engine = builder.build_cuda_engine(network)
    return engine
    def save_engine(engine, file_name):
    buf = engine.serialize()def load_engine(trt_runtime, plan_path):
    with open(engine_path, 'rb') as f:
    engine_data = f.read()
    def load_engine(trt_runtime, plan_path):
    with open(engine_path, 'rb') as f:
    engine_data = f.read()
    engine = trt_runtime.deserialize_cuda_engine(engine_data)
    return engine
    from onnx import ModelProto

    engine_name = "xyz.plan"
    onnx_path = "xyz.onnx"

    model = ModelProto()
    with open(onnx_path, "rb") as f:
    model.ParseFromString(f.read())

    batch_size = 1

    d0 = model.graph.input[0].type.tensor_type.shape.dim[1].dim_value
    d1 = model.graph.input[0].type.tensor_type.shape.dim[2].dim_value
    d2 = model.graph.input[0].type.tensor_type.shape.dim[3].dim_value
    shape = [batch_size , d0, d1 ,d2]

    engine = build_engine(onnx_path)
    save_engine(engine, engine_name)

Is there any way for reducing the size of the Tensorrt Model?
What is the Workspace size for jetson Nano?

Thanks

Hi,

Please note that TensorRT will choose an optimal algorithm based on the platform.
This may cause a different serialized file size.

A possible workaround is to increase the workspace to allow more algorithms for TensorRT.

Thanks.

Hi @AastaLLL
Thanks a lot for your response.

I tried changing the Max workspace size(Example: 1<<30), but it seems to be not working, still the size is 605 MB of TRT model.
Can you please guide me correct Workspace to be used for Jetson nano?

Thanks in Advance

Hi,

The physical memory of Nano is 4G, so please set the workspace < 4G.

The engine file size is model & platform dependent.
So this may cause the engine file of Nano to become large than other platforms.

Thanks.