[TensorRT Error] DLA validation failed

Hello,
I have a network layer that can be run on DLA (which has been validated with the function config->canRunOnDLA(layer)) and it also has been INT8 quantized, but when I declare it as running on DLA with the API config->setDeviceType( layer, DeviceType::kDLA), it still returns “DLA validation failed” error when building engine.
Here is my code for this part:

        config->setFlag(BuilderFlag::kGPU_FALLBACK);
        config->setDLACore(0);
        ILayer* layer = network->getLayer(10);
        if(config->canRunOnDLA(layer)) {
            config->setDeviceType(layer, DeviceType::kDLA);
        }

and the error:

ERROR: 4: [network.cpp::validate::2789] Error Code 4: Internal Error (DLA validation failed)
ERROR: 2: [builder.cpp::buildSerializedNetwork::751] Error Code 2: Internal Error (Assertion engine != nullptr failed. )

Additionally, this should not be a problem with DLA or TensorRT itself. I can successfully build an engine running on DLA using trtexec.

Hi,
Here are some suggestions for the common issues:

1. Performance

Please run the below command before benchmarking deep learning use case:

$ sudo nvpmodel -m 0
$ sudo jetson_clocks

2. Installation

Installation guide of deep learning frameworks on Jetson:

3. Tutorial

Startup deep learning tutorial:

4. Report issue

If these suggestions don’t help and you want to report an issue to us, please attach the model, command/step, and the customized app (if any) with us to reproduce locally.

Thanks!

Hi,

This topic is duplicated to the below one.
Let’s continue the discussion on it instead:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.