Incorrect results when running on DLA

We have a segmentation network that we originally ran on GPU. Over the past few months we’ve adjusted it to run on the DLA (see this other thread for that saga)

Everything looks like it’s working now. We get no errors when building the network or running inference. However the results are incorrect. The inferred segmentation is basically one value over the whole thing except a few pixels in the upper-left corner.

However, if we disable these lines:

builder->setDefaultDeviceType(DeviceType::kDLA)
builder->allowGpuFallback(false)

and run on GPU everything works. The only difference is whether it’s running on DLA or not.

This happens in both 4.3 developer preview and 4.3 release

Moving to Jetson forum so the Jetson team can take a look.

duplicate to topic 1071004:
https://devtalk.nvidia.com/default/topic/1071004/jetson-agx-xavier/wrong-results-when-running-network-on-dla-instead-of-gpu/