I posted this in another area of the forum. A moderator moved it to this one, but it’s not showing up in the list of recent posts, and no one has responded, so I’m reposting.
We have a segmentation network that we originally ran on GPU. Over the past few months we’ve adjusted it to run on the DLA (see this other thread for that saga)
Everything looks like it’s working now. We get no errors when building the network or running inference. However the results are incorrect. The inferred segmentation is basically one value over the whole thing except a few pixels in the upper-left corner.
However, if we disable these lines:
and run on GPU everything works. The only difference is whether it’s running on DLA or not.
This happens in both 4.3 developer preview and 4.3 release