How to get Yocto cuda recpies to package x86 nvcc in sdk?

I am using the Yocto layers in the DriveOS SDK to produce an image for the Orin dev kit. I am having problems getting cuda support built into the SDK.

Basically I’m following the instructions here to build the tegra-drive-os-av-image:

After building the target image, I want to cross-compile programs for it, so I have bitbake package an SDK using
bitbake -c populate_sdk tegra-drive-os-av-image

There are various features I need to enable, which I do in my build/conf/local.conf file. In the branch of Yocto that the Drive SDK packages, the canonical way to enable cuda support in the SDK is to add it to the list of tools to package in the SDK, using something like this line in build/conf/local.conf:
TOOLCHAIN_HOST_TASK_append = " nativesdk-cuda-toolkit"

However, this does not actually package the x86 nvcc executable in the SDK. It packages only the aarch64 nvcc executable. This is worthless to me in the SDK, because I need the SDK to cross-compile from x86 to aarch64, and therefore the nvcc program must be able to execute on x86.

In newer versions of the Nvidia Yocto layers, such as the ones provided for Jetson in the meta-tegra repo at GitHub - OE4T/meta-tegra: BSP layer for NVIDIA Jetson platforms, based on L4T, enabling cuda is slightly different, and I am able to get the OE4T/meta-tegra to package an x86 nvcc just fine. But the SDK resulting from this is for a newer version of cuda, and it cross-compiles cuda executables that aren’t compatible with the DriveOS image that is running on the target.

I want to understand what I need to do in my build/conf/local.conf for the DriveOS Yocto layers to package x86 nvcc in my SDK when I run bitbake -c populate_sdk tegra-drive-os-av-image. Is this documented anywhere?