Seeking kit-kernel@110.0.0+feature.linux-aarch64.release for E2CC on DGX Spark

Operating System:
Windows
Linux
Kit Version:
110 (Kit App Template)
109 (Kit App Template)
108 (Kit App Template)
107 (Kit App Template)
106 (Kit App Template)
105 (Launcher)
Kit Template:
USD Composer
USD Explorer
USD Viewer
Custom
GPU Hardware:
A series (Blackwell)
A series (ADA)
A series
50 series
40 series
30 series
GPU Driver:
Latest
Recommended (573.xx)
Other
**Work Flow:**Earth-2 Weather Analytics Blueprint on DGX Spark (aarch64)
**Main Issue:**kit-kernel 110.0.0 linux-aarch64 package not accessible via urm.nvidia.com or NGC
**Reproduction Steps:**Run bash repo.sh build on DGX Spark after updating kit-sdk.packman.xml to 110.0.0
**Error Code:**Connection failure. Retries exhausted. Package not found on specified remote servers (name: kit-kernel, version: 110.0.0+feature.linux-aarch64.release)

Hey team,

Very first post and relatively new to the developer world but I own a DGX Spark and I’ve been working my way through the official playbooks and blueprints — building, testing, and tearing down each one to really understand the stack.

I’m currently attempting the Earth-2 Weather Analytics blueprint and I’ve made significant progress getting it ready for aarch64. Steps so far:

  • Identified that the blueprint’s E2CC component was pinned to Kit SDK 106.5.0, which predates ARM support
  • Updated deps/kit-sdk.packman.xml to target kit-kernel 110.0.0 using the format from the current kit-app-template
  • Successfully compiled onnxruntime-gpu 1.25.0 from source against CUDA 13 and Compute Capability 121 (Blackwell) — not available on PyPI for aarch64
  • Confirmed the DFM containers, Redis, and FourCastNet NIM all have aarch64 support

The one missing piece is kit-kernel 110.0.0 for linux-aarch64. Packman correctly resolves the package name as:

kit-kernel@110.0.0+feature.linux-aarch64.release

And packman attempts to fetch it from:

urm.nvidia.com/artifactory/ct-omniverse-generic/pkgs/kit-kernel/

However urm.nvidia.com does not resolve in public DNS — it appears to be an internal NVIDIA registry not accessible outside NVIDIA’s network. I also tried the NGC CLI and downloaded kit-sdk-linux 110.0.0 via:

ngc registry resource download-version "nvidia/omniverse/kit-sdk-linux"

The downloaded package is kit-sdk@110.0.0+feature.manylinux_2_35_x86_64.zip — x86_64 only. No aarch64 variant is listed on NGC.

Kit SDK 109.0.1 release notes explicitly state ARM platform support was added targeting DGX Spark. Kit 110.0.0 added aarch64 shader compiler support. The code clearly supports ARM — but I can’t find the aarch64 kit-kernel binary through any public channel.

My question: has kit-kernel 110.0.0 for linux-aarch64 / manylinux_2_35_aarch64 been published to urm.nvidia.com or NGC? If not, is there a timeline or an alternate access path for DGX Spark owners?

I’m happy to be pointed to another forum if this is the wrong place. Any help for a Spark user would be greatly appreciated.

Hardware: DGX Spark, GB10 Grace Blackwell, 128GB unified RAM, aarch64, headless
Kit version targeted: 110.0.0
OS: DGX OS (Ubuntu 24.04)

@Richard3D Hello and thank you for your reply via email. I apologize for being such a noob and asking for the kernel, I will leave the title unchanged as a badge of noob shame, but please hear me out. While the wording was wrong, the request is still valid. I have checked both NGC and the NVIDIA-Omniverse GitHub organization and did not find an aarch64 variant of the Kit SDK useable on the DGX Spark.

Just to clarify the reason I named kit-kernel specifically is because it was referenced by name in NVIDIA’s own packman dependency manager — embedded in the kit-app-template repo — and it attempted to fetch it from urm.nvidia.com/artifactory/ct-omniverse-generic/pkgs/kit-kernel/. Packman resolved that package name from the dependency file and told me exactly where it was trying to go.

My short term goal is to run the Earth-2 Weather Analytics blueprint on the Spark, and I believe I’m very close — this appears to be the last piece I need. Also in my search I came across this earlier thread (link below) from October 2025 where ARM support wasn’t yet available. I was hoping things may have changed with 109.0.1 and 110.0.0, given the reply mentions possibility ARM support.

Ultimately I’m just trying to challenge my GB10 Spark and get more value from it. If this is a known limitation acknowledged by its creators then so be it — I have exhausted what’s possible on the platform circa Q2 2026. No harm no foul. I just want to completely exhaust my search options.

Thanks in advance!

Ok I have done some research and I think this is what you need. The official kit 110 kit for ARM64. Try this
GPU-optimized AI, Machine Learning, & HPC Software | NVIDIA NGC | NVIDIA NGC

Rather than using ā€œkit-sdk@110.0.0+feature.manylinux_2_35_x86_64.zipā€, you need this, ā€œkit-kernel@110.0.0+feature.manylinux_2_35_aarch64.releaseā€

1 Like

@Richard3D Hello again and thank you so much — your research made a real difference. The kit-kernel-linux-arm download was the key that unlocked the build system and got the aarch64 Kit binary actually running on the Spark.

That said, after getting past the kit-kernel wall I ran into additional licensing roadblocks I hadn’t fully anticipated. hpcvis.dynamictexture 0.4.1 — a proprietary NVIDIA extension required by E2CC — is not publicly distributed, and without it the dependency solver fails at precache. At that point I realized that asking for components piecemeal for a blueprint that isn’t fully supported on the Spark is not a viable path forward, regardless of what the hardware is physically capable of.

I did know going in that Earth-2 wasn’t officially supported on DGX Spark. But given what this machine is capable of, it felt like it might just be right there on the edge and I just wanted to give it a push.

I’ve pivoted to an open source approach using Earth2Studio directly with FourCastNet and a Cesium globe for visualization. It doesn’t fully flex the Spark the way the RTX rendering layer would have, but it produces a demo with visuals. I’ll keep my eyes peeled for when full Spark support lands!

Thanks again for digging into this — your help genuinely mattered

Well it sounds like there is no issue with the Spark hardware, running Earth2. It is purely a licensing issue as you stated. If you were to acquire the correct license, I am sure you could run Earth2 on your Spark. But if you are just doing it for fun, then it will not be worth it.