Orin nano Developer kit takes long time to boot

No. My NX run with 4.6.1.

Could you help to verify the boot time for Xavier NX with JP5.1.1?

There’s many difference between JP4 and JP5, the significant one is about the bootloader, JP5 uses UEFI instead of cboot in JP4.

UEFI waits for 5 seconds for UEFI menu, also takes 6 seconds to L4TLaunch, then L4TLuancher takes 5 seconds to kernel booting. I will check the UEFI first. But the kernel also takes very long time (00:19 - 00:50).

I built a UEFI image to reduce 5 seconds delay and change L4TConfiguration.dts to move SD to boot priority. I want to flash those change to my board, but dont want to flash all. How can I just flash a few partitions to cover my change?

To optimize boot time of the kernel, you could disable the unused modules in kernel defconfig to reduce the probe time for drivers.

To change boot-order, you could just replace the L4TConfiguration.dtbo in your board.

I want to just flash uefi partition which should be in Orin nano module QSPI. Dont want to flash other parts.

I tried below command with “-k A_cpu-bootloader”, but did not work.

sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device mmcblk1p1 -c tools/kernel_flash/flash_l4t_external.xml -p “-c bootloader/t186ref/cfg/flash_t234_qspi.xml” -k A_cpu-bootloader --showlogs --network usb0 jetson-orin-nano-devkit internal

Do you mean updating uefi_jetson.bin not work for removing 5s delay in UEFI?
If yes, it should be expected. You could refer to the following thread for details.
Optimizing UEFI boot time - #11 by KevinFFF

If you just want to update UEFI for Orin Nano, you could just run the following command.

$sudo ./flash.sh -c bootloader/t186ref/cfg/flash_t234_qspi.xml -k A_cpu-bootloader jetson-orin-nano-devkit mmcblk0p1

One thing is why I did not see UEFI boot logo shown out. UEFI exited in about 20 seconds, but I did not see any logo on screen.

I enabled UEFI debug, saw below error logs, may be related to display.

add-symbol-file /build/nvidia-uefi/Build/Jetson/DEBUG_GCC5/AARCH64/Silicon/NVIDIA/Drivers/NvDisplayControllerDxe/NvDisplayControllerDxe/DEBUG/NvDisplayControllerDxe.dll 0x240368000
Loading driver at 0x00240367000 EntryPoint=0x0024036EFE4 NvDisplayControllerDxe.efi

CreateFramebufferResource: no framebuffer region present
DeviceDiscoveryBindingStart, driver returned Not Found to start notification
CreateFramebufferResource: no framebuffer region present
DeviceDiscoveryBindingStart, driver returned Not Found to start notification
add-symbol-file /build/nvidia-uefi/Build/Jetson/DEBUG_GCC5/AARCH64/Silicon/NVIDIA/Drivers/FwImageDxe/FwImageDxe/DEBUG/FwImageDxe.dll 0x240080000
Loading driver at 0x00240070000 EntryPoint=0x00240084580 FwImageDxe.efi

I checked the UEFI code and the debug log. Framebuffer base/size information is from TEGRA_CPUBL_PARAMS. Below code is from function T234GetPlatformResourceInformation.

// Populate FrameBufferInfo
PlatformResourceInfo->FrameBufferInfo.Base = CPUBL_PARAMS (CpuBootloaderParams, CarveoutInfo[CARVEOUT_DISP_EARLY_BOOT_FB].Base);
PlatformResourceInfo->FrameBufferInfo.Size = CPUBL_PARAMS (CpuBootloaderParams, CarveoutInfo[CARVEOUT_DISP_EARLY_BOOT_FB].Size);

The error log “CreateFramebufferResource: no framebuffer region present” will be outputed only when either framebuffer base or size is 0. This can also be proved by the below log. There is no Carveout 39 data, 39 is CARVEOUT_DISP_EARLY_BOOT_FB.

So it should be framebuffer base/size are not set in TEGRA_CPUBL_PARAMS. How can framebuffer base/size are set in TEGRA_CPUBL_PARAMS?

DRAM Encryption Enabled
Carveout 1 Region: Base: 0x0000000268F00000, Size: 0x0000000000100000
Carveout 2 Region: Base: 0x000000026B800000, Size: 0x0000000000800000
Carveout 3 Region: Base: 0x000000026B000000, Size: 0x0000000000800000
Carveout 4 Region: Base: 0x0000000268E00000, Size: 0x0000000000100000
Carveout 5 Region: Base: 0x0000000268D00000, Size: 0x0000000000100000
Carveout 6 Region: Base: 0x000000026A800000, Size: 0x0000000000800000
Carveout 7 Region: Base: 0x0000000269400000, Size: 0x0000000000400000
Carveout 8 Region: Base: 0x0000000268C00000, Size: 0x0000000000100000
Carveout 9 Region: Base: 0x0000000268B00000, Size: 0x0000000000100000
Carveout 10 Region: Base: 0x000000026A000000, Size: 0x0000000000800000
Carveout 11 Region: Base: 0x000000004007A000, Size: 0x0000000000002000
Carveout 13 Region: Base: 0x000000004007C000, Size: 0x0000000000002000
Carveout 14 Region: Base: 0x000000004007E000, Size: 0x0000000000002000
Carveout 15 Region: Base: 0x0000000268A00000, Size: 0x0000000000100000
Carveout 16 Region: Base: 0x0000000040078000, Size: 0x0000000000002000
Carveout 17 Region: Base: 0x0000000268900000, Size: 0x0000000000100000
Carveout 18 Region: Base: 0x0000000040076000, Size: 0x0000000000002000
Carveout 20 Region: Base: 0x0000000272000000, Size: 0x0000000002000000
Carveout 21 Region: Base: 0x0000000040074000, Size: 0x0000000000002000
Carveout 22 Region: Base: 0x000000026D000000, Size: 0x0000000001000000
Carveout 23 Region: Base: 0x000000026C000000, Size: 0x0000000000200000
Carveout 24 Region: Base: 0x0000000270000000, Size: 0x0000000002000000
Carveout 25 Region: Base: 0x0000000040072000, Size: 0x0000000000002000
Carveout 27 Region: Base: 0x0000000268800000, Size: 0x0000000000100000
Carveout 28 Region: Base: 0x000000026E000000, Size: 0x0000000002000000
Carveout 30 Region: Base: 0x0000000040000000, Size: 0x0000000000040000
Carveout 31 Region: Base: 0x0000000278000000, Size: 0x0000000008000000
Carveout 33 Region: Base: 0x0000000269000000, Size: 0x0000000000400000
Carveout 34 Region: Base: 0x0000000268670000, Size: 0x0000000000010000
Carveout 35 Region: Base: 0x000000026C200000, Size: 0x0000000000E00000
Carveout 38 Region: Base: 0x0000000080000000, Size: 0x00000001E8670000
Carveout 40 Region: Base: 0x0000000040070000, Size: 0x0000000000002000
Carveout 42 Region: Base: 0x0000000268700000, Size: 0x0000000000100000
Carveout 43 Region: Base: 0x0000000274000000, Size: 0x0000000004000000

What’s the UEFI branch using from GitHub?

I followd Build with docker · NVIDIA/edk2-nvidia Wiki · GitHub steps, check below two steps, please.

edk2_docker edkrepo manifest-repos add nvidia https://github.com/NVIDIA/edk2-edkrepo-manifest.git main nvidia
edk2_docker edkrepo clone nvidia-uefi NVIDIA-Jetson main

Any update for this case?

Please use r35.3.1-updates branch for JP5.1.1

$ edkrepo clone nvidia-uefi-r35.3.1-updates NVIDIA-Jetson r35.3.1-updates

@KevinFFF ,

I built UEFI with r35.3.1-updates, and flashed it into my board. I saw logo at about 20 seconds, this should be UEFI displaying its logo. But it still took about 59 seconds to reach the kernel boot logo and desktop screen. So this r35.3.1-updates branch should fix the issue that UEFI can’t display its logo, but the boot time is still too long as before.

Please refer to the following thread.
Optimizing UEFI boot time - #8 by ts01399984

If you don’t want network stack, you could remove them from .fdf.
Similarly other features too.

So r35.3.1-updates branch does not incude any change to reduce UEFI boot time?

As I replied you from another topic, we are still working on it.
Please try to remove the features what you don’t want to use to optimize the boot time at this moment.

Hi Harry,

There should be 4s improvement in boot up with Setup Early Mmu by ashishsingha · Pull Request #60 · NVIDIA/edk2-nvidia · GitHub.
You could just use latest r35.3.1-updates branch from GitHub for UEFI to verify.

OK. I will try it.

I tried the latest r35.3.1-updates, it did have 4s improvement in uefi boot. Will you continue to reduce the boot time since the whole boot time is about 50+ seconds to see desktop.

BTW, I also tried to remove some components in uefi like network, but I can’t see much improvement for those change, and I am afraid to remove much components because I dont know what is the side effect if I remove them.