I am trying to add “KernelCommandLine” to my L4TConfiguration.dts to include some new default kernel bootargs. The problem is that KernelCommandLine is define as uint16 since it is unicode. If I set data = “string value”; then it appears that the individual characters are somehow transformed to unicode characters and I get a kernel command line with Asian characters in place of “string value”. If I use data = [0072 006f]; to try to force 2 unicode characters for example. I get the same result where the characters are similarly misinterpreted into 4 unicode characters of unknown progeny. What is the syntax in the device tree to pass a unicode string or at least somehow get the characters I want transferred to the kernel command line?
This is what I am using now (that isn’t working):
data = “root=/dev/mmcblk0p1 rauc.slot=A”;
The device tree has node “chosen->bootloader” (on a running system check “cat /proc/device-tree/chosen/bootloader”). This is appended to kernel boot command line. Can you add the “rauc.slot=A” there? Also, when this fails, what do you see from “cat /proc/cmdline”?
On the computer you work from, what do you see from “echo $LANG”?
Thanks for your response. It is a yocto-generated kernel and file system (core-image-minimal) running on a standard NVIDIA Orin dev kit. I am logged in as root so I don’t need sudo and there is nothing at all in the /boot directory.
This question has nothing to do with yocto. The question is “What is the syntax in the device tree L4TConfiguration.dts for specifying a value for “KernelCommandLine” given that it is defined as unicode characters?”
Unfortunately, that yielded the same result. Asian characters appended to the boot arg list. The magic here is most definitely knowing the device tree syntax that allows specification of unicode characters
My money is currently on the device tree compiler turning the string into a standard ASCII encoded string rather than a unicode string because every other string in the device tree is ASCII-encoded and NOT unicode. I don’t see a way to flag a unicode string to the device tree compiler. If there is some magic way to define a 16 bit word array to the device tree compiler that could be a work around too, I suppose. The few attempts I have made at that have seen the compiler break every array into 8 bit bytes that once again are mangled by the code that reads this particular device tree entry (it reads each byte and expands it to 16 bits)