Hi Team,
I was following the playbook (LLaMA Factory | DGX Spark) to fine-tune the LLM with LLama factory in my dgx spark. The docker environment seems to be conflicting with the LLama installation (in step 4) and here is the error message:
and apparently, the CUDA is not activated.
Should I install all dependent packages manually? Or it is an internal bug…
