Unable to run megatron-20b-gpt model on nemo Q&Amodel

Dear NVIDIA Nemo team,

We are planning to develop a question-answering chatbot based on Nemo. I noticed that NVIDIA provides pre-trained LLM model weights for Megatron 1.3b, Megatron 5b, and Megatron 20b(https://huggingface.co/nvidia/nemo-megatron-gpt-1.3B,https://huggingface.co/nvidia/nemo-megatron-gpt-5B,https://huggingface.co/nvidia/nemo-megatron-gpt-20B). I would like to use our own custom dataset to train a QA model based on Megatron with Nemo. Could you please advise me on how to proceed?

Thank you very much for your time and assistance.

Best regards,
Raymond

Hi @yl.liu

Sorry for the delay in responding, I just found your post.
You might have better luck getting a response in the NeMo Github discussion area. NVIDIA/NeMo · Discussions · GitHub

Tom