OpenAI Presents GPT-3, a 175 Billion Parameters Language Model

Originally published at: OpenAI Presents GPT-3, a 175 Billion Parameters Language Model | NVIDIA Technical Blog

OpenAI researchers recently released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters.  For comparison, the previous version, GPT-2, was made up of 1.5 billion parameters. The largest Transformer-based language model was released by Microsoft earlier this month and is made up of 17 billion parameters.  “GPT-3…