Learn How to Build Transformer-Based Natural Language Processing Applications

Originally published at: https://developer.nvidia.com/blog/learn-how-to-build-transformer-based-natural-language-processing-applications/

Deep learning models have gained widespread popularity for natural language processing (NLP) because of their ability to accurately generalize over a range of contexts and languages. Transformer-based models, such as Bidirectional Encoder Representations from Transformers (BERT), have revolutionized NLP by offering accuracy comparable to human baselines on benchmarks like SQuAD for question-answer, entity recognition, intent…