In the previous edition of the Radar we had BERT — which is a key milestone in the NLP landscape. Last year, Baidu released ERNIE 2.0 (Enhanced Representation through kNowledge IntEgration) which outperformed BERT on seven GLUE language understanding tasks and on all nine of the Chinese NLP tasks. ERNIE, like BERT, provides unsupervised pretrained language models, which can be fine-tuned by adding output layers to create state-of-the-art models for a variety of NLP tasks. ERNIE differs from traditional pretraining methods in that it is a continual pretraining framework. Instead of training with a small number of pretraining objectives, it could constantly introduce a large variety of pretraining tasks to help the model efficiently learn language representations. We're pretty excited about the advancements in NLP and are looking forward to experimenting with ERNIE on our projects.