Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Nov 20, 2019
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Nov 2019
Assess ?

Fairseq is a sequence-to-sequence modelling toolkit by Facebook AI Research that allows researchers and developers to train custom models for translation, summarization, language modeling and other NLP tasks. For users of PyTorch, this is a good choice. It provides reference implementations of various sequence-to-sequence models; supports distributed training across multiple GPUs and machines; is very extensible; and has a bunch of pretrained models, including RoBERTa which is an optimization on top of BERT.

Download the PDF

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

Subscribe now

Visit our archive to read previous volumes