Published : Apr 03, 2024
Not on the current edition
This blip is not on the current edition of the Radar. If it was on one of the last few editions it is likely that it is still relevant. If the blip is older it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar
Understand more
Apr 2024
Assess
Baichuan 2 is part of a new generation of open-source large language models. It was trained on a high-quality corpus with 2.6 trillion tokens, achieving quite good performance for its size on Chinese, English and multi-language benchmarks. Baichuan has been trained on several domain-specific corpora, including healthcare and law data sets, which is why we prefer using it in these and related fields.