Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Sep 27, 2023
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Sep 2023
Assess ?

There are many emerging large language models (LLMs) in the English-speaking world. Although these models are usually pretrained with multiple languages, their performance in other languages may not be as good as in English. ChatGLM, developed by Tsinghua University, is an open bilingual language model optimized for Chinese conversation based on the General Language Model architecture. Since Chinese can be more complex than English with its different word segmentation and grammar, it’s important to have an LLM optimized for Chinese. Our team found ChatGLM beat other LLMs in accuracy and robustness when we built a Chinese emotion detection application for a call center. Considering many LLMs aren’t available in China due to licensing or regional restrictions, ChatGLM became one of the few open-source options.

Download the PDF

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

Subscribe now

Visit our archive to read previous volumes