Yi-34B by 01.AI
Yi-34B is a bilingual large language model developed by 01.AI, featuring 34 billion parameters. Trained on a 3 trillion token multilingual corpus, it excels in tasks such as reading comprehension, commonsense reasoning, and coding. The model supports both English and Chinese languages and is designed for both research and commercial applications.
Key Features:
Bilingual Proficiency: Supports English and Chinese, making it suitable for cross-lingual tasks.
High Performance: Achieves competitive results on benchmarks like MMLU and C-Eval, rivaling models like GPT-3.5.
Extended Context Window: Trained with a 4K sequence length, expandable to 32K during inference.
Open-Source Access: Available on platforms like Hugging Face for community use and development.


