this post was submitted on 18 Jun 2023
3 points (100.0% liked)

LocalLLaMa

29 readers
1 users here now

Magazine to talk about LLaMA (large language model created by Meta AI) and any related Open Source LLMs. Inspired by Reddit's /r/LocalLLaMA/ subreddit.

founded 1 year ago
 

baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).

GitHub: https://github.com/baichuan-inc/baichuan-7B

Hugging Face: https://huggingface.co/baichuan-inc/baichuan-7B

top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago

Seems like a solid model!