MiniCPM
open-source-llm-projects | model-compression
Overview
MiniCPM is a series of small, efficient LLMs developed by OpenBMB and Tsinghua. Despite small parameter counts (1B–3B), it achieves competitive performance on benchmarks through careful training and quantization.
Key Variants
- MiniCPM-2B: 2.1B parameters, competitive with much larger models
- MiniCPM-2.4B: Slightly larger variant with improved reasoning
- MiniCPM-V: Vision-language variant
- MiniCPM-MoE: Mixture-of-experts variant
Relationship to Other Projects
- Competes with Qwen and phi in the small model space
- Open-source alternative to proprietary models like GPT-3.5