DeepSeek Unveils V3.1-Base: World's Largest AI Model with 685 Billion Parameters
Hangzhou-based AI company DeepSeek has unveiled its latest model, V3.1-Base, with a staggering 685 billion parameters. The model, released on August 20, boasts an impressive 128K context length.
The V3.1 model is now available for testing on the official website, app, and mini-program, with the API call method remaining unchanged. DeepSeek has open-sourced the model on Hugging Face, allowing developers worldwide to explore its capabilities.
Founded in 2023 by Liang Wenfeng, DeepSeek is the same company behind the DeepSeek R2 model. However, they have not yet officially announced the release of R2, and details about this specific model remain scarce.
The V3.1-Base model's extensive parameters and context length promise enhanced performance. While DeepSeek has not confirmed a release date for the DeepSeek R2 model, the open-sourcing of V3.1 offers a glimpse into the company's advancements in AI.
Read also:
- EPA Administrator Zeldin travels to Iowa, reveals fresh EPA DEF guidelines, attends State Fair, commemorates One Big Beautiful Bill
- Hitachi Rail's Next-Gen SelTrac to Revolutionize Urban Transit with C$100m Investment
- Leaders at HIT Forum 2025: Middle Powers Key to Asia's Security
- Samsung, SK Hynix Partner with OpenAI for AI Chip Boost, Driving South Korea's Tech Industry