开源大模型 (5 articles)

DeepSeek V4 Open-Source Model Released: 1.6 Trillion Parameters, Million-Token Context – Can It Overthrow Closed-Source Dominance?

On April 25, 2026, Chinese AI company DeepSeek officially open-sourced its V4 series large models, with the Pro version boasting 1.6 trillion parameters and supporting a 1 million token context window, alongside a low-compute Flash variant and a 75% API discount until May 5, 2026. Winzheng.com's evaluation based on YZ Index v6 methodology reveals that it is the first open-source model to match closed-source leaders in key dimensions like code execution and grounding, while offering superior cost-effectiveness.

DeepSeek V4 开源大模型 AI产品评测
1,456

DeepSeek-V2 Open Source Release: 236B Parameters Run on Just 16GB VRAM, Math Capabilities Surpass Llama3, Igniting Developer Community

DeepSeek team releases open-source LLM DeepSeek-V2 with 236B parameters requiring only 16GB VRAM for inference, outperforming Meta's Llama3 in mathematical benchmarks. The model has garnered over 150,000 reposts in Chinese communities, marking a major breakthrough for domestic AI in efficient large model development.