DeepSeek-V2 (5 articles)

DeepSeek-V2 Open Source Release: 236B Parameters Run on Just 16GB VRAM, Math Capabilities Surpass Llama3, Igniting Developer Community

DeepSeek team releases open-source LLM DeepSeek-V2 with 236B parameters requiring only 16GB VRAM for inference, outperforming Meta's Llama3 in mathematical benchmarks. The model has garnered over 150,000 reposts in Chinese communities, marking a major breakthrough for domestic AI in efficient large model development.