DeepSeek-V2 Open Source Model Released: 236B Parameter MoE Architecture Rivals GPT-4o at 1/30 Inference Cost
Chinese AI startup DeepSeek has released DeepSeek-V2, a 236B parameter MoE model that matches GPT-4o's performance while costing only 1/30 for inference. The release has sparked widespread discussion with over 10,000 GitHub stars and 150,000+ mentions on X platform.