DeepSeek-V2 Open Source Release: 671B Parameters with Only 37B Activated, Performance Rivals GPT-4o
Chinese AI startup DeepSeek releases its latest open-source large language model DeepSeek-V2, featuring 671 billion total parameters but requiring only 37 billion activated for efficient inference, with performance metrics approaching OpenAI's GPT-4o.