National Cyber Warfare Foundation (NCWF)

DeepSeek releases DeepSeek-V3, an open-source MoE model of 671B parameters, with 37B activated per token, claims it outperforms top models like Llama


0 user ratings
2024-12-27 02:03:10
milo
Developers

 - archive -- 

Shubham Sharma / VentureBeat:

DeepSeek releases DeepSeek-V3, an open-source MoE model of 671B parameters, with 37B activated per token, claims it outperforms top models like Llama 3.1-405B  —  Chinese AI startup DeepSeek, known for challenging leading AI vendors with its innovative open-source technologies, today released a new ultra-large model: DeepSeek-V3.




Shubham Sharma / VentureBeat:

DeepSeek releases DeepSeek-V3, an open-source MoE model of 671B parameters, with 37B activated per token, claims it outperforms top models like Llama 3.1-405B  —  Chinese AI startup DeepSeek, known for challenging leading AI vendors with its innovative open-source technologies, today released a new ultra-large model: DeepSeek-V3.



Source: TechMeme
Source Link: http://www.techmeme.com/241226/p15#a241226p15


Comments
new comment
Nobody has commented yet. Will you be the first?
 
Forum
Developers



Copyright 2012 through 2025 - National Cyber Warfare Foundation - All rights reserved worldwide.