DeepSeek-Coder-V2-Lite-Base We present DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. Specifically, DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion...
We release the DeepSeek-Coder-V2 with 16B and 236B parameters based on the DeepSeekMoE framework, which has actived parameters of only 2.4B and 21B , including base and instruct models, to the public. Model#Total Params#Active ParamsContext LengthDownload DeepSeek-Coder-V2-Lite-Base 16B 2.4...
Watch 1Star0Fork0 Hugging Face 模型镜像/DeepSeek-Coder-V2-Lite-Base 代码Issues0Pull Requests0Wiki统计流水线 服务 Gitee Pages JavaDoc PHPDoc 质量分析 Jenkins for Gitee 腾讯云托管 腾讯云 Serverless 悬镜安全 阿里云 SAE Codeblitz 我知道了,不再自动展开 ...