Chinese AI lab DeepSeek has unveiled two preview iterations of its latest large language model, DeepSeek V4, significantly updating its previous V3.2 iteration along with an accompanying R1 reasoning model. The models, DeepSeek V4 Flash and V4 Pro, are based on a mixture-of-experts architecture, featuring context windows of one million tokens. This configuration allows for the efficient processing of extensive documents or codebases by activating only a portion of the parameters necessary for specific tasks, thereby reducing inference costs.
The V4 Pro model is particularly noteworthy, boasting a staggering 1.6 trillion parameters (with 49 billion active), positioning it as the largest open-weight model currently available. It surpasses competitors like Moonshot AI’s Kimi K 2.6, MiniMax’s M1, and more than doubles the parameters of DeepSeek V3.2. Conversely, the V4 Flash version features 284 billion parameters, with 13 billion active.
DeepSeek asserts that these new models demonstrate improved efficiency and performance compared to V3.2, nearly closing the gap with leading AI models on reasoning benchmarks. The V4-Pro-Max is reported to outperform many of its open-source counterparts and has shown superior performance compared to OpenAI’s GPT-5.2 and Gemini 3.0 Pro in specific tasks, with coding competition benchmarks indicating that both V4 models rival GPT-5.4.
However, the new models appear to slightly lag in knowledge assessments against some frontier models such as OpenAI’s GPT-5.4 and Google’s Gemini 3.1 Pro, indicating a developmental delay of about three to six months behind the latest innovations in the field. Unlike some proprietary models that can handle audio, video, and images, both V4 models only support text.
In terms of pricing, DeepSeek V4 is notably more budget-friendly compared to existing frontier models. The V4 Flash comes at $0.14 per million input tokens and $0.28 per million output tokens, significantly underpricing alternatives like GPT-5.4 Nano and Gemini 3.1 Flash. The larger V4 Pro model is also competitively priced at $0.145 for input tokens and $3.48 for output tokens, again offering a more affordable option than its peers.
This launch follows allegations against China of widespread intellectual property theft from American AI labs, with DeepSeek itself facing accusations from companies like Anthropic and OpenAI of mimicking their AI models.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence

