How is Sarvam AI Leading India’s Sovereign AI Initiative with New 30B and 105B Models?

Share:
Audio Loading voice…
How is Sarvam AI Leading India’s Sovereign AI Initiative with New 30B and 105B Models?

Synopsis

Bengaluru-based Sarvam AI has launched two groundbreaking large language models, Sarvam 30B and 105B, as part of India's push for sovereign artificial intelligence. These models are designed to enhance efficiency and reasoning capabilities, marking a significant step towards reducing reliance on foreign AI systems. Discover how they compare to global competitors and their implications for India's AI landscape.

Key Takeaways

Sarvam 30B and 105B models launched to enhance India's AI capabilities.
Models utilize mixture-of-experts architecture for efficiency.
30B model activates only 1 billion parameters per token output.
105B model supports 128,000-token context window .
Models outperform global systems like DeepSeek R1 and Gemini Flash .

New Delhi, Feb 18 (NationPress) The AI startup Sarvam AI, based in Bengaluru, announced the launch of two new large language models as part of India's initiative to establish its own sovereign artificial intelligence capabilities.

This revelation occurred during the India AI Impact Summit, where the company detailed that both models were developed from the ground up utilizing a mixture-of-experts (MoE) architecture to enhance efficiency and performance.

The first model, named Sarvam 30B, comprises 30 billion parameters, but only 1 billion parameters are activated for each output token it generates.

Co-founder Pratyush Kumar emphasized that this MoE design minimizes inference costs while boosting efficiency, particularly for reasoning and complex tasks.

He noted that the 30B model excels in thinking and reasoning benchmarks at both 8K and 16K scales when benchmarked against other models of comparable size.

The Sarvam 30B model features a 32,000-token context window and has been trained on a staggering 16 trillion tokens.

Kumar reiterated that efficiency is at the forefront of the company's vision, aiming to make AI accessible on a large scale throughout India.

Additionally, the company launched a more advanced model, the 105-billion-parameter model, which is tailored for intricate reasoning and agent-based tasks.

This model utilizes 9 billion parameters and accommodates a 128,000-token context window, enabling it to process more complex instructions and sustain longer conversations.

Kumar drew comparisons between the new 105B model and leading global systems, asserting that it surpasses DeepSeek’s DeepSeek R1, which had 600 billion parameters at its launch last year, across various benchmarks.

He also mentioned that this model is more cost-effective than Google’s Gemini Flash, while outperforming it in numerous benchmarks.

According to him, when compared to Gemini 2.5 Flash, Sarvam's model displays superior performance on tasks involving Indian languages.

This launch coincides with India's intensified efforts to create foundational AI models that cater to multilingual and large-scale public applications.

Backed by the government, the IndiaAI Mission is supported by a significant Rs 10,000 crore fund, aiming to decrease reliance on foreign AI systems and foster domestic innovation.

To date, the mission has allocated Rs 111 crore in GPU subsidies, with Sarvam AI emerging as the primary beneficiary, acquiring 4,096 NVIDIA H100 SXM GPUs through Yotta Data Services and receiving nearly Rs 99 crore in subsidies.

The startup was previously selected as the inaugural company to develop India’s foundational AI model under this mission.

Sarvam AI was established in July 2023 by Vivek Raghavan and Pratyush Kumar, both of whom were involved with AI4Bharat, an initiative supported by Infosys co-founder Nandan Nilekani.

Point of View

Sarvam AI's launch of its new models signifies a pivotal moment for India's technological autonomy. This initiative aligns with the government's vision to bolster domestic innovation and reduce dependency on foreign technologies, which is crucial for national progress.
NationPress
1 May 2026

Frequently Asked Questions

What is the significance of Sarvam AI's new models?
Sarvam AI's new models are crucial for establishing India’s sovereign AI capabilities, allowing for greater efficiency and reduced reliance on foreign systems.
How do the new models compare to global counterparts?
Sarvam's models have shown to outperform several global systems, including DeepSeek's R1 and Google's Gemini Flash, particularly in tasks involving Indian languages.
What is the IndiaAI Mission?
The IndiaAI Mission is a government-backed initiative aimed at promoting domestic AI innovation, supported by a substantial fund to reduce foreign dependency.
Who are the founders of Sarvam AI?
Sarvam AI was founded by Vivek Raghavan and Pratyush Kumar, who previously worked on the AI4Bharat initiative.
What is the training data for the Sarvam models?
The Sarvam 30B model has been trained on 16 trillion tokens, providing it with a robust foundation for various applications.
Nation Press
Google Prefer NP
On Google