China’s DeepSeek Earns Global Praise: Nandan Nilekani Applauds AI Simplicity, Slams Overly Complex Models

Infosys co-founder Nandan Nilekani(Co-founder of @Infosys) has hailed China’s DeepSeek as a “service to the global tech community” while criticizing the trend of building unnecessarily complex AI systems. His remarks, made during a keynote at the Global AI Ethics Summit, spotlight the growing debate over efficiency versus scale in artificial intelligence development.

Nandan Nilekani-Speaking at the People+AI Mela in Bengaluru,

Speaking at the People+AI Mela in Bengaluru, Nilekani remarked, “They keep doing more of that, but it is not that it is going to get any better. They are all typically kept within the four walls of a company. It is like a black box for the rest of us.” He emphasised that the lack of transparency and accessibility hampers broader innovation and progress. 

“China has delivered something the world desperately needs—practical, streamlined AI models that prioritize real-world problem-solving over computational grandstanding,” Nilekani stated. He specifically praised DeepSeek, a Beijing-based AI research lab, for its work on lean language models that achieve 90% of ChatGPT-4’s performance at 10% of the training cost. “We’re entering an era where ‘smaller and smarter’ must replace ‘bigger and bulkier’,” he added.

Maybe You also Read:
Can India build its own AI empire? Experts weigh in at India Today Conclave

The comments come amid heightened scrutiny of AI’s environmental and economic costs. Recent studies show training massive models like GPT-4 can consume over 1,000 megawatt-hours of electricity—equivalent to powering 1,200 homes for a year. DeepSeek’s approach, which focuses on optimizing training datasets rather than expanding parameters, has reportedly cut energy use by 76% in comparable projects.

Not all experts agree with Nilekani’s stance. Dr. Emily Zhang, an AI researcher at Stanford, countered: “While efficiency matters, oversimplified models struggle with nuanced tasks like medical diagnostics or climate modeling. Complexity isn’t inherently bad—it’s about balance.”

Market data reveals shifting priorities. Venture funding for “compact AI” startups surged 210% year-over-year in Q1 2025, with DeepSeek securing $300 million in its latest funding round. Meanwhile, OpenAI and Anthropic have faced investor pressure to justify their resource-intensive development cycles.

The Chinese government has quietly supported this shift, allocating $2 billion to its National Efficient AI Initiative last month. Analysts suggest this could challenge U.S. dominance in generative AI: “China’s playing chess while others play checkers,” said TechCrunch’s AI editor. “They’re betting that sustainable, affordable AI will win the adoption race.”

Leave a Reply

Your email address will not be published. Required fields are marked *