“Almost every aspect of the organization is influenced by AI,” stated Vishal Sharma, Amazon’s Vice President of Artificial General Intelligence, during a presentation at the Mobile World Congress in Barcelona on Monday. He also challenged the notion that open-source models could lessen computational demands and refrained from commenting on whether European firms will alter their GenAI strategies due to geopolitical strains with the United States.
Speaking at the 4YFN startup conference, Sharma revealed that Amazon is presently implementing AI through its own foundational models across Amazon Web Services (AWS), the company’s cloud division, in warehouse robotics, and in the Alexa product line, among various other applications.
“We currently have approximately 750,000 robots performing tasks ranging from sorting to self-navigation within our warehouses. The Alexa device is likely the most widely utilized home AI product available… No segment of Amazon remains untouched by generative AI,” he asserted.
In December, AWS introduced a suite of four text-generating models, referred to as Nova, which includes multimodal generative AI models.
Sharma emphasized that these models undergo rigorous testing against public benchmarks: “It has become evident that there’s a vast array of use cases. There isn’t a universal solution. Certain scenarios demand video generation, while others, like Alexa, require specific task execution with rapid, predictable responses. You can’t have it incorrectly respond to ‘unlock the back door’.”
Despite this, he expressed skepticism regarding the idea that smaller, open-source models could lead to reduced computational resource requirements: “As you begin to deploy it in various contexts, the demand for greater intelligence continues to grow,” he explained.
Additionally, Amazon has unveiled “Bedrock,” targeting companies and startups interested in combining different foundational models, including models like China’s DeepSeek, as a service within AWS, which allows for easy switching between models, according to Sharma.
Amazon is also developing a substantial AI compute cluster utilizing its Trainium 2 chips in collaboration with Anthropic, a firm in which it has invested $8 billion. Simultaneously, Elon Musk’s xAI has recently introduced its flagship AI model, Grok 3, leveraging a vast data center in Memphis equipped with around 200,000 GPUs for training.
When asked about the level of computational resources required, Sharma remarked: “Personally, I believe computation will remain a key topic of discussion for the foreseeable future.”

He did not perceive Amazon as being pressured by the influx of open-source models recently developed in China: “I wouldn’t characterize it that way,” he stated. As such, Amazon remains comfortable in incorporating DeepSeek and other models on AWS: “We are a company that values choice… We are willing to embrace whatever trends and technologies serve our customers’ needs,” Sharma mentioned.
Reflecting on OpenAI’s emergence with ChatGPT in late 2022, did he believe Amazon was caught off guard?
“I would disagree with that perspective,” he responded. “Amazon has been engaged in AI development for approximately 25 years. Take Alexa, for instance; it operates with around 20 distinct AI models… We had amassed billions of parameters dedicated to language processing long before this.”
Regarding the recent controversy involving Trump and Zelensky, along with the subsequent strain on relations between the current U.S. administration and several European countries, did he think companies in Europe might seek alternative sources for GenAI?
Sharma conceded that this topic falls “beyond” his “expertise” and the resulting implications are “difficult for me to foresee…” However, he judiciously implied that some firms may revise their strategies: “What I can say is that technological advancements tend to respond to prevailing incentives,” he remarked.
Compiled by Techarena.au.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence

