Amazon, Databricks strike five-year deal around AI chips

Amazon and Databricks declined to share financial terms of the pact. (Reuters)
Amazon and Databricks declined to share financial terms of the pact. (Reuters)

Summary

Databricks will tap Amazon’s Trainium chips to power services for building AI systems, a move that could cut costs for businesses.

Amazon.com and startup Databricks struck a five-year deal that could cut costs for businesses seeking to build their own artificial-intelligence capabilities.

Databricks will use Amazon’s Trainium AI chips to power a service that helps companies customize an AI model or build their own. Amazon says customers pay less to use its homegrown chips compared with the competition, such as Nvidia’s graphics processing units, or GPUs, which dominate the AI chip market.

Amazon and Databricks declined to share financial terms of the pact.

The deal comes as Databricks, Amazon and other enterprise technology companies like Microsoft, Salesforce and Snowflake, a rival of Databricks, aggressively court businesses for their AI dollars. Meanwhile, corporate technology executives say it is time to show AI investment is generating returns.

Databricks bought AI startup MosaicML last year for roughly $1.3 billion and is expanding the acquired company’s services to get a piece of corporate AI deals. Its partnership with Amazon ultimately makes AI faster and cheaper for businesses because it can pass on the savings it gets from using Amazon’s AI chips, said Naveen Rao, Databricks’s vice president of generative AI.

Early AI successes have relied on using a company’s private data to customize AI. For instance, building a bespoke customer service chatbot can help lower staffing costs.

For Amazon, that means continuing to position itself as a neutral provider of AI technology, offering businesses the capabilities to use and customize a variety of AI models from many vendors on its platform.

Databricks makes money by renting out analytics, AI and other cloud-based software that taps AI-ready data for companies to build their enterprise technology tools. The San Francisco-based firm said it was valued at $43 billion last September.

The two companies have an existing partnership where customers can run Databricks data services on Amazon’s cloud-computing platform, Amazon Web Services. Databricks also rents Nvidia GPUs through AWS, and will be using more of them as part of the deal. Customers using AWS have generated over $1 billion in revenue for Databricks, and AWS is the data company’s fastest-growing cloud partner, Rao said.

Jonny LeRoy, chief technology officer of W.W. Grainger, said the industrial supplier is using AI to help customers navigate its product offerings. The Illinois-based company is using a combination of AI models and a retrieval-augmented generation system from Databricks to build its customer-service tool, and is planning to use Amazon’s chips under the hood, LeRoy said.

Amazon isn’t widely considered a leader in AI innovation, some technology analysts and business leaders say, and needs to show that it can compete against Microsoft and Google. Part of Amazon’s AI reboot involves its AI chips, Trainium and Inferentia, which are designed specifically for building and using AI models. Compared with Nvidia’s more general-purpose GPUs, such custom chips can be more efficient because they were designed for just one thing.

Amazon’s pitch for its custom AI chips: lower cost. Customers can expect to pay about 40% less than they would using other hardware, said Dave Brown, vice president of AWS compute and networking services.

“No customer is going to move if they’re not going to save any money, and if their existing solution is working well for them," Brown said. “So it’s important to deliver those cost savings."

Brown declined to say how many Amazon customers use its custom chips rather than Nvidia’s GPUs.

The car-shopping site Edmunds.com is using Databricks to build an AI tool that helps customers figure out which incentives they are eligible for when purchasing electric vehicles, said Greg Rokita, its vice president of technology. Any decrease in the cost of building AI systems is a benefit, especially because built-to-own AI is preferred over renting a vendor’s private models, Rokita said.

NinjaTech AI, a startup building AI agents to perform tasks, has said that by using Trainium chips it spends about $250,000 a month on computing rather than $750,000 to $1.2 million on Nvidia’s GPUs. Other customers that use Amazon’s custom AI chips include Anthropic, Airbnb, Pinterest and Snap.

But Amazon isn’t the only alternative to Nvidia. Longtime Nvidia rival AMD has its own line of GPUs, Google makes in-house chips called Tensor Processing Units, and startups like Groq and Cerebras have developed their own single-purpose AI chips.

“For enterprises, it is less about the underlying technology and more about the value the technology delivers," said Chirag Dekate, an analyst at market research and information-technology consulting firm Gartner. “If it’s Trainium, fine. If it’s CPUs, fine. If it’s GPUs, fine. It doesn’t really matter for end users."

Write to Belle Lin at belle.lin@wsj.com

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

MINT SPECIALS