News Scrap

Key Takeaways

  • Amazon is developing Trainium 2 AI chips to reduce reliance on Nvidia.
  • Amazon’s capital spending is expected to rise from $48.4 billion to $75 billion in 2024.

Share this article

Amazon is developing new artificial intelligence chips to boost returns on its semiconductor investments and reduce dependency on Nvidia, as reported by the Financial Times.

Amazon’s AI chip development aims to boost data center efficiency and offer customers tailored options in the cloud AI market by optimizing chips for specific tasks, unlike Nvidia’s general-purpose tools.

The company plans to widely release its ‘Trainium 2’ AI training chip next month.

The chip development is led by Annapurna Labs, which Amazon acquired in 2015 for $350 million.

Trainium 2 is currently being tested by Anthropic, which has received $4 billion in Amazon backing, along with Databricks, Deutsche Telekom, Ricoh, and Stockmark.

“We want to be absolutely the best place to run Nvidia,” said Dave Brown, vice-president of compute and networking services at AWS. “But at the same time we think it’s healthy to have an alternative.”

Amazon reports that its Inferentia AI chips are 40% cheaper to run for AI model response generation.

Brown emphasized the cost implications, stating that when saving 40% on $1,000, the impact may be minimal, but saving 40% on tens of millions of dollars makes a significant difference.

The company expects capital spending of around $75 billion in 2024, primarily on technology infrastructure, up from $48.4 billion in 2023. CEO Andy Jassy indicated higher spending is likely in 2025.

Rami Sinno, Annapurna’s director of engineering, explained that it’s not just about the chip but the entire system.

Amazon’s approach involves building everything from silicon wafers to server racks, all supported by proprietary software and architecture.

Sinno added that scaling this process is extremely challenging and that very few companies can achieve it.

Despite these efforts, Amazon’s impact on Nvidia’s AI chip dominance remains limited.

Nvidia reported $26.3 billion in AI data center chip revenue in its second fiscal quarter of 2024, matching Amazon’s entire AWS division revenue for the same period.

Share this article