Amazon's AI Revolution: Inside the Austin Chip Lab

Amazon engineers are developing in-house AI chips in Austin, Texas to reduce reliance on Nvidia and lower costs for customers. With a focus on cloud services through AWS, Amazon aims to offer cheaper alternatives to Nvidia chips. Custom chips like Graviton, Trainium, and Inferentia are showcased.


Devdiscourse News Desk | Updated: 25-07-2024 21:04 IST | Created: 25-07-2024 21:04 IST
Amazon's AI Revolution: Inside the Austin Chip Lab
AI Generated Representative Image

Inside Amazon.com's chip lab in Austin, Texas, engineers are working on a groundbreaking new server design. They aim to develop in-house AI chips as a cost-effective alternative to Nvidia.

Amazon's director of engineering, Rami Sinno, revealed that these AI chips are set to power part of Amazon Web Services (AWS), reducing the dependence on Nvidia's expensive processors and offering cheaper solutions for complex computations.

The custom chips, including the Graviton, Trainium, and Inferentia, promise up to 50% better price and performance. AWS, a significant revenue driver, continues to dominate the cloud market alongside Microsoft Azure, and recently handled record sales during Prime Day using these custom chips.

(With inputs from agencies.)

Give Feedback