微軟最新AI晶片與亞馬遜、Google正面較勁

微軟最新AI晶片與亞馬遜、Google正面較勁

Hacker News·

微軟正將其最新的Maia 200 AI晶片部署至自家數據中心,此舉直接挑戰亞馬遜和Google等競爭對手的AI晶片產品。

Posts from this topic will be added to your daily email digest and your homepage feed.

See All News

Posts from this topic will be added to your daily email digest and your homepage feed.

See All AI

Posts from this topic will be added to your daily email digest and your homepage feed.

See All Tech

Microsoft’s latest AI chip goes head-to-head with Amazon and Google

The Maia 200 chip is starting to roll out to Microsoft’s data centers today.

The Maia 200 chip is starting to roll out to Microsoft’s data centers today.

Image

Posts from this author will be added to your daily email digest and your homepage feed.

See All by Tom Warren

Image

Image

Image

Image

Posts from this author will be added to your daily email digest and your homepage feed.

See All by Tom Warren

Microsoft is announcing a successor to its first in-house AI chip today, the Maia 200. Built on TSMC’s 3nm process, Microsoft says its Maia 200 AI accelerator “delivers 3 times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google’s seventh generation TPU.”

Each Maia 200 chip has more than 100 billion transistors, which are all designed to handle large-scale AI workloads. “Maia 200 can effortlessly run today’s largest models, with plenty of headroom for even bigger models in the future,” says Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division.

Microsoft will use Maia 200 to host OpenAI’s GPT-5.2 model and others for Microsoft Foundry and Microsoft 365 Copilot. “Maia 200 is also the most efficient inference system Microsoft has ever deployed, with 30 percent better performance per dollar than the latest generation hardware in our fleet today,” says Guthrie.

Image

Microsoft’s performance flex over its close Big Tech competitors is different to when it first launched the Maia 100 in 2023 and didn’t want to be drawn into direct comparisons with Amazon’s and Google’s AI cloud capabilities. Both Google and Amazon are working on next-generation AI chips, though. Amazon is even working with Nvidia to integrate its upcoming Trainium4 chip with NVLink 6 and Nvidia’s MGX rack architecture.

Microsoft’s Superintelligence team will be the first to use its Maia 200 chips, and the company is also inviting academics, developers, AI labs, and open-source model project contributors to an early preview of the Maia 200 software development kit. Microsoft is starting to deploy these new chips today in its Azure US Central data center region, with additional regions to follow.

Image

Posts from this author will be added to your daily email digest and your homepage feed.

See All by Tom Warren

Posts from this topic will be added to your daily email digest and your homepage feed.

See All AI

Posts from this topic will be added to your daily email digest and your homepage feed.

See All Microsoft

Posts from this topic will be added to your daily email digest and your homepage feed.

See All News

Posts from this topic will be added to your daily email digest and your homepage feed.

See All Tech

Hacker News

相關文章

  1. 微軟發表Maia 200 AI晶片,聲稱效能超越亞馬遜與Google

    3 個月前

  2. 微軟發布強大新款AI推理晶片

    Techcrunch · 3 個月前

  3. 微軟推出Maia 200 AI晶片,加速模型推理

    3 個月前

  4. Maia 200:專為推論打造的人工智慧加速器

    3 個月前

  5. 微軟執行長納德拉表示,即使推出自家AI晶片,仍將持續採購輝達和AMD的產品

    Techcrunch · 3 個月前