September 24, 2023

In April, a report recommended that Microsoft is engaged on synthetic intelligence (AI) chips to make use of them to coach massive language fashions (LLMs). The report, nonetheless, didn’t point out any companions with which the corporate was working. Based on a brand new report, Microsoft is working with Superior Micro Units (AMD) for the event of AI processors.

Citing folks with data of the scenario, a report by Bloomberg mentioned that Microsoft will assist enhance AMD’s provide of in-demand components. It additionally famous that this partnership is a part of a multi-pronged technique to safe extra of the extremely coveted elements.

Learn Additionally

Current language models lightyears away from human-like AI Early DeepMind investor

An alternative choice to Nvidia
Based on the report, the folks additionally talked about that the businesses will provide a substitute for Nvidia, which dominates the marketplace for AI-capable chips (GPUs). It’s to be famous that

Microsoft constructed a supercomputer with OpenAI and its structure has tens of hundreds of Nvidia A100 graphics chips strung collectively to supply energy to coach AI fashions.

“We constructed a system structure that might function and be dependable at a really massive scale. That is what resulted in ChatGPT being doable. That’s one mannequin that got here out of it. There’s going to be many, many others,” Nidhi Chappell, Microsoft head of product for Azure high-performance computing and AI, mentioned on the time.

Learn Additionally

Biden meets Microsoft Google CEOs on AI dangers

How Microsoft will assist AMD
The Bloomberg report mentioned that Microsoft will present assist, together with engineering assets and dealing with the chipmaker on a homegrown Microsoft processor for AI workloads, code-named Athena, to bolster AMD’s efforts.

Google’s AI chip
Google introduced final 12 months that it developed an AI chip known as the Tensor Processing Unit (TPU). The processor has been particularly designed for machine studying duties and is claimed to deal with trillions of operations per second and, on the identical time, devour low watts of energy.

The TPU is for use with Google’s TensorFlow software program, Google’s open-source software program library for machine studying.

FbTwitterLinkedin



finish of article