- Intel and Google signed a multi-year deal to maintain Xeon in cloud infrastructure
- Google Cloud situations C4 and N4 already run on Xeon 6 processors
- Intel and Google are co-developing customized IPUs for networking and storage
Intel and Google have introduced a multi-year collaboration that can hold Intel Xeon processors on the coronary heart of Google Cloud infrastructure for the foreseeable future.
The settlement spans a number of generations of Xeon chips and contains programs used for AI workloads, inference duties, and general-purpose computing throughout Google’s international information facilities.
Google Cloud situations resembling C4 and N4 already depend on Xeon 6 processors, and this deal ensures that sample continues.
Article continues under
It’s possible you’ll like
Why CPUs nonetheless matter in an period of specialised AI {hardware}
“AI is reshaping how infrastructure is constructed and scaled,” stated Lip-Bu Tan, CEO of Intel.
“Scaling AI requires greater than accelerators — it requires balanced programs. CPUs and IPUs are central to delivering the efficiency, effectivity, and adaptability trendy AI workloads demand.”
The announcement comes at a time when many hyperscalers are accelerating adoption of customized Arm-based processors for AI duties.
Counterpoint Analysis just lately claimed 90% of AI servers working customized silicon will depend on the Arm instruction set structure, leaving x86 with solely a small share of latest deployments.
To make sure Xeon stays related, Intel and Google are additionally collectively creating customized infrastructure processing models designed to deal with networking, storage, and safety workloads.
These IPUs function as ASIC-based accelerators that transfer infrastructure duties away from host CPUs, liberating Xeon processors to concentrate on utility execution.
This separation improves system effectivity and useful resource allocation throughout massive cloud deployments working AI instruments, AI brokers, and enormous language fashions.
What to learn subsequent
CPUs and infrastructure acceleration stay a cornerstone of AI programs — from coaching orchestration to inference and deployment,” stated Amin Vahdat, SVP and Chief Technologist for AI Infrastructure at Google.
Google at the moment makes use of each Xeon 5 and Xeon 6 processors throughout a number of service layers alongside its personal customized Arm-based Axion processors.
These deployments proceed alongside Google’s personal customized processors utilized in different components of its infrastructure stack.
Intel and Google state that collaboration throughout CPUs and IPUs will proceed throughout future system generations, overlaying ongoing integration efforts throughout cloud infrastructure layers.
They keep that CPUs and infrastructure accelerators stay a part of present cloud design patterns throughout distributed programs.
Many workloads working in Google’s information facilities require backward compatibility with x86 structure, whereas others want most single-thread efficiency that Xeon CPUs ship.
These necessities are anticipated to persist for years, which explains why Intel and Google signed this multi-year settlement.
Comply with TechRadar on Google Information and add us as a most popular supply to get our professional information, opinions, and opinion in your feeds. Be sure that to click on the Comply with button!
And naturally you can even comply with TechRadar on TikTok for information, opinions, unboxings in video kind, and get common updates from us on WhatsApp too.

