Facebook Pixel {العنوان: سلسلة} | {اسم المغناطيس: سلسلة} - {الفئة: سلسلة} - اقرأ هذه القصة على Magzter.com

يحاول ذهب - حر

When AI moves from chips to racks

March 2026

|

PCQuest

AI performance is no longer just about faster chips. It is about how racks, power, networking, and orchestration work together. As agentic AI grows, infrastructure must become predictable, open, and built for scale from day one

- By Ashok Pandey

When AI moves from chips to racks

At the AI Summit, one theme stood out: performance is no longer decided at the chip level. It is defined at the rack level.

Archana Vemulapalli, Corporate Vice President of Global Commercial Sales at AMD, and Mahesh Balasubramanian, Senior Director, Data Center GPU Product Marketing at AMD, outlined how this shift is reshaping infrastructure thinking. The message was clear: AI scaling is no longer about stacking individual servers. The rack has become the unit of design.

For large frontier models, tightly integrated infrastructure drives efficiency. Multi-hundred-billion and even multi-trillion parameter models demand more than powerful accelerators. They require compute, memory, networking, power, and cooling engineered as one system.

The chip is still core. But it is no longer enough.

Why rack scale matters now

Rack-scale infrastructure is becoming critical in two major scenarios.

First, pre-training large models. At that scale, dividing workloads across loosely connected systems introduces communication overhead. Efficiency drops. Integrated racks reduce those inefficiencies.

Second, inference at scale. When inference serves a large number of users or AI agents, efficiency becomes a function of integration. Large-context inference and massive concurrency benefit from fully integrated rack-scale systems.

AI is also expanding beyond centralized training clusters. It is proliferating across:

  • CPUs for traditional enterprise applications.

  • GPUs for model training and high-performance inference.

  • PCs for local and private AI use cases.

  • Edge systems in telecom base stations and distributed environments.

  • Field-Programmable Gate Arrays (FPGAs) for accelerating AI at the edge.

Yet, at the front end of AI development, especially large model training, rack-scale integration is becoming decisive.

المزيد من القصص من PCQuest

PCQuest

PCQuest

AI on the ground Practical use cases of AI in large enterprise operations

AI isn't a side project anymore, it's the quiet operator inside global giants. It reads invoices, senses machine fatigue, tailors every customer moment, flags risk in real time, and feeds leaders sharper instincts. Scale just got smarter

time to read

3 mins

March 2026

PCQuest

PCQuest

From AI experiments in 2025 to enterprise scale in 2026: Why data foundations will decide the winners

Everyone's betting big on Al, but most are burning cash instead of building value. The hidden culprit? Dirty data, clunky processes, and missing context. What if fixing your foundation, not your algorithms, was the real AI game-changer?

time to read

4 mins

March 2026

PCQuest

PCQuest

How automation at the periphery is accelerating digital transformation

Digital transformation is not tearing down the core anymore. It is happening at the edges. With AI and automation layered onto existing systems, companies are cutting costs, boosting productivity by up to 40%, and scaling smarter without risking operational chaos

time to read

2 mins

March 2026

PCQuest

PCQuest

When AI moves from chips to racks

AI performance is no longer just about faster chips. It is about how racks, power, networking, and orchestration work together. As agentic AI grows, infrastructure must become predictable, open, and built for scale from day one

time to read

4 mins

March 2026

PCQuest

PCQuest

Designing enterprise AI systems that stay fair

In 2026, bias is no longer treated as a communications issue or a public relations headache.

time to read

6 mins

March 2026

PCQuest

PCQuest

HALO smart sensor

What if bathrooms, locker rooms, and isolated spaces could become safer without adding cameras?

time to read

2 mins

March 2026

PCQuest

PCQuest

Building enterprise AI that doesn't discriminate

Bias in enterprise AI is not a side issue. It starts in data pipelines, training systems, product design, and engineering workflows. As AI scales, fairness, transparency, and accessibility are becoming core software requirements

time to read

4 mins

March 2026

PCQuest

PCQuest

Bias travels faster than code

Bias in enterprise AI is not a surface issue. It enters through data, features, model training, APIs, and UI logic, then spreads across the stack. The technical response is shifting from audits to architecture, observability, and deployment controls

time to read

6 mins

March 2026

PCQuest

PCQuest

How hospitals can use AI without risking patient data

With the fast pace of adoption of Artificial Intelligence (AI) and digital health systems in Indian hospitals, issues related to the security of patient data are also increasing at an equal rate.

time to read

2 mins

March 2026

PCQuest

PCQuest

DeFi in 2026: The rise of liquidity, privacy and decentralized exchanges

DeFi's teenage years are over. Liquidity now behaves like infrastructure, privacy is baked in with zero-knowledge math, and DEXS hum like borderless trading engines. What once felt scrappy now resembles parallel financial system-without gatekeepers

time to read

3 mins

March 2026

Listen

Translate

Share

-
+

Change font size