THE MACHINE LEARNING REVOLUTION
Maximum PC|September 2021
According to the hype, artificial intelligence will soon be capable of anything. Jeremy Laird examines the true nature of machine learning
Jeremy Laird

IT’S THE NEXT BIG WAVE of computing. So says Intel of artificial intelligence (AI). If anything, that’s underselling it. According to some evangelists, AI is the key to unlocking almost every major problem humanity faces, from a cure for cancer to limitless clean energy. Less optimistic observers, including philosopher and neuroscientist Sam Harris, see AI as one of the most pressing existential threats to the survival of mankind. Either way, as Dr. Emmett Brown would put it, it’s pretty heavy.Even a brief analysis of the implications of AI quickly takes on epic proportions. Back at the more practical end of the epistemological scale, getting a grasp on current commercial implementations of AI can be equally baffling. Machine learning, deep learning, neural networks, Tensor cores—keeping track of the processes and hardware, not to mention the jargon, associated with AI is a full-time job.

So, the long-term impact of AI may be anyone’s guess. But, in the here and now, there are plenty of questions that can at least begin to be addressed. What is AI in practical computing terms? What is it used for today? What kind of hardware is involved and how are AI workflows processed? And does it add up to anything that you as a computing enthusiast should care about? Or is it just a tool for Big Tech to bolster their balance sheets?

BUZZWORDS AND BULLCRAP or the greatest paradigm shift in the history of computing? What, exactly, is artificial intelligence, or AI? According to the hype, AI won’t just radically revolutionize computing. Eventually it will alter almost every aspect of human life. Right now, however, defining AI and determining how relevant it really is in day-to-day computing, that’s not so easy.

Put another way, we can all agree that when, for instance, the self-driving car is cracked, it’ll have a huge impact on the way we live. But more immediately, when a chip maker bigs up the “AI” abilities of its new CPU or GPU, does that actually mean much beyond the marketing? Whether it’s a graphics card or a smartphone chip, is the addition of “AI” fundamentally different from the usual generational improvements in computing performance?

Taken in its broadest sense, AI is any form of intelligence exhibited by machines. The meaning of “intelligence” obviously poses philosophical problems, but that aside, it’s a pretty straightforward concept. Drill down into the specifics, however, and it all gets a lot more complicated. How do you determine that any given computational process or algorithm qualifies as artificial intelligence?

WHAT DEFINES AI?

One way to define AI is the ability to adapt and improvise. If a given process or algorithm can’t do that to some degree, it’s not AI. Another common theme is the combination of large amounts of data with the absence of explicit programming. In simple terms, AI entails a system with an assigned task or a desired output, and a large set of data to sort through, but the precise parameters under which the data is processed aren’t defined. Instead, the algorithms are designed to spot patterns and statistical relationships, and learn in a trial and error fashion. This is what is otherwise known as machine learning and it’s usually what is meant when the term AI is used in a commercial computing context.

A good example of how this works in practice is natural language processing. A non-AI approach would involve meticulously coding the specific rules, syntax, grammar, and vocabulary of a given language. With machine learning, the algorithmic rules are much less specific and all about pattern spotting, while the system is fed huge amounts of sample data from which patterns eventually emerge.

GPT-3

Generative Pre-trained Transformer 3 (GPT-3), developed by San Francisco-based OpenAI and released in 2020, is just such a machine learning natural language system. It was trained using billions of English language articles harvested from the web. GPT-3 arrived to much acclaim, with The New York Times declaring it “terrifyingly good” at writing and reading. In fact, GPT-3 was so impressive, Microsoft opted to acquire an exclusive license in order to use the technology to develop and deliver advanced AI-powered natural language solutions, the first of which is a tool that converts text into Microsoft Power Fx code, a programming language used for database queries and derived from Microsoft Excel formulas.

GPT-3 goes beyond the usual question-and-answer Turing Test tricks. It can do things such as build web layouts using JSX code in response to natural language requests. In other words, you type something like “build me a page with a table showing the top 20 countries listed by GDP and put a relevant title at the top,” and GPT-3 can do just that. It showcases both the ability to adapt and a system that’s based on processing data rather than intricate hand-coded rules.

However, GPT-3 is also a case study in the limitations of machine learning. Indeed, it’s debatable whether GPT-3 is actually intelligent at all. Arguably, it could all be considered something of a digital conjuring trick. That’s because GPT3 and its machine-learning natural language brethren do not understand language—or, ultimately, anything else—at all. Instead, everything they do is simply based on statistics and patterns.

By analyzing enormous quantities of written text, statistical rules that output plausible responses to natural language queries can be created without any need for what, on a human level, would be classed as understanding. And that, essentially, is the basis for most—if not all—machine learning, whether it’s applied to the problem of natural language processing and voice assistants, self-driving cars, facial recognition, or recommending products and content to customers and consumers. It’s just pattern spotting on an epic scale. From Amazon’s Alexa to Tesla’s Autopilot, the fundamental approach is the same. You can find out more about the limitations of existing AI systems in the boxout page 43, but if we’ve established the rough parameters of AI, the next question is how it’s implemented and what kind of hardware is required.

A CHIP-DESIGN GURU AND THE UNFATHOMABLE FUTURE OF AI

CPU ARCHITECT AND TECH LEGEND JIM KELLER ON MACHINE LEARNING

Is there anybody involved in chip design with a better résumé than Jim Keller? He’s the brains behind not only AMD’s first CPU architecture to really sock it to Intel (that’ll be Hammer back in the 2000s), but also the driving force that led to Zen and AMD’s renaissance in the late 2010s. He sired Tesla’s Autopilot chip, set Apple on the path to building arguably the most efficient client CPUs available today, and more recently had a stint at Intel.

Now Keller is heading a start-up, Tenstorrent, specializing in custom chips for accelerating AI workloads. That fact alone is enough to lend serious credibility to the field of AI and machine learning. According to Keller, there are now three kinds of computers. First there were CPUs. Then came GPUs. Now, the AI computer has arrived.

“In the beginning,” Keller explains of the development of computers, “there was scalar math, like A equals B plus C times D. With a small number of transistors, that’s all the math you could do.” As the transistor count of computer chips grew exponential, so did the complexity of math that was possible. First, vector math was possible, then matrix multiply. Today, the complexity is pushing chip design in a new direction.

“As we get even more transistors, you want to take big operations and break them up. If you make your matrix multiplier too big, you begin to waste energy. So you build this optimal size block that’s not too small, like a thread in a GPU, but not too big, like covering the whole chip with one matrix multiplier.” The result, according to Keller, is an array of medium-sized processors where “medium” means a processor capable of four tera operations per second. What Keller is describing is the next step in AI computing from specialized blocks that accelerate matrix math, like Nvidia’s Tensor cores, to a new type of chip that accelerates what he calls “graph” computing.

Continue reading your story on the app

Continue reading your story in the magazine

RELATED STORIES

MacBooks

The most powerful Pro lineup ever created

8 mins read
AppleMagazine
October 22, 2021

CHINA CRACKDOWN ON APPLE STORE HITS HOLY BOOK APPS, AUDIBLE

Amazon’s audiobook service Audible and phone apps for reading the holy books of Islam and Christianity have disappeared from the Apple store in mainland China, the latest examples of the impact of the country’s tightened rules for internet firms.

3 mins read
AppleMagazine
October 22, 2021

Camera Set

The iPhone 13’s leaps forward for photo & video

6 mins read
AppleMagazine
AppleMagazine #520

Down to Their Last Dollar

Businesses fail every day, from world-beaters (like TWA and Lehman Brothers) to sexy high-fliers (DeLorean, Enron) to Steady Eddie, old-school icons (Toys “R” Us, Sears). Sometimes, of course, market conditions simply turn Sisyphean. But often, when that boulder starts to roll backward, a leader’s grit, imagination, resourcefulness, and ability to conjure a little luck can mean the difference between a brave new chapter and, well, Chapter 11. Here, four businesses that went from nearly bust to total gangbusters.

3 mins read
Inc.
October 2021

SPATIAL AUDIO RELATIONS

Ears-on with Apple Music’s new Atmos offerings

10+ mins read
Sound & Vision
October - November 2021

Logitech Ergo K860

This ergonomic “split” keyboard helps reduce wrist–strain

2 mins read
Mac Life
November 2021

Vissles V84 (2.0) Mechanical Keyboard

A Mac–friendly typing experience — and then some

2 mins read
Mac Life
November 2021

Steve Jobs

10 years without the genius behind apple

7 mins read
AppleMagazine
Ocotber 08, 2021

Dell XPS 17: The ultimate content creation laptop

With a giant screen and thin profile, the Dell XPS 17 is perfectly built for content creators on the go.

10+ mins read
PCWorld
October 2021

WHAT IS DDR5? THE PC'S NEXT–GEN MEMORY, EXPLAINED

EVERYTHING YOU NEED TO KNOW ABOUT THE PC’S NEXT RAM STANDARD.

6 mins read
PCWorld
October 2021