We asked John Knight to relive the entirety of computing history through open source. He almost died trying…
Never heard of Emulation? It’s when a piece of software mimics another machine in order to run its programs. For instance, you may have seen someone playing old Nintendo games on an Android phone. Emulation is best known for its use in playing old videogames on modern systems, but it’s used in many areas of computing, with many different purposes.
The first emulator was developed by IBM in 1965, for the System/360 line. It could run programs written for the older 7070 system and was a hit with customers. Although emulation would continue to develop in the computing industry, it remained relatively niche until the ’90s, when game console emulation on PCs resulted in landmark court cases. Nowadays, emulation is mainstream and is used for everything from virtual machines to nostalgic consoles, such as the Nintendo Classic Mini.
Emulators can be described as having anything from low-level to high-level emulation. The lower the level, the closer it is to the hardware, and the more system functions it tries to replicate. The higher the level of emulation, the more it simply mimics the required output behavior (for instance, “open a file” or “draw a rectangle on screen”).
The more it emulates the original machine, the more accurate the program’s behavior, but at a cost of speed, as your computer has more to process. The higher the level of emulation, the better the performance, as your computer can use its own hardware more, but at a cost of accuracy. The program is more prone to errors and feels less like the machine being imitated, and more like the system, it’s actually running on.
Emulators are particularly useful in getting around digital obsolescence. A business may rely on a niche piece of software, or a videogamer may want to play something from older systems on their current machine. In previous decades, major game companies have tried to stomp out attempts at emulation, but now that it’s become so pervasive, some companies have started embracing the technology and integrating their old software libraries into online stores—often applying visual upgrades in the process, such as higher resolutions.
What’s inside the box? Another box…
OTHER THAN THE EMULATOR, you usually need two things: some ROMs and a BIOS. “ROM” is a blanket term for programs you want to emulate. Although it should refer to an image of a system’s Read Only Memory chip, it is often used in the context of normal disk images that can be altered. Either way, your search for ROMs on specialist ROM sites. The BIOS is another blanket term for the program responsible for controlling the machine.
Distributing ROMs is generally illegal unless there’s a license that allows it—the exact legality varies between territories. Dumped copies of a system BIOS are legal under US law as long as the user owns the original machine. As you will see later, some emulators have their own substitute BIOS, which anyone can use legally and for free, developed through a process of reverse engineering, though usually at a cost of emulation accuracy.
SOME WINE? If you’ve used Linux for any decent length of time, you’ll have come across Wine. Wine lets you run Windows programs on Linux, but as any smarty-pants will tell you, Wine stands for “Wine Is Not an Emulator.” So what is it, then? Wine is what’s known as a compatibility layer. Compatibility layers take system calls from the foreign application and translate them for the native system.
For instance, if you’re running Microsoft Paint and click the maximize button, it sends a signal to the OS (a system call) to maximize the window. If you were running Paint in Linux with Wine, when you click the maximize button, Wine simply takes that Windows system call and substitutes it with a Linux system call.
The result is that rather than running a program under a Windows emulator, you are running the program natively as a Linux application—the main benefit being speed. Compatibility layers don’t stop there. Another variant is what’s known as a wrapper, which translates one kind of driver API into another.
For instance, in the late 1990s, 3dfx Voodoo cards were extremely popular for 3D acceleration. Although these cards supported OpenGL and Microsoft’s Direct3D, 3dfx had its own proprietary Glide API that would guarantee the best performance with its hardware. This is often called a wrapper.
With its deep pockets, Valve was able to build upon the existing Wine codebase with its own functionality. By changing from OpenGL to the new Vulkan API, Valve implemented huge performance gains, making conversion between Microsoft DirectX 12 and the Linux desktop genuinely viable. Although it’s still relatively early days, around 53 percent of Windows games work so far, giving Linux gamers an enormously increased library.
Defining the earliest “computer” is tricky, but they’ve certainly been around for longer than you might think
ONE OF THE EARLIEST MACHINES to be accepted as an analog computer is the Antikythera Mechanism. Dated to around 100 BC, it was used in astronomy to determine the positions of celestial bodies decades in advance. Only fragments of it remain, but detailed imaging suggests it could even model the irregular orbit of the moon. You can find a simulation of the Antikythera Mechanism at www.etl.uom.gr/mr/index.php?mypage=antikythera_sim. It only has a Windows executable, but it runs under Wine.
In the 13th century, we find Turkish inventor Ismail al-Jazari and his automata, laid out in his Book of Knowledge of Ingenious Mechanical Devices. Al-Jazari made complex mechanical musicians that ran via clockwork and even had a programmable drum machine, coded by altering the movement of cams. However, documentaries usually start just before the Victorian era. In those days the word “computer” would have evoked an image of a guy with a pencil behind his ear doing the math. But these squishy creatures were unreliable at best, and a machine with unerring accuracy was needed for a genuine technical revolution.
SPLIT THE DIFFERENCE
Enter Charles Babbage (1791-1871) and his Difference Engine. This was a mechanical number-cruncher, powered by a hand crank, with digits shown on rotating dials. It could not only perform arithmetic, but could also be programmed to follow numeric sequences, and even extract the root of a quadratic equation.
Babbage also designed the Analytical Engine, though it was never built. This would have been so far ahead of its time that it’s almost scary, with features much like a mainframe computer of the mid 20th century: punch-card programming, 16.2K of decimal storage, a grid layout with its own internal procedures, like a CPU dubbed “The Mill,” and something akin to assembly language.
Unfortunately, withdrawn funding meant the Difference Engine was never completed (Babbage only made a stripped-down prototype), and there was no interest in producing the even more elaborate Analytical Engine. Nevertheless, enough design existed for Babbage’s friend Ada Lovelace to write an algorithm for it, thus Lovelace is generally credited as having written the first computer program.
You can read up to 3 premium stories before you subscribe to Magzter GOLD
Log in, if you are already a subscriber
Get unlimited access to thousands of curated premium stories, newspapers and 5,000+ magazines
READ THE ENTIRE ISSUE