Published 1 minute ago
Monica J. White is a journalist with over a decade of experience in covering technology. She built her first PC nearly 20 years ago, and she has since built and tested dozens of PCs.
PC hardware is her main beat, and graphics cards and the GPU market at large are her main area of interest, but she has written thousands of articles covering everything related to PCs, laptops, handhelds, and peripherals. From GPUs and CPUs to headsets and software, Monica’s always willing to geek out over all things related to computing.
Outside of her work with How-To Geek, Monica contributes to TechRadar, PC Gamer, [Tom’s Guide](https://www.tomsguide.com/author/monica-j-whit…
Published 1 minute ago
Monica J. White is a journalist with over a decade of experience in covering technology. She built her first PC nearly 20 years ago, and she has since built and tested dozens of PCs.
PC hardware is her main beat, and graphics cards and the GPU market at large are her main area of interest, but she has written thousands of articles covering everything related to PCs, laptops, handhelds, and peripherals. From GPUs and CPUs to headsets and software, Monica’s always willing to geek out over all things related to computing.
Outside of her work with How-To Geek, Monica contributes to TechRadar, PC Gamer, Tom’s Guide, Laptop Mag, SlashGear, Whop, and Digital Trends, among others. Her ultimate goal is to make PC gaming and computing approachable and fun to any audience.
Monica spends a lot of time elbow-deep in her PC case, as she’s always making upgrades, testing something, or plotting out her next build. She’s the go-to tech support person in her immediate circle, so she’s never out of things to do. Whenever she has spare time, you’ll find her gaming until the early hours and hanging out with her dog.
For most people, the story of graphics cards starts with Nvidia and AMD. The reality is quite different.
Long before either of those two companies started to dominate the market, and long before GPUs powered modern games, we were witnesses to the early days of computer graphics. To help understand how we ended up with modern GPUs, we’ll have to go back to a time before the term "graphics card" even existed.
HTG Wrapped 2025: 24 days of tech
24 days of our favorite hardware, gadgets, and tech
What came before graphics cards existed?
For most people, computer graphics weren’t a thing yet.
Credit: Wikimedia Commons
For us to be able to now discuss all the new GPUs that will come out in 2026, someone had to invent computer graphics in the first place—and that’s not the same thing as a GPU, not even close.
Early computers didn’t display images, windows, or even pixel-addressable graphics. Most outputs were text-based. Electromechanical teleprinters like the 1963 Teletype Model 33, which adopted the new ASCII standard, were essentially glorified typewriters, spitting results onto paper one line at a time.
They were painfully slow, loud, and very literal. There were no visuals to speak of other than whatever text you ordered the computer to print out.
Next came video terminals, referred to as "dumb terminals." They were essentially keyboards connected to screens, but they weren’t computers per se—they were all part of a network connected to a host computer, and the terminal displayed whatever the host sent back. The screen was divided into a fixed grid (usually 80 columns wide), and each little box could display a character from a predefined set. This allowed people to get creative and create some very basic ASCII art, but it was all done using characters that were pre-programmed and assigned to specific keys.
The ’60s saw the advent of computer graphics in some form
Even while most people were still stuck dealing with text, some researchers were already experimenting with interactive graphics. In 1963, Ivan Sutherland created Sketchpad, a system that let users draw and manipulate line drawings directly on a screen. Sounds pretty similar to the touchscreens of today, doesn’t it? Sketchpad used a light pen to achieve this.
Interactive computer graphics were also already coming to life in big-company systems. IBM started shipping graphics terminals like the 2250 in 1965.
The introduction of the PC sped up the evolution of graphics
But it still had little to do with graphics cards.
Credit: Wikimedia Commons
The late 1970s brought us the emergence of personal computers. Not as we know them today, but in general. Before that, computers were massive machines that took up entire rooms and were used by businesses; the rise of the PC made them widely available. That brings us within reach of computer graphics.
The early days of computer graphics gave us low-resolution, often monochrome displays. PCs had limited memory, which meant programmers had to get clever to achieve something remotely engaging in terms of graphics output.
Machines like the Radio Shack TRS-80 introduced bitmap-style graphics, but at extremely low resolutions (128x48).
Before graphics cards and accelerators, the CPU and the memory both played a big part in display outputs. Seeing as those early PCs had kilobytes, not megabytes, of RAM, storing full-screen images was expensive and impractical. Graphics were minimal and reused aggressively as required.
This was a time before any standard image formats, before JPGs, BMPs, and PNGs. Software had to store images as raw bitmaps or custom data structures, and compress them as if there was no tomorrow.
IBM’s early display standards shaped PC graphics
Apple was up there, too, with the Macintosh.
Credit: Wikimedia Commons
IBM was a big player in the early days of personal computing. In 1981, IBM introduced the "PC," followed by the PC-XT in 1983. The original IBM PC effectively had zero graphics power, though. Most workloads were still text-based, so text clarity and reliable software were the two main concerns.
The IBM PC could be configured with the Monochrome Display Adapter (MDA), which was sharp for the time, and delivered high-contrast text with no bitmap graphics at all. It was used for word processing, databases, and spreadsheets.
The Color Graphics Adapter (CGA), also introduced with the IBM PC, finally added basic color graphics, but of course, the color palette was tiny, and everything was super low-res. But hey, at least we had some graphics.
Apple took a different approach with the early Macintoshes, or Macs, as we know them today. Macs treated the screen as a bitmap instead of a pre-programmed character grid, giving the user a lot more freedom. Apple broke ground in industries like graphic design and publishing thanks to those choices, and it still remains a go-to for similar workloads to this day.
Although IBM wasn’t too generous with its graphics interfaces in those early days, it did something far more important: it helped standardize PC display modes in the IBM-compatible ecosystem. With devs and manufacturers optimizing their products to be compatible with IBM as an industry standard, the door was wide open for computer graphics to finally flourish.
The rise of 2D graphics was a pivotal moment in computing
We’re still nowhere near the GPUs of today, though.
Credit: Wikimedia Commons
By the late 1980s, bitmap graphics modes became mainstream on IBM-compatible PCs, and the rise of Windows meant more software was optimized for a pixel-addressable screen. Text modes didn’t vanish overnight, but basic graphics were becoming the norm. And by 1987, IBM’s PS/2 introduced VGA, which became the PC baseline, and although those ports are officially too old now, back then, they were revolutionary.
VGA finally made it so that personal computers could display semi-realistic images, games, and movies. It also vastly expanded the practical resolution and color options in PCs, introducing the popular 256-color mode.
Even if most video was still tiny, heavily compressed, and overall unimpressive by today’s standards, it was still groundbreaking for the time; besides, VGA finally became a baseline that developers could target, which sped up the graphics revolution. SVGA arrived in the early 1990s, and PCs could finally run higher resolutions, as high as 1,024 x 768.
However, graphics were still heavily handled by the CPU, and dedicated "graphics cards" as we call them today weren’t common yet, although display adapters like CGA existed. But by the late 1980s and into the 1990s, 2D acceleration became more widespread.
Early 2D acceleration was implemented as dedicated hardware on add-in graphics cards (again, they weren’t called that back then). In the 1990s, they increasingly became integrated 2D engines inside mainstream VGA/SVGA cards.
Those 2D accelerators did a lot of heavy lifting in making a PC more responsive. More importantly, they made it clear that computer graphics were a workload that deserved its own dedicated hardware, eventually leading us to the GPUs we know today.