Science

CES 2019: AMD Debuted the World's First 7nm GPU, but What Does That Mean?

New shots fired in the never-ending chip wars. 

CES 2019 is basically like Wrestlemania for the tech industry and few rivalries are as bitter or quietly exciting as the one between chipmakers Advanced Micro Devices (AMD) and Nvidia. Every year, both companies try to outdo one another by hyping up ever smaller, faster, and more powerful chips. This year AMD has come out of the gate strong by introducing the world’s first 7-nanometer graphical processing unit, or GPU.

The graphics card has been dubbed the Radeon VII, and it will ship for $699 starting February 7. The chipmaker says it’s here to revolutionize gaming, and claims the new chips will bring a 25 percent performance boost compared to its previously released GPU. It flexed this with a demo of the video game Devil May Cry 5 at 4K resolution with maximum graphical settings all running at well over 60 frames per second.

In its subsequent press release, the company rattled off a list of game titles that it said would see a spike in performance with the Radeon VII, including Fortnite. But will gamers actually be able to note the difference after spending $699?

Dr. Michael Liehr (left) of SUNY Polytechnic Institute's Colleges of Nanoscale Science and Engineering and Bala Haran (right) of IBM Research inspect a wafer comprised of 7nm (nanometer) node test chips in a clean room in Albany, NY. IBM Research, working with alliance partners at SUNY Poly CNSE, has produced the semiconductor industry’s first 7nm node test chips with functional transistors. (Darryl Bautista/Feature Photo Service for IBM)

Flickr / IBM Research

Did AMD Make the First 7nm GPU?

7 nanometer chips are all the rage, you might remember Apple flexing its 7nm A12 Bionic chip at its 2018 iPhone and iOS unveilings. However, the A12s are a central processing unit, CPU, not a GPU.

The difference is a little confusing, though a helpful analogy can be found on the Nvidia blog, which explains that while the CPU is the brain of a computer, the GPU is its soul.

“Architecturally, the CPU is composed of just a few cores with lots of cache memory that can handle a few software threads at a time,” it states. “In contrast, a GPU is composed of hundreds of cores that can handle thousands of threads simultaneously.”

But wait, didn't Apple have a 7nm chip first? That was a CPU, not a GPU.

Apple

Why You Should Pay Attention to the Chip Wars

AMD is locked in a never-ending nanometer competition with its rivals to cram ever-more computational muscle in the same tiny space. With GPUs in particular, this is almost entirely a battle in space efficiency as the actual size of graphics cards themselves aren’t really changing.

Somewhat confusingly, 7nm doesn’t actually refer specifically to a measure of distance but is more of a shorthand for the generation of chips that began rolling out last year (5 nm chips are expected some time in 2020-2021). But regardless of their literal size, it’s this cracker-shaped piece of tech that does all the heavy lifting. The rest of the space in your graphics card is taken up by lesser components like fans or other cooling systems.

Sure, you can splash out on some insane liquid cooling additions that will keep your GPU flying. But until the likes of AMD figure out a better way to package and cool off their GPUs, the bulky fan boxes will remain.

Here it is, the Radeon VII. Looks a lot like your last graphics card right? Well that's because it kind of is.

AMD

Will You Be Able to Tell The Difference?

This all depends on the GPU you are currently using. If your computer is retrofitted with last year’s top-of-the-line GPU chances are you’ll barely see a difference.

It’s like copping the iPhone XS when you already have the X. You might notice slightly better frame rates if you’re constantly tracking them, but that’s pretty much it.

The Radeon VII is a worthwhile investment if you haven’t upgraded your computer in a few years. This is when you’ll be able to note the biggest different. You’ll be able to put games on “Ultra” settings instead of “High” and they’ll run just as smooth.

So don’t think you need to scoop up these pricy computer components every time AMD or Nvidia hype them up. The one you bought last year might be marginally worse, but it doesn’t warrant a $700 upgrade.