AMD Radeon RX Vega 56 review: even if you could buy one, you probably shouldn’t | PCGamesN

AMD Radeon RX Vega 56 review: even if you could buy one, you probably shouldn’t

AMD Radeon RX Vega 56 review

The AMD RX Vega 56 follows team Radeon's Sith-inspired (probably) rule-of-two, meaning that every GPU they design eventually gets released as a pair of gaming graphics cards. That means, following on from the AMD RX Vega 64, comes their lower-spec RX Vega 56. 

To get the most out of your new graphics card you’re going to need one of the best gaming monitors around.

Bless the ol’ AMD Vega architecture, it’s had a bit of a hard time post-launch. Since it was first announced, in hushed whispers, as the first high-end Radeon GPU in an age, it’s been hailed both as the saviour of AMD’s graphics division and the architecture that might damn it. In all honestly, it’s neither.

Much of Vega's struggles can be traced back to the choice of HBM2 as the memory technology to be used across the professional and consumer variants of the Vega design. If AMD had simply opted for the traditional GDDR5/X memory Vega would have been here a lot sooner, and, importantly, a lot cheaper.

The lack of stock and confusion over pricing has left a bit of nasty taste in the mouth, with accusations flying around about the original RX Vega 56’s $399 price given out to reviewers being only a short-term deal.

That's a moot point now, with the nightmare of cryptocurrency miners grabbing all available GPUs. Both Vega cards are now almost as rare as a reasoned Twitter debate, making prices artificially inflated most especially in the US. Pricing aside though, how does the second-tier Vega stack up?

Click on the jump links to get to the section double-quick.

AMD Radeon RX Vega 56 specs

Subscribe to PCGamesN on YouTube

There’s not a huge difference between the GPUs at the heart of the top-end RX Vega 64 and this lower-caste RX Vega 56. As the name suggests, the second-tier card has 56 compute units (CUs) compared with the 64 CU count of the higher-spec card. That means AMD have jammed 3,584 GCN cores into the RX Vega 56 instead of the previous card’s 4,096 GCN cores.

Other than that it’s purely a case of some tweaked power settings and lower clockspeeds - everything else is identical, down to the cooling design. That means we’re still talking about 8GB of HBM2 video memory and a 2,048-bit bus. It’s the same on-die configuration that makes the Vega 10 GPU a 486mm2 mammoth (the GP102 used in the GTX 1080 Ti is just 314mm2 for comparison) with 12.5bn transistors inside it.

There’s no liquid-chilled version of the RX Vega 56, nor are there any super-sexeh Silver Shroud variants. From the looks of things, AMD only made maybe five of either anyway, and I think those just went out to friends and family…

In terms of the actual Vega GPU architecture, we’ve gone into detail about what AMD’s engineers have been trying to create in our RX Vega 64 review. Essentially, they’re calling it their broadest architectural update in five years which is when they unleashed the original Graphics Core Next (GCN) design - and it brings with it some interesting new features.

One of which is the Infinity Fabric interconnect, binding the GPU with all the other components in the chip, making it a more modular architecture than previous generations. It’s also the key factor in making multi-GPU packages in the future. There’s also the new compute unit and AMD’s high-bandwidth cache controller (HBCC) too, offering the promise of the tech improving as more developers start taking advantage of it.

AMD Radeon RX Vega 56 benchmarks

DirectX 12 benchmarking

AMD Radeon RX Vega 56 performance

AMD Radeon RX Vega 56 performance

The big challenge for the RX Vega 56 - outside of trying to magic up affordable stock for people to actually buy - is to push past the similarly priced Nvidia competition. Just as the RX Vega 64 is priced to go head-to-head with the GTX 1080, the RX Vega 56 has been released into the wild with the GTX 1070 in its sights. Unfortunately Nvidia have hit back with the similarly unavailable GTX 1070 Ti...

Like the RX Vega 64’s attempts to overthrow the GTX 1080, it’s kinda difficult to call an outright winner. As we said about the flagship Vega GPU, it’s an architecture which seems to have been primarily designed for a gaming future that’s yet to have been created. As such, it’s rather lacklustre in its legacy gaming performance.

With games built using the last-gen DirectX 11 API the second class Vega is left trailing in the wake of the GTX 1070, let alone the GTX 1070 Ti. But when you start to bring in tests based around the newer DirectX 12, or Vulkan, instruction sets, Vega’s modern architectural design allows it to take the lead.

Unfortunately, despite DX11 now being very much a legacy API, with DX12 being over a year old, the majority of PC games are still being released using the older system. And that means for the majority of PC games that come out over the next six months, at least, the GTX 1070 is likely to retain that performance advantage.

Granted, Vega’s performance improvements in DX12 and Vulkan is encouraging for how it’ll fare down the line, as they become the dominant APIs, but right now anyone lucky enough to be able to find the RX Vega 56 for a decent price is still going to be paying the same money for a card that struggles to beat a smaller, more efficient, year-old GPU in pretty much every game that’s in their Steam library.

That’s not the only competition for the RX Vega 56, however, as there’s also the small matter of fratricide. As is their wont, despite the $100 price difference in their relative SEP, AMD haven’t made sweeping changes to the core configuration of the two Vega variants. Essentially, the RX Vega 56 is simply operating with 12.5% fewer cores, yet in performance terms is only ever around 7-10% slower.

That means there’s a good chance gamers will end up going for this mildly chopped down Vega instead of the much more expensive card. Or there would be if the cards were consistently available for reasonable prices. We’ve seen etailers where there are RX Vega 64 SKUs available for around the same price, sometimes less, than that shop’s cheapest available RX Vega 56.

Yeah, Vega’s weird.

AMD Radeon RX Vega 56 verdict

AMD Radeon RX Vega 56 verdict

The overall design of the Vega architecture seems to have been about laying a marker in the sand, defining a branching point for future generations of their GPU technology. They’ve almost sacrificed legacy gaming performance for the promise of future applications of its feature set. The little extras the Vega architecture has baked into it look like they could be genuinely game-changing… but only if developers actually end up taking advantage of them. And that's a pretty big if.

If it was guaranteed that the HBCC and Rapid Packed Math shenanigans were going to be employed across the board, and not just by AMD best-buds Bethesda, then the new Radeon tech would definitely be the one to go for over the old-school Nvidia design. But it’s not a sure thing, and we don’t really know what performance improvements these Vega features might offer either.

Wolfenstein II: The New Colossus has shown some impressive Vega performance in the face of the GTX 1080 competition, with AMD's traditional Vulkan speed wins in the low-level API. But Vega is showing greater gains than the Polaris architecture, which would indicate there is something to the Rapid Packed Math stuff. We're going to be doing some more investigating on that in the future.

Arguably the potential performance improvements of Far Cry 5 on AMD cards could be the most important metric for seeing how the ‘fine wine’ gamble of Vega will manifest itself. But that’s a way off and of no comfort to anyone looking at AMD’s latest GPUs right now.

If you were spending around $400 on a graphics card today - even if there were RX Vega 56’s available at their suggested price - it would still be difficult to make the Radeon recommendation. It’s the more advanced architecture, but in raw performance terms the smaller, slightly cheaper, more efficient Nvidia GPU is likely to get you higher frame rates in more of the games you’re playing at the moment.

If they could have priced Vega more aggressively against the still-strong year-old Nvidia Pascal architecture, then it would have a better chance of taking the market by storm, but unfortunately the price of HBM2 is a big sticking point. And with AMD reportedly losing $100 on each card sold at the suggested price, it looks like they’ve done all they can on that front.

I do like what AMD are trying to do with Vega, but with all things being equal, sacrificing current performance for the chance of higher frame rates in a few future games is likely to be too much of a stretch for most PC gamers. 

Sign in to Commentlogin to comment
DuoBlaze avatarDave James avatarAnakhoresis avatarmikeweatherford7 avatarproject17 avatar
DuoBlaze Avatar
5 Months ago

Those total power draw numbers are not at all like what I've observed. I've had a vega 64 liquid card in my PC with all default wattman settings for weeks. During peak (spikes - meaning above average) utilization I still have not exceeded 475w TPD from my entire PC. With my RX480 that never exceeded 400w. Either the that 345w number is wrong or I've got some major headroom for overclocking.

As it goes right now using balanced profile the radiator fan is almost inaudible while gaming, barely beyond the noise level of my chassis fans and runs much cooler than my air cooled RX480 did.

I feel like reviewers are not being accurate with power draw, noise and heat numbers.

Dave James Avatar
5 Months ago

I may be wrong here, but I think you might be getting confused between total platform power draw and the rated thermal design point (TDP) of the cards.

The TDP of your liquid-chilled Vega 64 is 345W, essentially meaning its cooling has been designed to dissipate that level of energy use under load.

The total platform power draw shown in the benchmarks above represent the amount of energy being drawn from the wall into our test rig's PSU. That was taken using a discrete energy meter while running Battlefield 4 at max settings and at 1440p; the same method I've used for all of my GPU tests for a good few generations of graphics card.

The temperature readings are taken from Afterburner, which we leave running throughout the entirety of our testing suite to catch the maximum temperature, as well as long term GPU frequencies too.

Anakhoresis Avatar
5 Months ago

Well Tom's hardware did report that they recorded peaks from the standard air-cooled card (just the card, not including the rest of the system components) of 385w. Though their monitoring equipment is probably more sensitive to peaks.

mikeweatherford7 Avatar
5 Months ago

These cards respond extremely well to undervolting. The difference can be dramatic, higher stable clocks, at much lower temps, and lower fan levels. My Reference Vega56 can do 1550 Mhz (actual full 3d load clock) at 1040 volts, and 950 MHZ HBM clock. At these settings it's significantly faster then a 1070, on the heels of a 1080, at reasonable temps and noise levels. This result seems to be typical with Vega56 owners.

project17 Avatar
5 Months ago

i must say the rx vega 56 is an amazing card from yesterday as they had a driver problem that was fixed for web browsing. the gaming side is very good had a 1440p monitor with freesynce at 75 refresh so i cap the card at this and have had all games running high settings and only wild lands goes to about 50 fps but freesynce takes care of that as would not have noticed it i did not have fraps running. very good come back from amd