AMD Radeon RX Vega 56 review: even if you could buy one, you probably shouldn’t… yet

AMD Radeon RX Vega 56 review

AMD’s rule-of-two means that every new GPU they design gets released as a pair of graphics cards. That means, following on from the Radeon RX Vega 64, we’re now prodding the compute units of their lower-spec RX Vega 56. 

To get the most out of your new graphics card you’re going to need one of the best gaming monitors around.

Bless the ol’ Vega architecture, it’s had a bit of a hard time of late. Since it was first announced, in hushed whispers, as the first high-end Radeon GPU in an age, it’s been hailed both as the saviour of AMD’s graphics division and the architecture which might damn it. Honestly, it’s neither.

Much of its struggles can be traced back to the choice of HBM2 as the memory technology to be used across the professional and consumer variants of the Vega design. If AMD had simply opted for the traditional GDDR5/X memory Vega would have been here a lot sooner, and, importantly, a lot cheaper.

The lack of stock and confusion over pricing has left a bit of nasty taste in the mouth, with accusations flying around about the original RX Vega 56’s $399 price given out to reviewers only being a short-term deal. Both the Vega cards are still as rare as a reasoned Twitter debate, making prices artificially inflated, especially in the US. But stock is set to start rolling in soon, hopefully driving down the current prices, so how does the second-tier Vega stack up?

Click on the jump links to get to the section double-quick.

AMD Radeon RX Vega 56 specs

AMD Vega 10 GPU

There’s not a huge difference between the GPUs at the heart of the top-end RX Vega 64 and this lower-caste RX Vega 56. As the name suggests, the second-tier card has 56 compute units (CUs) compared with the 64 CU count of the higher-spec card. That means AMD have jammed 3,584 GCN cores into the RX Vega 56 instead of the previous card’s 4,096 GCN cores.

Other than that it’s purely a case of some tweaked power settings and lower clockspeeds - everything else is identical, down to the cooling design. That means we’re still talking about 8GB of HBM2 video memory and a 2,048-bit bus. It’s the same on-die configuration that makes the Vega 10 GPU a 486mm2 mammoth (the GP102 used in the GTX 1080 Ti is just 314mm2 for comparison) with 12.5bn transistors inside it.

There’s no liquid-chilled version of the RX Vega 56, nor are there any super-sexeh Silver Shroud variants. From the looks of things, AMD only made maybe five of either anyway, and I think those just went out to friends and family…

In terms of the actual Vega GPU architecture, we’ve gone into detail about what AMD’s engineers have been trying to create in our RX Vega 64 review. Essentially, they’re calling it their broadest architectural update in five years which is when they unleashed the original Graphics Core Next (GCN) design - and it brings with it some interesting new features.

One of which is the Infinity Fabric interconnect, binding the GPU with all the other components in the chip, making it a more modular architecture than previous generations. It’s also the key factor in making multi-GPU packages in the future. There’s also the new compute unit and AMD’s high-bandwidth cache controller (HBCC) too, offering the promise of the tech improving as more developers start taking advantage of it.

AMD Radeon RX Vega 56 benchmarks

DirectX 12 benchmarking

AMD Radeon RX Vega 56 performance

AMD Radeon RX Vega 56 performance

The big challenge for the RX Vega 56 - outside of trying to magic up affordable stock for people to actually buy - is to push past the similarly priced Nvidia competition. Just as the RX Vega 64 is priced to go head-to-head with the GTX 1080, the RX Vega 56 has been released into the wild with the GTX 1070 in its sights.

And, like the RX Vega 64’s attempts to overthrow the GTX 1080, it’s kinda difficult to call an outright winner. As we said about the flagship Vega GPU, it’s an architecture which seems to have been primarily designed for a gaming future that’s yet to have been created. As such, it’s rather lacklustre in its legacy gaming performance.

With games built using the last-gen DirectX 11 API the second class Vega is left trailing in the wake of the GTX 1070. But when you start to bring in tests based around the newer DirectX 12, or Vulkan, instruction sets, Vega’s modern architectural design allows it to take the lead.

Unfortunately, despite DX11 now being very much a legacy API, with DX12 being over a year old, the majority of PC games are still being released using the older system. And that means for the majority of PC games that come out over the next six months, at least, the GTX 1070 is likely to retain that performance advantage.

Granted, Vega’s performance improvements in DX12 and Vulkan is encouraging for how it’ll fare down the line, as they become the dominant APIs, but right now anyone lucky enough to be able to find the RX Vega 56 for a decent price is still going to be paying the same money for a card that struggles to beat a smaller, more efficient, year-old GPU in pretty much every game that’s in their Steam library.

That’s not the only competition for the RX Vega 56, however, as there’s also the small matter of fratricide. As is their wont, despite the $100 price difference in their relative SEP, AMD haven’t made sweeping changes to the core configuration of the two Vega variants. Essentially, the RX Vega 56 is simply operating with 12.5% fewer cores, yet in performance terms is only ever around 7-10% slower.

That means there’s a good chance gamers will end up going for this mildly chopped down Vega instead of the much more expensive card. Or there would be if the cards were consistently available for reasonable prices. We’ve seen etailers where there are RX Vega 64 SKUs available for around the same price, sometimes less, than that shop’s cheapest available RX Vega 56.

Yeah, Vega’s weird.

AMD Radeon RX Vega 56 verdict

AMD Radeon RX Vega 56 verdict

The overall design of the Vega architecture seems to have been about laying a marker in the sand, defining a branching point for future generations of their GPU technology. They’ve almost sacrificed legacy gaming performance for the promise of future applications of its feature set. The little extras the Vega architecture has baked into it look like they could be genuinely game-changing… but only if developers actually end up taking advantage of them.

If it was guaranteed that the HBCC and Rapid Packed Math shenanigans were going to be employed across the board, and not just by AMD best-buds Bethesda, then the new Radeon tech would definitely be the one to go for over the old-school Nvidia design. But it’s not a sure thing, and we don’t really know what performance improvements these Vega features might offer either.

Wolfenstein II: The New Colossus is going to be one interesting barometer on that front, with the potential performance improvements of Far Cry 5 on AMD cards being arguably the most important metric for seeing how the ‘fine wine’ gamble of Vega will manifest itself. But that’s a way off and of no comfort to anyone looking at AMD’s latest GPUs right now.

If you were spending around $400 on a graphics card today - even if there were RX Vega 56’s available at their suggested price - it would still be difficult to make the Radeon recommendation. It’s the more advanced architecture, but in raw performance terms the smaller, slightly cheaper, more efficient Nvidia GPU is likely to get you higher frame rates in more of the games you’re playing at the moment.

If they could have priced Vega more aggressively against the still-strong year-old Nvidia Pascal architecture, then it would have a better chance of taking the market by storm, but unfortunately the price of HBM2 is a big sticking point. And with AMD reportedly losing $100 on each card sold at the suggested price, it looks like they’ve done all they can on that front.

I do like what AMD are trying to do with Vega, but with all things being equal, sacrificing current performance for the chance of higher frame rates in a few future games is likely to be too much of a stretch for most PC gamers. 

Paladins
Sign in to Commentlogin to comment
DuoBlaze avatarDave James avatarAnakhoresis avatarmikeweatherford7 avatarproject17 avatar
DuoBlaze Avatar
70
1 Week ago

Those total power draw numbers are not at all like what I've observed. I've had a vega 64 liquid card in my PC with all default wattman settings for weeks. During peak (spikes - meaning above average) utilization I still have not exceeded 475w TPD from my entire PC. With my RX480 that never exceeded 400w. Either the that 345w number is wrong or I've got some major headroom for overclocking.

As it goes right now using balanced profile the radiator fan is almost inaudible while gaming, barely beyond the noise level of my chassis fans and runs much cooler than my air cooled RX480 did.

I feel like reviewers are not being accurate with power draw, noise and heat numbers.

1
Dave James Avatar
420
1 Week ago

I may be wrong here, but I think you might be getting confused between total platform power draw and the rated thermal design point (TDP) of the cards.

The TDP of your liquid-chilled Vega 64 is 345W, essentially meaning its cooling has been designed to dissipate that level of energy use under load.

The total platform power draw shown in the benchmarks above represent the amount of energy being drawn from the wall into our test rig's PSU. That was taken using a discrete energy meter while running Battlefield 4 at max settings and at 1440p; the same method I've used for all of my GPU tests for a good few generations of graphics card.

The temperature readings are taken from Afterburner, which we leave running throughout the entirety of our testing suite to catch the maximum temperature, as well as long term GPU frequencies too.

1
Anakhoresis Avatar
578
1 Week ago

Well Tom's hardware did report that they recorded peaks from the standard air-cooled card (just the card, not including the rest of the system components) of 385w. Though their monitoring equipment is probably more sensitive to peaks.

1
mikeweatherford7 Avatar
2
1 Week ago

These cards respond extremely well to undervolting. The difference can be dramatic, higher stable clocks, at much lower temps, and lower fan levels. My Reference Vega56 can do 1550 Mhz (actual full 3d load clock) at 1040 volts, and 950 MHZ HBM clock. At these settings it's significantly faster then a 1070, on the heels of a 1080, at reasonable temps and noise levels. This result seems to be typical with Vega56 owners.

1
project17 Avatar
5
1 Week ago

i must say the rx vega 56 is an amazing card from yesterday as they had a driver problem that was fixed for web browsing. the gaming side is very good had a 1440p monitor with freesynce at 75 refresh so i cap the card at this and have had all games running high settings and only wild lands goes to about 50 fps but freesynce takes care of that as would not have noticed it i did not have fraps running. very good come back from amd

1