AMD RX Vega - Frontier Edition's bizarre pre-order pricing makes for some expensive cooling

AMD Vega GPU release date

Update June 19, 2017: AMD's Radeon Vega Frontier edition is only eight days away from its June 27 launch date, and over the weekend pre-order pages went up on a couple of online retailers. And, oh boy, it ain't going to be cheap... especially as you might be paying $600 just for water-cooling.

Want to know what the best GPU is right now? Check out our guide to the top graphics cards available today.

The standard air-cooled version of the 16GB professional AMD Vega card was posted on SaberPC for $1,200 with the water-cool edition being available for pre-order at $1,800. Now, it's worth remembering that these are high-end workstation cards, designed for content creators and developers rather than a GPU for you to jam into your rig in preparation for Far Cry 5.

The listings have since been switched around so they no longer display the price and are just inquiry pages now, so it's entirely possible the prices aren't 100% accurate and were just best guesses. My scepticism mainly stems from the fact there is a $600 price delta between the air- and water-cooled versions of the card, despite them both having the same 16GB of HBM and what look like identical GPUs.

But damn, that's a lot of money if they are true. And, with AMD's Raja Koduri saying there will be gamer-focused RX Vega cards which will be quicker than these, it does call into question the veracity of the rumour pricing we heard about previously. 

I can't see the top-end RX Vega card retailing at $600 if the professional version is at least twice that price. Still, we won't have long to wait to sort the truth wheat from the rumoured chaff, as the RX Vega cards will be officially unveiled at the SIGGRAPH event kicking off on July 30.

Original story, May 23, 2017: The AMD Vega GPU architecture is the next generation graphics silicon the Radeon-red team are working on for release early this year, with the flagship Radeon RX Vega promising to deliver AMD a graphics card that can finally compete with the very top-end of rival Nvidia’s GPU stack.

AMD have announced the naming scheme for what will probably be the flagship Vega card - the imaginatively-titled Radeon RX Vega. It's likely to be the direct replacement for the R9 Fury cards from the last generation of high-end AMD graphics cards.

In 2016, AMD promised their Polaris graphics cards would bring their Radeon graphics cards back into the game. But while the RX 480 and RX 470 have shown impressive DirectX 12 performance against the mid-range GeForce-shaped competition, AMD have yet to release a high-end card to give them genuine 4K gaming.

This is where the AMD Vega GPU architecture comes in, aiming to jump in at the high-end and providing the Radeon faithful with a serious GTX 1080 Ti contender.

Click on the quick links below to jump to the different sections.

AMD Vega release date

AMD Radeon Vega Frontier Edition release date

AMD have just announced the Radeon Vega Frontier Edition will be launched on June 27 this year, with the gaming RX Vega line launching at SIGGRAPH 2017 at the end of July.

AMD previously confirmed they would be releasing Vega graphics cards in the first half of 2017 and, seeing as they’ve already shown a working Vega 10 GPU at last year's New Horizon event for their Ryzen CPUs, it doesn’t look like it'll be long before the full stack gets released.

AMD also announced the Vega GPU in their professional-class Radeon Instinct deep learning accelerators at the start of December 2016. The top-end Instinct card is the MI25, which is a Vega Frontier Edition by any other name, and likely represents the fastest Vega can go right now.

AMD Vega architecture

AMD Radeon Vega Frontier Edition specs

AMD lifted the lid on their new Vega GPU architecture in a series of short videos on their YouTube channel. If you don't want to listen to a load of AMD marketing folk we've distilled their tech-essence down for you.

AMD's Scott Wasson is calling Vega "the biggest improvement in our graphics IP in the past five years." And given the few details they've given away today that doesn't look like too much marketing hyperbole.

The redesigned geometry engine in the Vega GPUs is promising much higher performance. "It now has the capability to process more than twice as many polygons per clock cycle than we could in our previous generation," Wasson explains.

But it's the High Bandwidth Cache and High Bandwidth Controller silicon which looks the most exciting and that's all related to moving outside of the limits of the graphics card's video memory. In normal GPUs, developers have to fit all the data they need to render into the frame buffer, meaning all the polygons, shaders and textures have to squeeze into your card's VRAM.

That can be restrictive and devs have to find clever workarounds for large, open-world games. The revolution with AMD's Vega design is to break free of those limits. The High Bandwidth Cache and High Bandwidth Controller mean the GPU can now stream in rendering data from your PC's system memory or even an SSD, meaning it doesn't have to come via the card's frame buffer.

AMD Vega High Bandwidth Cache and Controller

"You are no longer limited by the amount of graphics memory you have on the chip," Wasson explains. "It's only limited by the amount of memory or storage you attach to your sytem."

The Vega architecture is capable of scaling right up to a maximum of 512TB as the virtual address space available to the graphics silicon. AMD are calling Vega 'the world's most scalable GPU memory architecture' and they look on the money with that claim at first glance. But it will still depend on just how many developers will jump onboard with the new programming techniques when Vega launches. 

AMD Vega architecture features

This could potentially mean the recently-announced 4GB versions of AMD's upcoming RX Vega graphics cards won't suffer from having such a relatively small amount of video memory.

It might well get complicated recommending minimum GPU memory capacities in the system specs of new games once the RX Vega cards launch...

The new chip design also allows for more concurrency in processing non-uniform graphics workloads. Previous designs potentially left large chunks of silicon idle when the GPU was processing smaller operations, bottlenecking the graphics system. The new NCU design, though, is meant to allow parts of the GPU to work on smaller operations when there is spare capacity, meaning that there shouldn’t be as many wasted, idle parts of the chip. 

This should mean more work gets done in the same amount of time as previous GPU designs. How much impact this will have on gaming workloads is difficult to say, but it could end up being important for the lower level APIs like DirectX 12 and Vulkan.

AMD Vega specs

AMD Vega specs

The AMD RX Vega cards will appear as standalone named cards sitting outside the rest of the Polaris-based RX 500 series. Those essentially rebadged cards are slightly tweaked Polaris GPUs with a very modest boost to their clockspeeds. The RX Vega cards though are completely different beasts from GPU to memory architecture.

The first Vega GPU, the Vega Frontier Edition, will use Vega 10 silicon and sees AMD coming out with a high-end halo graphics card first, with the RX Vega-branded versions coming later, in the same way the GTX 1080 Ti followed Nvidia's Titan X.

The Radeon Vega Frontier Edition, then, is going to be AMD's answer to Nvidia's Titan cards, offering pro-level specs in a card that's not really designed for the consumer, but for the creators. That said, I bet there's going to be a fair few well-off AMD fans dropping the cash on a Frontier Edition in the same way the Titan GPUs have always found their way into gaming rigs.

That’s a 16GB card, sporting second-gen high bandwidth memory (HBM2), which will give it an insane level of video memory performance. We’re expecting a 2,048-bit memory bus with bandwidth around the 512GB/s mark. 

AMD Vega die shot

The consumer-facing versions of the Vega architecture, though, seem to have been announced in more detail from a rather unlikely source. Apple announced the Radeon Pro Vega at a recent event, introducing the GPU which is set to make their new iMac Pro the fastest all-in-one machine that's ever spat out 1s and 0s. And if these aren't the specs for July's first two RX Vega cards I'll eat a Frontier Edition...

For their part, the fruity Mac gang are getting two flavours of Vega, the Radeon Pro Vega 56 and Radeon Pro Vega 64.

AMD Radeon Pro Vega

The numbers refer to the amount of next-gen compute units the different Vega cores will contain, which means the top-end chip will house 4,096 cores and the lower end chip will have 3,584 cores. They will also utilise different levels of HBM2 memory too, coming in both 16GB and 8GB flavours.

It's probably not much of a stretch to think these will be the essential specs of the RX Vega chips which are due to be launched at SIGGRAPH next month. AMD habitually release their GPU cores in pairs so having a 64 CU and 56 CU version for us PC gamers at launch wouldn't be much of a surprise.

The latest rumours show three RX Vega cards designed for the consumer sitting underneath the semi-professional Vega Frontier Edition. At the top is the RX Vega Nova, followed by the RX Vega Eclipse and finally the RX Vega Core.

AMD Polaris 10

AMD have said their GCN architecture can be configured to work with both HBM2 and GDDR5, so it's possible AMD will hold the top memory tech for just the top two cards in their RX Vega 10 GPU stack, leaving the cheaper GDDR5 to cover the potential RX Vega Core edition.

There have also been reports of a Vega 11 GPU too - though the rumour mill has been surprisingly quiet on that front recently. It’s possible AMD have decided the Polaris 12 GPUs will shore up the bottom end of the new 500-series range and we won’t see a Vega 11 chip until later on.

There have though been earlier rumours of a Vega 20 GPU that AMD are working on with GlobalFoundaries. The Vega 20 is reportedly going to be a 7nm GPU releasing in 2018 with up to 32GB of HBM2 and memory bandwidth of a ludicrous-speed 1TB/s. There are also rumours of it sporting 64 NCUs, support for PCIe 4.0 and coming with a teeny tiny 150W TDP.

With GlobalFoundaries looking to hit 7nm production in 2018 that part at least looks plausible. The rest? I’m not so sure. It looks like very wishful thinking to me. 

Given AMD have shown roadmaps with Vega sticking to 14nm with a 14nm+ refresh to follow, it looks unlikely we'll see 7nm Vega. That's more likely to come with the Navi GPU architecture which is to follow it.

AMD Vega price

AMD Vega price

As well as outlining the different levels of AMD RX Vega GPU the recent leaks have defined pricing for them, too. The top-end RX Vega Nova is rumoured to be released at $599, around $100 less than the competing Nvidia GTX 1080 Ti. Whether that means it's $100 slower we don't yet know...

The rumours also put a $499 price tag on the middle-order RX Vega Eclipse and the RX Vega Core at $399. That will leave the RX 500-series cards looking after the mainstream, sub-$250 level with Vega shoring up the high-end for the first time since the Fury X in 2015.

With the recent pre-order pages for the Radeon Vega Frontier Edition going live in mid-June we've had our first glimpse of the potential pricing of AMD's top-end workstation GPUs. And it's rather wallet-frightening.

The standard air-cooled version was posted at $1,200 with the all-in-one water-cooled edition priced at $1,800. It's worth pointing out these seem to have been pre-order pages published without AMD's consent as they were subsequently switched into inquiry pages for the new cards without pricing. So I'm not 100% convinced of the veracity of those two prices.

The $600 delta between the air- and water-cooled versions seems incredibly steep given they are both 16GB versions with identical GPUs. That's some expensive coolant if the prices are true.

AMD Vega performance

AMD Vega performance

AMD's Raja Koduri took part in a recent Reddit AMA where he confirmed there will be RX Vega cards which outperform the recently announced Radeon Vega Frontier Edition.

He was asked directly if the consumer RX version of the Vega GPU would be as fast as the Frontier Edition and responded with: "Consumer RX will be much better optimized for all the top gaming titles and flavors of RX Vega will actually be faster than Frontier version!"

Koduri also answered an earlier question about a 16GB variant of the RX Vega and gave a tantalising "we will definitely look at that..."

Considering they haven't officially announced any final specs for the gaming-focused versions of the new cards that would seem to suggest there will indeed be a 16GB RX Vega card. And if it's going to be quicker than the Frontier Edition we might be looking at a 1,600MHz GPU too.

He also confirmed Vega would be their first Infinity Fabric GPU, which is likely how the two components of the upcoming Raven Ridge APU will talk to each other. With Vega utilising the same Infinity Fabric interconnect which allows the two Zen modules in the Ryzen processors to communicate at high speed with minimal latency it's not beyond the realms of possibility that we'll see multiple Vega GPUs connected via the Infinity Fabric on a single board.

"Infinity Fabric allows us to join different engines together on a die much easier than before," Koduri explains. "As well it enables some really low latency and high-bandwidth interconnects. This is important to tie together our different IPs (and partner IPs) together efficiently and quickly. It forms the basis of all of our future ASIC designs. We haven't mentioned any multi GPU designs on a single ASIC like Epyc, but the capability is possible with Infinity Fabric."

AMD also showed four Frontier Editions running with a 16-core Threadripper making mincemeat of some seriously high-end graphics workloads at this year's Computex show in Taiwan. 

Four Radeon Vega cards with Ryzen Threadripper

We’ve also seen quite a lot from AMD’s new Vega GPU in benchmark form so far. The 8GB HBM2 version of the GPU has been shown in public at the recent New Horizon event, where it was running a Ryzen-powered gaming rig with the new Star Wars Battlefront Rogue One DLC. It was playing the game at 4K and was able to keep running consistently at over the 60fps mark.

At the recent AMD Tech Summit in China AMD showed a demo of Deus Ex: Mankind Divided running with the high-bandwidth cache controller off and on side-by-side. The GPU-intensive demo showing a 50% improvement in average frame rate and 100% higher minimum frame rates.

AMD have also shown Doom running at 4K using the flagship Vega graphics card. Running at the game's Ultra graphics settings the frame rate is shown at around 70fps with a few dips below 40fps here and there. That's not far off GTX Titan X performance - no wonder Nvidia is waiting for Vega to release before launching the GTX 1080 Ti.

The demo of the unreleased Vega card was running the Vulkan version of Doom live at the Ryzen event and was shown outperforming the GTX 1080 by 10%. That demo also confirmed the 687F:C1 device ID for the Vega GPU. If that sounds familiar it’s because that designation was seen in the Ashes of the Singularity benchmarking database recently, as well as a C3 revision, offering performance around the GTX 1080 mark too.

That device ID has appeared again in a recent 3DMark Timespy benchmark result found online. This appears to be the 8GB version of the RX Vega, but instead of the GTX 1080 performance we've previously seen, the DX12 benchmark seems to show it performing at GTX 1070 speeds.

That’s maybe a little disappointing at first glance, but these are all benchmarks running on unreleased, unoptimised drivers. The clockspeeds for the leaked benchmarks have the RX Vega GPU running with a boost clock of just 1,200MHz which puts it a far cry from the 12.5TFLOPs the peak Vega GPU is capable of.

It’s been reported that the Doom benchmarks were run on a slightly modified driver for the old Fiji GPUs so there is potentially more headroom left to come from the first Vega GPUs when they do finally launch.

The big-boy Vega 10 chip, the one that’s meant to be based on the Radeon Instinct professional MI25 card, could potentially hit 12.5 teraflops of single precision processing. The GTX Titan X runs to a little under 11 teraflops for its part, so even if AMD releases the card with a slightly cut-down GPU compared with the one in the expensive MI25 it may still have a version able to compete with both the Titan X and the GTX 1080 Ti.

AMD Radeon Vega Frontier Edition reveal

The most recent performance indicators, though, have come from the recent financial analyst day at AMD HQ. Here they showed Vega's 4K capabilities with Sniper Elite 4 running on a single GPU bouncing around the 60 frames per second mark. That's key because Raja Koduri made that a target for Vega from day one.

"One of the goals for Vega is that it needs to cross that 4K/60Hz barrier, a single GPU that can cross the 4K/60Hz barrier," he explains. "That seemed quite daunting two years ago."

Now, we've already seen Vega running Star Wars Battlefront at 4K/60Hz so the fact it can do the same with Sniper Elite 4 shouldn't really come as much of a surprise. What was more revealing from last night's demos, however, was the showcasing of what Vega's high-bandwidth cache controller might be able to offer the games of tomorrow. We have seen HBCC in action with Deus Ex: Mankind Divided at a previous event in China, but the Rise of the Tomb Raider test shows an even greater performance improvement.

The HBCC tech baked into Vega allows the card to stream in larger pools of data from the PC's system memory or even from an attached storage device. Right now that's incredibly useful for professional GPUs using massive data sets, but in-game, not so much. 

"Today's games are written for today's memory constraints," Koduri explains, "so we don't see many games cross the requirements for 4GB... We are building this architecture not just for one generation."

AMD Vega High Bandwidth Cache Controller demo

In order to show what HBCC can offer games AMD showed a version of Vega limited to just 2GB of video memory where one card had HBCC enabled and one with the memory caching tech turned off. This was to simulate where the new technology can help out when the frame buffer is being maxed out.

The Rise of the Tomb Raider demo showed a massive difference in both the average and minimum frame rates outputted by the same spec of GPU. It's the minimums that are the most interesting part of this, though, with the 2GB card without HBCC bottoming out at 13.7fps while the equivalent GPU running the new HBCC tech scores 46.5fps. That's between 2x and 3x the minimum frame rate which will have a huge impact on just how smooth a game feels to play.

"This is going to be a big deal," Koduri explains, "when we put it in the hands of both game developers and gamers."

And he could be right... if those game developers are given enough incentive to code specifically for this AMD-only technology. As Koduri says, current games are not designed to take account for this advance, so AMD are going to have to give them a solid reason to do so.

War Thunder
Sign in to Commentlogin to comment
SkankwOn avatarDave James avatarKeyvan avatarShriven avatargamertaboo avatarfuchs avatar+25
SkankwOn Avatar
83
6 Months ago

I've pretty much tied myself to Radeon GPUs after buying a Freesync monitor, so I've been awaiting Vega with hope and interest.

Could even be time to upgrade CPU/Mobo/RAM, as I'm guessing my trusty old i5 2500K (at 4Ghz) will likely bottleneck the GPU.

Ryzen perhaps. We'll see on both fronts.

6
Dave James Avatar
293
6 Months ago

Genuinely really excited about what AMD could be like in 2017. A serious Vega-based RX Fury and 16-thread Ryzen CPU could make for a stunning rig - and will make your Freesync monitor shine :)

5
b¤cek Avatar
1
2 Months ago

i'm also tied due to uncertainity. i'm still using i5 3570k with gtx660 on a 1080p display. wanna change them all and performance and prices of vega vgas will be the decision maker for the trio.

1
NandorHUN Avatar
1
2 Months ago

I see no reason to upgrade from the 2500K, if you are a 1080p player just OC it to 4,5Ghz and you are good to go, or change it to 2600K, today's games will benefit from the 2600K, and it is still a much better buy then any other new CPU.

If you are a 1440p or 4K player, then there is no reason to change the CPU, because bottlenech only ocures if you play on high end GPU and 1080p, so the CPU can't handle the fps. In case of 4K and 1440p there is bigger stress on the GPU, because of the less fps there is no bottleneck.

That's why I bought a new 1440p monitor instead of an upgrade from my 2600K.

You should only consider an upgrade if you are playing on 1080p AND 144Hz freesync monitor.

Other than that a new CPU is just a waste of money.

0
Salty Mac Avatar
6
2 Months ago

you explain it very odd but you are right for the most part. all except the very beginning.

if he is playing on 1080p and upgrades to high end vega the 2500k will 100% bottleneck the gpu. i would guess even a 2600k would bottleneck if he went high enough.

also a new cpu is not a waste of money if you are on the 2500k. a new cpu/motherboard/ram would give many new features, speed, reliability, cooler, and energy efficiency.

3
SkankwOn Avatar
83
3 Weeks ago

Interesting replies from both 'Salty Mac' & 'NandorHUN'.

For the record, I'm gaming on a 1440p/144Hz monitor. Right now, the only game that I've actually struggled to run is PUBG (which is still in Early Access anyway).

Concerning the news (or lack of) today of Vega's two-month wait. Well, it's a bit disappointing that AMD is still lagging behind NVIDIA when it comes to the higher end cards. VOLTA could well trump VEGA straight off the bat ... :(

1
vajjala1986 Avatar
4
4 Months ago

Freesync monitors are much more cheaper than Nvidia G-Sync. Unncessarily, we have to pay 200$ on average to get a G-sync over freesync monitor. If AMD puts a Vega card with performance between between GTX 1070 and 1080, but with a price tag of GTX 1070 - then I am buying it. Otherwise, I will wait for GTX 1070/80 price drop.

4
Salty Mac Avatar
6
2 Months ago

hey you got your 1080 price drop, you getting one??

1
Keyvan Avatar
11
5 Months ago

If AMD can pull this, then it'll be a nightmare for NVIDIA on the gaming market. They have more FreeSync screens, a more open standard, and better value for the money. I've said this before though and have been disappointed. That's why I'm saying "IF" they can pull it off... they still have to get the DEVs onboard for programming in the new offerings.

2
fuchs Avatar
1
4 Months ago

deep learning is more important than rendering some random game.

1
RanC Avatar
4
2 Months ago

And documentaries are more important than porn, but I think both of us know which one is the money maker as of right now. If you're short on cash and it makes money, you simply don't say no to it. They aren't mutually exclusive either. Neither do you say no to superb profit margins if you want to do expensive R&D. So what was your point again?

Besides, rendering some random games is what paved the road to HPC, deep learning, AI etc for Nvidia.

2
gamertaboo Avatar
3
5 Months ago

Can't believe they are going to wait until may to release Vega. They are out of their minds waiting that long. They should just release it alongside Ryzen in a month.

2
dwearle1 Avatar
1
4 Months ago

As this article indicates, it may just be a strategic marketing move on BOTH GPU manufacturers to wait for the other one to drop their next-gen GPU - if they drop too soon, they could be shooting themselves in the foot, so to speak.

1
RanC Avatar
4
2 Months ago

Yeah I don't really see any reason to hurry. From what I can tell, AMD is promising a product that is essentially a 1080 Ti but with some features which Nvidia decided to just not bother consumers with and instead push them for Volta (but Pascal workstation cards have them though, looking at you Unified Memory vs. High-Bandwidth Cache). I honestly they did it out of greed and because they didn't perceive AMD as a big enough threat... until suddenly you start seeing GP102 chips flooding the market without being neutered, at a price you supposedly just can't refuse. AMD has only one chance to do this right, better safe than sorry anyway. It might be the comeback of the decade or the biggest disappointment of the century. If there's any room for something between those two options, I think it's going to be pretty narrow.

I'm not particularly on either side of the fence, I admire both companies currently for very different reasons though. I have to say though, it's been way too long since I last owned an AM.. ATI card. It's been way too long since I updated my gpu too, but hey, it's not my fault Nvidia can't (won't, really) offer a reasonable upgrade for 780 Ti which has gotten me this far reasonably well.

2
Windows 10 Avatar
2
1 Month ago

If there is a card going against the GTX 1070 it cannot be over $400. There are a lot of 1070's dipping below $400 on amazon right now.

2
0V3RKILL Avatar
293
1 Month ago

all I'm going to say is that I am glad I waited. time to replace this twin frozr 290x. its been very good to me I have to say

2
SkankwOn Avatar
83
3 Weeks ago

We still gotta wait my friend ... 2 months!

I have the MSI Gaming Twin Frozr 290X also.

1
Death of Chaos Avatar
2
1 Month ago

Seeing as how there's a 500Mhz difference in clock speed between the chip tested and the 1080, I'd say that's not too bad. If this is a competing card for the 1070, that's still impressive in it's own right because it's hitting roughly the same scores as a 1070 at a clock speed lesser to it by 400Mhz. Would that mean it would be less hot and more power efficient than a 1070? Obviously that isn't taking into account the build of the card, it's possible that it could run hotter and all that at lower clocks. I'm still looking forward to what the actual, final card will present us with.

2
Shriven Avatar
3416
5 Months ago

I just worry about Nvidia locking off driver development access to certain titles before release. Shady shit, but not having a working driver on launch day is putting me off AMD. That, and past experience.

1
Dave James Avatar
293
5 Months ago

The recent AMD drivers have been really solid and they've also been a lot better at getting launch day driver fixes out too.

I reckon if they can get the performance at a good price they deserve to do well.

1
Recko Avatar
1
4 Months ago

AMD drivers and there Relive software have actually surpassed Nvidia now. AMD has also been releasing drivers on launch days but more often they release them before game launch day

1
joki8688 Avatar
2
4 Months ago

GPU as good as 1070, maybe 1080 (in the real applications) released a year later, when Pascal will go even cheaper, and NVIDIA will release faster version of Pascal...

And if you think that those cards will be cheaper than a year old Pascal, you are CRAZY (new type of memory for ex) . AMD was always cheaper for a reason. Because they had old and slower products. In the best scenario we will have the same products, for the same price, just in different colors...

I'm sorry, but I don't care about Vega.

1
jr2 Avatar
3
3 Months ago

Cared enough to troll...

3
meLAW Avatar
1
4 Months ago

Mhm, I wonder if it's really that smart to bring the top-tier model first, then fill up with the lower models afterwards. Yes it promises more hype and more profit right away, but that strategy also demands fully operational chips prior to launch. Whereas the other way round, optimisation in the fabrication process can run parallel with the launch sequence.

1
Bitdestroyer Avatar
2
4 Months ago

That is specifically why they released the 4xx series as budget cards... they already have offerings to compete at those levels.

2
Ramboy Avatar
1
3 Months ago

My ancient gtx560 died recently (not totally). I can still play some games but on low settings. Im looking for a replacement card. Should I go buy a rx480 8gb now or should I wait for the 500 series. And also when is the release date of the new radeon series?

1
7UKECREAT0R Avatar
1
2 Months ago

I would say choose if you want a budget card right now (rx480) or if you want a more expensive, more powerful card, wait for the 500. You probably already have a new one, but just helping :)

1
Salty Mac Avatar
6
2 Months ago

yea even if you got a good 480, the 580 is not a big jump in performance. hope whatever you got is making you happy!

1
FC_Nightingale Avatar
1
3 Months ago

Well the 1080TI was announced and drops in a couple days, what's your answer AMD??

1
dmoody19 Avatar
1
3 Months ago

THe answer is "YAWN... HO HUMMMM....." SO? lol

1
ŊU | Xxx Avatar
1
3 Months ago

Waiting in patience, promised myself never to go amd again 8 years later i now own freesync screen thank to Nvidia GREED and waiting for new Gpu release :) Not going back again before amd really dissapoint me again wich i hope then don't.. Looks like AMD have done a great job now and also compete on cpu's too.

Give us release date :)))

1
ju-uh77 Avatar
1
1 Month ago

I agree with you on nvidia greed, Be a cold day in hell before i ever spend the cash they want for a Gsync monitor. One the other side I also think Amd was dumb as shit for giving away the freesync, They should have charged 45$ for it so it didn't adjust the price of the monitor much and they could have made some extra cash for R&D rather then giving the chip away and all the monitor makers getting the chip free and still jacking up the damn prices. I like the free mindset but have seen over and over again it either not get adopted or some other company get fat off their freebies.

1
wiesner8 Avatar
1
2 Months ago

does this mean i need to sell my dual R9 Fury X cards :(

1
Salty Mac Avatar
6
2 Months ago

no way, dual r9 fury x do some work in games.

1
[HFA]Dragonstongue Avatar
4
2 Months ago

did you seriously state "WFCCTECH as from the always trustworthy" LMAO, even a blind squirrel can manage to get a few nuts now and then, I would not trust one to feed me however.

1
Dave James Avatar
293
1 Month ago

Hehe, no it really wasn't serious. Was hoping folk might sense the sarcasm there ;)

1
daroule1982 Avatar
1
1 Month ago

Couldn't of happen at a better time. I just went to shopping for a new gpu after I found out my GTX 750ti wasn't so great for game modding. This is going to arrive just in the nick of time!

1
Teemu Avatar
1
3 Weeks ago

It is an overheater More watts than Ti. Only the basic cooler of the Ti. Why wouldn´t the overheating issues of Ti limit Vega?

1
UbajaraMalok Avatar
1
3 Weeks ago

Those guys at amd are fucking kidding! No vega until the end of july!

1