Intel’s new GPUs can’t afford to just be a knee jerk reaction to Nvidia… nor a Larrabee 2.0 fail

Intel Phi nee Larrabee

I go on holiday for a few weeks and the PC tech world loses its collective head. I knew this was already a pretty bizarre year for hardware in general, but coming back into the office to find Intel and AMD holding hands in future laptops, Nvidia making $1,200 Star Wars GPUs, and Raja Koduri jumping ship to help the big blue chip giant create discrete graphics cards is almost too much to bear. 

At least nothing weird’s happened in the panel market, so check out our pick of the best gaming monitors around today.

Before I skipped merrily out of the office in October, things were pretty settled: Nvidia were about to launch a GTX 1070 Ti-shaped graphics card to soak up the GPU slack created by a lack of GDDR5X memory to marry with their GTX 1080 cards, and the red team were gearing up to wax lyrical about the AMD Raven Ridge Zen/Vega APU combo. But that fairly standard fare was about all I had to look forward to upon my return.

Now it’s November, there are Christmas ads plastered on every available screen and/or surface available, and Intel have seemingly forgotten the pain of their stillborn Larrabee graphics card, and are embarking on another journey down We Can Make Discrete GPUs Boulevard. What gives? How did this happen?

In a somewhat familiar refrain, I’m going to largely blame AMD for all this. They’re the ones who shook things up by releasing a surprisingly (to Intel at least) competitive range of mainstream CPUs, a bunch of mega-core, high-end server chips, and are now promising some rather tasty mobile APUs combining Zen CPUs and Vega GPUs. They rattled Intel’s cage and the big blue had to respond.

I’m not going to pretend AMD have already stolen a whole heap of market share from Intel, in either the desktop of server markets, but they’ve got them more than a little concerned.

AMD Ryzen Ridge

You can see that in the rapidly accelerated releases (and core-count upgrades) of both the Skylake-X and Coffee Lake CPU ranges. And now in the announcement of Raja Koduri joining Intel to head up their new GPU division and “expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutionsfor a broad range of computing segments.”

But it’s not all AMD’s fault, Jen-Hsun’s got to shoulder some of the blame, too. After all, his company is ruling the roost in terms of the burgeoning sphere of machine learning and artificial intelligence with the Nvidia Volta chips’ serious parallel computing chops. That growth market is somewhere Intel’s existing silicon can’t get a foothold, even with their high core-count Knights Landing processors. And if they want to follow the money they need to get in on the AI gravy train.

Nvidia Tesla Volta V100 GPU

Nvidia have been at it for a long while now, however, and they’ve got a hell of a lead over the competition. Intel have got a lot of catching up to do, and if they’re only really starting now with a proper discrete GPU architecture, it’s going to be years before anything graphics-y from team blue sees the light of day.

But Intel could also do with a little gamer love too.

They’ve been trying to court gamers all year; first touting their tedious Kaby Lake chips as the last word in gaming CPUs, they then assumed all gamers are streamers, and tried to jam needlessly expensive Skylake-X chips down our throats, and now they’re claiming the new 900p SSDs are going to bring the Star Citizen universe to life like no other. No other what? No other high-priced storage drive which does practically nothing outside the data centre?

Gaming is a huge market and, although not in the same league as the money tree that is machine learning, PC gaming is another growth area – one of the few in PC hardware. While Intel CPUs are still the best for gamers they’re not the absolute must-have components they once were. AMD’s CPUs are almostas good for gaming and offer much greater all-round processing power for the money too.

Intel need a high performance graphics card to really capture gamers’ hearts again. Something they can point to and say ‘look, all your games look awesome because of Intel’. Shouldn’t be too hard, right? They’ve got the R&D money, in Raja they’ve got the expertise, and they’ve got the manufacturing facilities too. But then this isn’t the first time Intel have looked to make discrete GPUs… remember Larrabee?

Intel Phi... Larrabee by any other name

Those mega-core Knights Landing Xeon Phi co-processors – the ones that look like PCIe-based graphics cards – are what the abortive Larrabee GPU project devolved into. And that was only after years of failure, hidden behind lots of marketing bluster about the parallel graphics power Intel were going to unleash when Larrabee finally launched. Their discrete GPU never saw the light of day, with early samples languishing behind the ever-advancing Nvidia and ATI/AMD graphics chips of the time.

Intel can’t afford for this latest GPU venture to end up the same way. They can’t get people excited again about a third way in the graphics world only to provide chips that struggle to keep up with AMD silicon. If the writing’s on the fab wall for x86’s data centre lifespan then Intel have to properly diversify into parallel computing. This whole new venture cannot just be a sudden reaction to Nvidia; I pray it’s been long considered and has already been inked on some secretive Santa Clara roadmap for years now.

So, have Intel learned from their Larrabee mistakes? Will this time be different with Raja taking the helm? Or is this just another ill-conceived knee-jerk reaction to the increased competition from AMD in the CPU, and Nvidia in the machine learning/AI spaces? Hopefully not, as it would be great to see a third player ante up in the GPU world. But that’s why we were all so excited about Larrabee when Intel were talking that up back in the day, so it’s tough not to feel a little weary about getting our hopes up again.

Raja Groks

But then this isn’t just about making a gaming GPU. Hopefully, that’s still one of the computing segments of the ‘high-end discrete graphics solutions’ Raja’s going to be working on, but he’s almost talking about this being a project whose merest operational parameters we are not worthy to even calculate.

Maybe Raja’s going to help Intel create the first quantum GPU, maybe he’s nailing down painting images directly onto our retina, one pixel at a time. Only time will tell, but it already looks like next year is going to be just as hectic a tech year as this. Fun times ahead.