The big picture
You could argue that you're not going to want to play Counter-Strike or Battlefield on a portable console with its dodgy joysticks, but local streaming also allows you to pipe games to your HDTV. And if you've got a nice HDTV, why wouldn't you want to game on it?
Intriguingly, Nvidia's game streaming will be compatible with Valve's Big Picture Mode feature in Steam. So, you'll be able to access your entire Steam library via an interface that's been tweaked and optimised for big-screen gaming. That includes Valve's nifty daisywheel text input that makes life a whole lot easier if you're exclusively a gamepad rather than a keyboard kind of person.
However, we've not yet seen in detail how Nvidia plans to handle big-screen gaming for non-Steam titles, and whether it plans to build a custom interface or simply do the Project Shield PC Connection app writ large across a huge display.
When you start thinking along those lines, there's a sudden realisation: Build a single powerful PC and it's like you have awesome gaming rigs all over the house. You don't need a dedicated gaming PC in the living room. You just stream games to the TV. That's definitely an interesting point, because it's hard to imagine Nvidia doing anything that might encourage people to buy fewer of its graphics cards - but that's exactly what its streaming technology seems to do.
As for how AMD fits into this equation, at the moment it looks like it doesn't. AMD certainly doesn't have a device anything like Project Shield up its sleeve. And it's having a hard enough time keeping its nose above water in its core graphics and CPU competencies, without taking on the challenge of developing local streaming games tech as well.
If anything, the notion of games streaming just underlines how tough the technology industry is if you start to fall behind. If the green one's game streaming via the PC takes off in a big way, it's difficult to see how AMD is going to respond.
That said, there could be one big barrier to success for Nvidia. We've a feeling that it may somehow limit this local streaming tech to devices powered by Nvidia Tegra chips. If so, the appeal and utility will be massively reduced.
Nvidia has a track record of attempting to run that kind of walled garden. Here's hoping they realise how much better a technology it will be if it's made as open as possible. After all, you'll still need an Nvidia GPU in your PC, so it's a market share-boosting play at the very least.
Up in the clouds
If that's local game streaming and Project Shield, what about Nvidia's plans for cloud gaming? Let's be honest, OnLive and Gakai have hardly set our gaming pants on fire. They're interesting as a glimpse of one possible future, and it's kind of impressive what's been achieved. But so far, it's not nearly been good enough to tempt us from our PCs.
But if anyone can do that, surely it's Nvidia? To that end, Nvidia has designed a server specifically for cloud gaming, known as GRID. It's a self-contained, ready-to-roll unit in a standard 2U form factor. Each server has 12 Nvidia GPUs, and each one is capable of supporting two users in parallel. The servers can be had in racks, with 20 per rack and, thus, 480 users per rack.
Oh, and each server soaks up 800 to 900 watts. So power consumption is what you might call non-trivial - it's as much as 18,000 watts per rack. Nvidia reckons the system works best with a management server for each rack of 20. The OS is Windows Server, so we're talking standard PC games taken right off the shelf.
To combat lag, Nvidia is attacking the problem from multiple angles. For starters, it encodes the output video locally on the server. OnLive, by contrast, shunts that job onto an external server. Nvidia reckons doing the job locally saves 20ms.
Nvidia also claims that graphics driver optimisations enable its video capture to shave 10ms off the competition - so we're already talking a 30ms advantage. If you have an Nvidia GPU of some kind on the client machine displaying the game, take off yet another 10ms. It's beginning to add up.
But wait, there's more clever streamlining going on. Nvidia has designed GRID to give the player as much of a consistent gaming experience as possible. That means no change in the resources available to each user, no matter how many are hooked up.
There's no dynamic allocation of processor cores or memory space based on the server load. That's inefficient if you're playing the normal cloud-based numbers game and looking for as many users as possible, but it becomes critical when the intention is to pump out games in real time.
As for bandwidth, Nvidia reckons a 6Mbps connection is good enough for 720p gaming. There's no word yet on how big a pipe you'll need for 1080p, but it's roughly double the pixel count of 720p, so something north of 10Mbps seems reasonable.
However, one thing GRID isn't, is a cheap ticket to super high-end visuals. Pinning down an exact equivalent in terms of GPU performance - especially when comparing a desktop PC to a GRID user - isn't straightforward due to the sharing technology. If you take Nvidia's claims of 200 TFLOPS per rack and extrapolate the numbers, you're looking at a low-end experience - perhaps as shonky as a GeForce GT 640.
The reality will probably be a bit better than that, since the raw TFLOPS measure doesn't take into account things like memory bandwidth, which is a real killer on low-end desktop GPUs. But the overall implication is that even with online streaming delivered by GRID servers, having your own desktop will still be by far the best way to get great graphics.
And that's why of the two models being proposed, it's the local streaming using your own PC that's by far the most exciting prospect, at least in the short to medium term. One day in the distant future, no doubt it'll all be in the cloud - the graphics will look pre-rendered and the PC as a gaming device will be dead.
But for now, having a single PC delivering high-end visuals to devices all over your house seems like one hell of an idea. Local game streaming - it makes all of your devices look awesome.