Consoles held The Witcher 3's graphics back, but PC gaming is far from cursed

The Witcher
Even if flawed, The Witcher 3 is still spell-binding

The Witcher 3: Wild Hunt has finally seen its long-awaited release, and by all accounts, happiness should reign. Instead, its launch has been soured by one accursed word - downgrade.

In a nutshell, the final game doesn't look as good as players were expecting after early footage, and to add insult to injury, it's because while the first two games were developed for the PC - the second one being a graphical showpiece - The Witcher 3 came to PC, PS4 and Xbox One simultaneously.

Console-ing words

Even CD Projekt has agreed, had this not been the case it could have looked a lot prettier and taken advantage of more technology. That said, they've also commented that without console versions, there wouldn't be the budget for the game at all.

It's an unfortunate situation all round, not least because The Witcher 3 still looks amazing. Stunning, actually. It's a game of beautiful animation and loving detail, of blood-red sunsets and distant glimpses at far off hills. Pixel-for-pixel it looks similar to the previous game, released in 2011, but that was a much more restricted experience in far smaller zones.

The Witcher

Resources can only be spread so far, financially or system based, and the last game still looks stunning. Not many can hold onto their looks for four years, especially when up against the might of companies like EA with Dragon Age Inquisition. To hear some people talk, you'd think that The Witcher 3 was using low-resolution pixel art. No, no, no. It's top-tier graphics, all the way down.

Just hearing that things could have been better though really spoils the experience. Suddenly every texture seems blocky, every clipped hair becomes a missed opportunity. Flat grass effects that would have gone unnoticed or even admired with a polite "Oooh" suddenly feel retrograde. Insulting, even. What is this feeble insult placed before our eyes? We were promised a literal portal into another world, goshdarnit!

Frame hate

The Witcher of course isn't the only victim of this - a more deserved one would be Dark Souls 2, where lighting and graphical fidelity was a late victim of the need for a better framerate. The problem though tends to be less a question of developers trying to deceive the public, and more that development is an inherently messy business.

Bioshock Infinite

Bioshock Infinite

Generally, when we see games years before release - the first shots, the first trailers, the E3 demos - we're actually seeing a construct. A fake, if you will. Games are generally running on a ludicrously powerful PC carefully hidden under the scenes, with the content designed to be a snapshot of what the developer intends to make rather than an actual slice of a game that by definition isn't finished. That often means changes, for good and bad.

Bioshock Infinite for instance bears almost no relation to its original demo, a high-octane chase full of reality-warping magic. The mission in the Watch Dogs trailer simply isn't the one players finally encountered in the final game, looking neither half as good, nor being half as interesting in terms of details.

Load of bullshot

It's rare though that this is deliberate, unlike other practices like rendering out super high-resolution screenshots and then Photoshopping them on top of that - a practice dubbed 'bullshots' by Penny Arcade. Live demos are intended to be what the game looks like, or at least, to set a level of aspiration on both sides of the screen. Otherwise, they'd just be renders.

After the show though, two key factors however often step in - optimisation so the final game actually run well, and whether or not that's possible, and whether or not the tricks used actually work across an entire game rather than simply an individual section. A particular rain shader for instance might look fantastic at night, but godawful on a sunny day, and need a total rewrite.

The Witcher 3

In the case of The Witcher 3 for instance, CD Projekt has discussed having experimented with two rendering systems - one looking better in key areas, yes, but the other looking better across the entire world. It's not surprising that it ultimately went for that one. Decisions like this are quietly made all the time, and we rarely ever even notice.

In most cases, there's little reason to care, save that with modern gaming culture's increased push for endless screenshots and videos, and the same images cropping up again and again, we're constantly exposed to these games to the point that by the time they arrive, it's as if we've played them already.

On PC, it's particularly irritating when a game fails to use our often mighty machines to the full. Still, we should look at the positives. Aside from the occasional coughed up console port, we're never asked to settle for crap frame-rates or 900p graphics, and tools and mods offer incredible scope for stepping into games where the developers left off - just look at what modders are doing with the creaky old GTA IV, never mind the projects already under way for GTA V.

The Witcher 3 is already getting the upgrade treatment, with its first PC patch offering some improvements, and later versions promising to allow players to tweak ini files and crank up the data. It's also of course possible to play it in 4K mode right now, provided you have a suitably nuclear powered PC at your disposal.

As E3 rolls around for another year though, it's worth remembering exactly what we're seeing - hopes, dreams, and yes, marketing. Things change, but rarely because someone in a suit is cackling at the thought of murdering gamers' hopes and dreams in two years.

As with all technology purchases, what really matters is the end result. Have the compromises and cutdowns resulted in something not worth the money? Then that is indeed a problem. When the result is still great though, perhaps it's time to cut everyone a little slack.