The Intel Developer Forum is a key event in the technology calendar. As the name suggests it's primarily focused on helping developers learn to use Intel's hardware, but it is also an opportunity for Intel to drum up interest for its latest hardware and technological advances.

Importantly the event gives all those that attend a glimpse into the future – what's about to be released and what's on the horizon for the coming months and years.

There's always a theme at IDF; sometimes it makes sense (Visual Computing always seems quite catchy), this year it was about "Building a Continuum of computing", whatever that means.

We were there and we're still not sure what that actually means beyond a trendy marketing term for letting Intel create everything you use. This is something that is becoming increasingly more likely as the chip giant branches out from straight CPU manufacturer to more consumer-focused devices such as mobile phones, graphics card, set-top boxes and even software.

One of the highlights of the show as far as the "continuum" is concerned was a mobile phone running Moblin v2.1 that would make any iPhone fanboy take a second look.

Capable of running several applications at once, but boasting a user interface far more friendly than anything that's come out with a Microsoft badge attached. The phone showed the real potential of what a Linux-based operating system backed up by Intel's cash could be capable of.

Larrabee in the flesh

This year's IDF saw the processor giant finally showing working silicon of its hotly-anticipated graphics core, codenamed Larrabee. Details of the actual GPU ranged from sparse to none-existent, but the fact that Intel has working silicon is a sign that it's still on course (albeit a revised course) to have cards on shelves some time in 2010.

The test was shown running on Gulftown too, Intel's six-core (12 threads) 32nm Nehalem processor that offers a high-performance upgrade path for owners of X58 motherboards. The brief demonstration used data from the ageing Enemy Territory: Quake Wars to render a scene in 'real time' of ships flying above a rusting boat barely floating on ray traced water.

Bill Mark, who was brought on stage for the demonstration, said that it only required ten lines of code to render the reflections, which is certainly impressive enough. Larrabee exists then, and for that it can be proud.

It's a shame then that the demo was so slow... and well... unimpressive. Apparently this is partly because the code wasn't specifically developed for Larrabee at all, but ported across so that there was a demonstration that it does work. It's not so much what was shown, more what wasn't.

Real-time ray tracing may be a funky demonstration of the power available in multi-core processors, but as far as a graphics card is concerned it's about as far removed from the gaming capabilities as...well, PhysX.

Importantly we still don't know whether Larrabee is going to be able to deliver anything playable. With the launch of DirectX 11 already upon us, it would have been good to see how Larrabee handles the swathe of DirectX 10 games that are out there. Even DirectX 9 would have been better than some proprietary engine, regardless of how slow it was.

There was no frame counter, but as a guess I'd say it was barely into double figures. For Larrabee to really be taken seriously, Intel needs to show that it can play current games at a reasonable frame rate.

Ideally it would be the fastest graphics solution out there, but realistically that might be a little hard for Intel to manage given that AMD's latest graphics cards have just hit the shelves, and that by the time Larrabee does find itself on the shelves Nvidia will have its DX11 part out there too.

Larrabee isn't dead in the (ray traced) water, but it's hard to tell at this point whether it's waving hello, or drowning on its own ambition.

Talking up the Digital Home - again

Intel has had a digital home strand at IDF for years now, but this year is probably one of the most impressive as far as real hardware is concerned.

You can forget your social science experiments, and the idea that we'll all be carrying cumbersome MIDs around so that content is aimed specifically at whatever is currently on our music lists, and instead focus on the real need to have a set-top box that is capable of playing back HD television.

A set-top box that has a good interface, can handle multiple HD video streams, and also has a few aces up its sleeve. Intel's first platform to answer this problem, the CE3100 effectively got the ball rolling for how an Intel-based machine could serve up television and it's this platform that was on display playing games.

Even so, there's an update already on the way. The new Atom-based CE4100 is a much more powerful platform for both handling the raw video streams and also for combining system created graphics with that content – both for high resolution program guides and for far more interesting interfaces.

Intel demonstrated the capabilities of its new CE4100 platform playing back movies and TV shows using an interface that you'd actually want to use – streaming video wrapped around smooth 3D surfaces and a voice recognition navigation system that worked without speaking at a word an hour. It was good, really good – and was demo'd with the help of Lavar Burton (Geordi La Forge from Star Trek).

You could question the need for having a graphics core on a glorified PVR, but it does enable Intel and system builders to create some funky effects and interfaces. A large drum wrapping videos onto curved surfaces produces some appealing visuals, while TransGaming used the graphics capabilities of the CE3100 to run older PC games. World of Goo was shown off as an example of what is possible, with Popcap on show too.

Admittedly such games aren't that demanding, but are great examples of titles you'd actually want to play in front of your TV.

And the future?

Intel ended the show with Justin Rattner continuing the television theme, but this time with an added dimension. Just about every tech event seems to be wheeling out 3D screens at the moment and IDF is clearly no different.

It's a shame then that it still feels so utterly clunky: not only do you have to wear silly glasses in order to appreciate the effect, but the non-aliased resolutions are woeful, the actors looks like animated cardboard cut-outs and the levels of depth just aren't subtle enough. Do people really want to watch American football in 3D? U2 live in concert? Intriguing for a few seconds, but it quickly feels like a gimmick.

As Sean Maloney stated in the second keynote, this IDF was more about integration than innovation – tweaking and refining the underlying technologies that will form the basis of future platforms and tech rather than introducing lots of new chips and systems.

Not content with showing off several of its next generation chips based on its 32nm process (Clarkdale, Allandale and Gulftown), Intel is well on the way to hitting its 22nm production process too, holding up a wafer of SRAM chips boasting 2.9 billion transistors apiece. It's already tackling the challenges that 15nm offers up as well.

Even so, this IDF drummed up more questions than it answered; can it trump the phenomenal architecture that is Nehalem? Will manufacturers start using some of its other technologies for phones and set-top boxes? And seriously, does Intel think Larrabee will offer up a real challenge to Nvidia and AMD? We'll see…

-------------------------------------------------------------------------------------------------------

First published in PC Format Issue 233

Liked this? Then check out Intel demos Larrabee chip

Sign up for TechRadar's free Weird Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register

Follow TechRadar on Twitter