The first service pack for Microsoft Surface, announced this week at Microsoft's TechEd conference, includes interface improvements and ways to make building Surface applications easier. It should also bring Surface-style apps and common gestures to Windows 7.
Surface can run multiple applications and services at once, but until now there hasn't been an easy way to tell if a background app wants your attention; SP1 adds a notification on the side that doesn't get in the way of what you're currently doing.
Article continues below
It also makes Surface easier to set up (you can calibrate without needing to plug in an external display) and to manage (it uses Windows Update and standard Microsoft deployment and management tools); and that may mean we see more Surfaces in more places. It also makes Surface available in another 12 countries in Europe and the Middle East (UK details here).
One of the interface changes is what General Manager Brad Carpenter nicknames "No touch left behind"; visual feedback that shows you where on the Surface you're touching so you can see if you're missing the object you're trying to select. And if you're trying to zoom something you can't zoom, Surface shows you lines leading away from the object so you know it recognised the gesture but it can't apply it.
GET FEEDBACK: Visual feedback of how you're touching Surface should make it easier to get the hang of gestures.
Tags on objects can now trigger specific apps and new visual and sound cues make the access points in the corners easier to find (many users play with the first Surface app they see and can't work out how to move to the others). "Our testing shows these help people figure out Surface manipulations faster than they otherwise would," says Carpenter.
SP1 includes new and improved multi-touch controls, including a library view and a circular menu, that Surface developers can use to build applications more quickly. Those same controls will also become available for .NET 4 "at a later time", making it easier to create apps that run on both Surface and Windows 7. The code for the 'manipulation processor' on Surface that determines how objects move as you manipulate them will be going into .NET 4 as well.
SURFACE: A circular element menu developers can use in any Surface application.
Carpenter also talks about creating consistent gestures for Surface and Windows, and even potentially for Xbox through the XNA developer tools or on Windows Mobile. "There's lots of interest around multi-touch and NUI [Natural User interface] inside Microsoft and that's great, but making sure we have consistency is important. We are at the stage where we are internally driving to an agreement about what is the right set of gestures, but we have to think about this across devices."
Although hardware is in development that will offer support for ten fingers and more, until it approaches the same massively multi-touch abilities as Surface, initially Windows 7 will only have two-finger gestures – and until we see a Windows Mobile phone with a capacitive screen you only have single-figure gestures. "But a flick needs to be a flick needs to be a flick; the gesture - and the underlying response - should be the same. On one device it shouldn't take forever to get across the screen."
SURFACE DEVELOPER TOOLS: This stress test simulates the equivalent of a bunch of school kids rushing up to Surface and all using it at once – an app that can cope with that should have no problem with normal users.
The next commercial version of Surface might incorporate some of the SecondLight technology Microsoft Research demonstrated last year; Carpenter is particularly interested in the idea of 3D gestures that you could make above the screen. He also confirms that Microsoft is still considering a home version of Surface. "Right now we're in middle of planning for Surface v.next [Microsoft-speak for the next version of products] and we're continuing to look hard at the consumer space to see what opportunities are there.
"We have a little bit of a luxury by being out in front of the competition to define what we go do next. We see this as a paradigm shift from GUI, the graphical user interface, to NUI, the natural user interface, and we think we are out in the forefront of that."