Sound and image search
Searching in future won't simply be about typing text into a box. As Google's Marissa Mayer wrote on her blog: "Why can't I enter my query as a picture of the birds overhead and have the search engine identify what kind of bird it is? Why can't I capture a snippet of audio and have the search engine identify and analyse it?"
In fact, you already can. Well, sort of. If you're listening to music, you can call Shazam on 2580 from your mobile phone, hold your phone to the speaker and have the track identified by text message. And Shazam apps for the iPhone and Android enable you to go a step further and buy the track, too.
Snaptell Explorer for iPhone lets you send a photo of a book, CD, DVD or game to the service where it will be identified, and you'll then get links to buy it and to Wikipedia and search engine results to learn more about the product. TinEye Mobile lets you buy music with your iPhone when you photograph and upload an album cover.
TinEye's Image Search Engine lets you upload an image to find out whether it appears on the web anywhere. It's a great tool if you want to check your photos haven't been ripped off by copyright thieves, and it even finds partial matches that have been altered in an image-editing program.
Voice search is already possible via mobile phone or with speech recognition software on your PC, but how many people currently use it? "A small proportion of people are using voice and an even smaller proportion are using cameras to say 'Where am I?' or tagging photos, and it's hard to tell if that is the future," says Stoddart.
"I've always been excited by things like music search – humming a song to my computer and it coming back to say 'This is the song you're thinking of and this is where you can buy it'. There are lots of different companies trying these things and I wouldn't rule any of them out for search engines in the future, but a lot of this gets dictated by the user. So if you were to build a search engine tomorrow featuring all the niche, bleeding-edge technologies, there's a high chance that no one would use it because they're just not used to it. So there's the whole transition from 'What do we have today?' to 'Where do we want to go in the future?' There will be things that the user pushes us to do and things that we push the user to do, and it's got to be a balance in between."
Another area we'll see increase in intelligence is image and video search. Galler says both have a lot of potential to improve. "You could say 'I'd like to see all the images where Paris Hilton is in them' or 'where there is a white horse', or a specific musical instrument, so you can define more object-specific searches, and that's going to improve over time."
Galler predicts that video search will improve in the same way, so the user doesn't need to view the full set of videos to find the snippet they're looking for. "Video search right now has a lot of potential to improve in what can be found in what part of what video, so the user doesn't really need to view the full set of videos to find what they're looking for," he explains.
"Day one of video search was going to a website, entering 'Friends' and getting all the different clips of Friends," says Goddart. "The next level was 'How do you make relevant results even more relevant?' If you look at Live Search you have an inline preview that allows you quickly identify whether it's the episode you're looking for because it picks out the key points throughout it – so now I don't have to watch the first four minutes of every episode of Lost."
The next stage, says Goddart, is where you can have the whole script indexed, so you enter your search query and it will bring back the relevant episode. "That's not going to be an easy thing to do, but if that's the way the user wants to search or we think that's the way we can give the best experience to the user, it's very likely that researchers will look into it."