Machine learning: Why Evernote has moved to Google's cloud

Evernote

On the surface it looks like a simple public cloud infrastructure deal. But stand back from productivity app Evernote's recently announced migration of its entire infrastructure onto Google's Cloud Platform and there's a much bigger story to tell. This one's about the changing world of customer-facing technology – and the rise of the AI API.

What is Evernote doing?

Productivity apps are many, but Evernote's 200 million plus customers make it one of the most popular. Allowing users to store private notes on the cloud, add multimedia, and access everything from multiple devices (recently restricted to two unless you pay a fee), Evernote's back-end now contains about five billion notes.

Until now, all of that was held on Evernote's own private cloud infrastructure, but from early October it's all going to be migrated to Google's Cloud Platform. Evernote's notes were already easy to integrate with Google Drive, but this goes much deeper than the mass adoption of the cloud as a place to store data.

Evernote claims over 200 million users globally

Evernote claims over 200 million users globally

So why is Evernote doing this?

We've all become accustomed to using Siri, Google Now and perhaps even Amazon Alexa in the past few years, but so far these digital assistants have been restricted to specific devices. Now get ready for 'AI everywhere' user experiences on all kinds of platforms on any smart device – from apps and websites to messages and social media feeds – as artificial intelligence APIs and machine learning go viral.

"Moving to the cloud is being done for all the usual reasons – reduced costs, easy scaling up and down, steady state support, security, etc., and Google knows as much about this as anyone," says Matt Jones, Analytics Strategist at Tessella. "But I believe AI and machine learning is the real driver, not the cloud … this move allows Evernote to take advantage of the specialists Google has in AI."

Why build your own in-house AI capability on your own private cloud when someone else can do it for you?

Spotify's data sits on the Google Cloud Platform

Spotify's data sits on the Google Cloud Platform

What kind of new features will Evernote offer?

Since Evernote allows its users to take notes as text, as dictation, as audio and as photos – and then search the lot using keywords – there's a lot that can be done with AI to its massive database in order to create new features. For instance, pretty soon Evernote users will be able to ask the app to 'show me all notes that I took in Wales last week' or 'read out those notes to me'.

"Voice interaction – dictation, queries, commands – with notes is likely to be a big driver of this move," says Jones, who thinks that Google's AI and Natural Language Processing has a proven capability in areas like contextualised voice searching and automated voice-to-text. "This would be a big draw of Google as a platform," he adds.

Connecting ideas

In a blog post in September, Evernote's VP Operations, Ben McCormack, outlined exactly what Google Cloud Platform brought to the party. "In addition to scale, speed, and stability, Google will also give Evernote access to some of the same deep learning technologies that power services like translation, photo management, and voice search," he wrote, adding that it was about users being more easily able to 'connect ideas', search for information in Evernote, and find the right note at the moment it was needed.

There will also be some new security features enabled by Google, including 'encryption at rest', where data in persistent storage is as protected as data being actively used.

Evernote is already integrated with Google Drive

Evernote is already integrated with Google Drive

What do these APIs do?

Developed by the likes of Google, Facebook, Apple and Amazon initially for their own purposes, these APIs are about selling AI as a Service. Basically, it's about processing unstructured data using public cloud machine learning.

One of Google's latest APIs is Google Cloud Natural Language, which extracts and analyses meaningful information from text about specific people, places and events. It understands sentiment, too, so could be used to find negative messages left on a Twitter feed, on news websites, or in call centre conversations. The latter would also require the use of Google Cloud Speech, another AI API, and this uses neural network models to convert audio to text in over 80 languages. It uses deep learning, too, getting more accurate over time.

Google Cloud Speech is used by HyperConnect, a video chat app designed to make it possible for people from all over the world to communicate without having to speak each other's language. Facebook already does a similar thing on its feeds. The potential for the use of this technology by commerce websites is huge: think virtual assistants – branded Siri-like bots – cloud-based intelligent assistants inside third-party apps, and much more besides.

Jamie Carter

Jamie is a freelance tech, travel and space journalist based in the UK. He’s been writing regularly for Techradar since it was launched in 2008 and also writes regularly for Forbes, The Telegraph, the South China Morning Post, Sky & Telescope and the Sky At Night magazine as well as other Future titles T3, Digital Camera World, All About Space and Space.com. He also edits two of his own websites, TravGear.com and WhenIsTheNextEclipse.com that reflect his obsession with travel gear and solar eclipse travel. He is the author of A Stargazing Program For Beginners (Springer, 2015),