How Talend is taking the pain out of big data

Architecture. What architecture and tools should I choose that will allow me to have a flexible, scalable, manageable and cost effective solution?

Governance. How to I solve the same governance issues that I have in my existing data infrastructure including things like data quality, data mastering, privacy and security?

Budget and cost. How do I find the budget for the solution as the pilot succeeds and I scale it up from a few servers to dozens or hundreds?

Shifting landscape

TRP: How do you see the big data landscape changing in the near future?

I think we'll see four things happening in parallel. First, we'll see increasing numbers of early adopter success examples leading to a widespread understanding of mainstream use cases where big data provides unique benefits. This is another way of saying that we'll see big data successfully "cross the chasm". We're on the verge of that now.

Second, we'll see the vendor solutions mature to simplify planning, deployment, management, and ongoing maintenance. Today, the solutions have a number of moving parts and some rough edges as the customer pain points above demonstrate.

Third, we'll see the big data "stack" getting more fully fleshed out, including solutions for the many areas outside of data storage and querying supplied by the core Hadoop distros. This includes the areas that Talend offers (data integration, data quality, master data management), as well as things like analytics, visualisation, and reporting.

Finally, we'll see a number of new use cases emerge as the technology continues to evolve in areas such as real time querying and streaming data. These are a step behind the early, more batch oriented scenarios since the supporting technology is just coming together now.

Yellow elephants

TRP: How does Talend's solution exemplify these trends?

We made a bet on delivering "native Hadoop" solutions, which means that our customers don't need to install extra software either in front of the Hadoop cluster or, worse, on every node within it.

This "zero footprint" architecture dramatically simplifies deployment, manageability, and maximises scalability since customers can take full advantage of Hadoop's innate scalability and all of the vendor solutions targeted to deployment, management, monitoring, and system maintenance.

In addition to data integration, we've adapted our data quality and master data management solutions to work natively with Hadoop, so that customers can solve the governance issues that emerge as they deploy and scale their solutions. Finally, with the new investment we're announcing today we're investing in real time support to give customers a path forward to take full advantage of the emerging scenarios – natively.

With all of this going on, 2014 will be an exciting year!

Desire Athow
Managing Editor, TechRadar Pro

Désiré has been musing and writing about technology during a career spanning four decades. He dabbled in website builders and web hosting when DHTML and frames were in vogue and started narrating about the impact of technology on society just before the start of the Y2K hysteria at the turn of the last millennium.