According to Gartner's recent Big Data Industry Insights report, it's clear that while organizations are increasing their investments in big data, they struggle to gain significant business value from those investments. Furthermore, according to a CSC survey, five of every nine big data projects aren't completed and many others fall short of their objectives. Often, business and IT groups are not aligned on the business problem they need to solve. Also, employees frequently lack skills required to analyze data.
To help overcome these hurdles and maximize business value from data, organizations can take the following steps to significantly shorten the time-to-value and contribute to business success using big data initiatives.
1. Know What You Are Trying to Solve
Identifying a key element or business problem for a successful big data initiative may seem trivial at first, but it's absolutely crucial. Many organizations fall into the trap of starting a big data initiative before digging deeper into what business questions they want to solve. This often leads to frustrations for both business and technology teams since they already started investing in technology and resources. Having clearly defined analytics requirements is essential for a partnership between business stakeholders and the technology team.
Success for big data initiatives hinges on business stakeholders deciding which requirements will move the needle for the business. As technology has evolved, business stakeholders should be encouraged to think of analytics questions that once seemed to be a distant dream with existing technologies.
Coming up with a succinct set of analytics requirements also helps technology teams define a big data architecture that leads to success for the organization.
2. Replace Data Silos with Auditable Role-Based Access Controls
Another hurdle that organizations face is that IT departments spend too much of their time accessing data rather than analyzing it. Most data analysts spend 80 percent or more of their time accessing data, while a mere 20% or less spend time actually analyzing it for actionable business insights.
In order to move past data silos and take full advantage of low-cost batch storage technologies like Hadoop, many organizations are looking favorably at a "data lake" or "data reservoir" model. In this model, data is stored once and shared by multiple business and IT stakeholders. This architecture permits role-based access controls to protect sensitive data and customer privacy. For example, someone on the fraud team may be authorized to see PII data, such as home addresses and credit card numbers, while a marketing analyst sees masked and aggregated data. Creating this "data fabric" not only breaks down organization silos, but also empowers employees to take advantage of their data with a single version of the truth.
3. Make Analytics Actionable
A hidden issue with many big data projects is that often assessments are based on data subsets, when they are actually meant to be a representative sample. While this provides directional analysis, actionable insights cannot be derived from sampled and aggregated data. For example, if retailers want to understand customer interactions across offline and online channels, or if an investment banker wants to measure risk in a portfolio, companies must be able to search, analyze and visualize raw granular data, not just a sample. Actionable insights from granular data can drive a targeted marketing campaign at a customer level, or reduce risk by identifying a micro segment of customers.
4. Empower Self-Service Analytics by Everyone, Not Just Data Scientists
Finally, organizations are struggling to hire and retain data scientists who understand statistics, computer science and open-source technologies, such as Hadoop or NoSQL data stores. According to research from McKinsey Global Institute, the United States will experience a shortage of between 140,000 and 190,000 skilled data scientists, and 1.5 million managers and analysts capable of reaping actionable insights from big data.
One way to address the lack of skills is to adopt technologies that bridge the gap between data scientists and knowledge workers. This approach includes enabling product managers, web analysts, risk managers, security analysts and other knowledge workers to simply point at Hadoop or another data store, and start exploring, visualizing and analyzing. This also relieves database or business analysts from having to learn programming language to access this data or spend valuable time writing complex code, saving both time and money.
The Bottom Line
At the end of the day, businesses will thrive with wide data access and data-driven decision making. By simplifying access to data and democratizing the information, organizations will be able to leverage the power of big data, which will ultimately bring them a step closer to gaining a competitive advantage in 2015.
- Rahul Deshmukh is Director of Solutions Marketing at Splunk
Article continues below