While this example shows how a person in a business type role will see the use of analytics, those in IT know there is much more to it.
The IT side needs to optimize IT costs in the support of the business while ensuring a successful customer experience, measured solely by performance (and availability). IT are still the people who are going do it, the people who have to make the decision about how to setup configurations, how much resource is needed and when, what rules to put in place around configuration automation, decommissioning and all that the other associated technically important details. Important questions must be asked and the answers understood relating to: around performance delivery, how to optimize throughput and response time, how to do that cost effectively? All of that is still, at the end of the day, somebody in IT making a decision and that decision must be informed by analyzing the right data, in a fashion aligned to the business.
Bridging the Gap between IT and Business
An understanding of applications performance must become deeply integrated with data center management tools and data for automatic provisioning of resources to be simultaneously cost- effective and service risk minimizing. Automated provisioning of storage, bandwidth, and computing power is a primary benefit of virtualization and a powerful capability of SDDCs.
But without integrated business intelligence all that is likely to happen is that sub-optimal decisions will be automatically implemented, and more quickly than ever! Recurring or compounding inefficiencies quickly drain resources and can inflict damage. The bungled launch of Healthcare.gov is a cautionary tale here.
If advanced analytics had been intelligently applied in the planning and functional testing phases, the disastrous under-provisioning of resources to this nationally deployed service might have been avoided. It doesn't take a data scientist to understand the money, time, and political capital wasted as a result of what was, at its essence, a profound – yet preventable - disconnect between IT and business.
When teams and tools can work across silos, the synergy created becomes the basis for competitive advantage. Gathering good data streams—metrics that matter to both business and IT— and correlating them through powerful analytics will amplify bottom line results.
By measuring and analyzing more than just power utilization effectiveness (PUE), the focus of continuous optimization shifts to risk reduction, revenue growth, decreased capital and operating expenditures, and enhanced customer experience. What does it mean for a datacenter to be the most efficient possible according to the industry standard PUE?
What are you getting for your use of that efficient power? How much work are you accomplishing? If all that power is going to servers than are not cost-effectively accomplishing useful work in service to the business, is that truly efficiency?
At the end of the day, nobody buys, builds or invests in a datacenter to move electricity around efficiently. They do it to get work done. Instead of only measuring PUE and the like, correlate efficiencies with work accomplished and end-user experience. All these components need to be optimized in a continuous, automated, and integrated manner. That's what true optimization across silos looks like.
Businesses can learn that they can be successful if they're able to look at the right data in combination with powerful analytics. It all comes back to this: Good data + powerful analytics = better business results.
- Dave Wagner is director of Market development at TeamQuest.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!