Digital core: How to create the data center of tomorrow

He added that “More data was created [between 2014 and 2016] than the previous 5,000 years of humanity.” 

He forecast that the type of data that everyone will create is “expanding rapidly across a wide range of industries: biotech, energy, IoT, healthcare, automotive, space and deep sea explorations, cyber-security, social media, telecom, consumer electronics, manufacturing, gaming and entertainment.” 

The list of potential data sources is constantly growing, and so you’ll need a data center that can cope with its growth. The data center of the future also need to be able to manage the growing varieties of data. 

With growing data volumes data centers need to evolve. Data centers are known in some circles as the “digital core”, and as such they are having to adopt to an increasing array of different types of technology. 

Large technology vendors also tend to say they care about their customers, and perhaps they do, but they’re also very keen to sell their wares – from servers to network infrastructure. Add the discussions about being at “the edge”, and you will find yourself in quite a jungle that’s arguably becoming much harder to navigate.

In fact talk about edge data centers and the digital core is likely to confound most customers. The IT industry is not the only culprit, most industries like to create terms that seem quite meaningless to everyone but those working with them. 

Yet, ironically, terms are often created for marketing reasons to describe something that has existed for many years in another guise. So where does this leave you? Well, it requires you to either hire the right expertise to help you to cut down this jungle or for you to learn how to figure out what you really need.

What you will find though is that data volumes are going to increase, and research on the web suggests that digital transformation and the Internet of Things (IoT) are playing a role in creating this trend. 

Richard Harris, Executive Editor of App Developer Magazine, went so far as to suggest in December 2016 that “In 2017, we will create more data than ever before, creating new challenges around consuming that data to make strategic and tactical decisions.”

Digital core banking

Chris Skinner, Co-Founder of the Financial Services Club, writes in the Finanser that ‘Banks without a digital core will fail.’ But what does he mean? He says that data is the key to disruption, and most people will agree with his statement. 

However, he was asked how he defined the term. One reply was that there wasn’t one, but it seems that it refers to an idea whereby there is a central point of systems – such as a mainframe.

He writes that the markets don’t operate in this way anymore, and so he thinks that systems should be spread across server farms in the cloud to avoid there being a single point of failure. 

However, he finds that many people are misinterpreting what the ‘digital core’ actually means. That’s no surprise to me because new terms often mean different things to different people. 

He therefore describes the digital core as being, “the removal of all bank data into a single structured system in the cloud [where] the data is cleansed, integrated and provides a single, consistent view of the customer as a result.”

Empowering companies

In 2015, technology giant SAP described the digital core as follows: “A digital core empowers companies with real-time visibility into all mission critical business processes and processes around customers, suppliers, workforce, Big Data and the Internet of Things. 

This integrated system enables business leaders to predict, simulate, plan and even anticipate future business outcomes in the digital economy.”

Amongst other examples of where a digital core can be used, the company adds: “The same digital core can be used to optimize manufacturing, moving from batch orders to real-time manufacturing resource planning to always meet demand.

Further, using information collected by assets and the Internet of Things the assembly line and ERP can be synchronized for increased cost efficiency and asset utilization.”

SAP believes that “Companies that lose the complexity that is weighing them down will be able to face the market disruption happening everywhere.” To be successful requires your firm to has to have real-time visibility and integration across all business processes. 

Visibility is about allowing your business to understand how information flows in and out of your company, or perhaps even in and out of your data center.

The digital core also digitalizes mission-critical processes based on a single source of information that SAP says “interconnects all aspects of the value chain in real-time.”

This includes workforce engagement, assets, IoT, supplier collaboration, business networks and customer engagement into an omni-channel experience to empower better decision-making to gain an advantage in today’s digital economy. 

Edge computing

In contrast edge computing is defined by Techarget in the following way:

“Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. The move toward edge computing is driven by mobile computing, the decreasing cost of computer components and the sheer number of networked devices in the internet of things (IoT).

Depending on the implementation, time-sensitive data in an edge computing architecture may be processed at the point of origin by an intelligent device or sent to an intermediary server located in close geographical proximity to the client.  Data that is less time sensitive is sent to the cloud for historical analysis, big data analytics and long-term storage.”

One of the arguments for edge data centers and edge computing generally is that it can help to reduce the impact of network latency. However, this is problematic. What if the data is too close to the source, and what if a problem arises? 

Ideally data should be stored and backed up in at least 3 different data centers or disaster recovery sites. Storing data too close to a circle of disruption can lead to disaster, and to mitigate the effects of latency there is no need to place all of your eggs in one basket. 

With data acceleration solutions such as PORTrockIT it becomes possible to mitigate the effects of data and network latency, speed up data flows and reduce packet loss at a distance.

Question vendors

So edge computing may not provide you with all of the answers you’re looking to address whenever you’re looking at how to create the data center of the future. 

Vendors will also be happy to sell you WAN optimization as the answer, but it can’t handle encrypted data in the way that a data acceleration tool can without compromising network efficiency. How? Well, data acceleration solutions use machine learning to increase throughput.

The alternative of increasing your bandwidth won’t necessarily improve your networks’ speed, but it could cost you the earth without addressing the limitations created by physics and the speed of light. It’s therefore important to consider what really is motivating a vendor. Your interests, or theirs? 

A vendor with your interests at heart will be able to explain to you in plain English, and demonstrate the veracity of their claims, what really will work effectively for you.

While the jargon might be sexy, it’s no good putting customers like yourself on edge. Vendors should explain and demonstrate the benefits of a given technology in a language they understand. That is crucial. 

If we can’t communicate properly with each other, then it’s not going to be possible to create the data center of the future – whether a digital core, an edge computing or another approach is used as part of it.

So the key to creating the future data center is for vendors to offer you as a customer what they really need today for tomorrow’s future. This doesn’t require you to replace all of your legacy infrastructure for latest technology. Much can be achieved with what you already have, and with data acceleration.

David Trossell is CEO and CTO of award-winning data acceleration company Bridgeworks, which has developed products such as PORTRockIT. It is the winner of the DCS Awards 2018 for the Date Centre ICT Networking Product of the Year category; and it won the DCS Awards 2017, Data Centre ICT Networking Product of the Year. David has over 25 years of experience designing, optimising and implementing Oracle database backup, replication protocols for international industries to include: banking, aerospace, retail, utilities and government sectors.