Five qualities to look for in a cloud service provider

Image Credit: Pexels (Image credit: Image Credit: Panumas Nikhomkhai / Pexels)

If you want anything done properly, you should do it yourself, so they say. However, you don’t hammer the nails in with your fist, or apply the paint with your fingers. These painful and detailed jobs are trusted to tools, which you are happy to use because you are confident in your complete mastery of them. Because they are so firmly ‘integrated’ into our thorough processes, we forget that we ‘outsource’ many jobs to tools.

This is how jobs become so smoothly orchestrated that their efficiency makes us forget how much engineering and integration has taken place. This is what is happening with hyperconvergence infrastructure (HCI), which we shall illustrate by reference to some of Mellanox’s valued technology partners.

The above ‘DIY logic’ applies to the comparison of managing your storage in house or through a service provider. Yes, you could do it yourself, but the cloud gives you so many more options for cost control, flexibility, speed, security, access and recovery from disasters. 

These are only available because technology has reached a level of maturity that makes the infrastructure faster, smarter and cheaper. It is based on Ethernet storage switches, data management and security which are blended and fortified by a quality identified as the magic of hyperconvergence. This integrates and concentrates the power of computing, storage and networking.

Bandwidth and backup

You can only use cloud back up if you own your bandwidth challenges. There will be two massive transfers that must be enabled: the initial backup and a full restoration.  

Collaborating vendors in this space have created a concentration of throughput. For example, Nutanix teamed with Rubrik so that the latter offers thunderously powerful data management on what is now a lightning fast backplane.

This means you never have to overbuy storage, even though analyst IDC warns us that it grows by 60 percent each year. Despite the mounting challenge of data management and consumption, there is no need to panic buy resources. The technology puts you in control and you can adopt a pay per user plan that keeps up with the pace of chance and leaves your options open. This means you can enjoy the economies of scale that create cheaper prices for each unit of storage.  

Image Credit: Pixabay

Image Credit: Pixabay

Quality of service

You want flexibility in everything, legislated for in your service level agreements (SLAs).

But those SLAs can only promise instant access, security and response times if the technology is available to deal with the security challenges. Snapshots for crash-consistent local or offline backup, off-site backup and instant disaster recovery need a powerful engine and lightning fast transport.

The technical specs that deliver these detailed SLAs have to be every bit as finely conceived. When NetappHCI was forged with SolidFire, they jointly created hybrids of their inventions that were more powerful than the sum of their parts. Their complementary engines perform at even greater levels of data discovery and resource provisioning. This contributes to tighter security and isolation and levels of monitoring, management and visualisation that create a storage-oriented quality of service.  

Cost control

There are hidden costs everywhere in IT - and cloud storage is no different. You must know what is covered by your regular monthly service and what is regarded as an ‘extra’. If you are paying a fee every single time your data moves to and from the cloud you will want to work around that to minimise the expense. Or change the way you are charged.  

The most important quality your service provider can offer is the reassurance that you have complete mastery at all times.

The convergence of HPE’s powerful M-Series Ethernet switches with Cohesity’s consolidated secondary storage and workflows exemplifies how the supporting infrastructure can be created for faultless online services. Everything they jointly achieve, from high bandwidth to low latency, from storage offloads to the acceleration of native non-volatile memory, colludes to bring the service to its ultimate performance levels.

Image Credit: Pixabay

Image Credit: Pixabay (Image credit: Image Credit: TheDigitalArtist / Pixabay)


Privacy and security standards, such as HIPAA and PCI, demand Herculanean processing efforts from the computing engines. So too does encryption, which is a vital pre-requisite for a cloud storage service in these days of water tight regulation and compliance.  

The need for ever greater levels of vigilance means that storage service providers might find it a limiting factor to have to find the ever larger amounts of computing power and bandwidth to make sure your data is strongly encrypted before transferring it into the cloud. Some users might even try to encrypt it themselves, but they will find the same problems in finding sufficient resources, However, hyperconvergence keeps service providers ahead of the game because it keeps them on top of the battle of resources.

How? HCI enables greater levels of network orchestration. With this policy-driven discipline hardware and software components of the infrastructure are fine tuned to support the optimum conditions for any application or service requires to run.  

The rationale is to automate the way that network requests are handled which in turn can obviate the need for human intervention - which is always a great way to keep things moving quickly and efficiently, without delay.  

If, say, a cloud storage provider gets an order for 2TB of storage from a customer on its website, its systems can translate that straight into configuration tasks for network devices to execute.  

This is a prime example of how hyperconvergence brings instant gratification.


Hyperconvergence creates small powerful building blocks of computing, networking and storage that are tightly integrated and potent. Their compatibility means they can be rapidly aggregated and highly scalable. 

This makes them ideal for the cloud and hyperscale environments. Their power, granularity and penchant for aggregation mean they can be adapted to suit the needs of organisations in very different vertical market sectors. Hyperconvergence creates the qualities needed for hyperscale in data centres for sectors as diverse as the media and the government, the health service and global banking corporations. 

It was hyperconvergence, created by Nutanix and Rubrik, NetApp and SolidFire, HPE and Cohesity, that forged a powerful engine of productivity. This ramps up output and make this a buyer’s market for resources such as computing, networking and storage. This empowers the service provider, which in turn passes the economic benefits on to the customer.  

Kevin Deierling, VP of Marketing at Mellanox  

Kevin Deierling
Kevin Deierling has served as Mellanox's VP of marketing since March 2013. Previously he served as VP of technology at Genia Technologies, chief architect at Silver Spring Networks and ran marketing and business development at Spans Logic. Kevin has contributed to multiple technology standards and has over 25 patents in areas including wireless communications, error correction, security, video compression, and DNA sequencing.