Since the start of the 21st century, security has emerged from scare-story status to become one of IT users' biggest issues – as survey after survey confirms. Along the way a number of uncomfortable lessons have been learned, and are still being learned.
The first lesson is that security technology must always be considered in a human context. No one still believes in a technological fix that will put an end to all security problems, because time and again we hear of new types of cyber-attack that bypass sophisticated and secure technology by targeting human nature – from alarming emails ostensibly from official sources, to friendly social invitations to share a funny download. Or indeed a harmless looking USB stick 'accidentally' dropped by the office entrance, or a fake policeman demanding a few personal details to verify that you are not criminally liable.
And that explains this article's headline: a balance must be struck between achieving the desired level of protection, and keeping all protection procedures quick and simple. Every minute spent making things secure is a minute lost to productivity – so the heading could equally have read "balancing security with efficiency".
The second lesson still being learned is never to fully trust to instinct in security matters. It is instinctive to obey instructions that appear to come from an authoritative source, or to respond in an open, friendly manner to a friendly approach – and those are just the sort of instincts that are exploited by IT scams. Instincts can open us to attack, and they can also evoke inappropriate caution.
In the first years of major cloud uptake we heard the oft-repeated advice to businesses that the sensible course would be to use public cloud services to simplify mundane operations, but that critical data should not be trusted to a public cloud service, rather it should be kept under control in a private cloud. Instinctively this made sense: you should not allow your secrets to float about in a cloud where you have no idea where they are stored or who is in charge of them.
The irony is that the cloud – being so obviously vulnerable and inviting to attackers – is constantly being reinforced with the most sophisticated security measures, so data in the cloud is probably far better protected than any SME could afford to secure its own data internally.
It is like air travel: because flying is instinctively scary, so much has been spent to make it safe that you are less likely to die on a flight than you are driving the same journey in the "safety" of your own car. The biggest risk in air travel is in the journey to the airport, just as the biggest risk in cloud computing lies in the data's passage to the cloud – hence the importance of a secure line to a cloud service.
So let us look at encryption in the light of those two lessons. Instinctively it makes sense to keep full control of your own encryption and keys, rather than let them get into any stranger's hands – so how far do we trust that instinct, bearing in mind the need also to balance security against efficiency?
Hot on the heels of BYOD – or "Bring Your Own Device" to the workplace – comes the acronym for Bring Your Own Key (BYOK).
The idea of encryption is as old as the concept of written language: if a message might fall into enemy hands, then it is important to ensure that they will not be able to read it. We have recently been told that US forces used Native American communicators in WW2 because the chances of anyone in Japan understanding their language was near zero.
More typically, encryption relies on some sort of "key" to unlock and make sense of the message it contains, and that transfers the problem of security to a new level – now the message is secure, the focus shifts to protecting the key.
In the case of access to cloud services, if we are encrypting data because we are worried about its security in an unknown cloud, then why should we trust the same cloud to hold the encryption keys?
Microsoft recently announced a new solution to this dilemma using HSMs (Hardware Security Modules) within its Windows Azure cloud – so that an enterprise customer can use its own internal HSM to produce a master key that is then transmitted to the HSM within the Windows Azure cloud. This provides secure encryption when in the cloud, but it also means that not even Microsoft itself can read the data, because they do not have the master key hidden in the enterprise HSM.
It is not so much that the enterprise cannot trust Microsoft to protect its data from attack, it is more to do with growing legal complexities. In the wake of the Snowden revelations, it is becoming known that even the most well protected data might be at risk from a government or legal subpoena demanding to reveal its content. Under this BYOK system, however, Microsoft cannot be forced to reveal the enterprise's secrets because it cannot access them itself, and the responsibility lies only with the owner.
This is increasingly important because of other legal pressures that insist on restricting access to certain types of data. A government can, for example, forbid anyone from allowing data of national importance to leave the country – not a simple matter in a globally connected IP network. There are also increasing legal pressures on holders of personal data to guarantee levels of privacy.
Instinctively it feels a lot more secure to manage your own key and use BYOK instead of leaving it to the cloud provider. As long as that instinct is backed by a suitable and strict in-house HSM-based security policy, these instincts can be trusted.