Currently in our society, more companies now collect more data (opens in new tab) than ever before. They process it and turn it into actionable insight, and as a result, data is highly valuable; some say it is becoming more valuable than money itself. As such, it’s increasingly the case that breaches impact a businesses’ ability to function, and can lead to punitive policies for any loss of personal data. Nefarious cyber events like data breaches can do real harm to human lives and livelihoods. The average cost of a data breach today is $4.24 million, which is higher than ever before.
Graham Hunter is VP of Skills at CompTIA (opens in new tab).
Today, more people have more capable devices than in the past, transmitting more data than ever. Using an employer-issued device to stream or download can open the door to a security (opens in new tab) breach on an entire organization. What’s more, millions of roles -- including key tech jobs -- are now sitting vacant. Critically, this means that user accounts are also sitting dormant, affording hackers more leeway to experiment and learn from each hack, which at small organizations can go undetected for a considerable periods of time. One can easily see, then, how the high-churn, low-retention digital business environment of today lends itself to increasing risk. The question is this: beyond cursory investments in cybersecurity (opens in new tab), how are organizations keeping up with the pace of change and the changing face of threats? Are they keeping up? Can they keep up?
Investment in highly skilled people
Companies are facing an increased burden of risk when it comes to security, which is a big problem. However, this burden does raise an interesting point: we should all be thinking about cybersecurity less in terms of a physical (or digital, as it were) investment in tech itself, and more in terms of an investment in the knowledge and competency of people in tech roles. Because having highly skilled people isn’t just part of a strategy; it is the strategy; without those people, everything falls apart. Admittedly, finding and keeping capable and driven tech workers today may seem “easier said than done.”
Especially after the move towards remote working (opens in new tab), the roles and responsibilities of workers are constantly shifting. As such, the breadth and depth of skills required by the workforce and in key tech roles today is an enormous concern, and a source of confusion for businesses. Many HR (opens in new tab) departments struggle to understand how to hire for requisite competencies and staff cybersecurity roles. Tech team managers similarly struggle with how to get everyone working from the same ever-changing playbook, which can lead to serious quality and performance issues. Secondary to staffing for cybersecurity roles, risk-averse businesses have been slow to adopt emerging technologies like blockchain (opens in new tab) and AI (opens in new tab)-enabled tools, fearing that a poorly managed adoption will hinder business or hurt their reputation. It’s easy to understand their conundrum - if you’re struggling to secure your house, is it wise to fill it with more valuables or build entire rooms that you can’t see into?
For cybersecurity, going it alone no longer works
Too many organizations are still trying to go it alone when it comes to cybersecurity; with teams working in isolation and relying on best guesses to keep them safe. Today’s cyber teams must fully understand digital best practices and have a continually evolving understanding of cyber hygiene (things like zero-trust policies and two-factor authentication), because the norms themselves are changing rapidly. Industry recognized training and certifications, as well as on-ramps like apprenticeships, can remove the guesswork from cybersecurity upskilling and ensure access to the most up-to-date tools and techniques.
Getting everyone on the same page and reading from the same playbook is a first step, and gives organizations a fighting chance to weather inevitable cyber storms. When organizations make early and sound investments in the training and upskilling strategies of their tech workers, they free up resources to use tech as something more than a defense tactic, but as a strategy for growth and evolution. When they don’t, they run the very real risk of becoming not only a target, but of losing relevance in an increasingly digital, connected, and data-driven business world.
The tech industry is constantly evolving. This landscape can feel overwhelming for decision-makers in IT. However, the need for upskilling as the driving force behind any organization's cyber strategy has never been clearer, and there is a way to do this right using economies of scale. Every tech team in the country shouldn’t have to reinvent the wheel every time they spec out a new tech job ad, or design a training program for newly onboarded IT or Security Managers. The competencies, training tools, and certifications already exist, and the legwork of ensuring that those standards are bulletproof has been handled. Pathways like apprenticeships are an excellent way to ensure learning happens in a consistent manner and leads to skills that can get the job done.
Once workers are on the job, training can and should be happening multiple times a year - if not constantly - rather than every 2-3 years, an unfortunately common but outdated approach to tech upskilling. Nurturing internal talent in this way, or by, for example, offering apprentices a job at the end of their apprenticeship, is not only smart for cybersecurity, but it’s a sound strategy for talent development and retention. It also ensures the consistent presence of skilled teams and eliminates the need to employ new staff. In short, keeping the focus on upskilling (looking left) is smarter and more efficient than focusing on new hires (looking right).
If you're interested in IT courses that include knowledge of cybersecurity practices, we feature the best IT development training courses (opens in new tab).