Is 'Shadow AI' a threat to your business? Report claims workers are increasingly more willing to cut corners and take risks

Hacker Typing
(Image credit: Shutterstock)

  • 58-59% of workers admit to using shadow AI at work
  • Datasets, employee names and finances are all shared with unapproved tools
  • Could IT teams meet workers where they are to ensure better compliance?

With AI tools now a common sight in many businesses, new research from BlackFog has flagged that while most (86%) employees say they now use AI for work tests at least weekly, three-fifths (58%) admit they're using unapproved AI or free, publicly available tools instead of company-provisioned tools, putting their business at risk.

Company-provisioned tools are important for providing enterprise-grade security, governance and privacy protections, but many workers complain that the AI they're given isn't suitable for what they need.

But more importantly, 63% believe it's acceptable to use AI without IT approval and 60% agree unapproved AI is worth the security risk if it helps them meet deadlines, suggesting there's a clear disconnect between company goals and how they communicate them with staff.

Shadow AI is rife in employee workflows

Shadow AI "should raise red flags for security teams and highlights the need for greater oversight and visibility into these security blind spots," BlackFog CEO Dr Darren Williams wrote.

This comes as 33% of workers admit to having shared research or datasets with unapproved AI, 27% have shared employee data like names, payroll or performance, and 23% have shared financial or sales data.

But while it might be on IT teams to double down on rules and expectations when it comes to AI, they face an uphill battle with more C-suite execs and senior leaders believing speed outweighs privacy and security than junior and administrative staff.

And BlackFog isn't the only company to have revealed widespread shadow AI usage – Cybernews also found that 59% of workers use unapproved AI at work, suggesting that an even higher 75% of users have shared sensitive information with these unapproved tools.

Similarly, the report found that 57% of workers' direct managers support the use of unapproved AI. "That creates a gray zone where employees feel encouraged to use AI, but companies lose oversight of how and where sensitive information is being shared,” Security Researcher Mantas Sabeckis warned.

Looking ahead, there are two clear solutions to stamp out shadow AI. Firstly, IT teams must reiterate the risks involved and guide users to approved tools, but secondly, it's clear that current approved tools aren't suitable for many workers, so IT teams should meet them where they are and offer enterprise-grade versions of those apps.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

TOPICS

With several years’ experience freelancing in tech and automotive circles, Craig’s specific interests lie in technology that is designed to better our lives, including AI and ML, productivity aids, and smart fitness. He is also passionate about cars and the decarbonisation of personal transportation. As an avid bargain-hunter, you can be sure that any deal Craig finds is top value!

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.