
Sign up to save your podcasts
Or


The headline is a bit dramatic but it was hard to pass up the “Shadow” reference. In fact, Shadow AI refers to employees covertly using generative AI tools at work without IT, HR, and other departments knowing about it. A recent report found vthat 27.4% of the content employees fed into AI tools like ChatGPT, Claude, Copilot, and Gemini was sensitive, a 10.7% increase from a year ago. The most sensitive data types shared with AI tools are customer support (16.3%), source code (12.7%), research and development material (10.8%), and unreleased marketing material (6.6%). HR and employee records represent 3.9% of sensitive information uploaded to AI chatbots, including confidential HR details, employee compensation, and medical issues. Neville and Shel examine the problem and the role internal communicators can play in this short midweek episode.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, July 29.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. Shel has started a metaverse-focused Flipboard magazine. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
The post FIR #419: Is Shadow AI an Evil Lurking in the Heart of Your Company? appeared first on FIR Podcast Network.
By Neville Hobson and Shel Holtz5
2020 ratings
The headline is a bit dramatic but it was hard to pass up the “Shadow” reference. In fact, Shadow AI refers to employees covertly using generative AI tools at work without IT, HR, and other departments knowing about it. A recent report found vthat 27.4% of the content employees fed into AI tools like ChatGPT, Claude, Copilot, and Gemini was sensitive, a 10.7% increase from a year ago. The most sensitive data types shared with AI tools are customer support (16.3%), source code (12.7%), research and development material (10.8%), and unreleased marketing material (6.6%). HR and employee records represent 3.9% of sensitive information uploaded to AI chatbots, including confidential HR details, employee compensation, and medical issues. Neville and Shel examine the problem and the role internal communicators can play in this short midweek episode.
Links from this episode:
The next monthly, long-form episode of FIR will drop on Monday, July 29.
We host a Communicators Zoom Chat most Thursdays at 1 p.m. ET. To obtain the credentials needed to participate, contact Shel or Neville directly, request them in our Facebook group, or email [email protected].
Special thanks to Jay Moonah for the opening and closing music.
You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. Shel has started a metaverse-focused Flipboard magazine. You can catch up with both co-hosts on Neville’s blog and Shel’s blog.
Disclaimer: The opinions expressed in this podcast are Shel’s and Neville’s and do not reflect the views of their employers and/or clients.
The post FIR #419: Is Shadow AI an Evil Lurking in the Heart of Your Company? appeared first on FIR Podcast Network.

32,003 Listeners

30,203 Listeners

112,408 Listeners

56,513 Listeners

10,215 Listeners

9,166 Listeners

68 Listeners

16,359 Listeners

14,287 Listeners

2,198 Listeners

29,249 Listeners

12,843 Listeners

19,806 Listeners

1,241 Listeners

96 Listeners