Microsoft says Office bug exposed customers’ confidential emails to Copilot AI

Why it matters: This breach underscores the critical need for robust security measures and transparency in AI development, as it erodes trust in AI-powered tools and raises questions about the handling of sensitive data.
- Microsoft confirmed a bug (CW1226324) that allowed Copilot to summarize confidential emails without permission, affecting Microsoft 365 customers using Copilot Chat.
- Bleeping Computer initially reported the bug, revealing that Copilot Chat could read and outline email contents since January, despite data loss prevention policies.
- The European Parliament's IT department blocked built-in AI features on lawmakers' devices due to concerns about confidential correspondence being uploaded to the cloud, reflecting broader anxieties about AI data security.
A Microsoft Copilot bug exposed confidential customer emails to its AI summarization feature for weeks, even when data loss prevention policies were in place. This incident, confirmed by Microsoft and highlighted by Bleeping Computer, raises serious concerns about data privacy and the security of AI-powered tools, prompting the European Parliament to block built-in AI features on work devices.


