A Microsoft bug allows Copilot Chat to summarize customers’ personal emails.
Microsoft says an Office bug allows Copilot AI to summarize customers’ confidential emails without authorization, as first noted by BleepingComputer. This bug has allowed Copilot Chat to read email content since late January. The bug bypassed Data Loss Prevention (DLP) policies, which are intended to prevent sensitive information from being used in language models like Microsoft’s. The tech giant is working on a fix.
Copilot Chat
Copilot Chat is available to paying Microsoft 365 customers. This AI-powered chatbot supports users in Office products such as Word, Excel, and PowerPoint. For instance, the AI chat function can retrieve sent items and draft folders for AI purposes.
read also
European Parliament blocks AI tools on staff mobile devices
Microsoft stated that the bug, identified as CW1226324, causes email messages “with a sensitivity label to be incorrectly processed by Microsoft 365 Copilot chat,” according to TechCrunch. The tech giant was reportedly rolling out a fix for this bug in early February. The company now says it continues to monitor the implementation and is checking with users to see if the solution is working.
It is not yet clear how many users or organizations are affected, or what the definitive timeline for recovery is.
