Skip to content

How We Discovered an Attack in Copilot's File Permissions

Knostic researches discover how you could bypass file permissions through using Microsoft 365 Copilot.

Critical Gap in Microsoft 365 Copilot 

We discovered a critical gap in Microsoft 365 Copilot where (likely) due to a delay between file permission updates and Copilot’s sync process, users could access sensitive file details they no longer had permission to see.

Microsoft controls what MS Copilot knows by limiting what documents it can access according to what the user can access. However, when a user's access is changed, there's a delay between when that user's access to a file is removed and when Copilot registers that change.

The end result is, while you may not be able to see the file any longer, Copilot still can. And you can get Copilot to tell you all the things you can't see within the file. Through prompting, you could access data that should have been restricted.

Now that Microsoft has fixed the issue, we're sharing our proof of concept.

 

Why does this matter?

1. Oversharing and data leakage are major blockers for enterprise adoption of AI search tools like Copilot & Glean.

2. Even without direct vulnerabilities, LLMs must be built to respect dynamic permissions in real-time.

3. Organizations need solutions that prevent data from leaking through AI assistants.

Protect your Organization's Data with Knostic 

At Knostic, we're tackling the problem of LLM oversharing—ensuring AI tools like Microsoft 365 Copilot and Glean don’t reveal more than they should.