

No. Commercial Microsoft 365 Copilot data does not leave your tenant's trust boundary and is absolutely not used to train public LLMs.
We use a surgical approach. Instead of breaking workflows, we remove "global" permissions (like 'Domain Users') and replace them with specific Microsoft 365 Groups. This maintains workflow efficiency while ensuring Copilot respects security boundaries.
Beyond the commercial M365 licenses, you need the "Current" update channel for apps, specific Entra ID accounts, and a healthy Semantic Index. Our audit validates all of these elements.
No. You do not need an E5 license to run the assessment. However, we will advise on whether upgrading to E5 is cost-effective for your organization to gain access to automated security features like auto-labeling moving forward.
Leaving ROT data creates a "Garbage In, Garbage Out" scenario. Copilot may reference outdated documents (e.g., an old policy from 2019) to answer current questions, leading to hallucinations and poor decision-making.
Yes. As part of the AI Governance Strategy deliverable, we provide a draft "Acceptable Use Policy" to help you manage how employees interact with both Copilot and public AI tools.