A practical, human-friendly guide for helpdesk and support teams
Microsoft Copilot is becoming a daily companion for knowledge workers—drafting documents, summarizing chats, automating tasks, and surfacing insights across Microsoft 365. But as helpful as Copilot can be, it is equally capable of confusing and frustrating users when something goes wrong.
If you work in a helpdesk, service desk, or support role, you’ve likely already seen questions like:
- “Why does Copilot say I’m not grounded?”
- “Why is Copilot telling me I don’t have permissions when I do?”
- “Copilot can’t see files that I can see—what’s going on?”
These questions are becoming as common as “I forgot my password,” and support teams need solid troubleshooting steps to avoid digging through vague documentation or escalating unnecessarily.
This guide walks you through the two most common Copilot failures—Not Grounded and Permission Denied—and explains what they mean, why they happen, and how to trace the root cause. I’ll keep this guide practical, focused, and friendly for real-world support teams.
Why Copilot Errors Are Different from Regular Microsoft 365 Errors
Traditional Microsoft 365 errors usually trace back to something tangible: a missing license, a misconfigured SharePoint library, a mailbox issue, and so on.
Copilot errors are trickier because:
- Copilot doesn’t store user data—everything flows through existing services (Graph, SharePoint, OneDrive, Exchange, Purview).
- Copilot depends heavily on Microsoft Graph to confirm what the user can actually access.
- Data access is governed by the user’s permissions in Microsoft 365 and organization-wide compliance settings, including sensitivity labels, DLP policies, retention rules, and Purview controls.
This means any Copilot error is often a symptom of an issue elsewhere.
1. Understanding “Not Grounded” Errors
What users see
Depending on the Copilot experience (M365, Teams, Outlook, Word), users may see variations like:
- “The response could not be grounded in your organization’s data.”
- “Copilot couldn’t verify access to the requested content.”
- “Your request didn’t meet the grounding requirements.”
Users interpret this as: “Copilot doesn’t understand me” or “Copilot is broken”.
What “Not Grounded” really means
“Grounding” is Copilot’s process of validating that an answer is based on real, accessible data. If Copilot can’t confirm the data source—or if it can’t retrieve the data—it will refuse to answer.
In short: Not Grounded = Copilot couldn’t access or verify the data needed to answer.
Typical causes
Below are the most common root causes seen in helpdesk scenarios:
1. The user asked Copilot something too vague or too broad
Examples:
- “Summarize everything said in all Teams chats about Project Phoenix.”
- “List every sensitive file in the company.”
When no specific source or context exists, Copilot cannot ground the answer.
Fix: Encourage users to reference a specific file, chat, email, or location.
2. Data is in a location Copilot can’t access
Common culprits:
- Files stored locally instead of OneDrive/SharePoint
- Old SharePoint classic sites
- Personal email folders not indexed by Graph
- Files stored in a network drive not synced to the cloud
Fix: Ask the user where the data is stored. Verify cloud indexing.
3. The user’s permissions don’t match what they think they have
Sometimes a user “can see a file” only because:
- A link was shared with them temporarily
- They had access but lost it recently
- They opened the file in the past and cached access
Graph sees the current permissions, not historical ones.
Fix: Check actual SharePoint/OneDrive/Teams permissions for the item.
4. Sensitivity labels or DLP rules block Copilot access
Purview can restrict:
- External access
- Copy/paste behavior
- AI and third-party interaction
- Indexing and search visibility
If a file is labeled “Highly Confidential – Finance Only,” Copilot may not be able to use it even if the user can open it.
Fix: Review Purview sensitivity labels or ask your compliance admin to verify rules.
2. Understanding “Permission Denied” Errors
These look more familiar to users but are still confusing:
- “You don’t have permission to access this content.”
- “Copilot cannot retrieve the requested information due to insufficient permissions.”
- “We couldn’t access this document.”
Users interpret this as: “But I can open the file—why can’t Copilot?”
What it really means
Copilot uses Microsoft Graph to validate access. This is more strict than simply “opening” a file.
A Permission Denied error happens when Graph concludes one of two things:
- The user does not have sufficient permissions to the requested data.
- The service storing the data blocked Copilot from retrieving it.
Even if the user can open the file, Graph may still block access if:
- Permissions were recently changed (graph index lag).
- The file is in a location not fully indexed.
- The storage location is not supported for Copilot grounding.
3. How Support Staff Should Troubleshoot These Errors
Here is a practical workflow that support teams can use without needing developer-level knowledge.
Step 1: Start with the user’s phrasing
Ask:
“What exactly did you ask Copilot?”
Look for:
- Broad requests (“summarize everything…”)
- Missing context
- References to data Copilot cannot see
If needed, help them rephrase:
“Summarize the attached file” → grounded
“Summarize all files about Project Phoenix” → not grounded
Step 2: Ask where the source data lives
The fastest filter:
“Where is the information stored?”
Check:
- OneDrive
- SharePoint
- Teams storage
- Exchange mailbox
- Personal PST files (not supported)
- Network drives (not supported)
If it’s not in the cloud → that’s the issue.
Step 3: Verify user permissions
Even if the user thinks they have access, always verify:
- SharePoint/Teams membership
- File/folder permissions
- Whether the file was shared via a link that expired
- Whether the file recently changed location
Step 4: Check Purview policies (often the real culprit)
Look for:
- Sensitivity labels preventing indexing
- DLP rules blocking AI interaction
- Restrictions on specific departments
- Conditional Access policies limiting Graph access from certain devices
If the user recently got a new device or is using a personal device, CA policies may deny Copilot access even when the user can open the file.
Step 5: Consider Microsoft Graph indexing delays
If data or permissions were changed in the last few minutes, Copilot may not see updates yet.
Indexing can take:
- Minutes for small items
- Up to an hour for large changes
- Longer for massive structural moves in SharePoint
A simple “wait 20–30 minutes and try again” can genuinely fix the issue.
4. When to Escalate
Escalation is appropriate when:
- User permissions look correct
- Data is in a supported cloud location
- No sensitivity labels or Purview rules apply
- Indexing time has passed
- Other users can perform the same request successfully
In this scenario, gather:
- The exact prompt
- Timestamp
- URL of the file or location (no file content)
- User’s UPN
- Screenshot of the error
This is usually enough for a Microsoft 365 admin or security engineer to pinpoint the underlying issue.
As Copilot becomes more deeply embedded in everyday workflows, helpdesk teams need reliable ways to make sense of these new AI-related errors. The good news is that most “Not Grounded” and “Permission Denied” messages are not AI failures—they’re simply symptoms of permission issues, unsupported data locations, or compliance rules doing their job.
By following a consistent troubleshooting approach—examining phrasing, verifying data location, checking permissions, reviewing Purview policies, and allowing time for Graph indexing—support staff can resolve the vast majority of issues without escalation.






