Data Loss Prevention (DLP) for Copilot

Video Tutorial

Data Loss Prevention (DLP) for Copilot

How to configure Purview DLP to prevent data leakage in Copilot interactions. A step-by-step guide for government agencies to set guardrails without blocking productivity.

8:00 January 14, 2026 Security, compliance, it

Overview

If you have a policy that says “Don’t share PII with external users,” enabling Copilot doesn’t change that policy. But it does change how easily users can access, summarize, and move that data.

This video explains how to use Microsoft Purview Data Loss Prevention (DLP) to enforce your data handling rules within Copilot. We’ll cover the difference between permissions and DLP, how to configure a policy for government data, and how to test it effectively.

What You’ll Learn

  • How DLP interacts with Copilot (blocking, warning, and auditing)
  • The step-by-step process to extend existing policies to Copilot
  • Critical design choices for government environments (blocking vs. user education)
  • How to validate your controls with real-world test cases

Script

Hook: DLP is how you turn policy intent into enforcement

If your agency has a rule that says “Don’t email Social Security Numbers to the public,” that rule applies whether you type the email yourself or ask AI to write it for you.

The question is: does your technology enforce that rule when the content is being generated by AI?

That is where Microsoft Purview DLP comes in. It’s the difference between hoping users follow the rules and technically ensuring they do.

What DLP does in Copilot scenarios

DLP is policy-based detection and control. It looks for specific sensitive information types—like SSNs, credit card numbers, or government-specific patterns—and applies a rule.

In a Copilot context, you can use DLP to:

  • Warn users when they are about to access or generate sensitive content.
  • Block Copilot from using certain sensitive documents to generate a summary.
  • Prevent Copilot from generating a response that includes restricted data.

It’s important to set expectations: DLP doesn’t replace permissions. Permissions control access. DLP controls what you do with the data once you have access.

Prereqs and design choices

Before you open the console, you need to make two design choices.

First: What are you protecting? Are you looking for PII? Financial data? CUI (Controlled Unclassified Information)? You need to define your Sensitive Information Types clearly.

Second: What is your control posture? Do you want to Block immediately? Or do you want to Warn and educate?

For our government customers, my recommendation is: Start with a pilot group and high-confidence detections. False positives kill user adoption. If Copilot gets blocked every time someone types a harmless acronym, they will stop using it.

Build the policy (walkthrough)

Let’s walk through the policy build.

  1. Select your Sensitivity Info Types. Use the built-in definitions for U.S. PII or create custom ones for your agency’s project codes.
  2. Define the Scope. Apply the policy to the locations where Copilot interacts—Exchange, SharePoint, OneDrive, and Teams.
  3. Configure the Action.
    • Start with Audit only to see what would happen.
    • Then move to Policy Tips to educate users (“This document contains sensitive data…”).
    • Finally, enable Restrict Access to block the behavior for high-risk data.

Test cases you should run

You can’t just turn it on and walk away. You have to test it. Here are the tests I recommend:

  1. The Summary Test: Ask Copilot to summarize a document that you know contains sensitive data. Does the policy trigger?
  2. The Draft Test: Ask Copilot to write an email that includes a fake Social Security Number. Does it stop you?
  3. The Export Test: Try to copy a sensitive Copilot response into an email to an external address.

Monitor, tune, and operationalize

Once you are live, you enter the operational phase.

Review your DLP Alerts and Incident Reports. Look for patterns. Are valid business processes getting blocked? If so, tune the policy or add exceptions.

You should coordinate this with your Records Management and eDiscovery teams, because a DLP event often triggers other compliance requirements.

Close: success criteria

Your goal isn’t to block Copilot—it’s to make it safe to use.

Success looks like this:

  • Your high-risk data is protected by high-confidence blocking rules.
  • Your users get clear guidance when they hit a guardrail.
  • And you have the reporting to prove to your leadership that the system is working.

Next up, we’ll talk about Sensitivity Labels—the other half of the data protection coin.

Sources & References

GCC GCC-HIGH DOD Compliance Governance Security

Related Resources

Watch on YouTube

Like, comment, and subscribe for more content

View on YouTube