If you’ve used Microsoft Copilot in on the web browser, you’ve seen the potential for AI with internal data, but that’s just the beginning. The paid license for Copilot in Microsoft 365 unlocks the full potential: seamless integration, personalization, and smart automation right inside the tools you already use every day.

Built Directly into Office Apps

The primary difference between the free version of Copilot and the paid license is the integration into the everyday Microsoft Tools you already use.

  • Word: Generate, summarize, or rewrite entire documents

  • Excel: Analyze data, create formulas, and find trends

  • PowerPoint: Build decks from scratch using just a prompt

  • Outlook: Write polished replies and suggest scheduling options

  • Teams: Summarize meetings (live or after the fact), track tasks, and surface key points

Copilot Free vs. Paid: What’s the Difference?

Feature Free (Web Version) Paid Copilot for M365
Commercial Data Protection
Access to your M365 data
Embedded in Word, Excel, etc.
Personalized replies based on your work
Summary of meetings, emails, and documents

The biggest difference in the free vs. paid license of Copilot is what you can do with the data you have once Copilot is built into your Microsoft Apps.

With the paid version:

  • Your data—emails, files, Teams chats, calendars—never leaves the Microsoft 365 environment.

  • Copilot runs within your secure tenant, using Microsoft’s Zero Trust architecture.

  • It does not send data to public servers or use it to train language models.

This means your interactions with Copilot stay subject to the same security and compliance rules that already govern tools like Outlook, OneDrive, and SharePoint.

The paid Copilot solution aligns with:

  • FERPA, HIPAA (where applicable), and other regulatory standards

  • UW’s internal data classification and acceptable use policies

  • Microsoft’s own Customer Data Protection Policy, with clear data residency and ownership agreements

This makes it fundamentally different from using free tools like ChatGPT, Bing AI, or Gemini, which do not reside within UW’s managed infrastructure. It’s a secure, integrated solution that respects your files, your permissions, and your data boundaries. That’s what makes it appropriate for handling work with moderate sensitivity, as long as users still follow internal data policies.

“What can I actually use AI for—and how do I avoid getting myself in trouble?”


That’s one of the first (and smartest) questions people ask when trying out generative AI tools. There has been a lot of excitement around such tools, and there sure are a lot of tools on the marketplace.

At the Universities of Wisconsin, we currently do not have an enterprise or educational license for tools like ChatGPT, Claude, Gemini, or Deepseek. (The exception is Microsoft Copilot, which has a paid license in some systems.)

That means: while these tools can be powerful for brainstorming, writing, and automating routine tasks, it’s critical to use them responsibly—especially when it comes to handling data.

Here’s what you need to know about using public AI tools safely and wisely. (This does not pertain to the paid Copilot license)

OK to Use ChatGPT (Or any public LLM) For:

  • Drafting non-sensitive communications, like email templates or project summaries

  • Brainstorming ideas (e.g., presentation outlines, workshop formats, survey questions)

  • Creating or refining generic text (e.g., help articles, documentation)

  • Generating code snippets without user or student data

  • Exploring concepts or summarizing public information

  • Clarifying technical terms or academic topics

  • Getting grammar and tone feedback on professional writing

  • Support for creative ideation and storytelling in student engagement materials

NOT OK to Use ChatGPT (Or any public LLM) For:

  • Sharing personally identifiable information (PII), such as names, ID numbers, or student records

  • Discussing confidential or restricted information, including contracts, budget documents, HR cases, or sensitive governance topics

  • Uploading or pasting internal documents marked confidential or not meant for public disclosure

  • Using it to make final decisions about student admission, financial aid, or disciplinary action

  • Treating ChatGPT outputs as authoritative without verification—it’s a tool, not a source of record

This includes technical communication as well. It is OK to use ChatGPT for general-purpose coding and debugging. It is NOT acceptable to use ChatGPT if the code contains student data, internal APIs, secure tokens or authentication keys. This means anything tied to Peoplesoft, SIS, Workday, etc. Be especially cautious if you are using the model to generate decisions that affect users without human oversight (e.g., scripts that automatically move money, update grades, or modify access rights)

If you wouldn’t email the code to a stranger without redacting it, don’t paste it into an LLM.

Tips for Responsible Use

  • Treat AI as a thought partner, not a decision-maker.

  • If in doubt, strip out sensitive details and ask generalized questions.

  • Assume anything you input is visible externally, unless you’re using a licensed instance with data protection agreements.

  • Cite any final content or policies to approved institutional sources, not ChatGPT.