How to disclose Microsoft Copilot use in academic writing
A practical guide for researchers who use Microsoft Copilot and need to disclose that use clearly in papers, theses, and journal submissions.
Microsoft Copilot use needs a clear paper trail
A lot of researchers use Microsoft Copilot because it sits inside tools they already know. Word, PowerPoint, Outlook, Teams, and Excel feel familiar. That makes Copilot easy to adopt and easy to forget when it comes time to disclose AI use.
That is the trap.
If Copilot helped you draft text, rewrite prose, summarize notes, extract points from meeting transcripts, or generate slides from research material, you should record that use. Journal and publisher policies now expect authors to disclose AI assistance when it affects manuscript preparation or the research process. The ICMJE says authors should disclose AI use at submission and describe how they used it. Elsevier says authors should include a separate AI declaration for manuscript preparation, and describe research-process use in the methods section. AI tools also cannot be listed as authors. (icmje.org)
This article shows you how to disclose Microsoft Copilot use in a way that editors can follow and coauthors can approve.
If you want a structured record instead of a loose note in your acknowledgments, generate an AI Usage Card and keep it with your manuscript files.
What counts as Microsoft Copilot use
Researchers often think of Copilot as "just editing help." Sometimes that is true. Sometimes it is doing much more than that.
You should treat Copilot use as worth documenting when it helped with any of the following:
- drafting or rewriting manuscript text
- summarizing articles, notes, or transcripts
- generating tables, outlines, or presentation content
- suggesting code, formulas, or analysis steps
- extracting themes from qualitative material
- preparing figures or image prompts
- assisting with grant, thesis, or cover letter text tied to the paper
Elsevier draws a line between basic grammar, spelling, and punctuation checks, that need no declaration, and broader manuscript preparation or research use, that should be disclosed. (elsevier.com)
That distinction helps. If you used Copilot like a spellchecker, you may not need a formal statement. If you used it to generate, transform, summarize, or interpret content, you should disclose it.
Why Copilot deserves tool-specific disclosure
"AI was used" is too vague.
Editors, reviewers, and readers need enough detail to understand what tool you used, what you asked it to do, and how much human review happened after that. The ICMJE says authors should describe how they used AI-assisted tools and should carefully review and edit the output because it can be wrong, incomplete, or biased. It also says humans remain responsible for the final work. (icmje.org)
Tool-specific disclosure matters for another reason. Microsoft Copilot comes in different forms. Some versions work only with web content. Others can access organizational files, emails, meetings, or documents through Microsoft 365, depending on your account and permissions. Microsoft says Microsoft 365 Copilot Chat with work or school accounts provides enterprise data protection, and prompts and responses are not used to train foundation models. Microsoft also notes that some experiences can draw on work content when the product has access to it. (support.microsoft.com)
That means your disclosure should name the version you used when you know it. "Microsoft Copilot" is better than nothing. "Microsoft 365 Copilot in Word using an institutional account" is better.
The four questions every disclosure should answer
Keep your note simple. Answer four questions.
1. What tool did you use?
Name Microsoft Copilot as specifically as you can.
Examples:
- Microsoft 365 Copilot in Word
- Microsoft 365 Copilot Chat
- GitHub Copilot for code assistance
- Microsoft Copilot in PowerPoint
Do not guess if you are unsure. Check your account, product page, or institutional license.
2. What did you use it for?
State the task, not a slogan.
Good examples:
- rewriting paragraphs for clarity
- summarizing interview notes into draft memos
- suggesting PowerPoint slide outlines for a conference talk
- generating starter code for data cleaning
- extracting action items from Teams meeting transcripts
Bad example:
- "used for productivity"
That says nothing.
3. What material did you give it?
This part matters more than many authors realize.
If you pasted unpublished manuscript text, participant data, peer review material, or confidential notes into Copilot, say so internally in your records even if your published statement stays brief. Nature journals state that reviewers should not upload manuscripts into generative AI tools during peer review because that can breach confidentiality. Elsevier says the same for submitted manuscripts and notes possible confidentiality and privacy risks. (nature.com)
For your own paper, you should record whether you shared:
- published sources only
- your own draft text
- de-identified research material
- sensitive or restricted data
- institutional documents or meeting transcripts
4. What human review did you do?
This is the part that protects you.
State that you reviewed, edited, and verified all AI-assisted output. ICMJE says authors must take responsibility for accuracy, originality, attribution, and the absence of plagiarism, including in AI-generated text and images. (icmje.org)
A simple disclosure template for Microsoft Copilot
Use this when Copilot helped with writing support but did not perform the research itself:
\section*{AI use disclosure}
The authors used Microsoft 365 Copilot in Word during manuscript preparation to
suggest edits for clarity, shorten selected paragraphs, and generate outline options
for the introduction. The authors reviewed and revised all output and take full
responsibility for the accuracy, originality, and final wording of the manuscript.Use this when Copilot also touched research materials:
\section*{AI use disclosure}
The authors used Microsoft 365 Copilot Chat with an institutional account to
summarize de-identified project notes and draft a preliminary outline for the methods
section. The tool was not used to make final analytic decisions. The authors verified
all summaries against the original records, rewrote the final text, and take full
responsibility for the manuscript.Use this when you need a methods-style statement:
\subsection*{Use of AI tools}
During data processing and manuscript preparation, the research team used Microsoft
Copilot for limited assistance with code suggestions, note summarization, and language
editing. The team reviewed all AI-assisted outputs, validated code before use, and did
not treat AI-generated content as evidence or as a source of interpretation.If you want more examples, see our guide to AI Usage Cards examples and templates and then generate a full record at ai-cards.org.
Where to put the disclosure
The right location depends on what Copilot did.
If Copilot helped with writing, many journals accept a disclosure in the acknowledgments, a dedicated AI declaration, or a cover letter. ICMJE says journals should require disclosure at submission, in the cover letter and in the submitted work where appropriate. Elsevier asks for a separate AI declaration statement for manuscript preparation. (icmje.org)
If Copilot affected the research process itself, put the disclosure in the methods section too.
That includes cases where Copilot helped summarize notes, classify text, suggest code used in analysis, or produce outputs that shaped your research workflow. Readers need to see that in the body of the paper, not buried in the back matter.
What not to do
Do not list Microsoft Copilot as an author. Publisher and journal guidance rejects that. (icmje.org)
Do not write a fake disclosure like "AI tools were used for editing assistance." That sounds evasive because it is evasive.
Do not claim that AI "verified" facts, "validated" findings, or "ensured" accuracy. You did that work, or you did not.
Do not hide tool use because it feels minor. Small uses add up. If Copilot shaped wording, structure, summaries, code, or slides tied to the paper, write it down.
A practical standard for research groups
If you lead a lab or write with students, set one rule now: log AI use while the work is happening.
A lightweight log can include:
- date
- user
- tool and version
- task
- input type
- output used or discarded
- human review
That record saves time when you submit the paper. It also helps when coauthors ask, "Wait, did we use Copilot for that section?"
I would not leave this to memory. By the end of a long project, nobody remembers what started as a prompt and what started as a human draft.
Copilot, privacy, and institutional accounts
Researchers also ask a separate question: if I used a school or university account, does that change disclosure?
It changes your risk analysis more than your disclosure duty.
Microsoft says that prompts and responses in Microsoft 365 Copilot Chat for work or school accounts are protected by enterprise data protection and are not used to train foundation models. OpenAI makes similar statements for ChatGPT Enterprise, ChatGPT Edu, and its API business offerings, where business data is not used for training by default. (support.microsoft.com)
That may reduce some privacy concerns. It does not remove the need to disclose the use itself. Transparency is about who helped produce the work and how.
A stronger way to disclose than a one-line acknowledgment
A single sentence in the acknowledgments can satisfy a journal form. It rarely gives your coauthors, department, or readers much context.
An AI Usage Card does better. It lets you record:
- the tool
- the task
- the inputs
- the outputs
- the human checks
- the limits you placed on AI use
That makes the disclosure easier to reuse across a paper, thesis chapter, grant report, or conference submission. It also helps when different venues ask for different wording.
If you are writing a thesis, pair this article with our guide on how to disclose AI usage in your thesis. If you need examples for acknowledgments and methods statements, read how to disclose ChatGPT usage in academic papers and adapt the wording for Copilot.
The safest rule
If Microsoft Copilot changed your words, your structure, your summaries, your code, or your presentation of the research, disclose it.
You do not need a dramatic confession. You need a clean record.
Create that record now with an AI Usage Card generator. Fill in the tool, the task, and your human review steps. Then copy the result into your manuscript, thesis, or submission materials.
Generate Your AI Usage Report
Create a standardized AI Usage Card for your research paper in minutes. Free and open source.
Create Your AI Usage Card