Generative AI tools are reshaping how students study, write, and learn.
Clearly stating your expectations around AI use in your syllabus fosters academic integrity, reduces confusion, and sets your learning community up for success. But it doesn't stop there: AI expectations should also be discussed in class and added into individual assignment descriptions to help students understand what’s appropriate task by task.
Why Include AI in the Syllabus
Your syllabus is more than a course outline—it’s your first and best opportunity to communicate the values, goals, and expectations that shape your classroom. As generative AI becomes part of the academic landscape, it’s important to clearly define your course policies on its use.
Be clear with students about when, how, and why AI tools may or may not be used. Connect your course policies to your course's learning outcomes and explain how these guidelines support both academic integrity and skill development. If you choose to set limits around student AI use, include specific language about academic misconduct and note that unauthorized use of generative AI may be considered a violation of the University of Iowa Code of Student Life.
Revisit your expectations throughout the semester.
Assignment specific guidelines
A syllabus statement is a good first step, but students also need to be reminded about expectations in individual assignments. Students are likely encountering very different AI expectations across multiple instructors and disciplines. Each task may invite different kinds of engagement with AI tools—and without guidance, students may misjudge what is appropriate.
Add brief explanations to assignment descriptions outlining when and how AI may (or may not) be used. This helps students make informed decisions, meet your expectations, and avoid unintentional academic misconduct.
1. Break down the task
"Using AI" can mean many things: generating ideas, rephrasing text, checking grammar, and possibly more. For your assignments consider providing step-by-step guidance that explicitly addresses AI use at each stage and why. For example:
Planning: AI can be used to brainstorm, explore topics of interest, and help refine the research question.
Research: Students are expected to read and summarize source materials themselves. AI should not be used for this step
Drafting: AI can assist in outlining or organizing ideas, but core arguments and analysis must be the student's own.
Reviewing: Peer feedback must reflect the student's own thinking. AI should not be used during peer review.
Revising: AI may be used for grammatical and spelling errors. Submit a before and after editing draft for this stage of the assignment.
2. Use a shared framework
The AI Assessment Scale (Perkins & Furze) is a helpful tool for communicating expectations around AI. This flexible framework supports instructor-student discussions about when AI use enhances learning and when it might not. Consider using it to spark classroom conversations or in labelling your assignments accordingly.
Once students know when AI use is appropriate, the next step is helping them understand how to acknowledge its use, through citations, disclosures, or both.
Citing an AI
Traditional citation practices allow readers to find the same sources and verify claims made by authors. Generative AI complicates that process. Its outputs can’t be retrieved or reproduced in the same way as a book or article, which makes proper crediting less straightforward. That’s why instructors should help students understand two options in documenting AI use: AI citations and AI disclosures.
AI citations acknowledge the use of AI and will indicate that certain ideas, phrasings, analysis, or approaches in your work originated from an AI interaction. Similar to citing a conversation with a colleague, the source can’t be retrieved, but the influence should be acknowledged.
Many fields are developing standards for AI disclosure in professional contexts:
- MLA: Citing generative AI
- APA: How to cite ChatGPT
- Chicago Manual of Style: Citing content developed or generated by artificial intelligence
- AMA/Vancouver: Reporting Use of AI in Research and Scholarly Publication
AI Disclosure Statements are different from a citation or reference, in that an AI Disclosure statement (like Disclosures of Funding or Conflicts of Interest) acknowledges how you have used generative AI in the process of creating a work for submission.
For example, “I acknowledge the use of [insert AI system(s) and link] to [specific use of generative artificial intelligence]. The prompts used include [list of prompts]. The output from these prompts was used to [explain use].”
The Association of College and Research Libraries has created a framework to “provide transparency to the use of AI tools throughout the writing process, ensuring clarity at a level that is both detailed enough to be informative and short enough to avoid being onerous.”
Students face varying AI expectations across their courses. Providing specific guidelines in every assignment helps eliminate confusion and prevent unintentional academic misconduct.
Resources to construct your syllabus statement:
- CLAS Syllabus template
- University of Iowa Center for Teaching and the Office of Teaching, Learning, and Technology
- Sentient Syllabus Project
- Syllabus Policies for Generative AI Tools
If you would like to learn even more about generative AI and teaching practices, you are welcome to join the Faculty Interest Group.
AI Disclosure Statement: ChatGPT (OpenAI) and Claude (Anthropic) were used to brainstorm, revise, and refine the language in this article. All final content was reviewed and edited before publishing for intended message, tone, and institutional context.