The newest entry in a long line of technologies that promise to disrupt higher education are artificial intelligence (AI) tools like ChatGPT. In seconds these tools can use machine learning to generate text, video, or images, in response to users’ prompts.

The technology continues to advance, with features such as internet browser plugins that can generate answers to assessment and homework questions. Computer generated text does have its limitations. Users can’t rely on the tool to make qualitative judgments, such as determining if language is appropriate for a given context or rely on the accuracy of content citations.

These tools are widely available, and instructors will need to consider carefully how to adapt to these developments. Amid discussions and academic integrity concerns about these tools, the Office of Teaching Learning, and Technology and the Center for Teaching offer this brief guide to address some of the most frequently asked questions about ChatGPT and other AI tools.

These are new technologies, and the future is unknown as the availability and reliability of these tools are developing.

You may need to discuss using AI tools in a variety of contexts, including learning materials and campus, collegiate, and course policies related to academic integrity. Consult with your collegiate leadership about specific policies. In any case, providing transparent information about expectations for student use of AI tools and how these expectations align with course goals and scholarly values is crucial.

Remember that with any policy in your syllabus, it’s important to have ongoing conversations throughout the semester.  Some example language:

  • When AI is prohibited. [This course] assumes that work submitted by students—all process work, drafts, low-stakes writing, final versions, and all other submissions—will be generated by the students themselves, working individually or in groups. This means that the following would be considered violations of academic integrity: a student has another person/entity do the writing of any substantive portion of an assignment for them, which includes hiring a person or a company to write essays and drafts and/or other assignments, research-based or otherwise, and using artificial intelligence affordances like ChatGPT. (Excerpted from ChatGPT by University of California: Irvine Division of Teaching Excellence and Innovation)
  • When AI is allowed with attribution. In all academic work, the ideas and contributions of others must be appropriately acknowledged and work that is presented as original must be, in fact, original. Using an AI-content generator (such as ChatGPT) to complete coursework without proper attribution or authorization is a form of academic dishonesty. If you are unsure about whether something may be plagiarism or academic dishonesty, please contact your instructor to discuss the issue. Faculty, students, and administrative staff all share the responsibility of ensuring the honesty and fairness of the intellectual environment. (Excerpted from Constructing a Syllabus: A Checklist by Washington University in St. Louis Center for Teaching and Learning)
  • When AI is allowed with attribution. Use of AI tools, including ChatGPT, is permitted in this course for students who wish to use them. To be consistent with our scholarly values, students must cite any AI-generated material that informed their work and use quotation marks or other appropriate indicators of quoted material when appropriate. Students should indicate how AI tools informed their process and the final product, including how you validated any AI-generated citations, which may be invented by the AI. Assignment guidelines will provide additional guidance as to how these tools might be part of your process for each assessment this semester and how to provide transparency about their use in your work.
  •  When AI use is encouraged with certain tasks. Students are invited to use AI platforms to help prepare for assignments and projects (e.g., to help with brainstorming or to see what a completed essay might look like). I also welcome you to use AI tools to help revise and edit your work (e.g., to help identify flaws in reasoning, spot confusing or underdeveloped paragraphs, or to simply fix citations). When submitting work, students must clearly identify any writing, text, or media generated by AI. This can be done in a variety of ways. In this course, parts of essays generated by AI should appear in a different colored font, and the relationship between those sections and student contributions should be discussed in cover letters that accompany the essay submission. (Based on Course Policies related to ChatGPT and other AI Tools by Joel Gladd)

For more, Ryan Watkins, professor of Educational Technology Leadership, and Human-Technology Collaboration at George Washington University, offers suggestions to update your course syllabus and assignments. For more ideas about communicating with students about generative AI in your syllabus and beyond, contact the Center for Teaching.

Instructors should refrain from using AI detectors on student work due to the inherent inaccuracies and biases in these tools.

AI detectors can produce false positives, unfairly penalizing students who have not engaged in any academic misconduct. Moreover, using AI detectors shifts pedagogical focus to policing, undermining the trust between students and educators, and offers only a short-term solution as AI technology is rapidly evolving.

Instead, educators should focus on fostering open dialogue, providing clear guidelines and diverse assessment methods to ensure academic integrity and support authentic student learning. As alternatives to AI detectors:

  • Create assignments that motivate students and help them understand how their work supports their learning. Incorporate course materials and require direct quotations from these materials to be included in students’ written answers.
  • Use multi-stage assignments with drafts and revisions.
  • Consider non-written formats like presentations or multimedia projects.
  • Design assignments with components that ask students to use their own voice and personal experiences.
  • Ask students metacognitive follow-up questions about their assignments to encourage reflection on their thought processes and confirm the authenticity of their work.

The Center for Teaching has resources available for instructors wishing to promote academic integrity or redesign assignments for future students.

Learn more about the limitations of AI detection tools:

Abdali, S., Anarfi, R., Barberan, C. J., & He, J. (2024). Decoding the AI Pen: Techniques and Challenges in Detecting AI-Generated Text. arXiv. https://doi.org/10.48550/arXiv.2403.05750

Sadasivan, V. S., Kumar, A., Balasubramanian, S., Wang, W., & Feizi, S. (2024)1. Can AI-Generated Text be Reliably Detected? arXiv. https://doi.org/10.48550/arXiv.2303.11156

Weber-Wulff, D., Anohina-Naumeca, A., Bjelobaba, S.et al. Testing of detection tools for AI-generated text. International Journal for Educational Integrity, 19, 1-39. (2023). https://doi.org/10.1007/s40979-023-00146-z

Students may have access to a variety of resources like calculators, Wikipedia, peers, hired tutors, and increasingly, AI tools to accomplish their course tasks and assignments. 

Instructors are encouraged to engage in discussions with their students about Iowa’s policy on Academic Misconduct. Students benefit from transparent instructions about which tools and resources they can appropriately leverage in their work (e.g., Winkelmes et. al, 2016). They will also need guidance on how to cite or acknowledge the tools and individuals who contributed to their work. Although a policy in a course syllabus is a good way to start these conversations, the syllabus shouldn’t be the only time that the policy is discussed.

The Center for Teaching's Handbook for Teaching Excellence provides additional information on promoting academic integrity.

Some instructors may want to incorporate AI tools into their assignments so that students develop the skills necessary to interact with it in the future. Depending on course goals, potential assignments could include:

  • Developing a question that will effectively elicit an AI-generated response that meets certain specifications.
  • Comparing and contrasting AI-generated text with other scholarly work.
  • Evaluating AI-generated text for missing or biased information.
  • Using AI to generate examples or proofs of concept (Warner, 2022).

Like any technology tool, it is important to consider questions about student privacy, availability of technical troubleshooting, and accessibility prior to requiring use by students. Instructors can request consultations on the availability of alternative tools or alternative assignments for individual students who may not wish to use AI tools.

If you would like assistance with designing assessments for your own teaching context, contact the Center for Teaching.

Some instructors may be interested in preventing the student use of AI tools in their assessments. This may be a difficult task as Beth McMurtrie points out in a Chronicle of Higher Education’s Teaching Newsletter, “If you want to make your assignments AI proof, that’s impossible," (McMurtrie, 2023). Effective assessments are closely aligned to course goals and allow students to demonstrate their learning in authentic ways. Assignments that are focused on the specific learning of the course and students’ own learning are also less suited to being written by AI or by another person who is not participating in the course.

  • Developing a specific prompt that requires students to leverage information learned in the course, including materials on ICON and in-class discussions.
  • Scaffolding the assignment to include several stages, such as a proposal, an outline, a rough draft, and a final draft. Depending on other factors, such as course size, an instructor might be able to include opportunities for students to receive and comment on peer/instructor feedback at these stages.
  • Including a metacognitive component in which students describe their research and writing process, what they learned from it, and how they would approach a similar task in the future.
  • Focusing assessments on current events and recent scholarship not yet included in the AI training data set.
  • Exploring how an AI responds to your assignment prompt. 

It is not recommended to simply replace existing high-stakes assignments with graded, hand-written, in-class assignments. This may have unintended consequences for students who use technology to provide accommodations and other legitimate assistance in completing coursework. (CRLT, 2022)

The Center for Teaching's Handbook for Teaching Excellence provides additional information on how to develop authentic assessments. If you would like assistance with designing assessments for your own teaching context, contact the Center for Teaching.

These programs bring generative AI into the same browser window a student is using to complete an assignment. They can read text displayed in a browser (this includes quiz questions from ICON, Top Hat, and other online platforms), and then suggest or automatically select an answer based on what the generative AI believes is the correct answer.

Here are the strategies you can use to adapt your teaching:

  1. Include a statement about the use of these tools in your syllabi and assignments. See samples of these statements.
  2. Consider alternative ways to assess students' learning that require them to complete tasks outside of or in addition to tasks in ICON, such as oral presentations, visual or multimodal projects, case studies, and journals.
  3. Consider incorporating the use of alternative assessments, higher order questions, and flipped classroom approaches in your teaching.
  4. Use a locked browser with any online quizzes or exams to prevent the use of other applications while testing.
  5. All University of Iowa courses have access to Respondus Lockdown Browser.
  6. Select courses have access to Honorlock.
  7. Use Gradescope to assess paper-based quizzes and tests in a way that helps save time and gives students consistent feedback at scale.

ITS is monitoring developments in this space, escalating the AI browser plugin topic with the instructional technology vendors we work with, adding a message to instructors in ICON’s quiz page, and blocking the installation of known plugins on university-managed machines that are located in Instructional Technology Centers and testing centers.