The newest entry in a long line of technologies that promise to disrupt higher education are artificial intelligence (AI) tools like ChatGPT. In seconds these tools can use machine learning to generate text, video, or images, in response to users’ prompts.

The technology continues to advance, with features such as internet browser plugins that can generate answers to assessment and homework questions. Computer generated text does have its limitations. Users can’t rely on the tool to make qualitative judgments, such as determining if language is appropriate for a given context or rely on the accuracy of content citations.

These tools are widely available and instructors will need to consider carefully how to adapt to these developments. Amid discussions and academic integrity concerns about these tools, the Office of Teaching Learning, and Technology and Center for Teaching offers this brief guide to address some of the most frequently asked questions about ChatGPT and other AI tools.

These are new technologies, and the future is unknown as the availability and reliability of these tools are developing.

You may need to discuss using AI tools in a variety of contexts, including learning materials and campus, collegiate, and course policies related to academic integrity. Consult with your collegiate leadership about specific policies. In any case, providing transparent information about expectations for student use of AI tools and how these expectations align with course goals and scholarly values is crucial.

Remember that with any policy in your syllabus, it’s important to have ongoing conversations throughout the semester. Some example language:   

  • When AI is prohibited. [This course] assumes that work submitted by students—all process work, drafts, low-stakes writing, final versions, and all other submissions—will be generated by the students themselves, working individually or in groups. This means that the following would be considered violations of academic integrity: a student has another person/entity do the writing of any substantive portion of an assignment for them, which includes hiring a person or a company to write essays and drafts and/or other assignments, research-based or otherwise, and using artificial intelligence affordances like ChatGPT. (Excerpted from ChatGPT by University of California: Irvine Division of Teaching Excellence and Innovation) 

  • When AI is prohibited. Since writing, analytical, and critical thinking skills are part of the learning outcomes of this course, all writing assignments should be prepared by the student. Developing strong competencies in this area will prepare you for a competitive workplace. Therefore, AI-generated submissions are not permitted and will be treated as plagiarism. (Sample statement shared by Chrissann Sparks Ruehle, with permission for others to use, on Higher Ed Discussions of AI Writing Facebook Group on 1/6/2023, cited in ChatGPT Resources by Texas Tech University Teaching, Learning & Professional Development Center) 

  • When AI is allowed with attribution. In all academic work, the ideas and contributions of others must be appropriately acknowledged and work that is presented as original must be, in fact, original. Using an AI-content generator (such as ChatGPT) to complete coursework without proper attribution or authorization is a form of academic dishonesty. If you are unsure about whether something may be plagiarism or academic dishonesty, please contact your instructor to discuss the issue. Faculty, students, and administrative staff all share the responsibility of ensuring the honesty and fairness of the intellectual environment. (Excerpted from Constructing a Syllabus: A Checklist by Washington University in St. Louis Center for Teaching and Learning) 

  • When AI is allowed with attribution. Use of AI tools, including ChatGPT, is permitted in this course for students who wish to use them. To be consistent with our scholarly values, students must cite any AI-generated material that informed their work and use quotation marks or other appropriate indicators of quoted material when appropriate. Students should indicate how AI tools informed their process and the final product, including how you validated any AI-generated citations, which may be invented by the AI. Assignment guidelines will provide additional guidance as to how these tools might be part of your process for each assessment this semester and how to provide transparency about their use in your work. 

  •  When AI use is encouraged with certain tasks. Students are invited to use AI platforms to help prepare for assignments and projects (e.g., to help with brainstorming or to see what a completed essay might look like). I also welcome you to use AI tools to help revise and edit your work (e.g., to help identify flaws in reasoning, spot confusing or underdeveloped paragraphs, or to simply fix citations). When submitting work, students must clearly identify any writing, text, or media generated by AI. This can be done in a variety of ways. In this course, parts of essays generated by AI should appear in a different colored font, and the relationship between those sections and student contributions should be discussed in cover letters that accompany the essay submission. (Based on Course Policies related to ChatGPT and other AI Tools by Joel Gladd)  

For more, Ryan Watkins, professor of Educational Technology Leadership, and Human-Technology Collaboration at George Washington University, offers suggestions to update your course syllabus and assignments.  For more ideas about communicating with students about generative AI in your syllabus and beyond, contact the Center for Teaching. 

Students may have access to a variety of resources like calculators, Wikipedia, peers, hired tutors, and increasingly, AI tools to accomplish their course tasks and assignments. 

Instructors are encouraged to engage in discussions with their students about Iowa’s policy on Academic Misconduct. Students benefit from transparent instructions about which tools and resources they can appropriately leverage in their work (e.g., Winkelmes et. al, 2016). They will also need guidance on how to cite or acknowledge the tools and individuals who contributed to their work. Although a policy in a course syllabus is a good way to start these conversations, the syllabus shouldn’t be the only time that the policy is discussed.  

The Center for Teaching's Handbook for Teaching Excellence provides additional information on promoting academic integrity.  

Some instructors may want to incorporate AI tools into their assignments so that students develop the skills necessary to interact with it in the future. Depending on course goals, potential assignments could include:   

  • Developing a question that will effectively elicit an AI-generated response that meets certain specifications.   

  • Comparing/contrasting AI-generated text with other scholarly work.  

  • Evaluating AI-generated text for missing or biased information.   

  • Using AI to generate examples or proofs of concept (Warner, 2022).   

Like any technology tool, it is important to consider questions about student privacy, availability of technical troubleshooting, and accessibility prior to requiring use by students. Instructors can request consultations on the availability of alternative tools or alternative assignments for individual students who may not wish to use AI tools.  

If you would like assistance with designing assessments for your own teaching context, contact the Center for Teaching. 

Some instructors may be interested in preventing the student use of AI tools in their assessments. This may be a difficult task as Beth McMurtrie points out in a recent Chronicle of Higher Education’s Teaching Newsletter, “If you want to make your assignments AI proof, that’s impossible," (McMurtrie, 2023).  Effective assessments are closely aligned to course goals and allow students to demonstrate their learning in authentic ways. Assignments that are focused on the specific learning of the course and students’ own learning are also less suited to being written by AI or by another person who is not participating in the course.

Consider:  

  • Developing a specific prompt that requires students to leverage information learned in the course, including materials on ICON and in-class discussions.   

  • Scaffolding the assignment to include several stages, such as a proposal, an outline, a rough draft, and a final draft. Depending on other factors, such as course size, an instructor might be able to include opportunities for students to receive and comment on peer/instructor feedback at these stages.  

  • Including a metacognitive component in which students describe their research and writing process, what they learned from it, and how they would approach a similar task in the future.   

  • Focusing assessments on current events and recent scholarship not yet included in the AI training data set.  

  • Exploring how an AI responds to your assignment prompt.  

It is not recommended to simply replace existing high-stakes assignments with graded, hand-written, in-class assignments. This may have unintended consequences for students who use technology to provide accommodations and other legitimate assistance in completing coursework. (CRLT, 2022). 

The Center for Teaching's Handbook for Teaching Excellence provides additional information on how to develop authentic assessments. If you would like assistance with designing assessments for your own teaching context, contact the Center for Teaching. 

These programs bring generative AI into the same browser window a student is using to complete an assignment. They can read text displayed in a browser (this includes quiz questions from ICON, Top Hat, and other online platforms), and then suggest or automatically select an answer based on what the generative AI believes is the correct answer.

Here are strategies you can use to adapt your teaching in advance of the coming semester. 

  1. Include a statement about the use of these tools in your syllabi and assignments. See samples of these statements in the question above. 

  1. Consider alternative ways to assess students' learning that require them to complete tasks outside of or in addition to tasks in ICON, such as oral presentations, visual or multimodal projects, case studies, and journals. 

  1. Consider incorporating the use of alternative assessments, higher order questions, and flipped classroom approaches in your teaching. OTLT offers educational opportunities on these topics on request. 

  1. Use a locked browser with any online quizzes or exams to prevent the use of other applications while testing.  

  1. All University of Iowa courses have access to Respondus Lockdown Browser. 

  1. Use Gradescope to assess paper-based quizzes and tests in a way that helps save time and gives students consistent feedback at scale. 

 ITS is monitoring developments in this space, escalating the AI browser plugin topic with the instructional technology vendors we work with, adding a message to instructors in ICON’s quiz page, and blocking the installation of known plugins on university-managed machines that are located in Instructional Technology Centers and testing centers. 

While there are several tools being developed to detect the use of AI in student work, it's important to remember that these emerging technologies are likely to be an imperfect solution.

GPTZero: an AI program developed by Edward Tian, a student at Princeton University, to detect if text was generated by ChatGPT.

Turnitin: in April 2023, Turnitin released their AI detection tool. The Turnitin service team at the University of Iowa reviewed this detection tool for several months to understand its capabilities. Due to inconsistent outcomes viewed at the University of Iowa and other peer institutions, Turnitin’s AI detection program will not be made available at this time. 

Skepticism of detection tools includes University of Mississippi instructor Marc Watkins's widely quoted assertion that, “We should be proactive in our response and not approach our teaching out of panic and mistrust of our students. What message would we send our students by using AI-powered detectors to curb their suspected use of an AI writing assistant, when future employers will likely want to them to have a range of AI-related skills and competencies?” (Watkins, 2022).   

For further information about the inaccuracies in AI detection tools read this article from the University of Pittsburgh’s Teaching Center or this article from Vanderbilt University.

The Office of Teaching, Learning, and Technology continues to monitor AI technology trends and the ongoing challenges of leveraging AI detection tools in higher education.