Generative AI is rapidly transforming the academic and research landscape, offering new opportunities for discovery and innovation. It is important to approach generative AI with a thoughtful approach, balancing the benefits with potential risks and ethical considerations. In this page, we collect information for using generative AI in an academic and research settings.

  • The ITS Research Services group has provided Using Artificial Intelligence (AI) Tools in Research to explain the benefits and limitations of AI tools and to provide guidance on using AI tools in research at the University of Iowa.
  • The Office of the Executive Vice President and Provost created a page with steps and tips for instructors to provide guidance in responding to AI in the classroom. This page includes suggestions of strategies for creating AI-resistant assignments, recent developments and challenges associated with AI in higher education, and the encouragement for instructors to provide students with clear instructions and ongoing discussion about what uses of AI are permissible within a course’s context.
  • The Center for Teaching and the Office of Teaching, Learning, and Technology has written the page Artificial Intelligence Tools and Teaching to provide answers to some frequently asked questions about AI tools and teaching, such as how to address generative AI in a syllabus, how to have conversations with students, and how to learn more about AI’s impact on education.

Guidance on Student Facing Chatbots

As chatbots have become easier and less expensive to deploy, interest in them has increased and many faculty are starting to experiment with their use. Chatbots can provide information at any time, answering questions quickly, and can be an effective tool for both faculty and students. 

To enable informed decisions about the uses and potential applications of chatbots in work, teaching and learning, and more, we ask faculty to consider the following guidance.

  • Make sure you understand what goals you have for the chatbot. Is your goal to provide students with basic information or individualized learning opportunities? Is your chatbot being created to automate administrative tasks? A clear understanding of your goals will help you design and implement your chatbot.
  • Adjust the chatbot responses to match your goals. Many chatbots allow you to control the tone, or voice, of the chatbot conversations. You should choose a style that matches the purpose for implementing the chatbot. For example, a chatbot that responds to questions in a formal manner may not be the best tone for a chatbot with a conversational purpose. A chatbot with a humorous or strongly emotional character may not be a good fit for a professional setting.
  • Make sure that any chatbot that is student- or private-facing is introduced with reminders to users that private or sensitive information should not be shared with the chatbots because, in general, the university cannot protect the privacy of information shared with chatbots.
  • Share a link to the University’s data classification guidelines to help users make informed decisions about their interactions with a chatbot.
  • The capabilities and limitations of chatbots used in academic settings should be clearly communicated to students. Students should understand that chatbots are incapable of understanding, emotions, or subjective opinions. Any output from the chatbot should be treated as supplemental or a convenience. Students should verify essential information with an authoritative human source. These reminders about chatbot limitations could be shared with students in an introductory message or prompt that appears each time they access the chatbot tool.
  • To avoid potential harm, users should be explicitly advised not to rely on any chatbot for emergencies, medical advice, or mental health support. Chatbots cannot assess emergencies or urgent situations and may give inaccurate, incomplete, or misleading information. Chatbots cannot replace trained professionals and using chatbots for health matters could have severe consequences.
  • Chatbots will not be able to answer all questions. It is important that chatbots let users know when they don’t know an answer, instead of “hallucinating” or providing incorrect information in its responses. This is especially true for when users ask personal, self-harm, bullying, or food insecurity types of questions.
  • To ensure validity of interactions with users, chatbots should be able to record their interactions so they can be reviewed later. Chatbots should also be able to share when they cannot produce an answer. Identify a person responsible for the chatbot and establish a regular schedule of review for this information to ensure the performance of the chatbot.
  • The knowledge base and programming behind any chatbot will need ongoing maintenance and updates to expand its capabilities over time. Plan for allocating time and money, not just to the initial implementation of a chatbot, but also for continuous improvement to support the chatbot’s continued use.
  • Students may use chatbots to get individualized help, to review content, to research a topic, and to find answers at any time and any location. Ask others, have a focus group, or whatever method works best to understand how people will use the chatbot before implementing the chatbot with students. Consider the types of questions the chatbot will be asked and see what the answers generated by the chatbot are.

Campus Resources:

For further information on developing messaging/communication that would resonate with students, contact Academic Support & Retention at uc-retention@uiowa.edu.

For more information on the role of generative AI at the University of Iowa, visit the ITS website here.

As we continue to explore AI, generative AI in particular, we’re open to learning more about how you are using AI. Faculty and staff can email their AI experiences, questions, and suggestions to ai-feedback@uiowa.edu.