- Make sure you understand what goals you have for the chatbot. Is your goal to provide students with basic information or individualized learning opportunities? Is your chatbot being created to automate administrative tasks? A clear understanding of your goals will help you design and implement your chatbot.
- Adjust the chatbot responses to match your goals. Many chatbots allow you to control the tone, or voice, of the chatbot conversations. You should choose a style that matches the purpose for implementing the chatbot. For example, a chatbot that responds to questions in a formal manner may not be the best tone for a chatbot with a conversational purpose. A chatbot with a humorous or strongly emotional character may not be a good fit for a professional setting.
- Make sure that any chatbot that is student- or private-facing is introduced with reminders to users that private or sensitive information should not be shared with the chatbots because, in general, the university cannot protect the privacy of information shared with chatbots.
- Share a link to the University’s data classification guidelines to help users make informed decisions about their interactions with a chatbot.
- The capabilities and limitations of chatbots used in academic settings should be clearly communicated to students. Students should understand that chatbots are incapable of understanding, emotions, or subjective opinions. Any output from the chatbot should be treated as supplemental or a convenience. Students should verify essential information with an authoritative human source. These reminders about chatbot limitations could be shared with students in an introductory message or prompt that appears each time they access the chatbot tool.
- To avoid potential harm, users should be explicitly advised not to rely on any chatbot for emergencies, medical advice, or mental health support. Chatbots cannot assess emergencies or urgent situations and may give inaccurate, incomplete, or misleading information. Chatbots cannot replace trained professionals and using chatbots for health matters could have severe consequences.
- Chatbots will not be able to answer all questions. It is important that chatbots let users know when they don’t know an answer, instead of “hallucinating” or providing incorrect information in its responses. This is especially true for when users ask personal, self-harm, bullying, or food insecurity types of questions.
- To ensure validity of interactions with users, chatbots should be able to record their interactions so they can be reviewed later. Chatbots should also be able to share when they cannot produce an answer. Identify a person responsible for the chatbot and establish a regular schedule of review for this information to ensure the performance of the chatbot.
- The knowledge base and programming behind any chatbot will need ongoing maintenance and updates to expand its capabilities over time. Plan for allocating time and money, not just to the initial implementation of a chatbot, but also for continuous improvement to support the chatbot’s continued use.
- Students may use chatbots to get individualized help, to review content, to research a topic, and to find answers at any time and any location. Ask others, have a focus group, or whatever method works best to understand how people will use the chatbot before implementing the chatbot with students. Consider the types of questions the chatbot will be asked and see what the answers generated by the chatbot are.
Campus Resources:
For further information on developing messaging/communication that would resonate with students, contact Academic Support & Retention at uc-retention@uiowa.edu.
For more information on the role of generative AI at the University of Iowa, visit the ITS website here.