Using AI tools is often as simple as typing a question, but what happens to your data behind the scenes is more complex. Understanding the difference between public AI tools and Iowa-supported tools can help you choose the safest option for your work.
Two kinds of AI tools
Public AI tools include consumer websites such as free chatbots you reach through a browser. You sign in with a personal account, and your prompts are handled entirely under that vendor’s terms of use.
Iowa-supported AI tools include Copilot Chat, Microsoft 365 Copilot, and ChatGPT Edu. You sign in with your HawkID, and these tools are covered by University agreements and data-classification rules. They are not magic shields, but they do provide stronger protections and clearer expectations than public tools.
Where your data goes
Public AI tools store your prompts and responses on the vendor’s systems, sometimes for long periods. Depending on the provider, staff or automated processes may review some content to improve services. That is why you must never paste sensitive or restricted data into public tools, even if you trust the company, without first submitting the tool for a security review. Work with ISPO or your local IT group when you are unsure what data a tool can handle.
Iowa-supported tools keep your content within approved environments. Some services such as Microsoft 365 Copilot and ChatGPT Edu require additional licenses and associated costs.
- Copilot Chat is designed for general questions and ideas and should only be used with data that is labeled public or university internal.
- Microsoft 365 Copilot works inside apps such as Outlook, Word, Excel, and Teams, using the files and messages you already store in Microsoft 365 under existing permissions.
- ChatGPT Edu runs in a licensed environment tied to the University, and conversations are not used to train OpenAI’s public models. This tool is also designed for use with data that is labeled public or university internal, but make sure you are logged in to your university account.
In all cases, you remain responsible for what you choose to paste into a prompt.
Will prompts be used to train models?
Public tools may use your inputs to improve their models, depending on the vendor and settings. Even when there is an “opt-out,” you should assume anything you paste into a public tool could be seen or analyzed within that service.
For ChatGPT Edu at Iowa, conversations are not used to train OpenAI’s public models, which reduces the risk of your prompts being repurposed. For Microsoft 365 Copilot and Copilot Chat, University guidance expects you to follow data-classification rules and treat prompts as institutional content.
Quick examples
- Generally safe in a public tool: brainstorming generic ideas for a syllabus theme or event name, without names, IDs, or internal details.
- Best in Iowa-supported tools: summarizing a Teams meeting or drafting an internal memo that references University files, done inside Microsoft 365 Copilot rather than uploading content elsewhere.
- If there is any question about whether your task or data is appropriate for AI, work with your IT leader or support team and submit a security review to confirm that the tool is safe to use with the information you have in mind. This applies across all contexts and helps ensure that University data is handled correctly.
When in doubt
A quick mental checklist can help:
- Identify the data: is it Public, University/Internal, Restricted, or Critical?
- Choose a university-supported tool that matches the lowest-risk option for that data.
- Remove names, IDs, and other identifiers whenever you can.
- If you are unsure, contact your IT leadership or support team, or reach out to the AI Support Team, and we can help direct you to the proper University resource for reviewing tool safety and data use.
Remember, Follow University policies and departmental guidance when you use any AI tool.
Visit the AI Tools page to compare supported options and subscribe to the AI at Iowa newsletter to stay updated on new guidance and features.