AI: Questions and Answers
Questions about features of the approved tools
Q. What do the Google and Microsoft AI’s have access to?
Regardless of whether you’re using the basic or enhanced AI features in Microsoft Copilot or Google Gemini, the AI’s access is limited to the information you already own or that has been shared with you. This includes emails, documents (e.g., Word, Excel, Google Docs, Sheets), and other files within your workspace. AI-generated responses will not pull from documents or data you do not have permission to view. In short, the AI respects your existing access rights and does not extend visibility beyond your own content.
Q. What’s the difference between the “basic” and “pro” versions of AI in HMC’s Microsoft 365 for Education and Google Workspace for Education tenants?
When we talk about the “basic” and “pro” versions, we are using those terms to capture the fact that both services include some AI features and offer others as part of paid upgrades. These features vary across elements of the platforms (eg. mail, documents, spreadsheets, meet/teams etc) and over time (as the vendors continuously add and change functionality).
The places where differences between basic and pro versions emerge typically include the following:
- The level of integration of AI with the rest of the tools in the suite
- Limits on the amount of usage allowed (higher per month usage with paid versions)
- The number and types of AI models that are available
Regarding Google, our terms “basic” and “pro” currently refer to “Google Workspace Edition” and “Google AI Pro for Education” respectively. There are no differences between the two with respect to privacy and security of your data. The main differences are the level of integration of AI with each tool (mail,docs, sheets, calendar, chat etc), the availability of certain AI models and monthly limits on usage. Google provides detailed comparisons on their page Compare Gemini for Google Workspace add-ons. And there are details about privacy and security in the Generative AI in Google Workspace Privacy Hub.
Regarding Microsoft, “basic” and “pro” currently refer to “Microsoft 365 Copilot Chat” and “Microsoft 365 Copilot”. There are no differences with respect to the privacy and security of your data. Copilot Chat (Included) has access to web sources only, whereas Microsoft 365 Copilot (Add-On) also gets access to organizational content, as well as being fully integrated with applications like Word, Excel and Outlook. There are also some limits on usage in the included version. Microsoft provides details on their page Which Copilot is right for me or my organization?
Q. Can a department create a separate, insulated AI tool restricted to its own use?
Depending on the details of what is meant by “separate” and “insulated”, there are of course lots of ways to go about this.
For example, if someone creates a notebook within Google’s NotebookLM, they can set the notebook to only work with specific sources (up to 300) and share the notebook, or parts of it, with others in their department. M365 Copilot also offers notebooks, but the notebooks cannot be shared with others, only pages from the notebook.
More technologically advanced approaches are also possible.
Q. Is it possible to disable integration features in the Basic or Pro versions of Gemini and Copilot, such that they do not connect with other applications like email or Google Docs?
The literal answer is “yes”, but a better answer is probably “sort of, maybe no”.
Google: Yes, Administrators can choose to enable or disable Gemini features for individuals or groups and the side panel in the following Workspace services:
- Gmail
- Drive, Docs, Sheets, Slides, and Vids
- Meet
- Chat
The default setting for these Gemini features is “on”. However, turning them off does not prevent users from accessing items via Gemini in another service, and the AI still has access. For example, if you turn off Gemini in Drive and Docs, a user can ask Gemini in Gmail about a Sheet they own. (See google documentation).
Microsoft:
If a user signs into Copilot with their @hmc.edu but they DO NOT have an elevated/pro license, they will not have app integration. Copilot basis is mostly not allowed to access apps. Where it gets tricky is that while they don’t have full app integration, Copilot is “aware” of currently open documents in Word or Excel.
Q. In consumer versions of the Gemini app I can delete chats, but I can’t do that in Workspace for education. Why?
In the consumer version of the Gemini app (gemini.google.com but logged in with a gmail account) you can manage your chat history and delete chats. At the moment, you cannot do that with Gemini for Education (gemini.google.com but logged in with your HMC credentials). The reason that this is the case is because Workspace for Education is designed to allow organizations to comply with laws and regulations (about data retention for example). See Google’s explanation.
If you find that unimportant chats are clogging up the list of chats, you can “pin” the more important ones so that they appear at the top of your list.
Questions about Data Classification
Q. Can you give more examples of data classification at different protection levels?
Here are some examples, drawn from Appendix A of the HMC Data Classification Standard:
Protection Level 4 (P4) This is the highest level of protection, for the most sensitive information.
- Credit card information
- Passwords and PINs
- Social Security numbers
Protection Level 3 (P3) This level is for information that is restricted and sensitive.
- Student education records, like grades
- HMC employee records
- Security camera recordings
Protection Level 2 (P2) This level covers information that isn’t intended for public release but isn’t highly sensitive.
- Exam questions and answers
- Drafts of unpublished research papers
- A college directory of faculty, staff, and students
Protection Level 1 (P1) This is the lowest level, for public information.
- Press releases
- Course catalogs
- Public event calendars
Q. Why is a data classification standard necessary?
Data classification is the keystone of data management, privacy and security. Without it, it can be difficult to track which data is affected by which regulations (eg. GLBA, FERPA), who should have access, or how long it should be kept. A practical example is how The Claremont Colleges, each separately incorporated entities, can use data classification as a guide to sharing student data in Anthology. The full HMC Data Classification Standard is complex at first sight, but in many ways it is an improvement on the vaguer terms “confidential”, “sensitive” and “public” that were used before.
Questions about processes
Q. How does CIS approach AI?
Where possible, CIS turns off or opts out of any functionality that is labeled “AI”, until a review has been conducted and it is approved for use. In situations where functionality can be turned on for individuals or groups without being turned on tenant wide, we will turn it on for faculty or staff who request it, after review of the functionality. Not all functionality comes with the ability to turn it on or off. Examples include the copilot functionality that is built into Microsoft Windows and things like auto suggest in google search (although AI in google search can be turned off by individuals, using “-ai” in the search terms).
Q. How do I request a “pro” license for Gemini or Copilot?
Faculty or staff may request a paid “pro” license for Gemini or Copilot using our form.
Note that you do not need to request a license for the basic functionality – it is turned on for everyone.
Q. How do I request approval for an AI tool I wish to use?
If the AI is only going to be used with P1 or P2 data, that AI doesn’t need special approval. Caution may need to be exercised with P2 data, as some P2 data may be more sensitive (for example, meeting notes that do not contain P3 or P4 information are classified as P2, but you might not want them out there in the world for others to see).
Before you use an unapproved AI tool with P3 or P4 data, please get in touch with either Adele Vuong or Joseph Vaughan to discuss initiating a contract with the vendor.
Other questions
Q. How do these guidelines intersect with the Pilot Program: HMC AI-Use Course Policies?
The Pilot Program initiated by Academic Affairs suggests language and approaches to AI usage that can be used by faculty when guiding students about AI usage for their classes.
Independent of which AI usage level is chosen for a course, the AI guidelines regarding HMC data should be followed. For example, even if your course usage level is “AI Use Level 3: Open AI Use”, this does not mean you can submit P3 or P4 data to any AI; you should still only submit P3 and P4 data to approved systems.
Q. How can I learn more about the HMC policies and AI usage?
You can read the policies themselves:
HMC Policy on Safeguarding Confidential and Sensitive Information
HMC Data Classification Standard
HMC Policy on Incidental Personal Use of Information Technology
Q. How can I submit a new question?
Use our form!