Skip to main content

AI Policy & Guidelines

Generative Artificial Intelligence (GAI) technologies are rapidly evolving and becoming an integral part of education, research and daily work at UW-Green Bay. To ensure responsible and ethical use of these tools, we have established guidelines and policies.

UW-Green Bay Generative AI Policy

This policy document outlines the university's approach to the use of Generative Artificial Intelligence (GAI) by students, faculty, and staff. Key considerations include data protection, security compliance, and individual responsibility in the use of non-University-provided GAI tools. Additionally, the document clarifies that no one at UW-Green Bay can be compelled to use GAI tools that require personal data input, ensuring privacy and equity.

  • Policy Title: UW-Green Bay Generative Artificial Intelligence Policy
  • Policy Number: GB-50-24-2
  • Approval Date: August 23, 2024
  • Approved By: Chancellor’s Cabinet

AI Policy (pdf)

Generative AI Guidelines

Below, you'll find key points and expanded guidance on GAI usage, tailored to our university community. For a detailed overview, please refer to the official GAI Policy document (pdf).

For All UWGB GAI Users:

Acknowledging Use of Generative AI

Use of GAI should be communicated, not only by students, but also by staff and faculty. Depending on your role, discipline, or course, you may need to use an official citation style, such as that of the American Psychological Association, or you may be able to add a simple note at the end of a document indicating it was created in collaboration with Copilot, as one example. The key is transparency. 

Below are some specific citation styles for GAI.

University AI Tools

We encourage the use of Microsoft Copilot with data protection (signed in with your UWGB credentials). It is the GAI tool with GBIT approval. Only approved tools will be supported by GBIT and the Service Desk. If you log into Chat GPT, for example, and have problems, no one at UWGB can assist you. Moreover, if there is a data breach with this kind of tool, it is your responsibility. 

Responsibility of Use

Be prepared to take individual responsibility for content accuracy and similarity to others’ work. It is critical to review any GAI data before you use it. GAI can produce excellent information, but it also makes mistakes and can “hallucinate” or make up information when it has a knowledge gap (e.g., make up fictional citations). Do not assume that information from GAI is free of other people’s intellectual property. GAI “scrapes” different data sources, such as the Internet, for training and to produce output.

Bias & Quality

GAI can contain bias. It may seem that a piece of technology should be objective, but users must remember that GAI is trained by humans and/or with human data on the web. That means that GAI may have some of the very same biases people do. Understand that the output quality of GAI will depend on the quality of information it takes in. An employee will need to enter a “prompt” of some kind, whether a question, a statement, or an image, to obtain GAI output. Good prompting will become an important skill. 

Security & Compliance

Even when using a UWGB-supported tool such as Copilot, PII and FERPA-protected data should never be entered in GAI (if unfamiliar, see FERPA and PII links earlier in the document). 

  • Remember that personally identifiable information (PII), such as names, ID numbers, and date of birth, should not be placed in GAI. It may both violate FERPA and put such information at risk. 
  • Consider the ramifications of placing part or all of your own work into GAI for feedback or editing assistance. Work you created yourself is typically your intellectual property. Unless you are using a GAI tool that explicitly says it will not use or store your inputs, such as Copilot with data protection, your work could be used to train that specific GAI model and be shared with for future users. Regardless of the tool, no one can be compelled to place their intellectual property or educational record (e.g., even an essay without a name) into GAI.   
  • Understand that depending on an employee’s role, their use of GAI may vary. 

For Instructors

Specify GAI Policy

Be specific about your GAI policy in your syllabus and in all assignment instructions (e.g., in handouts, in Canvas descriptions). State explicitly your stance for each of these. 

  • Do you forbid the use of all GAI? 
  • Do you allow it in some cases? (e.g., on some assignments, for some tasks such as brainstorming) but not others (e.g., creating text)

Be as explicit as possible and encourage questions and discussion. Students may be told to look for affirmative consent in course documents, and it will be helpful to all if it is included. This work may seem onerous, but it may prevent problems from arising. In addition, imagine the task ahead of students. They may have five different courses in a semester with six assignments each – and each course and all thirty assignments across them could have distinct GAI policies. It is a lot for them to do, too. 

Ask Students to Use Copilot

If GAI use is permitted for an assignment, ask students to use Copilot after signing in with their UWGB credentials. Using Copilot in this way is important for several reasons. The use of UWGB login information means they do not have to share PII, such as a cell phone number, to access GAI. Students cannot be compelled to share such information. Although this Copilot version is still not FERPA-compliant, Microsoft has indicated they will not save prompts or use them to train the Copilot model, so there are some security features. Copilot is also a university-sanctioned tool. GBIT will only provide technical support for approved GAI. Finally, it is free to students. Many other GAI tools have a subscription fee or are monetized in some way, producing equity issues. 

GAI Learning Objectives

Note that students will encounter GAI in many different ways at work and in their everyday lives. As such, your department or unit will be asked by the Provost’s office to create and revise learning objectives related to GAI that assist students in achieving a better understanding of it. Each major and minor has multiple learning objectives, however, and not all classes are expected to address every single one of them – it is simply not practical. Curriculum mapping to decide thoughtfully on GAI inclusion will likely be helpful to units. Teaching students about GAI is also not limited to having them practice its use. Students can also learn something by completing assignments about it (e.g., an in-class debate about the ethics of using GAI in medical diagnosis). 

Right to Refuse

If an instructor chooses to use a GAI tool for assignments that require students to input their own work, they should be told what tool will be used and how at the beginning of the class or semester so they can do independent research on it. Students should be given the right to refuse and request an alternate assignment with reasonable notice. 

Surveillance vs. Trust

In general, having a strong surveillance atmosphere in a course and assuming students will use GAI inappropriately unless carefully monitored can build a culture of mutual distrust that is counterproductive to learning. Good student-teacher relationships in higher education have been found to correlate with multiple positive outcomes, such as engagement, learning, and positive social behavior, and such relationships have been the theme of entire conferences and books.   

There may not be now (or ever) a truly reliable GAI detector, and use of GAI cannot be “proven” in the way plagiarism might have been in the past. In general, as GAI detection improves, so does GAI and its ability to avoid discovery; moreover, the potential negative consequences to individuals of false positives are substantial. Please do not place student work in a GAI tool to try to show it is machine-generated. This article provides some evidence-based strategies to reduce academic misconduct. As noted in past communications from the Provost Office, information from a so-called “AI detector” cannot be used in isolation to level misconduct charges. 

Academic Integrity Conversations

Have open conversations with students about academic dishonesty. The definition of that concept may be evolving with time and with technological advances. Instructors and students may be justifiably confused. For example, students may have been told throughout high school that they could/should use Grammarly to reduce typos and grammatical errors. Use of that same tool may now result in the student getting suggestions without prompting about how to rewrite some sentences. Is taking those suggestions cheating or not? GAI output results from human prompts. If a student uses GAI information that came as a result of their own prompts, is that misconduct? Did they use “other people’s” words if they are “copying” from a large language model, not a human? These are difficult questions, and reasonable people may disagree on the answers, but they can also be great fodder for discussion and learning. 

Have Questions?

If questions arise about these guidelines, proper citation, or related topics, be sure to consult with your supervisor regarding permissible use and/or any necessary training.