Guardrails and Guidance for Artificial Intelligence Use
The transformative power of Artificial Intelligence (AI) is increasingly felt across disciplines, including our community at UIndy. As we embrace the opportunities presented by AI, it's essential to ensure its responsible and ethical usage within our academic and administrative roles. This aims to foster a transparent environment for exploring and applying AI technologies while upholding academic integrity, data privacy, and ethical conduct.
Bias and discrimination
AI algorithms can perpetuate existing biases (based on factors like race, gender, or socioeconomic status) present in training data, leading to discriminatory outcomes. Users must be aware of and mitigate these risks.
Mitigation Strategies
- Auditing an AI tool alongside peer instructors prior to adoption and intentionally looking for bias in the output can ensure products adopted for use in class provide fair outcomes for all individuals in the course.
Openness to Feedback
- Encourage reporting of examples of bias in AI applications and review feedback with peer instructors to determine if the product should continue to be used. Report bias to the company to see if they are willing to address and rectify any discriminatory outcomes.
Copyright infringement
Faculty should ensure AI-generated materials do not infringe on copyrights.
Accessibility concerns
AI tools used for classes should be accessible to all students, including those with disabilities.
Data privacy concerns
AI tools often collect and process data provided to enhance training of the model. Even if you are not submitting student work, know that any submitted information becomes part of the model and is publicly available.
Openness and transparency
- Faculty should be open about their use of AI tools and explain how they ensure responsible and ethical application.
- Disclosures: Clearly state whether content was generated by AI or humans, especially in research papers, presentations, and creative works.
- Limitations: Acknowledge the limitations of AI models, such as potential biases or inaccuracies, to avoid misinterpretations and ensure trustworthy results.
- Citations: When using AI-generated content, cite the specific tool or model used if appropriate. APA and MLA provide specific guidance.
- Attribution: Ensure proper attribution for AI-generated content, especially when used in scholarly or creative works.
Focus on learning outcomes
AI tools should be used to enhance learning, not replace essential human skills and interactions.
Regular review and evaluation
Continuously review and evaluate how AI tools are being used to ensure they are beneficial and align with faculty and university values and outcomes.
Training and support
Take advantage of AI training and support opportunities offered by the university and other Higher Ed focused entities. Google offers a free self-paced course targeted to help educators learn how to write better prompts and incorporate AI tools into their daily productivity.
- Using AI in the drafting and development of policies, communications, or processes which do not include Sensitive Institutional Data
- Use of AI in the analysis of a de-identified dataset
- Use of an approved AI app, developed by a 3rd party under contract, to evaluate academic work
- Using AI to write or debug code
- Asking students to use an AI tool to do the following within your course
- Research assistance (e.g., literature review, data analysis)
- Creative exploration (e.g., music composition, image generation)
- Writing revision (e.g., grammar assessment, active vs passive writing)
Plagiarism and AI detection without human review
While AI tools can flag potential plagiarism, relying solely on them can miss nuanced cases or misinterpret citations. Human expertise is crucial to ensure fair and accurate assessments of originality. Additionally, AI detection tools have a high false positive rate and reliance on AI detection tools creates an atmosphere of suspicion.
Generating entire lectures or research papers
Using AI to write complete materials undermines academic integrity and deprives students of the essential skills of critical thinking, research, and written expression.
Personalization without student agency
Adopting algorithmic personalization tools for learning can be beneficial, but it becomes problematic if it limits student choice and exploration, or reinforces stereotypes and biases. Adoption of any tools specifically for personalized learning should be vetted at the school or college level with assistance from the Faculty Academy and IT.
- Submitting student work to any software or service, leveraging AI or not, for grading where the University does not have an established contractual relationship with the service. University contracts for software or web based services cannot be established outside of IT and the General Counsel’s Office.
- Submitting any personally identifying information into any software or service, leveraging AI or not, where the University does not have an established contractual relationship that specifically allows for the service to have that information.
- Submitting or importing a spreadsheet containing Social Security Numbers and other identifying information to an AI tool or model for analysis.
- Submitting student work to any unapproved and uncontracted AI app to assess student work. (e.g., Taking a picture of a completed math worksheet and uploading it to an AI app for grading)
- Plagiarizing AI-generated content in any document represented to be your own work.
Additional detail will be provided in this section over time as our adoption of AI matures.
- Educational Workshops: Workshops on responsible AI use will be offered by the Faculty Academy.
- Faculty Consultation: Faculty members can consult with the Faculty Academy and the Learning Resource Committee on integrating AI tools into their curriculum and research.
- Support: The IT Help Desk is available to answer questions related to AI usage or availability for faculty, staff and students.
- Library Support: The Library can help with tools to conduct library research and evaluation of scholarly resources. Libguides will be made available by the Library.
- Research Support: IRB will publish guidance related to use of AI within a research setting.
Portions of this document were produced with the assistance of Google Gemini