The Responsibility of Educators
AI is going to present teachers with incredible opportunities, but they will have to consider carefully how they are teaching and assessing students now that free platforms like ChatGPT are widely available. While AI can serve as a tool to help both teachers and students learn, it can not replace what is often the best part about higher education: the discussion, critical thinking, and mentorship that occurs in and out of a classroom. There is a history of tools that initially seemed threatening to education, but were later incorporated into instruction (e.g., calculators, Wikipedia, etc), and faculty would be well-served by approaching generative AI in a similar manner (Hicks, 2023).
While there is still debate about the extent that generative AI will transform education (Marcus, 2023; NeJame et al., 2023; Office of Educational Technology, 2023), faculty will need to encourage students to learn how to balance the information they get from AI with their own perspectives or creative expressions. Students will need to learn how to use these tools because they might be required to master them in their jobs when they graduate. However, it is important to stress that using artificial intelligence is not the same as thinking, which is why this technology should not control curriculum or content. Instead, faculty should set expectations and policies about the use of AI so students have clear guidance.
Despite the challenges for academic integrity and student learning, students will need to use these tools and educators have an obligation to instruct them on AI literacy, ethics, and awareness. Part of higher education’s obligation in this regard is that we can include disciplines outside of the STEM fields to research and contribute to our knowledge of AI development and capabilities (UNESCO, 2022).
In a recent discussion among faculty on this topic, Nicola Marae Allain, Dean of the School of Liberal Arts and Humanities at Empire State University, looked further into the future. “So these tools are here now. They’re only going to get better, more complex, more ubiquitous, present, and available.”
We have a responsibility to our students to help them understand artificial intelligence, she points out. But we also have a responsibility “to think carefully about: How should it be used? Where should it be used? How could it be used? What do students need to know to use AI effectively in their fields?”
It’s also worth mentioning that AI tools can be incredibly helpful for faculty and staff – these changes don’t just affect our students. We can use this technology to create examples, quizzes, sample essays, in-class activities, discussion questions, study guides, or other resources for students. AI can be used for plagiarism detection, research assistance, or help give feedback on student work. However, just because you can use it doesn’t mean you always should. Chapter 3 of this document provides additional detail on some considerations to keep in mind when deciding if and when to use AI.
The World Economic Forum (Partovi & Hongpradit, 2024) posits 7 principles for using AI in education:
- Purpose: Connect the use of AI to explicit educational goals.
- Compliance: Ensure that the use of AI supports institutional policies.
- Knowledge: Teach proper skills when using AI.
- Balance: Understanding the risks associated with AI and not overusing it.
- Integrity: Promote ethical use of AI.
- Agency: Human decision-making must be on the forefront.
- Evaluation: Regularly review the role and impact of AI.
The Impact of AI in Other Contexts
While it is critical to consider the ethical impact of AI like ChatGPT on academic integrity and academic dishonesty, there are other aspects of higher education that will be impacted that have ethical components. AI has been integrated into processes in human resources, financial aid, the student experience, diversity, equity, inclusion and belonging (DEIB), and institutional effectiveness. In many cases, AI integration in these domains is meant to enhance decision-making and assist in data analysis (du Boulay, 2022; Holmes et al., 2023; Naik et al., 2022; Nguyen et al., 2023). One must also consider the ethical impacts of integrating AI into educational platforms, like expert systems, intelligent tutors/agents, or personalized learning systems/environments (PLS/E), and teaching and learning perspectives (du Boulay, 2022; Hutson et al., 2022; Ungerer & Slade, 2022). Students and educators need to be aware of the type and content they are feeding into the AI tool, especially because the more it’s used, the more data it gathers.
Information Security Concerns
Information security is a crucial factor to consider when adopting generative AI tools (Piscia et al., 2023). It is important to evaluate the information required to use generative AI tools, the confidentiality of completed queries and potential for data hacking. Additionally, data sharing between the tool and private entities must also be evaluated.