Strategies for Student Input
When considering strategies for AI use in higher education, it’s critical not to put the cart before the horse and craft usage policy that has significant implications for students without student input. A holistic approach that includes student input and involvement is paramount. Carving out space to ensure faculty are familiar with, involved, and respect the student users in all their plurality increases the chances that efforts are helping, not impeding students, while also giving some assurance that the practices and policies are crafted from an informed (as opposed to speculative) perspective, and therefore have greater potential to endure.
Institutions of higher education have to know their student bodies to the extent that their approach to AI and policy surrounding it bear their students in mind; and yet adhering to a standard, e.g. a SUNY-wide approach (not mandate) is just as important a litmus to draw out where a particular institution’s students are in relation to an entire system. Thus, faculty and administration should be vigilant in their awareness of student AI usage, gathering qualitative and quantitative feedback that can continually be evaluated, assessed, and synthesized into respectable, relevant, and flexible policy while also preparing students for a world with AI. According to Veera Krohonen (2024), a research expert covering United States data for society, “a 2023 survey in the United States, [revealed that] 85% of undergraduate students would feel more comfortable using AI tools if they were developed and vetted by trusted academic sources.” As educators in higher education, as we draft policy, we still need to remain cognizant of the fact that many students are already immersed in the world of AI and are waiting for us to catch up.
Once the dust (excitement, confusion, repulsion, narrative, and myth-making) begins to settle, more deliberate strategies for gathering student input will begin to surface (Walter, 2024). Regular, voluntary, and anonymous surveying, in whatever form, be it in-person or digital, remains a dependable approach. Utilizing QR code-linked surveys to catch students in passing on their phones may result in a higher yield of (and potentially more accurate) responses, given the familiarity and personal nature of the medium. These links can be deployed in focused environments like classrooms, class LMS pages, or in more communal spaces such as the library, student union, dormitories, or the college website. The library specifically is a prime location as the work of information literacy [read: AI literacy] falls directly in its wheelhouse. Given their open, communal settings and the relationships that are often formed therein between students, faculty, and librarians (also faculty), the likelihood of more candid and in-depth responses to the survey questions raises the potential for greater “statistical power, credibility, and generalizability” (Fass-Holmes, 2022).
Aside from open and honest classroom discussion of AI use, as well as instruction around ethical use of AI, which will soon be much more the norm, students should be surveyed broadly but concisely regarding: which AI tools they interface with and how, the quality of the output, and their perspective on how appropriately their professors and/or institution have or have not integrated AI into the classroom, with consideration given to how they anticipate it affecting their futures. This sample survey attempts to cover these key areas in an approachable 3-5 minute long survey that collects both qualitative and quantitative data. The survey tool automatically synthesizes data into digestible, reportable data sets, which we can use to design courses, stay abreast of changes in specific degree areas, and observe how AI is being used in career fields.
Similarly, if building a campus-wide survey is not doable, a way to build smaller sample size surveys while yielding important feedback regarding student use of AI is to center a survey inside an instructor’s course. In-class surveys yield better answers than a survey disseminated by an unknown source and, as a bonus, help faculty learn about their students. Keeping surveys anonymous removes the fear students may feel regarding their answers impacting their class grades. In-class surveys have the potential to produce honest answers from students, especially if instructor-student relationships have grown over a semester, and a survey helps students feel integral to the design of the learning. Biesta and Stengal (2016), in “Thinking Philosophically About Teaching,” refer to the relationship between teacher and student as partners in the process of questioning, and there are innumerable questions to ask about the use of AI in higher education (p. 15). Additionally, reciprocal sharing versus a top-down power structure creates a stronger course and one that reflects the interests of students, thereby correlating to greater student success (Freire, 1968). “Voices Inside Schools” by Carol R. Rodgers (2002) speaks to the sharing that is important in the classroom, “I encourage teachers to value student feedback as critical to understanding students’ learning . . . most teachers rarely take the time to engage their students in conversations about their learning” (p. 233). A survey allows a teacher to be a learner and look at the subject matter through the eyes of their students (Rogers, 2002, p. 243).
Surveys in the classroom allow professors to make evidence-based decisions to adjust teaching practices such as incorporating AI. If a survey shows what AI tools students engage with, they can then work to build content to help them use AI tools responsibly. A survey has the added benefit of being a professional development tool as it encourages instructors to engage and grow in their field on an ongoing basis. In the article “Improving Teaching With Expert Feedback—From Students” (2016), information acquired from a survey improves a professor’s effectiveness, and the classroom remains student-centered. “By listening to their students, educators can continuously evolve and enhance their teaching practices”. As we work towards reciprocal relationships with our students and learn what AI will look like in the classroom, we must also remain focused on protecting our students’ identities and teaching them how to do the same.
Preparing students for life after college requires that we adequately equip them for the workplace and a broader audience. Keeping abreast of the ever-changing landscape of AI is our professional and ethical responsibility, allowing us to help our students reach their long-term goals of economic and upward mobility and future readiness in the workplace. As we build relationships between university curricula and careers, it is important we also teach them how to protect their identities, as we have a professional obligation to safeguard student data privacy. For example, all personal information must be omitted to comply with the Family Educational Rights and Privacy Act (FERPA). Additionally, we must “think of AI Chatbots as public data warehouses,” so as you incorporate AI into your classroom, Copilot, a Microsoft AI tool, recommends the following:
- Understand that the data you enter into an AI chatbot may be stored by the company running the tool.
- Companies use this data for training future models, potentially including user-submitted information.
- Even if anonymized, there’s always a risk of data breaches. (Open AI, 2024)
Teaching our students to be careful with personal information by removing specific identifiers is a part of helping our students to be safe with their personal information when they move beyond our classrooms. For instance, using Copilot instead of ChatGPT allows the user to know where Copilot is pulling information as they share their sources with the user. This added benefit allows us to work with our students on checking the credibility of the sources from which Copilot pulled information. This is an important tool to build a student’s information literacy skills and is part of the SUNY General Education Framework implemented in Fall 2023.
Certainly, if we are considering using survey data for research publication, it becomes critical to protect ourselves and our students through institutional support and the institutional review board that comes with it. Additionally, awareness of the timing of the surveys as it relates to the broader institutional and discipline-specific surveying calendar, as well as student survey fatigue, is important (Fass-Holmes, 2022).
A Statista survey (2024) revealed that “65% of students also believe that AI will improve how they learn, rather than having negative consequences on learning.” If we are to take anything away from this statistic, it is that, as educators, we have an obligation to help guide our students through the unchartered waters of AI. If we embrace where we are with AI, rather than look at the use of AI through a punitive lens, the possibilities of change in higher education due to AI are not to be feared but rather a journey to share with students.