What are some of the risks to students of relying exclusively on AI?

Students know there are risks involved in using AI to complete assignments comparable to that of plagiarism or other academic misconduct. They may be less aware, however, of other risk factors:

  • Reproduction of biases already prevalent in online content. Because LLMs generate their text responses by scraping the internet for language patterns that already exist, it is inevitable that biases will be reproduced through its algorithms. This is an important risk for students and instructors to consider in terms of form and content. In terms of form, linguistic bias is perpetuated because AI relies on language models already steeped in what composition scholar Asao B. Inoue terms “White Language Supremacy”; so, when words like “essay,” “report,” or “academic” are used to prompt a response from the bot, it will draw on dominant discourse language patterns that have traditionally been associated with such terms, potentially further relegating historically marginalized varieties of English as not academic. In terms of content, stereotypes and falsehoods can be reproduced through the algorithmic categorization and combination of words and phrases—and the people, ideas, and events those words refer to.
  • Inequity in access. While some students may be able to afford the latest versions of AI or the newest AI apps or plugins, others may not have the same access. Some students may feel uncomfortable sharing personal information with a chatbot.
  • Creation of false information. Chatbots are known to produce “hallucinations” or a confident response that is, in fact, false. Students may turn in work that includes false or fake data and must be made aware of the fallibility of AI chatbots.
  • Responses generated by AI represent the most common or the average response to a particular topic, rather than the most insightful or most in-depth.
  • Implications for development of critical thinking, reading, and writing skills. Education research has suggested that when students’ drive to complete assignments is externally motivated (for instance, by points or grades), their ability to remember, synthesize, and evaluate information suffers. Students should be made aware that relying exclusively on AI chatbots to produce writing for them may hinder their learning, potentially making tasks in more advanced classes or in their desired profession more difficult. This is not to suggest that use of AI is always inconsistent with critical thinking. Students may engage with AI in ways that helps deepen or advance their thinking. Indeed, prompt engineering (or creating prompts and questions to elicit desired responses from a generative AI model) is a complex task that requires thought and persistence. However, if students rely solely on a chatbot to write a paper, they may be missing out on important skills.
  • Implications for data privacy. ChatGPT and other AI tools collect data about the conversations it has with users. This data may include personal information, such as names and locations, as well as sensitive information, such as opinions and beliefs. This data could potentially be accessed by the AI developers or by other parties. There is also a risk that the data collected by AI tools could be used for purposes other than those for which it was intended. For example, it could be used for targeted advertising or to influence individuals in ways that they may not be aware of.
  • Fallibility of AI-detecting software. Students are at risk of being wrongfully accused of plagiarism or cheating because of the inaccuracy of AI detection tools. Recent research has shown that multilingual writers are more often than their English monolingual peers the targets of wrongful accusations. 

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Locally Sourced: Writing Across the Curriculum Sourcebook Copyright © by wac@writing.wisc.edu is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book