You may recall units and lessons on media literacy from your English classes, which discussed concepts such as detecting bias and evaluating sources. Now with the use of artificial intelligence (AI) on the rise, College of DuPage’s English department recognizes a need to address the emergence and prevalence of AI and aims to develop a solid framework regarding the teaching and implementation of AI literacy.
Jason Snart, an English professor at COD, acknowledged the importance of addressing the rise of generative AI tools to aid students, and future workforces, to adapt them.
“It’s hard to avoid,” he said. “Generative AI is such a prominent thing in the news and the media. It’s not like we sought it out as a thing to do— it was prominent. We needed to address it somehow. It’s just this black hole of this thing that students are using and being exposed to. I imagine that if not already, then certainly a few years down the road, almost every profession out there will want their people, whatever the job is, to have some level of generative AI sophistication, not to cheat or skirt the system, but to use it wisely, ethically, efficiently to produce good output. To just ignore that kind of does a disservice to students.”
According to a Forbes Advisor survey, 60% of educators already use AI to “improve and streamline daily teaching responsibilities” already. More than half of respondents stated that AI “improved educational outcomes.” However, the survey also found educators are concerned about AI use in the classroom, particularly when it comes to “plagiarism in essays/work” and “reduced human interaction in learning.”
Chris Slojkowski, a Computer and Information Technology student at College of DuPage, uses AI tools to analyze internet search results.
“The AI I most commonly use is the version of ChatGPT that Microsoft offers with Bing,” they said. “I’m honestly just surprised how good it is. I mean it’s Bing. I’ve found it’s very helpful because it can draw off online sources. ChatGPT is just a language model, so being able to basically form a useful search query, search and then have it summarize the results is really helpful.”
Karina Ortiz, a pharmacy student at College of DuPage, said using AI to enhance learning and comprehension of academic concepts is reasonable, but becoming too reliant on it is when it becomes an issue.
“I think to a certain extent it’s OK,” she said. “For example to check your work or to see if you are on the right track. But not relying on it for your homework answer, because you might get the answer right on your homework, but when it comes to the exam, you’re not going to know the actual material because you’re just relying on the AI.
“If you’re using it just as a guide to see how you’re doing,” Ortiz continued, “if you’re on the right track to solving the problem [and to] help you understand, then I think it’s OK.”
Snart elaborated on why the English department specifically is involved in such efforts.
“Developing what we call critical Gen-AI literacy for students and for ourselves is really like the kind of thing that we’ve been teaching in composition, rhetoric and English departments forever,” he said. “Information literacy, identifying credible sources, being very intentional with language, all those kinds of things, we have a long history of doing that work. So it’s a natural fit for what we’ve always been doing. It’s just a very new variation on that stuff.”
Snart said the key to combating AI-related academic dishonesty is a balanced approach that acknowledges both the potential ethical and unethical uses of AI.
“Old-time plagiarism is just copy-and-pasting from the Internet as if you could write this thing when clearly you can’t,” he said. “So generative AI is just a different tool that allows for that. That’s still clearly the problem. It’s always been, but we want to find that middle space where we’re not cheating with it, not ignoring it because that doesn’t seem very productive, but we’re getting students to think critically about it and to use it most effectively and efficiently as this tool that could impact just about every kind of moment in the larger writing process. It’s always situational, like ‘How could this tool be most effective?’ ”
Ortiz said her professors generally encourage a balanced use of AI in a way that does not hinder learning and understanding.
“I feel like from my personal professors, they’ve encouraged it in that way to use it to maximize your understanding,” she said. “Don’t use it just to get the right answer, because when that quiz and test comes along, the professor gets to see that we don’t really know the material.”
Slojkowski acknowledged the nuances of academic challenges people often encounter, stating that AI can sometimes help in circumventing such issues.
“People don’t always have access to a teacher that works for them,” they said. “Having an entity you can talk to 24/7 that can rephrase obtuse articles, answer questions, etc, is valuable. I think some criticism is warranted, and you shouldn’t rely on it too much, but if you know how to use it, ChatGPT can act like a sort of “talking book” to help you get through things. In particular, I wish this existed when I was taking a philosophy class, [it could have been] really interesting to prepare myself for discussions by having them beforehand with Bing’s AI.”
Snart pointed out that while AI is capable of generating content at rapid speeds, that same content tends to lack substance.
“It produces very readable prose,” he said. “It’s stylistically good. There’s no common mistakes, but it also doesn’t necessarily produce really great depth of substance. It’s very surface level in what it produces. It’s very repetitive. A five-bullet point thing about whatever topic I’ve asked it to outline for me like three of those five are likely to be just slight rewordings of the same basic idea. I’m just not at a point where I would ever want to turn over that kind of creative or developmental work that I like to do as a teacher to some machine. That would be an unnecessary or inefficient use of the tool because I’d have to fact-check it.”
He predicted AI will likely become a topic of discussion or study in the classroom.
“Each kind of person will develop their own assignments or their own ways of engaging with AI as a conversation with students to see what students think and how, if they have used it before, what did they think of it,” he said. “All of those particulars will be idiosyncratic to each individual professor.”
Ortiz has never had a professor use AI directly in the classroom nor instruct students to use AI critically but is intrigued by the idea.
“I’m interested to see how that [will be],” she said. “I’m not really familiar with that, so it’d be something new.”
Snart sees English professors collaborating with experts across disciplines to better understand and implement AI in higher education.
“The conferences that are sponsored by those organizations will begin to feature more and more discussions and collaborative work from people across institutions,” he said. “I think that will help to bolster our discussion here. We reached this point where we’re more visible like the English department is, but also psychology, computer science or math.”
The Modern Language Association has developed a “Student Guide to AI Literacy” which can be found here. The English Department at COD has also developed an educator resource on AI on its website.