By Erin Peterson / Illustrations by Marcos Chin
Generative artificial intelligence has come on strong. What does that mean for teaching and learning?
Spanish and Portuguese instructor Claudia Giannini remembers the moment when a new artificial intelligence tool upended her teaching.
It instantly translated short texts, giving students in language classes a potential shortcut. “Although still imperfect, it was such a huge jump from previous machine translation systems. It was impressive,” she recalls. “But it was also a problem in the classroom.” She knew she’d have to change some of the teaching techniques she’d relied on for years, and fast.
Giannini’s experience may sound like many professors’ reaction to the November 2022 launch of ChatGPT, an artificial intelligence chatbot that communicates by text in uncannily human ways. Instead, it was 2016, the year that Google released its neural machine translation service with the support of deep learning, the model on which today’s generative AI technology is based.
While it was true that Google Translate couldn’t artfully translate a poem or literary work (or even a newspaper article), it could quickly translate some of the written assignments students typically tackle as they learn the basic building blocks of a foreign language. And for some of these students, it could seem like an easy way out of assignments.
Giannini quickly adjusted her approach. She started weighting class participation more heavily in student grades. She swapped out many written assessments with oral ones. She had students write the first draft of their essays in class. And she strategized with her colleagues, who were facing similar challenges.
In some ways, Giannini has had a head start on understanding the transformative impact of AI in the classroom. She sees both the technology’s challenges and its potential. And as a new crop of generative AI tools—from ChatGPT to GitHub Copilot—affect education in nearly every discipline, it’s a topic that almost no one in the classroom can avoid today.
At Macalester, professors and students are not digging in their heels against the changes these tools will bring, but are instead stepping mindfully into this new world.
The future starts now
Generative AI—artificial intelligence that creates new material based on patterns it identifies in data—was barely on the radar for most faculty and students as late as October 2022. But it wasn’t long before higher education as a whole was on high alert. “I read the Chronicle of Higher Education and Inside Higher Ed every morning,” says professor of international relations and political theory Andrew Latham. “And the level of anxiety around AI, on a scale of one to ten, is an eleven.”
Although robust data is still relatively rare and change is happening quickly, early surveys suggest that AI is already influencing higher education. Two surveys conducted in August 2023, for example, found that anywhere between 20 and 38 percent of American college students were using AI tools at least monthly. Meanwhile, a survey of hundreds of Harvard University faculty members in the spring of 2023 found that just 21 percent believed AI would have a positive impact on education; 47 percent believed the impact would be negative.
At Macalester, attitudes continue to evolve. Professor of environmental studies Chris Wells, for example, admits he was dismissive of ChatGPT when he first tested it. “I had it write a bad poem—it was like a parlor trick,” he recalls. When he gave ChatGPT one of his own assignments, it returned nothing more than “slick sounding BS” that wouldn’t pass muster in his classes.
But he kept tabs on the technology, and he began to see examples of more meaningful uses of the tool.
He finally was convinced to take ChatGPT more seriously when he heard a podcaster frame resistance to the new technology as a liability, not a moral high ground. “They said that in academia, they call the use of generative AI cheating, but in business, they call it creativity and innovation,” he says. “I just don’t see a future in which AI doesn’t become a standard part of how people think, write, and communicate. We have to figure out what it means to live in this new world.”
This past spring, in his upper-level research and writing course, “US Urban Environmental History,” he and his students have had in-depth conversations about the ethics and opportunities of using these generative AI tools.
In one class, for example, he asked students to share what made them most uneasy about using ChatGPT and similar technology. They identified a range of issues: its significant energy use, large language model training practices that benefit from copyrighted work in unethical ways, and its facilitation of plagiarism, for starters.
But they also discussed reasons to be excited about these opportunities, as well as the ethics of avoiding a technology so powerful that it could fundamentally disrupt society. “There’s a lot of hype to generative AI, but there’s also a ‘there’ there,” Wells says. “And we’re all just trying to figure that out.”
AI attempts to replicate a human artist
We hired illustrator Marcos Chin of Brooklyn, N.Y. to illustrate artwork for this story. Then we fed AI image generators a prompt to see what they came up with, and compared the two approaches on this page. Chin wrote about the experience: “I saw this as an opportunity to dig into what my strengths are as a human being—an artist. I knew that I wouldn’t be able to compete with AI in regard to speed and the amount of sketches I could make in a short period of time. But what I did have was just that—time. I had time to feel, to remember, to think, to ruminate. I spent some days thinking about concepts while pacing around my apartment, walking my dog, and having conversations with my partner. Moreover, I also knew that I had lived experiences, and opinions about this topic which informed my approach.”
Sketches by Marcos Chin / Images generated by Adobe Firefly
Finding the right balance
After ChatGPT’s public rollout in late 2022, Macalester faculty were immediately interested in grappling with the challenges of generative AI. By January 2023, the Serie Center for Scholarship and Teaching had organized a panel and faculty discussion about AI and teaching. Britt Abel, director of writing and a co-organizer of the event, describes the turnout for the event as “massive.”
The interest encouraged Abel and associate library director Mozhdeh Khodarahmi to form a working group and faculty and staff learning committee on AI. That led to a report on AI literacy and critical thinking. The report includes robust guidance for faculty and students, and has been praised by the Macalester community—as well as national and even international audiences.
The working group has hosted ongoing presentations with energetic discussions about the ways that instructors and students can harness the power of these tools effectively to improve their teaching and learning.
For students, AI tools can make beginning an assignment less intimidating. Ada Bruno ’24 (Cranston, R.I.), who teamed up with two students to write a paper about the use of AI at Macalester for a news reporting and writing course, says she has used AI to help her do early thinking on some projects. “If I need an idea for a project, it can be helpful for brainstorming,” she says.
Still, she admits that its limitations are abundantly clear, even with relatively simple, clearly delineated tasks. “It’ll come up with ten ideas, but it doesn’t have the same kind of energy or collaborative spirit as a face-to-face interaction,” she says.
Faculty, too, have found ways to use the tools to support their teaching. For example, Giannini has been using ChatGPT in her advanced classes. First, she asks students to analyze an issue or a text related to a class topic the way she did before the advent of generative AI. Then, she has them ask ChatGPT the same questions she posed to the class and critique its output. “They can see how much better they do in their own analyses—and they can also see how much ChatGPT ‘hallucinates’,” she says, referring to the false information that can be created by these large language models.
Abel, who also is a professor of German, says the tools can be very valuable to faculty who are early in their teaching careers. For example, a professor could ask an AI tool to provide them a detailed list of potential classroom activities, such as a movie analysis or a cooking class, to support student learning at a specific language level. They could also ask ChatGPT to create a rubric to help assess student learning for this activity. “It’s pretty powerful at putting together a rubric if you’re using nationally accepted standards and coming up with specific activities related to those standards,” she says.
Wells says he finds ChatGPT most useful when he imagines it as another person. “If you use the analogy of an intern, you can think of ChatGPT as someone who works very hard and very quickly, and who is so eager to please that they will make stuff up in order to try to satisfy you,” he says.
With that mindset, he says, faculty and students can reorient their approach to the technology. For Wells, that means that he spends a significant amount of time defining the task or question in clear and often excruciatingly granular detail. He’s even developed a seven-point template that he uses for prompts that includes identifying the audience, specifying style and tone, and using examples for clarity.
This is work that requires its own unique type of thinking and analysis, and students benefit from learning these skills, says Wells. “There are so many details we don’t think to stipulate, but the AI still has to decide for you,” he explains. “It’s when those default decisions don’t line up with what you want that you often get a bad output.”
Of course, there’s a fine line between getting help from an AI tool and plagiarism. It’s why the Macalester working group developed an updated academic integrity statement that bars the unauthorized use of generative AI tools in coursework.
Still, while AI-facilitated plagiarism has been one of the most significant concerns for many educators and institutions, Abel says that Macalester’s structure, philosophy, and processes give the institution distinct advantages in an AI world. “Our faculty design really good writing assignments. We have small class sizes. We have students free write and brainstorm before they write an essay, and we have them write what writer Anne Lamott calls ‘sh***y first drafts.’ We spend a lot of time on writing, which is an iterative process, and as a result, we know our students’ voices.”
And while professors are quick to acknowledge that they would be hard pressed to detect AI cheating, they also know that the students who come to Macalester are typically hungry to do the kind of rigorous academic work that the college requires.
Latham says he often uses an athletic analogy when he talks to students about their use of AI. “If you decided that you were going to do a triathlon, and you had access to the best gym and the best coaches in the world, and you paid a bunch of money to do it, why on earth would you have someone else do the workouts for you?” he asks. “I tell them: Your education is a big investment, so make the most of it.”
I am not a robot
If AI tools have shaken up teaching and learning, they have also opened up opportunities. In some cases, they’re leading professors to rethink how they teach.
Before ChatGPT, for example, Latham had focused on having students complete traditional writing assignments. He has since replaced many of these projects with reflection papers and invitations for his students to come to his office to discuss their growth as scholars and as people. “I tell them that this is not a moment for me to judge you and to grade you. This is a moment for you to reflect on what you have actually learned,” he says. “And these papers and conversations are fantastic. I get the strong sense, in a way that I never have before, that they’re experiencing real growth as human beings. They’re not just ticking boxes and pretending that they know what I talked to them about three weeks ago.”
If AI tools have shaken up teaching and learning, they have also opened up opportunities. In some cases, they’re leading professors to rethink how they teach.
He pauses. “Are these reflection papers AI-proof? Probably not. But it’s pretty hard to ask an AI to write about what you’ve learned,” he says. “These are wonderful pedagogical moments, and I wish I would have done this twenty-five years ago.”
It’s this part of the AI transformation—the thoughtful analysis about what teaching and learning can look like, the re-engineering of classes to encourage critical thinking in new ways, and the increasing focus on human connection that is central to a Macalester education—that gives Latham hope about what lies ahead. “It’s not all rosy,” he says. “We’ll have to change things. We’ll have to adapt. But we can be true to our liberal arts heritage and tradition. Even in an AI world.”
Glossary
Artificial intelligence (AI). Technology that simulates human intelligence, often by mimicking communication and decision-making.
Generative artificial intelligence (GenAI). Technology that searches for patterns in large amounts of data to generate new material, such as text, code, and images.
Hallucination. Incorrect or nonsensical information generated by an AI system because of limitations in its training data or algorithms.
Large language model (LLM). A type of generative artificial intelligence that is focused on text-based data and algorithms.
Prompt. A specific instruction or question humans give an artificial intelligence system to guide an AI tool to generate a response, create content, or perform a task.
Erin Peterson is a Minneapolis-based writer.
May 17 2024
Back to top