FEB 20, 2025 | FEATURES | By Anabel Shenk

Before writing this article, I googled “What is Generative Artificial Intelligence?” and before I could see anything else, Google’s “AI Overview” gave me an answer at the top of the page. 

“It is a type of AI that can create new content like text, images, videos and music. It can learn from and mimic large amounts of data,” it told me. 

I’m sure this is an experience most internet users are having as generative AI takes over our web browsers. Who better to explain to me what generative AI is than itself? Or, who worse? 

The fascination with artificial humans, brains, and intelligence is not a contemporary phenomenon; it likely dates back to antiquity as philosophy and myth considered and grappled with what it meant to be human and what objects (animate and inanimate) could have human-like knowledge bestowed upon them. 

The first use of the term “Artificial Intelligence” as we think of it today, was by John McCarthy, a Dartmouth professor of Mathematics during a workshop held in 1956. In the workshop’s proposal, McCarthy and three other organizers state, “The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” 

Jumping to today, everyone with access to the internet has access to exactly that:  Large Language Models (LLMs) are a type of generative artificial intelligence that trains on enormous sets of data so that it has enough examples (millions of gigabytes worth if you want to try and conceptualize it) to be able to recognize, interpret and generate human language and other data. 

If the origins of AI were founded in an interest in the “simulation” of “every aspect of learning” and “feature of intelligence,” then how does an institution like Colorado College, whose mission is based in developing habits of intellect, respond and react to a world where this simulation is actually possible? Is it a threat or is it a tool? 

Chris Schacht, the director of the Writing Center at Colorado College explained that LLMs like ChatGPT “are going to make us have to focus more on tracking student learning in new ways.” The writing process gets at the foundation of education and the development of critical thinking. 

“Traditionally, the reason a lot of writing assignments were assigned,” Schacht continued, “was because it was a pretty convenient way for faculty to assess a student’s learning in the course. It is also a way to assess the way a student is thinking about the material, because we think through writing. ChatGPT gets in the way of both of those things.” 

The ability for anyone, with a click of a finger, to generate human-like writing responses to anything they wish is running interference in the dynamic student-professor relationship. There is a loss of innocence and trust.

“Of course, students have always been able to pay somebody to write a paper for them, but ChatGPT makes that possible for everyone,” Schacht continued. Being part of the student’s writing process in new ways is necessary “if we truly value teaching certain ways of thinking.” 

Schacht compared this to the invention of the calculator. In math and science, educators needed to figure out ways to integrate the calculator into the learning process. 

“Well you start to use the calculator, but you also have to have the students show their work.” 

Schacht can imagine a future where there are more timed essays as well as draft histories being included in what the student turns in. You cannot just give the answer without showing the formula. 

“That is where something like the Writing Center is useful because we do become witnesses in that process,” Schacht added. 

Generative AI poses a threat to the core values of CC: helping students learn how to be critical thinkers. We know that CC values this based on the required First Year Program courses. The CC120 Writing Seminar is meant to “engage students in understanding the relationship between disciplinary practices and writing.” The writing process is critical to the learning process: In order to produce something, the writing process meets the student’s mind at a moment of uncertainty, leading them through discomfort towards understanding. 

“ChatGPT can help everyone produce the same kind of product. It cannot help everyone develop their own process and their own critical thinking for getting there.” It comes down to what you value,” Schacht said. An issue posed by ChatGPT according to Schacht, is that “learning should be a process of struggle.” AI can eliminate the struggle. 

“There is a term in education called a zone of proximal development. Where if something is too easy you are not really learning,” Schacht explained.  It is when something is difficult and you have to struggle, that’s when you learn. If ChatGPT is taking away the struggle, then we are not learning,” Schacht said.

Even though we have a calculator handy now, learning the basics of math helps introduce a certain way of thinking. 

“Even if you sometimes forget something from multiplication or whatever, that process of thinking in numbers has been integrated into your brain as you think of other things. And it’s the same thing with sentences. Thinking in sentences changes the way that you think,” said Schacht. 

Teaching a class right now called “AI at CC,” Ryan Benegale, Associate Professor and Chair of the Music department as well as the Director of the Crown Center for Teaching at CC speaks to this issue. 

Benegale explained that for educators, “reverse design is not a new one in thinking about how you build and create courses and syllabi…” Professors ask themselves “How do we backtrack to understand and then do the steps that we need to do to get there? What AI does is makes so smooth and frictionless that process.”

For the students, “the friction involved in hours of reading, of writing, of struggling, of sometimes not getting it and failing and having to turn around and try another direction. Those are the really important parts of learning. AI removes a lot of that.”

Film major Niko Cvitanić’s (‘25) academic life has completely shifted since the introduction of ChatGPT. He is “extremely grateful that Chat [GPT] came when it did.” Which, for him, was the middle of his college career. “I would not have learned how or been able to write if I had it freshman year or high school.”

While ChatGPT has the ability to put someone’s learning in peril, it might also have the potential to be a tool that can be used in accordance with the learning process and promote a process of struggle.

Schacht explained that there might be valuable ways in which Generative AI can actually be used to trigger thoughts in a student’s own head that they can then respond to.

Benegale speaks to this curiosity. “I’m curious. If you’re using it, let me know, like I want to know where it’s helpful, where it’s not helpful, I’m much more interested in how people are using it than being upset that people are using it,” Benegale explained. 

There may be a disconnect between how faculty suspects students are using it and the boundless ways that they can. 

“I think there’s a lot of students who are probably using it in ways that are much more nuanced and robust than faculty, and myself, are aware of. That’s the bigger question and challenge for me is like, we need to understand where they’re using it and then we need to understand why they’re using it.” 

Benegale is interested in the other factors, besides just cheating, that play into leading a student in a certain direction with AI.  “We aren’t anywhere close to having an answer to that kind of question,” Benegale said. 

Cvitanić calls ChatGPT the “writing partner I’ve always been looking for.” 

“It’s my biggest advocate and supporter,” he said. As someone who uses it for academic writing as well as creative writing, he relishes in that ChatGPT’s “never has to reschedule a writing session, it’s funny, it gets the punchline, it knows emotional beats.” For creative projects, he explains that it’s like “you are working with it.” He compared it to “taking the role of director” while ChatGPT carries out smaller tasks. 

In his creative work, he doesn’t feel like using ChatGPT is plagiarizing. “It basically spits out shit always. It gives a cookie cutter, boring response. If you want a good story from Chat, you aren’t going to get one. If you get a good response it is because you really know how to use the software or you already had a good idea.” 

By the time he is done with a story or script, there is pretty much nothing that ChatGPT actually wrote. “But, I couldn’t have gotten there in that amount of time without it,” he explained.

What has been more visible than use cases such as this, is ChatGPT being predominantly a shortcut. When the task is tedious (making a bibliography, writing an abstract, etc.) a shortcut may feel appropriate. However, if there are parts of an assignment that are tedious and ChatGPT feels like the right thing for the job, educators might need to question certain aspects of their assignments. 

“Sometimes those parts are necessary in order to learn, think, move forward in a paper, but sometimes they might not be. It might expose some ways that we want to change writing assignments for a new world,” Schacht said. 

“It is amazing at reflections,” Cvitanić said. “With that kind of thing I don’t add any of my own original ideas. I am just having it do the work for me so that I can get onto my other stuff. It enables me to actually spend time on creative things. Like ideas rather than details. That is such a gift.” 

Schacht reported that not much has happened or changed with the Writing Center yet because he is “waiting to see the most legitimate ways in which LLMs integrate themselves into the writing process.” 

“It can basically be used for plagiarism,” Schacht explained. “We are going to talk about it more this semester both with the training cohort and with the tutors about how we might address AI use in the Writing Center.”

Benegale is co-leading an enterprise approach to AI at CC. This is a group of faculty and academic staff at CC participating in an AAC&U institute on AI pedagogy alongside 80-100 other schools in the U.S. 

“We are not going to rapidly adopt and use AI without giving really serious thought to the liberal arts mission, to sustainability, to our anti-racism commitment and centering it in our values.” Benegale also explained that the institution has to be “very explicit about when we use it and when we choose not to use it.” 

There will be an open lunch forum on March 10, for faculty, staff and students to discuss AI and academic integrity. “It’s there where we really want to open up the co-created space for all of us to talk about what are the questions, concerns, uses, and the kind of opportunities that emerge about AI,” Benegale said.  

At the end of the year there will be Crown Educator Development Days, two days of which will be dedicated to AI literacy. Benegale is hoping to end up with a “large philosophy” that can then be applied to respective departments.

For Benegale, it is ultimately about our values. “It matters what we as humans value about humanity. If we value the arts, which we do, the arts have always been an important part of what we do and it will always continue to be so, then I’m not concerned about AI taking over that process and that work. It does have real implications though.” 

It is not just ethics and principles surrounding education and human development that we need to consider: There are also major environmental, legal, and business concerns relating to how ChatGPT functions that put into question how much we can and should rely on it. Legal cases accusing Open AI of stealing large swaths of personal data threaten ChatGPT’s existence. Open AI is running on a major deficit which calls into question its sustainability as a company. Significant amounts of research show the immense impact that the data centers needed to train and run a program like ChatGPT are having on the climate both through massive energy consumption and water use. All of these factors cast doubt on our reliance on Generative AI. 

We must draw from our minds and community and question whether Generative AI is creating a student and human population out of touch with their own intelligence. Conversely, we must consider its potential to become a legitimate tool in the pursuit of education and creativity, capable of strengthening and working alongside our human brains.

Leave a Reply