Skip to Content, Navigation, or Footer.
The Cornell Daily Sun
Submit a Tip
Wednesday, Aug. 13, 2025

Opinion Graphic

REYEN | Qualitative Study Reveals ChatGPT Up, Critical Thinking Down

Reading time: about 7 minutes

On Thursday, OpenAI announced that ChatGPT Plus would become free for college students until the end of May. With this subscription, users will have access to GPT-4o, image generation, advanced voice mode and other research tools.

I’ve long suspected that the unparalleled degree of information access granted to us by the internet — and now, artificial intelligence — comes with a loss of internal memory function. 

A 2019 study not only suggests this concern, but supports it, finding that the internet’s constant “distractions” and ability to “offload” mental tasks hinder the development of higher-level critical thinking during key stages of brain growth in children and adolescents. 

At Cornell, it’s safe to say that I’ve seen the effects firsthand. Recently, I asked the Cornell Sidechat community how often they use ChatGPT to complete assignments. Out of 324 respondents, 43% responded “most of the time”. Is this a function of overworking students, or of a tendency towards convenience?

Another 22% responded “when I don’t know how to do the work”. In other words, Chat is serving as a substitute for learning itself.

Forbes reports that an estimated 1 in 3 Americans between the ages of 18 and 24 use the platform. 

While undeniably convenient, I have to wonder if this ready-access second brain is quietly dulling our own. While getting my laptop fixed last week, the employee on hand consulted GPT to answer my questions about my computer’s lifespan. And when assisting a group of colleagues with printer issues, I peered over their shoulders to see the familiar “What can I help with?” in ChatGPT’s minimalist interface.

Hoping to assess just how pervasive GPT use is on campus, I’ve taken to conversing with my peers. One student I interviewed was working on an essay for which her professor permitted ChatGPT use.

“It will only hurt me if I use it,” the student said: the assignment was to create work samples for job applications.

She initially asks ChatGPT to generate an essay using the prompt and her notes, then refines awkwardly-worded sections herself before prompting GPT to improve the essay once more.

Collaborative? Or outsourcing?

GPT may seem like the busy student’s solution to managing heavy course loads, job applications, and club activities. But what writing skills will we have left if the tool becomes the first — and final — draft? Using GPT as a crutch removes the process of forming connections and patterns while writing, minimizing the ability to learn from writing feedback and academic improvement. 

Another student uses ChatGPT to turn class notes and an essay prompt into an outline, which she later develops into a cohesive paper. (And, she informed me, the key to job application efficiency is perfecting the slight tweak of an A.I.-prepared cover letter.) 

Working to fine-tune the assignment, she wondered out loud: “How do I get ChatGPT to make this essay better without shortening it?”

“Did you run that through an A.I. detector?” another student added, peering over her shoulder.  

But perhaps there’s no need. 

Professors are attempting to adapt, with GPT citation clauses included in each syllabus after the standard academic integrity boilerplate. 

Yet, the prevalence of use and lack of corresponding Academic Integrity hearings demonstrates that the current state of A.I., compounded with the savvy of students, is evading any safeguards professors may have in place. For teaching academics, it surely must be frustrating to assume the role of anti-plagiarism enforcer. 

One grad TA said he runs every student writing assignment through an A.I. detector. Still, even when seemingly obvious that a student has used A.I. for a writing assignment, unless hallucinated citations are present, he feels there’s no concrete evidence that can be used as proof of cheating before an academic integrity hearing board. 

(I’m saddened that my favorite punctuator — the emdash — is now apparently considered a sign of A.I. writing. Will an A.I. detector will flag my work too?)

My friends and I have noticed the introduction of annotated reading assignments on Hypothesis and marked-up close readings for essay assignments, possibly an effort to mandate “showing your work” in Humanities courses. 

As an English major, I’m attuned to just how much of our learning comes from writing: not just expressing ideas but arriving at them as well. (For the record, I have yet to be impressed by ChatGPT’s attempts to stand in for quality, human-authored prose.) 

I acquiesce that students must adapt to the changing times, that a Luddite mindset will only leave us behind in the fluctuating job market. However, I caution the “offloading” of academic tasks at the expense of critical thinking.

I’m not trying to be Holier than Thou by any means. It’s objectively a waste of our tuition dollars to leave behind the soft skills that ChatGPT often steps in for. And we’re not only losing out on skill development; each time A.I. helps with an assignment, we’re teaching GPT how to more effectively replace us. 

The university's shift towards pre-professionalism (consulting, tech, etc.) may be accelerating this decline. As we train students for jobs, GPT becomes less of a shortcut and more a symbol of a trend away from thinking for thinking’s sake. I worry that higher education’s chief objective will become churning out corporate robots.

Are the STEM fields safe from A.I. corruption for now? One student lamented that GPT was unable to answer a coding homework question due to the specificity of libraries required. 

In my neurobiology and behavior class, our professor warned us about A.I.'s tendency to butcher scientific writing by summarizing badly, hallucinating nonexistent citations and omitting the word “not,” thus falsifying information.

Yes, Chat use at Cornell may be symptomatic of overworked students grasping at any method that might help them pass the semester. But looping in another professor to the conversation, he told me that students rarely attend office hours. Easy access to GPT may also allow students to slide through their four years of college avoiding voicing questions to the human resources on campus or building meaningful relationships with faculty.

Perhaps, then, this is another of the reasons for the rejection of academia. Not only can machine learning complete basic course functions and reduce labor, but students are checking out of their work — there’s no incentive to do it themselves when a 24/7 virtual assistant is waiting eagerly on computer screens to transform assignment prompts into deliverables. 

Whether or not I like it, last fall’s headlines confirm that “ChatGPT Is Coming For Higher Education.” Arizona State University, for one, has been working with OpenAI since early 2024 to develop ChatGPT Edu, with functions such as individually tutoring students.

The CEO of OpenAI has said that ChatGPT will create new knowledge and provide stepping stones to answer otherwise unsolvable problems, altering the current job market in some positive ways. 

Can we trust this testimony? I doubt I’m alone in my skepticism. Several students shared my lethargy to get with the times, A.I.-wise.

“I never use it,” one student said.

I’m hopeful that the GPT Plus deal’s requirement of a university email address will encourage engagement with assignments, with a healthy fear of academic integrity violations and university surveillance preventing actual cheating.


Carlin Reyen

Carlin Reyen is a fourth year student in the College of Arts and Sciences. Her fortnightly column Just Carlin’ It Like It Is centers around student life, social issues, Cornell life hacks and the University’s interactions with the broader community. Carlin can be reached at creyen@cornellsun.com.


Read More