Is AI-Generated College Essays Good Enough to Trick Admissions Officials?
In the wake of ChatGPT’s release a year ago, educators expressed concern over the potential use of the artificial intelligence tool by students to produce forged college admissions essays. With universities currently in the middle of college application season, The Washington Post decided to investigate whether these AI-generated essays could indeed fool admissions counselors.
To conduct the experiment, The Washington Post enlisted the help of a prompt engineer, an expert in guiding AI chatbots, to create college essays using ChatGPT. Two essays were produced: one in response to a question from the Common Application, widely used in college admissions, and another addressing a prompt specific to Harvard University applicants.
These AI-generated essays were then presented to Adam Nguyen, a former Ivy League college admissions counselor who had previously worked with students at Harvard and Columbia. As a control, real college admissions essays written by Jasmine Green, a Post intern currently at Harvard, were also provided.
Nguyen was asked to read the essays and identify which ones were AI-generated. The results were insightful.
According to computer science and college admissions experts, there are notable indicators that can easily expose AI-created essays. ChatGPT responses often lack specific details, resulting in essays that lack supporting evidence. The writing tends to be clichéd, relying on platitudes to explain situations rather than delving into the author’s emotional experiences. The essays often become repetitive and predictable, failing to provide surprise or a sense of the writer’s journey. If chatbots touch on topics like race, sex, or socioeconomic status, they may rely on stereotypes.
Initially, Nguyen found the AI-generated essays impressive, as they were readable and mostly free of grammatical errors. However, if he were reviewing them as part of an application package, he would have stopped reading.
The essay is such a mediocre essay that it would not help the candidate’s application or chances, Nguyen commented. In fact, it would probably diminish it.
Nguyen emphasized that while AI may be useful for everyday writing, it falls short in creating college admissions essays. Admissions offices are already using AI screening tools to filter out computer-generated essays, though these methods have been found to be occasionally inaccurate and falsely accuse students.
Importantly, admissions essays require students to reflect on their lives and craft their experiences into a compelling narrative that showcases their uniqueness quickly. According to Nguyen, ChatGPT is not equipped for this task.
Nguyen understands the appeal of AI, especially during the stressful application season, and sometimes recommends weak writers to use AI chatbots for brainstorming purposes. However, he warns that those who go beyond inspiration and use AI extensively may face consequences.
Their essays will be terrible, he stated, and might not even reflect who they are.
As students navigate the college application process, it is essential to consider the limitations of AI-generated content. While it can provide a starting point, the authenticity and personal touch that admissions counselors seek are best achieved through genuine introspection and self-expression.