There is a trend we’ve seen on Reddit, on the ever-stressed r/ApplyingToCollege, where students ask ChatGPT (or Grok, or Claude, or Gemini) to rate their chances of getting into particular schools, especially their dream universities or the top Ivy League schools. This is a joke, sort of, and presented as one, but there is an undercurrent of need in these posts, a sense that maybe, just maybe, these AI devices know something you don’t, and can help predict the future.
This isn’t just something done by students, as a recent New York Times article demonstrates. As chat-bots become more ubiquitous, more integrated into workflows, the temptation to ask them this and other questions about college admissions inevitably grows. This is often a mistake, and we’ll explain the how and why of it in this article. Let’s jump right in!
Should You Ask AI About Your Chances of College Admission?
We’re going to start here, as that’s the title of the article, and then go on to the other flaws in AI-based workflows. The way the process generally works, based on reddit posts (the ultimate source of information), is that students upload their entire application to their chatbot of choice, and then provide it with a list of the colleges they’re applying to. Once this is done, they ask the bot to rate their chances of getting into each one.
Now, as a process, this is nothing new. Subreddits like r/ChanceMe exist so that unqualified teenagers can guess at each other’s chances of getting into top schools, and it is also something that we do all the time as admissions counselors, assessing a student’s portfolio and judging based on that how likely they are to gain acceptance to particular colleges.
Here is an example of how that process works when we do it, so you understand the meat of what chancing students for college entails:
- We first look at a student’s numbers. Grades, test scores, the hard stats, and compare them to the averages for the college in question.
- We then look at extracurriculars; what has a student done, to what extent, and how have they actively sought to demonstrate their passions and ambitions.
- We finish with the student’s essays, how well a story is portrayed, how well they describe who they are and what they want from their college experience.
- These are then compared to what colleges want, based on what admissions officers have said, and what data is available. From all of this, we create a range of probabilities for acceptance.
This is not an exact science, and neither we nor anyone else can exactly replicate what admissions officers do; we aren’t in the room where it happens, even if we may have some insights into why they make the decisions they do.
Why ChatGPT Can’t Do This
So then what’s the issue with asking ChatGPT to do the same?
This comes down to the nature of chatbots, and how they work. ChatGPT and other LLMs do not “know” things in the standard sense of the word. They can create approximations of it, words in patterns that make sense and even sound quite convincing, but there’s nothing to back them up, no system of knowledge they are drawing upon for their conclusions.
What this means is that when you ask ChatGPT to rate your chances, the answers it gives you have no basis in reality. The bot can make them sound good, write out reasoning, but this is all gossamer threads of nothing, unsupported and unsubstantiated. All you need do is push back, and it will completely change its mind.
The urge to use AI here is understandable, to give some sense of control, of surety in a process which is rather alien to students and parents. These tools, unfortunately, can do more harm than good here, leaving you less certain of your chances than if you hadn’t asked them at all.
AI Is Bad at Giving College Application Advice
We’ve already written about the pitfalls of using AI to write your college essays, and why admissions officers don’t like it if you do, but as more and more students use AI in multiple aspects of the admissions process, we are seeing just how bad these tools are at giving admissions advice. To explain this, we’re going to go through common ways students use AI in the admissions process, and explore how it fails them.
Researching Colleges With AI
This is by far the most common usage, and with Google’s introduction of inevitable AI summaries, likely inevitable. Now, many people have pointed out the factual issues with these summaries, and this holds true on colleges as well. You may get useful data, and you may instead get references to programs that don’t exist, or which were recently canceled, or which are wildly out of date, all of which are presented together in a single stream, without citations.
This can be quite bad, as accurate information about a college is important if you want to go. If you are only applying to a school for a particular program or to pursue a very specific major, then you want to be sure said program really exists, and wasn’t simply hallucinated by an AI tool. Researching colleges can be hard, but using AI as a shortcut often wastes more time than it saves.
Application Reviews
Even if students don’t use AI to rate their chances, many will ask it to review their applications and essays. This is again, a new take on a very old thing; asking peers, teachers, or counselors to review your application is a good practice. While their advice may not always be perfect, they can often notice things you have missed, even if these are just simple grammatical errors.
Now, AI can pick out grammatical errors quite well, but you should be cautious about taking its other advice. Just like your friends and strangers on reddit may claim to know everything but actually don’t, AI chatbots are quite good at sounding authoritative without having any actual sources to back it up. Thus they can give advice that is reasonable or abysmal with the same tone, and without prior, in-depth experience, you’ll have no way to tell if you’ve been steered good or ill.
AI College Counselors
There is a final trend in the use of AI in giving admissions advice, one not touched on in the article in the New York Times, but which we have seen in our field, and that is the incorporation of AI tools into the processes used by college consulting companies.
Now, AI tools are not inherently bad, and do have legitimate uses; we use one here at Ivy Scholars to generate transcripts of our meetings with students for instance, a very useful tool indeed. But this is a specific tool used for a purpose, one that complements the work we do with students, rather than replacing our knowledge and effort.
Other companies, big and small, are introducing AI tools in far more obtrusive ways into their process, everything from helping students draft college lists to aiding in the essay brainstorming and drafting process. We have explored elsewhere in this article why the use of AI tools in the admissions process is a problem, but it is especially pernicious when done by counselors.
Students and parents come to counselors because they want expert advice. The admissions process is confusing and often overwhelming, and having a guide who knows the ins and outs and who can answer your pressing questions is what they are looking for when they hire a counselor. The cost is justified by the expertise that counselors bring, and the level of personal attention given to each engagement.
The use of AI tools cancels both of these out. An AI tool used by a counseling service is unlikely to give you any better advice than that of a standard model online. Sure, that can be corrected by a counselor later, but then what’s the point of having the AI tool in the first place?
There is often a push towards the new and novel, technology for its own sake, and this is true across industries. We certainly don’t have anything against technological progress, but implementation should be considered and deliberate, done to ensure it is actually improving a service offered, rather than just added on for flash and wow and the allure of the next big thing. In many cases, companies that advertise the use of AI tools in counseling are doing a disservice to their students.
Final Thoughts
AI is a new and interesting thing, and the full impacts and implications of this technology for college admissions (and everything else) remain to be seen. At this moment, however, this tool causes far more harm than good when used by students seeking information about students, steering them astray fully unintentionally. This is not the fault of the students themselves, and we hope that this article has given you insight into how and why AI tools can cause problems when applying to college.
If you want help on your own applications from real experts with real experience, then Ivy Scholars can help. Our mentors are all well skilled in helping students apply to top colleges, and work one-on-one with students to guide through the often tumultuous admissions process. If you want help on your own admissions journey, then schedule a free consultation with us today, we’re always happy to hear from you.

