But in 2025, we’re forced to ask a tough question:
Are we still evaluating motivation or just writing skills, editing help, and prompt engineering?The Problem That Everyone Silently Admits A beautifully written motivation letter doesn’t necessarily reflect a student’s actual drive, values, or potential. It often reflects:
- Whether they had someone edit it.
- Whether they had access to writing coaches.
- Whether they had ChatGPT or other AI tools — and knew how to use them.
In fact, today’s admissions teams and scholarship committees are quietly wrestling with a new kind of bias:
tech privilege. When one applicant from a well-connected urban background submits a flawless, AI-refined letter, and another from a rural or underserved region submits a raw, unpolished one… guess who looks “more motivated”? It’s not always fair and we know it.
Universities Are in the Same Boat Motivation letters are also widely used in Master's and PhD admissions, where reviewers try to gauge research alignment, commitment, and long-term goals.
But again: how much of that letter reflects the
applicant, and how much reflects a friend’s help, a polished template, or AI-generated fluff? Reviewers read between the lines. But reading between the lines still assumes the lines were written by the applicant.
A Better Way: Let Them Speak for ThemselvesWhat if, instead of interpreting motivation from a curated document, we gave candidates a chance to literally speak? A short, structured video interview lets applicants explain:
- Why they’re applying.
- What they hope to achieve.
- How the opportunity aligns with their story.
And they do it in their own voice, with their own expressions, and their own thought process in real time. This isn’t about penalizing those who write well but about
balancing the scales for those who don’t have the same tools, time, or help.
We’re Not Saying: “Ban Motivation Letters.” We’re saying:
don’t let them speak louder than the person behind them. Adding an optional video interview (especially one that’s AI-assisted, fair, and available 24/7 like INSELECT) gives decision-makers an extra layer of insight. A more human layer. One that can’t be outsourced to ChatGPT.
One Final ThoughtIn a world of growing automation, perhaps our selection processes need a little less ghostwritin and a little more humanity.