What If Education Was Designed to Notice?
Why treating every learner as an individual may be the most important moral choice higher education faces.
The first time she realized she was falling behind, it wasn’t during an exam. It was during a homework assignment.
The math itself was not beyond her. What stopped her were the examples. They referenced sports she did not follow, contexts she did not recognize, assumptions about the world that were not hers. She kept rereading the problems, convinced she was missing something obvious.
“I understand the formula,” she said. “I just don’t understand what they were talking about.”
She wasn’t asking for less rigor. She was asking to be seen.
By the time we spoke, she was already behind. Not dramatically. Not catastrophically. Just enough that each new concept arrived before the previous one had settled. No one had noticed early enough. And slowly, almost imperceptibly, the course had begun to teach her a lesson it never intended to teach: that college might not be for people like her.
That sentence, spoken quietly by students across our institutions, should haunt us.
Most every educator I know entered this work with a deeper hope than to produce high graduation rates or enable successful labor-market outcomes, important as those are. Many continue to hope that students come to recognize that learning is complex, that growth is uneven, and that becoming educated means more than earning a credential. It means developing intellectually, socially, emotionally and morally.
And yet, despite our intentions, we have built systems optimized for efficiency rather than attention. We scale uniformity. We reward compliance. We call it rigor when students adapt to structures that refuse to adapt to them.
So, I find myself asking a question that now feels unavoidable: what if we designed education around learners as individuals, not abstractions, averages, or enrollment units?
For decades, personalization in higher education has been treated as a luxury: small seminars, elite tutoring, boutique programs at the margins. Everyone else receives a standardized experience, justified by scale and defended as necessity.
But technology is quietly changed the moral landscape.
The question is no longer whether it is possible to know where a learner is struggling, how they best engage, or when intervention matters most. The question is whether we will use these tools to humanize education or whether we will use technology to automate indifference more efficiently.
At its best, AI does not replace faculty judgment or human relationship. It extends the capacity to notice. Put somewhat differently, it provides the capacity to care at scale.
Imagine two students enrolled in the same statistics course. One is a young woman, from the Bronx, who loves soccer. The other is a young man, from Beverly Hills, who follows basketball obsessively. In most courses, both students receive the same examples, the same problem sets, all written in the same “language.”