The Practice of Becoming in a World That Will Not Hold Still

April 28, 2026

What do you do when the map you were handed no longer matches the world?

A generation of students is walking into a world whose patterns shift faster than our syllabi can name them. The jobs we are preparing them for may not exist when they graduate. The disciplines we sort them into were drawn for a different epoch of knowledge. The tools they will work with were not in the room when their majors were designed. And yet we keep asking them, with a straight face, the same question we have asked for fifty years: Have you decided on a major, and what you plan to do with your life?

We are asking the wrong question of the wrong students at the wrong time.

It is tempting to read the AI moment as a story about jobs. It is not, or not yet. It is a story about context. When context changes, patterns change. When patterns change, we must learn to see them before we can adapt, and we have to adapt before we can create. This is not a clever sequence. It is the order in which human beings, in every transition that has ever mattered, have made their way from disorientation to agency.

Otto Scharmer's Theory U gives this movement a discipline and a name. He calls it presencing, the practice of suspending our reflexive frames long enough to perceive what is actually emerging, redirecting our attention from the patterns we expect to the patterns that are forming, and letting what we sense reshape what we do. The U is the shape of the descent and return: down through observation, into stillness, back up through prototyping. It is not mystical. It is rigorous. It is also, I think, the most honest description we have of what the practice of learning actually requires of a person, and of an institution. Anyone interested in the framework can find Scharmer's work at presencing.org. I mention it because what he describes is not a metaphor for what students need. It is the thing itself.

Right now, most people including our students are AI users. A vanishingly small number of people are agent-makers, people who design systems that act on their behalf, who shape the tools rather than being shaped by them. The asymmetry between those who direct AI and those who are directed by it is going to define a generation. And we are, for the most part, training the directed.

The harder thing to say is this: the divide is going to grow, and we are going to grow it. Every semester we recruit students into majors built for a world that is dissolving, we widen the gap between the prepared and the unprepared. We are not merely failing to keep up with change. The masses are being persuaded, with our institutional weight behind the persuasion, to spend years and considerable money studying things that are likely to be obsolete before the loans are repaid.

The major itself was an artifact of a particular epistemology: knowledge cut into disciplines, disciplines cut into credits, credits stamped into credentials, credentials traded for jobs. The seams held for a century because the world held still long enough for them to. AI is dissolving those seams in real time. What is emerging on the other side is not a new discipline. It is a new literacy, the capacity to see across disciplines, to hold a system in mind, to work with patterns rather than against them. Older traditions simply called it wisdom.

Here is the part where I am supposed to tell you what jobs will exist in 2040, what skills students will need, what curricula will future-proof them. Joseph Aoun's Robot-Proof offers one of the more thoughtful answers we have, a framework he calls humanics, weaving technological, data, and human literacies into a curriculum designed to cultivate what machines cannot. I have learned from the book and recommend it. My own argument runs underneath his. Before we ask which literacies to acquire, we must ask what kind of person we are forming, and in what practice. We do not know what jobs will exist. We do not know what knowing will mean when machines hold most of what we used to call knowledge. We do not even know what the word work will refer to in twenty years.

This should not paralyze us. It should clarify us. If we cannot prepare students for an outcome we cannot predict, we can still form them in a practice they will need no matter what comes. The honest curriculum is not preparation. It is formation. Everything else is a contract we cannot keep.

Four practices belong at the center of the work, and any of them could fill a book. I will name them and trust the reader to extend the lines.

Self-knowledge. Purpose is the only compass that works when the map fails. Students who do not know who they are will be led by whoever speaks loudest, including a model trained to please them.

Pattern literacy. Systems thinking is no longer optional. It is the discipline of seeing a system long enough and honestly enough to perceive what is emerging, rather than what we were told to expect.

Creative agency. The line that matter is no longer between the AI-fluent and the AI-illiterate. It is between those who use AI and those who build with AI. We must teach our students to be on the building side of that line, or we have to be honest that we are not.

Moral imagination. This is the practice that holds the other three together, and the one most absent from our institutions. Moral imagination is the capacity to picture, in advance, what a tool will do to the people who will live with it, and to refuse to build what should not be built. Capability without conscience is a danger we have already learned to regret in every century of our history. We are now handing the next generation tools whose consequences will outrun our ability to recall them. It is not enough to teach students to make. They have to be formed in the harder questions: what are we making, for whom, at whose cost, and toward what kind of life together? Hannah Arendt warned that the gravest harms in modern life are committed not by monsters but by people who never paused to think. Iris Murdoch insisted that to see clearly is itself a moral act. AI does not change those claims. It makes them urgent.

The cost of pretending that nothing has changed is borne entirely by students. They are the ones who graduate into a world they were not prepared for. They are the ones who carry the debt of an education that was certain about the wrong things. They deserve teachers and institutions willing to tell them the truth: that deciding is not certainty, that no curriculum can substitute for the practice of becoming, and that the question is no longer what will you major in? but who are you willing to become, and how will you keep deciding when the ground keeps moving?

This is not a problem to be solved. It is a vocation to be taken up. The students arriving on our campuses this fall are walking into a world whose patterns we ourselves cannot yet read. We owe them more than the question we have been asking. We owe them the practice we have not yet learned to teach.

Next
Next

Undecided Is a Mindset. Deciding Is a Practice.