AI and the Art of Being Human: Reclaiming the Purpose of the University

“When machines learn to imitate our intelligence, we must learn to deepen our humanity.” Artificial intelligence has entered higher education not as a visitor but as a permanent resident. Every provost, president, and faculty member now faces a common question: what does it mean to teach, to lead, and to learn in a world where machines increasingly approximate human cognition?

Andrew Maynard and Jeff Abbott’s AI and the Art of Being Human offers a simple but radical answer: rather than racing to keep up with technology, we must rediscover what it means to flourish as human beings. Their argument is neither nostalgic nor technophobic. It is a moral and practical framework for aligning technological capability with human purpose.

They propose four guiding postures: Curiosity, Clarity, Intentionality, and Care. Each, taken seriously, could reorient how universities approach AI—not merely as a set of tools, but as a catalyst for re-examining our deepest commitments.

Curiosity: From Instrumental Use to Existential Inquiry

Curiosity, in their framework, is not about novelty or innovation alone; it is the courage to ask fundamental questions when the answers are uncertain. Universities have increasingly treated curiosity as instrumental—something to be monetized through grants or patents. Yet AI demands a different kind of curiosity: one that is reflective and dialogical. How do we remain curious about ourselves when the machine begins to mirror our thinking? What becomes of wonder when algorithms appear to anticipate our needs? For the academy, this means designing learning environments where students and faculty explore AI not only as coders or users but as philosophers of technology. Courses in ethics, literature, and design must converge to help students ask: What does it mean to create something that now co-creates with us?

Clarity: Seeing the System and Naming the Human

Clarity begins with recognizing the difference between what AI can do and what it should do. AI thrives on prediction; humanity thrives on discernment. Institutions risk confusing operational clarity with moral clarity. The question is not whether outputs are accurate but whether they are wise. Universities must learn to see their systems clearly: Where do algorithms reinforce inequities in admissions or advising? Where do predictive models narrow rather than expand opportunity?

Intentionality: Aligning Innovation with Mission

Intentionality is the act of choosing purpose over convenience. Maynard and Abbott describe their use of AI in writing the book—not to outsource thought, but to amplify dialogue. An intentional university would treat AI not as a mechanism of cost-reduction but as an opportunity to free human capacity for mentoring, reflection, and creativity. Administrative automation can serve the mission only if it expands, rather than contracts, the space for human encounter.

Care: Re-centering the Human in the System

Care resists the impulse to treat students, staff, or faculty as data points. It demands that we remember each algorithm affects a person—a learner with hopes, constraints, and histories. The purpose of AI is not efficiency alone but flourishing. AI should help identify students who need support, but not become a mechanism of surveillance or control. Imagine AI tools designed to listen deeply—to help students articulate purpose, to facilitate belonging, to humanize rather than quantify engagement.

The great temptation of the moment is to treat AI as either savior or threat. Maynard and Abbott offer a third path: partnership. AI can expand human capacity if we engage it as a mirror—one that reflects both our potential and our limitations. The future of higher education will not be determined by who adopts AI fastest, but by who integrates it most ethically. If curiosity reawakens wonder, clarity reveals truth, intentionality restores purpose, and care re-centers dignity, then perhaps the university can become the very space where humanity learns to live wisely with its machines.

All said, one might wonder whether universities, ironically, aren't the very places where people get an opportunity to experience the "real world"; where students have a genuine opportunity to encounter the real world before trekking off into the fictive world set up by investment bankers and financial speculators, a world dominated by concerns related exclusively to industry and commerce and whose valuations however speculative and fictive are measured in dollars and "sense."

 It seems, ironically, that once students leave university life, they become quite prone to living one-dimensionally, and therefore not fully, especially if they haven't taken their "education" seriously.

Previous
Previous

Amazon, Do Your Part to Promote Democracy

Next
Next

The House v. NCAA Settlement: Some Considerations