Flipping the Script: Using AI to Motivate and Inspire

Every new information technology has sparked fresh panic about the future of teaching.  Plato warned that writing itself would destroy memory and learning. The printing press set off fears that books would undermine scholarly authority. Thomas Edison said that motion pictures would replace schoolbooks. Radio, television, and personal computers each drew the same anxious forecasts, followed in the 2000s by a wave of predictions that online courses would empty out campuses. None of it happened. Each technology reshaped instruction while leaving the teacher-student relationship intact.

Generative AI has now joined the parade, and once again the loudest voices are asking the wrong question. The issue isn’t whether AI will replace faculty. It’s how instruction must adapt so that the human work stays at the center of teaching.

A good place to start is a recent Forbes interview with Ben Gomes, Google’s Chief Technologist for Learning and Sustainability, in which he made one simple point.  The hardest problem in education is motivation. And it’s a problem AI will never solve. “Technology can improve how you learn and the details of it,” Gomes put it, “but the why you learn is a very human thing.”[1] That sentence should sit at the heart of every conversation about reforming college instruction. AI can deliver content, flag errors, generate practice exercises, and personalize feedback faster than any human. None of that addresses the prior question of why a student would bother engaging in the first place.

Gomes grounded the point in a lifetime of watching learners. He observed that high-achieving people almost never credit a book or a tool for unlocking their potential. They credit a person. A teacher who said something, who treated them differently, who made them feel that the work of learning mattered. Once that feeling took hold, the student could run forward on their own, and the tools became accelerators. But the ignition was always human. Decades of educational research echo the finding, with thousands of studies placing teacher-student relationships and instructor clarity among the strongest predictors of learning, well ahead of any technology.[2] If motivation comes from relationships rather than from content delivery, then any reform that pushes instructors further from their students is moving the wrong direction.

A similar point came from Josh Crossick of the tutoring platform Preply. Writing in the Observer, Crossick sketched a clear-eyed map of where AI earns its keep, noting that generative tools are excellent at transcription, synthesis, pattern matching, and content generation.[3] They can turn a class recording into a student transcript, flag mistakes, draft a tailored lesson plan, generate personalized homework, mark the responses, and brief the instructor on what to prioritize next time. They can run low-stakes practice sessions with realistic avatars that correct errors without judgment. These are real contributions, and they represent hours of grinding preparatory work that can take up enormous amounts of instructor time. Data from the Organization for Economic Cooperation, and Development( OECD) has shown that teachers across member countries consistently log hours beyond their contracts, with administrative burden cited as a leading driver of burnout and attrition.[4]

The trouble starts when efficiency gets mistaken for the whole of education. Crossick recounted the popular belief that AI is essentially nihilistic in its logic. Humans are replaced, costs are saved, and the resulting system ends up less human and, oddly enough, less effective. History keeps offering the same warning. Printed books did not replace teachers. Khan Academy did not replace teachers. MIT has given away thousands of hours of free course material through OpenCourseWare since 2001, and the university still charges serious money for a human-based education. That pattern says something important about what students are really paying for. They are paying for the relationship, the accountability, the judgment calls a skilled instructor makes in ambiguous moments. An AI tool can identify your mistakes. An instructor can give you the motivation to try again.

Gomes pushed the argument further, insisting that AI should prompt a rethinking of what colleges teach, not only how they teach it. For decades, higher education has funneled students toward ever narrower specializations. AI changes that calculus. The syntax of a programming language has become vastly less important. The concepts behind programming still matter a great deal. How to break a problem into modules. How to think about interfaces. How to move between levels of abstraction. Gomes argued that education needs to shift toward this kind of conceptual literacy, since the mechanical tasks are increasingly automated. He referenced an old image of the Renaissance learner with broad competence across many domains. Modern knowledge is too vast for a literal return, yet AI does open the door to a meaningful broadening of the narrow specializations that have shaped professional life for generations.

Yet that broadening only works for students already in motion. A student with no interest in learning does not become a Renaissance learner because the tools got better. Tools amplify direction. They do not supply it. That observation sharpens the equity stakes considerably. Gomes described what he calls the 5 percent problem, the worry that if AI learning tools are expensive only well-resourced institutions will adopt them skillfully, with the benefits going to students who already have the most. The problem cuts deeper than price. A free tool is useless if nobody in a student’s life has made them want to pick it up. The gap that matters most may not be access to technology at all. It may be access to the people who create the desire to use it. For public universities serving large and diverse student populations, that insight demands attention.

So what should change in practical terms? Rather than using AI to thin the ranks of instructors or automate the teaching relationship, colleges should use AI to free instructors from the administrative labor keeping them away from their students. About half of a typical educator’s working hours go to lesson prep, assessment design, guidance documents, and routine management. Those are precisely the tasks where AI excels. Handing that work to AI would free faculty to spend their time on what only humans can provide. Motivation. Inspiration. Social accountability. The careful calibration of challenge for individual students. Crossick put it nicely when he argued that the goal should be to use generative AI to bring more humanity to education, so that more AI ends up meaning more humanity rather than less.

Making any of this real will require new kinds of professional development and a harder look at institutional priorities. Gomes noted that AI integration works best where instructors talk among themselves and make the technology their own. He described communities of practice where leading-edge teachers bring their colleagues along. Google’s work with the State Teachers of the Year offered a telling example. In the project, special education teachers used AI to build custom apps for students with disabilities, designing tools no product team would ever have thought to create. They did it because they cared about specific students. The motivation was theirs. The AI just made it possible. College instruction after AI will look different from what came before. The lectures will change, the assessments will change, the administrative routines will change. What should not change is the centrality of the instructor as the person who makes students feel that learning matters, and that they matter within it.

[1] Dan Fitzpatrick, “Google’s Head of Learning Says AI Can’t Solve Education’s Real Problem,” Forbes, (Apr. 5, 2026).

[2] John Hattie, Visible Learning (London and New York: Routledge, 2009).

[3] Josh Crossick, “We’re Asking the Wrong Question About A.I. in Education,” Observer, April 7, 2026.

[4] Results from TALIS 2024,” The Demands of Teaching, (Palm Desert, CA: OECD Publishing, 2024).

Leave a Reply

Your email address will not be published. Required fields are marked *