Flipping the Script: Using AI to Motivate and Inspire

Every new information technology has sparked fresh panic about the future of teaching.  Plato warned that writing itself would destroy memory and learning. The printing press set off fears that books would undermine scholarly authority. Thomas Edison said that motion pictures would replace schoolbooks. Radio, television, and personal computers each drew the same anxious forecasts, followed in the 2000s by a wave of predictions that online courses would empty out campuses. None of it happened. Each technology reshaped instruction while leaving the teacher-student relationship intact.

Generative AI has now joined the parade, and once again the loudest voices are asking the wrong question. The issue isn’t whether AI will replace faculty. It’s how instruction must adapt so that the human work stays at the center of teaching.

A good place to start is a recent Forbes interview with Ben Gomes, Google’s Chief Technologist for Learning and Sustainability, in which he made one simple point.  The hardest problem in education is motivation. And it’s a problem AI will never solve. “Technology can improve how you learn and the details of it,” Gomes put it, “but the why you learn is a very human thing.”[1] That sentence should sit at the heart of every conversation about reforming college instruction. AI can deliver content, flag errors, generate practice exercises, and personalize feedback faster than any human. None of that addresses the prior question of why a student would bother engaging in the first place.

Gomes grounded the point in a lifetime of watching learners. He observed that high-achieving people almost never credit a book or a tool for unlocking their potential. They credit a person. A teacher who said something, who treated them differently, who made them feel that the work of learning mattered. Once that feeling took hold, the student could run forward on their own, and the tools became accelerators. But the ignition was always human. Decades of educational research echo the finding, with thousands of studies placing teacher-student relationships and instructor clarity among the strongest predictors of learning, well ahead of any technology.[2] If motivation comes from relationships rather than from content delivery, then any reform that pushes instructors further from their students is moving the wrong direction.

Continue reading “Flipping the Script: Using AI to Motivate and Inspire”

American Anti-Intellectualism Fuels the New War on College

American higher education finds itself under siege, facing unprecedented political attacks that threaten its fundamental mission and autonomy. What makes these assaults particularly devastating is not just their intensity, but the fertile ground of public sentiment that has enabled them to take root and flourish. The convergence of deep-seated anti-intellectual currents with a dramatic erosion of trust in universities has created the perfect conditions for opportunistic politicians to weaponize higher education as a cultural and political battleground.

Once seen as sites of personal and social betterment, universities and colleges nationwide now struggle with a profound crisis of confidence. This shift in perception is hardly anecdotal. Recent surveys reveal that but 36 percent in the U.S. feel positively about higher education, reflecting serious concerns over the institution’s efficacy and fairness.[i] Moreover, a growing partisan divide complicates the erosion of trust. While 59 percent of Democrats express confidence in higher education, a staggering 81 percent of Republican voters now view the institution unfavorably. This chasm speaks volumes about the politicization of education in America, with college increasingly seen as a battleground for antagonistic ideologies.[ii]

A pragmatic shift in educational preferences complements this rift. Mirroring student attitudes is the reality that most Americans now regard trade schools and vocational training as equivalent or superior to four-year institutions in delivering practical education. This pivot reflects changing educational values and an indictment of the entire enterprise of higher education. The growing appeal of alternative educational paths suggests a fundamental reevaluation of what constitutes valuable knowledge and skills in today’s rapidly changing job market. The roots of this mistrust are multifaceted, extending beyond mere economic calculations to encompass broader socio-political undercurrents. Often associated with privilege and intellectual elitism, higher education increasingly is viewed through a lens of class-based suspicion.

Continue reading “American Anti-Intellectualism Fuels the New War on College”

Emotional Intelligence in the Classroom

Everyone knows that schools have a problem with feelings. Walk into any classroom and you’ll witness the elaborate dance of affective suppression that defines modern education. Students learn to hide frustration behind blank stares, to swallow anxiety whole, to perform engagement even when drowning in confusion. Faculty become masterful at reading the affective climate while pretending emotions don’t exist unless they become “disruptive,” at which point they’re quickly pathologized or punished. This affective hide-and-seek isn’t merely unfortunate. It’s academically devastating in ways that most educators are only beginning to understand.The recognition that emotion fundamentally shapes learning has deep roots in psychological research, though it took decades to gain educational traction.

Howard Gardner’s groundbreaking theory of multiple intelligences, introduced in the 1980s, challenged narrow definitions of cognitive ability by identifying “personal intelligences” as distinct forms of human capacity. These included both intrapersonal intelligence (understanding oneself) and interpersonal intelligence (understanding others), categories that opened space for recognizing emotional and social skills as more than personality traits.[i] Gardner’s framework provided crucial legitimacy for educators who suspected that success required more than traditional academic skills.

It was Daniel Goleman’s 1995 bestseller Emotional Intelligence that brought these ideas into mainstream conversation, arguing that EQ often matters more than IQ for success in work and relationships.[ii] Goleman synthesized research from psychology and neuroscience to demonstrate what many educators intuitively knew. Emotions are not distractions from thinking but rather integral to how thinking happens. When learners cannot recognize, understand, or regulate their emotional states, everything else becomes exponentially harder. Memory formation falters under affective stress. Attention scatters when anxiety floods the system. Continue reading “Emotional Intelligence in the Classroom”

Inclusive Teaching 2.0: The Challenge of Equity in Anti-DEI Times

These days American universities find themselves at a peculiar crossroads. With the stroke of a pen, federal actions have swept away diversity, equity, and inclusion (DEI) programs at institutions dependent on government funding. Yet in this moment of apparent retreat, one might discern not an ending but a beginning, creating the potential for a more profound transformation in how we understand the art of teaching itself.

The moment demands reinvention, not retreat. For decades now, inclusive teaching has been quietly revolutionizing classrooms, operating not by privileging some students over others, but by ensuring all students can thrive. The principle, though deceptively simple, borders on the radical: every learner deserves access to tools that support their academic growth. This principle can guide institutions toward a universal model of excellence, grounded in research, focused on outcomes, and aligned with the values of higher education.

Never has the need been greater. Today’s college student defies easy categorization. The stereotypical image of young adults attending full-time classes on residential campuses has given way to something far more complex, with  students juggling work commitments, family responsibilities, and extended degree timelines to manage costs.[1] This demographic shift demands nothing less than a pedagogical evolution, one that acknowledges students’ multifaceted lives while maintaining academic rigor.

Enter Universal Design for Learning (UDL), an evidence-based framework offering a compelling vision for the future. UDL is defined as “a framework developed to improve and optimize teaching and learning for all people based on scientific insights into how humans learn.”[2] Grounded in cognitive neuroscience and educational research, UDL principles encourage educators to present information in multiple ways, offer students various methods for demonstrating understanding, and foster engagement through real-world relevance and autonomy. Continue reading “Inclusive Teaching 2.0: The Challenge of Equity in Anti-DEI Times”

The Other Campus Gender Matter

 Recent months have witnessed heightened gender-related conflicts in higher education. From debates over transgender athletes to challenges against Diversity, Equity, and Inclusion (DEI) initiatives, a narrative portraying universities as ideological battlegrounds has gained traction. Political figures, including Vice President J.D. Vance, have amplified this perspective, saying “universities are the enemy” as sites of indoctrination rather than education.[1]

Beneath these controversies lies a significant but often unacknowledged factor: women’s dramatic ascendance in higher education. The gender gap in college enrollment has reached historic proportions, with female-identified learners constituting approximately 60% of U.S. college students.[2]  This unprecedented shift, a cornerstone of women’s empowerment and presence in the workforces, appears to have become a focal point for those seeking to reshape or constrain the university system.

The gender gap in education begins well before college, with girls outperforming boys in reading and writing during elementary years and making up 60 percent of the top students graduating from high school.[3]  In the 1990s, some universities began giving boys extra points on their applications, owing to concerns that students might avoid gender imbalanced schools. While federal law never sanctioned this kind of affirmative action, a Title IX lawsuit filed against the University of Georgia in 1999 effectively ended the practice.[4]

The historical context illuminates the significance of this transformation. Before the mid-20th century men made up 80% of college students.[5]  Women’s integration into universities in substantial numbers, accelerating after World War II, represented a profound social change. This evolution challenged established gender norms, created pathways to professional careers for women, and contributed substantially to their economic and social independence.[6]

Continue reading “The Other Campus Gender Matter”

Evidence-Based Teaching Ideas for Academic Equity

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, forthcoming from Worlding Book

In recent years, the premise of “evidence-based teaching” has emerged as a key strategy in addressing outcome disparities in higher education. Much like evidence-based practices in medicine and social science, this approach draws on empirical research to inform teaching methods, moving beyond practices based on personal experience or intuition. This shift represents a major change in how educators view the art of teaching itself, acknowledging that while intuition has value, it must be balanced with systematic investigation of what actually works in the classroom. The development of evidence-based teaching can be traced to the late 20th century, along with advances in cognitive science and educational psychology. As researchers gained new insights into adult learning and intellectual development, their findings found their way into the university classroom.

The earliest educational research came from simple comparative methods. Researchers typically would divide classes into control and experimental groups, with one cohort receiving standard instruction and the other a modified version. These “split-class” experiments provided the first rigorous evidence that teaching methods could affect learning outcomes significantly. While rudimentary, these early methods established the crucial principle that teaching effectiveness could be measured and improved through systematic study rather than innate talent alone. Educators also relied heavily on pre- and post-testing, administering assessments before and after interventions to measure knowledge gain. Though simple, this proved particularly good for seeing which teaching strategies led to lasting comprehension versus short-term memorization. Besides this, some faculty maintained teaching journals for documenting their own methods and student responses, which later would be shared with others. While lacking the sophistication of conventional educational studies, these varied methods laid the groundwork for an evidence-based teaching movement asserting that teaching effectiveness could be studied and improved. Continue reading “Evidence-Based Teaching Ideas for Academic Equity”

The Value of Inclusive Teaching

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, forthcoming from Worlding Books

As awareness grows about the role of structural inequities and systemic biases in student success or failure, many schools are exploring the role of instructional methods and course design in bringing equity to the educational environment. In doing so, institutions are finding emerging teaching practices guided by evidence-based research can broaden learner success. Key to this movement is the practice of inclusive teaching, a pedagogical approach that recognizes the inherent diversity of learners and seeks to accommodate their varying needs. This philosophy is predicated on the understanding that students come from various backgrounds, possess different learning styles, and often face individual challenges in their educational pursuits. In recognizing these forms of diversity, educators can develop strategies catering to the most significant number of learners, ensuring no one is left behind. This also treats classroom diversity as an asset, enriching the learning experience for all students by introducing multiple perspectives and fostering cross-cultural understanding.

For many faculty like me, the COVID-19 pandemic catalyzed a heightened attention to inclusive principles. The sudden transition to remote learning destabilized my ongoing practices in two significant ways: first, by forcing the adoption of new instructional methods, and second, by making visible latent inequities I hadn’t previously recognized. As mentioned above, this situation led many colleges and universities to scrutinize their teaching approaches and adopt new tools and strategies to enhance fairness, flexibility, and accessibility. The pandemic also highlighted the importance of social-emotional learning and mental health support in education, prompting institutions to integrate these elements into their teaching strategies more fully. Continue reading “The Value of Inclusive Teaching”

Decentering the Teacher

The university classroom long has been dominated by teacher-centered instruction, which has shown some adaptability while retaining its fundamental characteristics. It wasn’t until the late 20th century that this approach faced significant challenges, as evidence-based practices and learning sciences began to inform educational methods. Understanding this transition requires examining the extensive history of teacher-centered education, including the influence of global pedagogical traditions and the effects of industrialization and technological advances.

Throughout educational history, our understanding of how childrenand young adults learn has continuously evolved. For centuries, this understanding remained notably one-dimensional, failing to account for the complexity of human learning. Prior to the 20th century in most parts of the world children were either seen as blank slates or miniature adults, requiring little more than information and discipline as they matured. Philosophers in the 1700s described children as possessing a natural goodness or in need of stern training. But it wasn’t until the early 1900s that Swiss psychologist Jean Piaget began charting children’s “stages” of maturity.[i]  From this would emerge understandings of how youngsters transition from self-centeredness into social beings, eventually acquiring capacities to actively “construct” knowledge rather than passively taking it in. These insights about cognition and learning would eventually underlie the fields of child development and “child-centered” education. Continue reading “Decentering the Teacher”

Legacies of Western Exculsion

Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge.

Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.

The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii]  This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse. Continue reading “Legacies of Western Exculsion”

The Shrinking College Premium

The “college premium” is the shorthand term for the income differential accruing to those who complete four-year degrees. Often attributed to research begun in 2011 by Georgetown University’s Center on Education and Workforce (CEW), the college premium concept came about from estimates comparing the average lifetime earnings of college graduates ($2.3 million) to those of high school diploma holders ($1.3 million).[i]  In the subsequent decade, the CEW estimate swelled from its initial $1 million to $1.2 million as the premium made college seem like a mandatory life choice.

But families often pay heavily for this benefit, as top-tier universities edge ever closer to tuition costs of $100,000.This year, Vanderbilt University came nearest to this much-watched threshold, projecting tuition of $98,426, though it also emphasized that most students receive financial aid. This trend is evident in other prestigious institutions like Brown, NYU, Tufts, and Yale, whose costs are similarly approaching six figures. While these universities cater to a specific segment, it’s noteworthy that the national average tuition is $56,000 for private colleges and $26,000 for public universities. The rising costs across the industry continue to be a significant concern.[ii]

Seen in broader terms, these costs reflect a decade-long pattern of tuition increases among all U.S. colleges and universities, amounting to twice the inflation rate.[iii]  This rate of increase isn’t a deliberate decision by educational institutions; instead, it’s driven by several historical factors that converge to push tuition upward. One contributing factor is the relationship between financial aid and tuition fees. As students and their families receive more grants, there’s a tendency to be more accepting of higher tuition fees, which in turn enables colleges to increase prices. Additionally, many students harbor the misconception that any college degree will automatically lead to better earnings, leading them to spend less time evaluating the actual costs of their education.[iv] Continue reading “The Shrinking College Premium”