American Anti-Intellectualism Fuels the New War on College

American higher education finds itself under siege, facing unprecedented political attacks that threaten its fundamental mission and autonomy. What makes these assaults particularly devastating is not just their intensity, but the fertile ground of public sentiment that has enabled them to take root and flourish. The convergence of deep-seated anti-intellectual currents with a dramatic erosion of trust in universities has created the perfect conditions for opportunistic politicians to weaponize higher education as a cultural and political battleground.

Once seen as sites of personal and social betterment, universities and colleges nationwide now struggle with a profound crisis of confidence. This shift in perception is hardly anecdotal. Recent surveys reveal that but 36 percent in the U.S. feel positively about higher education, reflecting serious concerns over the institution’s efficacy and fairness.[i] Moreover, a growing partisan divide complicates the erosion of trust. While 59 percent of Democrats express confidence in higher education, a staggering 81 percent of Republican voters now view the institution unfavorably. This chasm speaks volumes about the politicization of education in America, with college increasingly seen as a battleground for antagonistic ideologies.[ii]

A pragmatic shift in educational preferences complements this rift. Mirroring student attitudes is the reality that most Americans now regard trade schools and vocational training as equivalent or superior to four-year institutions in delivering practical education. This pivot reflects changing educational values and an indictment of the entire enterprise of higher education. The growing appeal of alternative educational paths suggests a fundamental reevaluation of what constitutes valuable knowledge and skills in today’s rapidly changing job market. The roots of this mistrust are multifaceted, extending beyond mere economic calculations to encompass broader socio-political undercurrents. Often associated with privilege and intellectual elitism, higher education increasingly is viewed through a lens of class-based suspicion.

Continue reading “American Anti-Intellectualism Fuels the New War on College”

Emotional Intelligence in the Classroom

Everyone knows that schools have a problem with feelings. Walk into any classroom and you’ll witness the elaborate dance of affective suppression that defines modern education. Students learn to hide frustration behind blank stares, to swallow anxiety whole, to perform engagement even when drowning in confusion. Faculty become masterful at reading the affective climate while pretending emotions don’t exist unless they become “disruptive,” at which point they’re quickly pathologized or punished. This affective hide-and-seek isn’t merely unfortunate. It’s academically devastating in ways that most educators are only beginning to understand.The recognition that emotion fundamentally shapes learning has deep roots in psychological research, though it took decades to gain educational traction.

Howard Gardner’s groundbreaking theory of multiple intelligences, introduced in the 1980s, challenged narrow definitions of cognitive ability by identifying “personal intelligences” as distinct forms of human capacity. These included both intrapersonal intelligence (understanding oneself) and interpersonal intelligence (understanding others), categories that opened space for recognizing emotional and social skills as more than personality traits.[i] Gardner’s framework provided crucial legitimacy for educators who suspected that success required more than traditional academic skills.

It was Daniel Goleman’s 1995 bestseller Emotional Intelligence that brought these ideas into mainstream conversation, arguing that EQ often matters more than IQ for success in work and relationships.[ii] Goleman synthesized research from psychology and neuroscience to demonstrate what many educators intuitively knew. Emotions are not distractions from thinking but rather integral to how thinking happens. When learners cannot recognize, understand, or regulate their emotional states, everything else becomes exponentially harder. Memory formation falters under affective stress. Attention scatters when anxiety floods the system. Continue reading “Emotional Intelligence in the Classroom”

Inclusive Teaching 2.0: The Challenge of Equity in Anti-DEI Times

These days American universities find themselves at a peculiar crossroads. With the stroke of a pen, federal actions have swept away diversity, equity, and inclusion (DEI) programs at institutions dependent on government funding. Yet in this moment of apparent retreat, one might discern not an ending but a beginning, creating the potential for a more profound transformation in how we understand the art of teaching itself.

The moment demands reinvention, not retreat. For decades now, inclusive teaching has been quietly revolutionizing classrooms, operating not by privileging some students over others, but by ensuring all students can thrive. The principle, though deceptively simple, borders on the radical: every learner deserves access to tools that support their academic growth. This principle can guide institutions toward a universal model of excellence, grounded in research, focused on outcomes, and aligned with the values of higher education.

Never has the need been greater. Today’s college student defies easy categorization. The stereotypical image of young adults attending full-time classes on residential campuses has given way to something far more complex, with  students juggling work commitments, family responsibilities, and extended degree timelines to manage costs.[1] This demographic shift demands nothing less than a pedagogical evolution, one that acknowledges students’ multifaceted lives while maintaining academic rigor.

Enter Universal Design for Learning (UDL), an evidence-based framework offering a compelling vision for the future. UDL is defined as “a framework developed to improve and optimize teaching and learning for all people based on scientific insights into how humans learn.”[2] Grounded in cognitive neuroscience and educational research, UDL principles encourage educators to present information in multiple ways, offer students various methods for demonstrating understanding, and foster engagement through real-world relevance and autonomy. Continue reading “Inclusive Teaching 2.0: The Challenge of Equity in Anti-DEI Times”

The Other Campus Gender Matter

 Recent months have witnessed heightened gender-related conflicts in higher education. From debates over transgender athletes to challenges against Diversity, Equity, and Inclusion (DEI) initiatives, a narrative portraying universities as ideological battlegrounds has gained traction. Political figures, including Vice President J.D. Vance, have amplified this perspective, saying “universities are the enemy” as sites of indoctrination rather than education.[1]

Beneath these controversies lies a significant but often unacknowledged factor: women’s dramatic ascendance in higher education. The gender gap in college enrollment has reached historic proportions, with female-identified learners constituting approximately 60% of U.S. college students.[2]  This unprecedented shift, a cornerstone of women’s empowerment and presence in the workforces, appears to have become a focal point for those seeking to reshape or constrain the university system.

The gender gap in education begins well before college, with girls outperforming boys in reading and writing during elementary years and making up 60 percent of the top students graduating from high school.[3]  In the 1990s, some universities began giving boys extra points on their applications, owing to concerns that students might avoid gender imbalanced schools. While federal law never sanctioned this kind of affirmative action, a Title IX lawsuit filed against the University of Georgia in 1999 effectively ended the practice.[4]

The historical context illuminates the significance of this transformation. Before the mid-20th century men made up 80% of college students.[5]  Women’s integration into universities in substantial numbers, accelerating after World War II, represented a profound social change. This evolution challenged established gender norms, created pathways to professional careers for women, and contributed substantially to their economic and social independence.[6]

Continue reading “The Other Campus Gender Matter”

Evidence-Based Teaching Ideas for Academic Equity

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, forthcoming from Worlding Book

In recent years, the premise of “evidence-based teaching” has emerged as a key strategy in addressing outcome disparities in higher education. Much like evidence-based practices in medicine and social science, this approach draws on empirical research to inform teaching methods, moving beyond practices based on personal experience or intuition. This shift represents a major change in how educators view the art of teaching itself, acknowledging that while intuition has value, it must be balanced with systematic investigation of what actually works in the classroom. The development of evidence-based teaching can be traced to the late 20th century, along with advances in cognitive science and educational psychology. As researchers gained new insights into adult learning and intellectual development, their findings found their way into the university classroom.

The earliest educational research came from simple comparative methods. Researchers typically would divide classes into control and experimental groups, with one cohort receiving standard instruction and the other a modified version. These “split-class” experiments provided the first rigorous evidence that teaching methods could affect learning outcomes significantly. While rudimentary, these early methods established the crucial principle that teaching effectiveness could be measured and improved through systematic study rather than innate talent alone. Educators also relied heavily on pre- and post-testing, administering assessments before and after interventions to measure knowledge gain. Though simple, this proved particularly good for seeing which teaching strategies led to lasting comprehension versus short-term memorization. Besides this, some faculty maintained teaching journals for documenting their own methods and student responses, which later would be shared with others. While lacking the sophistication of conventional educational studies, these varied methods laid the groundwork for an evidence-based teaching movement asserting that teaching effectiveness could be studied and improved. Continue reading “Evidence-Based Teaching Ideas for Academic Equity”

The Value of Inclusive Teaching

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, forthcoming from Worlding Books

As awareness grows about the role of structural inequities and systemic biases in student success or failure, many schools are exploring the role of instructional methods and course design in bringing equity to the educational environment. In doing so, institutions are finding emerging teaching practices guided by evidence-based research can broaden learner success. Key to this movement is the practice of inclusive teaching, a pedagogical approach that recognizes the inherent diversity of learners and seeks to accommodate their varying needs. This philosophy is predicated on the understanding that students come from various backgrounds, possess different learning styles, and often face individual challenges in their educational pursuits. In recognizing these forms of diversity, educators can develop strategies catering to the most significant number of learners, ensuring no one is left behind. This also treats classroom diversity as an asset, enriching the learning experience for all students by introducing multiple perspectives and fostering cross-cultural understanding.

For many faculty like me, the COVID-19 pandemic catalyzed a heightened attention to inclusive principles. The sudden transition to remote learning destabilized my ongoing practices in two significant ways: first, by forcing the adoption of new instructional methods, and second, by making visible latent inequities I hadn’t previously recognized. As mentioned above, this situation led many colleges and universities to scrutinize their teaching approaches and adopt new tools and strategies to enhance fairness, flexibility, and accessibility. The pandemic also highlighted the importance of social-emotional learning and mental health support in education, prompting institutions to integrate these elements into their teaching strategies more fully. Continue reading “The Value of Inclusive Teaching”

Decentering the Teacher

The university classroom long has been dominated by teacher-centered instruction, which has shown some adaptability while retaining its fundamental characteristics. It wasn’t until the late 20th century that this approach faced significant challenges, as evidence-based practices and learning sciences began to inform educational methods. Understanding this transition requires examining the extensive history of teacher-centered education, including the influence of global pedagogical traditions and the effects of industrialization and technological advances.

Throughout educational history, our understanding of how childrenand young adults learn has continuously evolved. For centuries, this understanding remained notably one-dimensional, failing to account for the complexity of human learning. Prior to the 20th century in most parts of the world children were either seen as blank slates or miniature adults, requiring little more than information and discipline as they matured. Philosophers in the 1700s described children as possessing a natural goodness or in need of stern training. But it wasn’t until the early 1900s that Swiss psychologist Jean Piaget began charting children’s “stages” of maturity.[i]  From this would emerge understandings of how youngsters transition from self-centeredness into social beings, eventually acquiring capacities to actively “construct” knowledge rather than passively taking it in. These insights about cognition and learning would eventually underlie the fields of child development and “child-centered” education. Continue reading “Decentering the Teacher”

Legacies of Western Exculsion

Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge.

Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.

The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii]  This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse. Continue reading “Legacies of Western Exculsion”

The Shrinking College Premium

The “college premium” is the shorthand term for the income differential accruing to those who complete four-year degrees. Often attributed to research begun in 2011 by Georgetown University’s Center on Education and Workforce (CEW), the college premium concept came about from estimates comparing the average lifetime earnings of college graduates ($2.3 million) to those of high school diploma holders ($1.3 million).[i]  In the subsequent decade, the CEW estimate swelled from its initial $1 million to $1.2 million as the premium made college seem like a mandatory life choice.

But families often pay heavily for this benefit, as top-tier universities edge ever closer to tuition costs of $100,000.This year, Vanderbilt University came nearest to this much-watched threshold, projecting tuition of $98,426, though it also emphasized that most students receive financial aid. This trend is evident in other prestigious institutions like Brown, NYU, Tufts, and Yale, whose costs are similarly approaching six figures. While these universities cater to a specific segment, it’s noteworthy that the national average tuition is $56,000 for private colleges and $26,000 for public universities. The rising costs across the industry continue to be a significant concern.[ii]

Seen in broader terms, these costs reflect a decade-long pattern of tuition increases among all U.S. colleges and universities, amounting to twice the inflation rate.[iii]  This rate of increase isn’t a deliberate decision by educational institutions; instead, it’s driven by several historical factors that converge to push tuition upward. One contributing factor is the relationship between financial aid and tuition fees. As students and their families receive more grants, there’s a tendency to be more accepting of higher tuition fees, which in turn enables colleges to increase prices. Additionally, many students harbor the misconception that any college degree will automatically lead to better earnings, leading them to spend less time evaluating the actual costs of their education.[iv] Continue reading “The Shrinking College Premium”

When Universities Become Brands

Choosing a college from one of the America’s 5,775 public and private options in the U.S. can be one of the biggest decisions a young adult makes.  With 25-million applicants making these choices, a large industry exists to help with this process, encompassing high-school guidance counsellors, college admissions offices, professional advisors, industry organizations, books and guides, and ranking publications – all devoted to help applications find the “best” school for them.[i] From elite private universities to regional state colleges, for-profit institutions, and community colleges, the hierarchy of institutions is well-recognized and often shapes public opinion. This stratification raises crucial questions about access, equity, and whether the status of an institution significantly determines a graduate’s long-term success.This “brand hierarchy” is a reality of the U.S. higher education system. The public often assigns greater value to highly selective, well-resourced institutions with name recognition.

Rankings and media portrayals fuel this perception, creating an implicit understanding that some colleges are simply “better” than others. In fact, studies from the U.S. Department of Education show 74 % of prospective students rating important “reputation/academic quality” the most important factor in choosing a school –– more important than tuition cost (67%), proximity to home (26%), or personal recommendations (24%).[ii]

A central question for the public is whether the name of the institution on a diploma translates to tangible differences in earnings potential and life satisfaction. There’s a prevailing assumption that graduates of elite universities have a clear advantage, but the reality is more complex. Partly this has to do with the structural benefits that higher education institutions provide as a transitional ground between high school and adulthood. For many young adults, elite colleges are seen as sources of social connections, professional networks, access to organizations, recommendations, and mentoring, much of linked to a particular college or university brand identity.

Continue reading “When Universities Become Brands”