The Other Campus Gender Matter

 Recent months have witnessed heightened gender-related conflicts in higher education. From debates over transgender athletes to challenges against Diversity, Equity, and Inclusion (DEI) initiatives, a narrative portraying universities as ideological battlegrounds has gained traction. Political figures, including Vice President J.D. Vance, have amplified this perspective, saying “universities are the enemy” as sites of indoctrination rather than education.[1]

Beneath these controversies lies a significant but often unacknowledged factor: women’s dramatic ascendance in higher education. The gender gap in college enrollment has reached historic proportions, with female-identified learners constituting approximately 60% of U.S. college students.[2]  This unprecedented shift, a cornerstone of women’s empowerment and presence in the workforces, appears to have become a focal point for those seeking to reshape or constrain the university system.

The gender gap in education begins well before college, with girls outperforming boys in reading and writing during elementary years and making up 60 percent of the top students graduating from high school.[3]  In the 1990s, some universities began giving boys extra points on their applications, owing to concerns that students might avoid gender imbalanced schools. While federal law never sanctioned this kind of affirmative action, a Title IX lawsuit filed against the University of Georgia in 1999 effectively ended the practice.[4]

The historical context illuminates the significance of this transformation. Before the mid-20th century men made up 80% of college students.[5]  Women’s integration into universities in substantial numbers, accelerating after World War II, represented a profound social change. This evolution challenged established gender norms, created pathways to professional careers for women, and contributed substantially to their economic and social independence.[6]

Continue reading “The Other Campus Gender Matter”

Behind the New Diversity Culture Wars

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, now free of charge from Worlding Books

Underlying current opposition to diversity programs lies the pervasive belief that inequity and bias barely exist in a “post-civil rights, post-feminist” era, and that efforts to redress them have gone too far. This mindset helps explain why, as American universities now face a federal ban on diversity, equity, and inclusion (DEI) initiatives, a majority of the general public supports eliminating the programs –– with a recent Economist/YouGov poll finding 45% in favor of ending DEI in education over 40% opposed.[1]   Already intense in state legislatures and conservative media, this resistance reflects deeply rooted American ideologies about meritocracy and individualism that clash with efforts to address systemic inequalities in higher education. The resulting political struggle has transformed campus diversity initiatives from administrative policies into flashpoints in America’s culture wars.

The controversies over this are no secret. Recent measures to ban or restrict DEI and the teaching of CRT in educational institutions reflect a longstanding political backlash. Leading up to the November election, 85 anti-DEI bills had been introduced in 28 state legislatures, according to the Chronicle of Higher Education‘s “DEI Legislation Tracker.”[2]  These often broadly worded laws created confusion and fear among educators, while chilling discussions of race, gender, sexual orientation, and disability on campuses. Continue reading “Behind the New Diversity Culture Wars”

Evidence-Based Teaching Ideas for Academic Equity

From Degrees of Difficulty: The Challenge of Equity in  College Teaching by David Trend, forthcoming from Worlding Book

In recent years, the premise of “evidence-based teaching” has emerged as a key strategy in addressing outcome disparities in higher education. Much like evidence-based practices in medicine and social science, this approach draws on empirical research to inform teaching methods, moving beyond practices based on personal experience or intuition. This shift represents a major change in how educators view the art of teaching itself, acknowledging that while intuition has value, it must be balanced with systematic investigation of what actually works in the classroom. The development of evidence-based teaching can be traced to the late 20th century, along with advances in cognitive science and educational psychology. As researchers gained new insights into adult learning and intellectual development, their findings found their way into the university classroom.

The earliest educational research came from simple comparative methods. Researchers typically would divide classes into control and experimental groups, with one cohort receiving standard instruction and the other a modified version. These “split-class” experiments provided the first rigorous evidence that teaching methods could affect learning outcomes significantly. While rudimentary, these early methods established the crucial principle that teaching effectiveness could be measured and improved through systematic study rather than innate talent alone. Educators also relied heavily on pre- and post-testing, administering assessments before and after interventions to measure knowledge gain. Though simple, this proved particularly good for seeing which teaching strategies led to lasting comprehension versus short-term memorization. Besides this, some faculty maintained teaching journals for documenting their own methods and student responses, which later would be shared with others. While lacking the sophistication of conventional educational studies, these varied methods laid the groundwork for an evidence-based teaching movement asserting that teaching effectiveness could be studied and improved. Continue reading “Evidence-Based Teaching Ideas for Academic Equity”

Decentering the Teacher

The university classroom long has been dominated by teacher-centered instruction, which has shown some adaptability while retaining its fundamental characteristics. It wasn’t until the late 20th century that this approach faced significant challenges, as evidence-based practices and learning sciences began to inform educational methods. Understanding this transition requires examining the extensive history of teacher-centered education, including the influence of global pedagogical traditions and the effects of industrialization and technological advances.

Throughout educational history, our understanding of how childrenand young adults learn has continuously evolved. For centuries, this understanding remained notably one-dimensional, failing to account for the complexity of human learning. Prior to the 20th century in most parts of the world children were either seen as blank slates or miniature adults, requiring little more than information and discipline as they matured. Philosophers in the 1700s described children as possessing a natural goodness or in need of stern training. But it wasn’t until the early 1900s that Swiss psychologist Jean Piaget began charting children’s “stages” of maturity.[i]  From this would emerge understandings of how youngsters transition from self-centeredness into social beings, eventually acquiring capacities to actively “construct” knowledge rather than passively taking it in. These insights about cognition and learning would eventually underlie the fields of child development and “child-centered” education. Continue reading “Decentering the Teacher”

Legacies of Western Exculsion

Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge.

Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.

The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii]  This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse. Continue reading “Legacies of Western Exculsion”

When Universities Become Brands

Choosing a college from one of the America’s 5,775 public and private options in the U.S. can be one of the biggest decisions a young adult makes.  With 25-million applicants making these choices, a large industry exists to help with this process, encompassing high-school guidance counsellors, college admissions offices, professional advisors, industry organizations, books and guides, and ranking publications – all devoted to help applications find the “best” school for them.[i] From elite private universities to regional state colleges, for-profit institutions, and community colleges, the hierarchy of institutions is well-recognized and often shapes public opinion. This stratification raises crucial questions about access, equity, and whether the status of an institution significantly determines a graduate’s long-term success.This “brand hierarchy” is a reality of the U.S. higher education system. The public often assigns greater value to highly selective, well-resourced institutions with name recognition.

Rankings and media portrayals fuel this perception, creating an implicit understanding that some colleges are simply “better” than others. In fact, studies from the U.S. Department of Education show 74 % of prospective students rating important “reputation/academic quality” the most important factor in choosing a school –– more important than tuition cost (67%), proximity to home (26%), or personal recommendations (24%).[ii]

A central question for the public is whether the name of the institution on a diploma translates to tangible differences in earnings potential and life satisfaction. There’s a prevailing assumption that graduates of elite universities have a clear advantage, but the reality is more complex. Partly this has to do with the structural benefits that higher education institutions provide as a transitional ground between high school and adulthood. For many young adults, elite colleges are seen as sources of social connections, professional networks, access to organizations, recommendations, and mentoring, much of linked to a particular college or university brand identity.

Continue reading “When Universities Become Brands”

The Last Lecture

Teacher-centered instruction, a dominant approach for centuries, finds its roots in several ancient civilizations and has evolved through a variety of cultural contexts. History helps explain how this type of education has reflected certain social values and sustained its popularity through the transformations of the industrial era. It wasn’t until the 20th and 21st centuries that significant re-evaluations began to challenge this traditional model, based on outcome evidence and learning science. To understand the factors behind s transition, it’s crucial to examine the long trajectory of teacher-centered pedagogy, including the influence of global educational traditions and the impacts of capitalism and modernization. This exploration provides a comprehensive understanding of the present state of higher education and the diverse factors that have shaped its evolution.

The tradition of teacher-centered pedagogy finds its roots in ancient civilizations such as Greece, China, and India, predating widespread literacy and printed materials. In ancient Greece, figures like Socrates, Plato, and Aristotle engaged in dialectic methods, fostering environments where oral discourse was paramount.[1]   In China, Confucian education emphasized hierarchical relationships and moral instruction, with teachers like Confucius himself serving as the central figures of wisdom.[2] Similarly, in India, the Gurukul system placed teachers (gurus) at the heart of the educational process, guiding students (shishyas) through rigorous intellectual and spiritual training. Continue reading “The Last Lecture”

Never Smart Enough

Everyone wishes for higher intelligence. Like beauty and fitness, it’s another quality everybody seems to want. But at some point in life, most people accept what they have and just plow ahead. This sense of defined limits comes from grades, standardized tests, performance evaluations, and chosen pathways reinforced throughout life in competitive comparison. Because of this, attitudes toward intelligence become a perfect set-up enhancement marketing. Rarely is the definition of intelligence questioned, even though the concept is extremely murky. Instead, what gets advanced is the hope of salvation, supplement, addition, or replacement of native functioning, these days offered in a dizzying array of methods, tricks, and technologies. Memory boosting supplements like Brainmentin and Optimind flood the consumer market, often pitched to aging baby-boomers.

Students drink Red Bull or acquire ADD-drugs to study for tests. Exercise and nutritional products promise sharper thinking through purportedly “natural” means. Dig a little further, and one finds unexamined values in intelligence discourse, which privilege reasoning and memory over just about anything else. Important as such traits may be, alone they can’t account for many and diverse ways people navigate their lives, adapt to changing circumstances, or act in creative ways.

So, what is intelligence? The Cambridge Dictionary says it’s the “ability to understand and learn well, and to form judgments and opinions based on reason.” Most other sources say roughly the same thing. Yet people who study intelligence argue that single definitions just won’t do. There simply are too many variables that go into “intelligent” thinking and behavior –– among them cognition, capacity, context, experience, emotion, orientation, language, memory, motivation, and overall physical health. Continue reading “Never Smart Enough”

Updating the Self

Neuroscientists call the brain an “anticipation machine” because it spends so much time predicting the future.[i] It does this by piecing together past experiences to build scenarios of expected outcomes, in a process that reinforces itself as predictions come true. But of course things don’t always come true,  creating uncertainty and wreaking havoc on the anticipation machine. In mild cases this expresses itself in a sense of worry that things might go wrong. But pile up a lot of bad experiences and you end up expecting the worst, in what psychologists call “anticipatory dread.”[ii] While this can be a healthy process in buffering the shock of negative events, it also can spiral into a harmful sensation of crisis.

Recent research has a lot to say about the anticipation machine’s relationship to the update impulse. Visions of the future don’t spring from a vacuum, but link to objects, expected outcomes, or something we think we want. This desiring process applies to just about everything, whether it’s a slice of pizza or the admiration of others. But here’s the fascinating part: Getting things is less powerful than wanting them. That new pair of jeans might bring a thrill. But soon comes the yearning for another purchase. Neuroimaging reveals that “wanting” and “liking” occur in different parts of the brain, with the former more strongly active than the latter. Contrary to common wisdom, motivation isn’t influenced by animalistic hungers and drives. What gets people going is the imagination, which is why advertising favors feelings over facts. Continue reading “Updating the Self”

Beyond the Slogans: Evidence-Driven DEI in Higher Education

The past year has witnessed unprecedented assaults on Diversity, Equity, and Inclusion (DEI) initiatives in universities. Often disguised as support for “traditional” values or academic freedom, these criticisms mask a deeper debate about the role and direction of higher education in a diverse society. To navigate this turbulent discussion, it’s important to move beyond slogans and delve into the evidence-based benefits of DEI, not just for educational institutions, but for the very fabric of a democratic society.

Historically, American academia has been marked by exclusion. Access to knowledge, the cornerstone of a thriving democracy, was largely reserved for privileged white students. This reality underscores the dynamic nature of tradition in higher education. True progress lies not in clinging to past practices, but in expanding access to reflect the rich tapestry of American life. DEI serves as a crucial tool in this expansion. Far from a political tool or mere slogan, it represents a data-driven approach to dismantling barriers that impede access and success for historically marginalized communities. Research paints a clear picture:

  • Improved Student Outcomes: Studies by the National Bureau of Economic Research show that diverse learning environments significantly enhance academic performance and critical thinking skills.
  • Higher Graduation Rates: The American Association of Colleges and Universities reports that campuses with robust DEI programs boast higher graduation rates, particularly for socially marginalized students.
  • Stronger Civic Engagement: Research by the National Center for Education Statistics reveals that universities with strong inclusivity practices foster greater student satisfaction and civic engagement.

Continue reading “Beyond the Slogans: Evidence-Driven DEI in Higher Education”