Decentering the Teacher

The university classroom long has been dominated by teacher-centered instruction, which has shown some adaptability while retaining its fundamental characteristics. It wasn’t until the late 20th century that this approach faced significant challenges, as evidence-based practices and learning sciences began to inform educational methods. Understanding this transition requires examining the extensive history of teacher-centered education, including the influence of global pedagogical traditions and the effects of industrialization and technological advances.

Throughout educational history, our understanding of how childrenand young adults learn has continuously evolved. For centuries, this understanding remained notably one-dimensional, failing to account for the complexity of human learning. Prior to the 20th century in most parts of the world children were either seen as blank slates or miniature adults, requiring little more than information and discipline as they matured. Philosophers in the 1700s described children as possessing a natural goodness or in need of stern training. But it wasn’t until the early 1900s that Swiss psychologist Jean Piaget began charting children’s “stages” of maturity.[i]  From this would emerge understandings of how youngsters transition from self-centeredness into social beings, eventually acquiring capacities to actively “construct” knowledge rather than passively taking it in. These insights about cognition and learning would eventually underlie the fields of child development and “child-centered” education. Continue reading “Decentering the Teacher”

Legacies of Western Exculsion

Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge.

Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.

The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii]  This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse. Continue reading “Legacies of Western Exculsion”

The Shrinking College Premium

The “college premium” is the shorthand term for the income differential accruing to those who complete four-year degrees. Often attributed to research begun in 2011 by Georgetown University’s Center on Education and Workforce (CEW), the college premium concept came about from estimates comparing the average lifetime earnings of college graduates ($2.3 million) to those of high school diploma holders ($1.3 million).[i]  In the subsequent decade, the CEW estimate swelled from its initial $1 million to $1.2 million as the premium made college seem like a mandatory life choice.

But families often pay heavily for this benefit, as top-tier universities edge ever closer to tuition costs of $100,000.This year, Vanderbilt University came nearest to this much-watched threshold, projecting tuition of $98,426, though it also emphasized that most students receive financial aid. This trend is evident in other prestigious institutions like Brown, NYU, Tufts, and Yale, whose costs are similarly approaching six figures. While these universities cater to a specific segment, it’s noteworthy that the national average tuition is $56,000 for private colleges and $26,000 for public universities. The rising costs across the industry continue to be a significant concern.[ii]

Seen in broader terms, these costs reflect a decade-long pattern of tuition increases among all U.S. colleges and universities, amounting to twice the inflation rate.[iii]  This rate of increase isn’t a deliberate decision by educational institutions; instead, it’s driven by several historical factors that converge to push tuition upward. One contributing factor is the relationship between financial aid and tuition fees. As students and their families receive more grants, there’s a tendency to be more accepting of higher tuition fees, which in turn enables colleges to increase prices. Additionally, many students harbor the misconception that any college degree will automatically lead to better earnings, leading them to spend less time evaluating the actual costs of their education.[iv] Continue reading “The Shrinking College Premium”

When Universities Become Brands

Choosing a college from one of the America’s 5,775 public and private options in the U.S. can be one of the biggest decisions a young adult makes.  With 25-million applicants making these choices, a large industry exists to help with this process, encompassing high-school guidance counsellors, college admissions offices, professional advisors, industry organizations, books and guides, and ranking publications – all devoted to help applications find the “best” school for them.[i] From elite private universities to regional state colleges, for-profit institutions, and community colleges, the hierarchy of institutions is well-recognized and often shapes public opinion. This stratification raises crucial questions about access, equity, and whether the status of an institution significantly determines a graduate’s long-term success.This “brand hierarchy” is a reality of the U.S. higher education system. The public often assigns greater value to highly selective, well-resourced institutions with name recognition.

Rankings and media portrayals fuel this perception, creating an implicit understanding that some colleges are simply “better” than others. In fact, studies from the U.S. Department of Education show 74 % of prospective students rating important “reputation/academic quality” the most important factor in choosing a school –– more important than tuition cost (67%), proximity to home (26%), or personal recommendations (24%).[ii]

A central question for the public is whether the name of the institution on a diploma translates to tangible differences in earnings potential and life satisfaction. There’s a prevailing assumption that graduates of elite universities have a clear advantage, but the reality is more complex. Partly this has to do with the structural benefits that higher education institutions provide as a transitional ground between high school and adulthood. For many young adults, elite colleges are seen as sources of social connections, professional networks, access to organizations, recommendations, and mentoring, much of linked to a particular college or university brand identity.

Continue reading “When Universities Become Brands”

The Crisis of Belonging

The structural inequities and systemic biases present in higher education profoundly affect learners’ sense of belonging, which in turn influences their academic and social experiences. Research consistently shows that students from historically minoritized backgrounds, including students of color, low-income students, and first-generation college students, often feel less connected to their institutions. This lack of belonging can have far-reaching consequences, impacting learners’ engagement with courses and materials, their sense of connection with peers and community, and their overall well-being and acceptance within the campus culture.

As institutions strive to create more inclusive environments, it is essential to understand the multifaceted ways in which belonging influences student experiences and outcomes.When learners perceive themselves as outsiders, their motivation and participation in academic activities suffer. A recent study found that students who do not feel a sense of belonging are less likely to engage in classroom discussions or participate in group projects, leading to a diminished learning experience.[1] This disengagement is particularly pronounced among learners from underrepresented groups, who may already feel alienated due to cultural and institutional biases. Such environments fail to support these learners, exacerbating feelings of isolation and disengagement. Consequently, these students are often left to navigate academic challenges without the support structures necessary for success, further entrenching existing inequities. Continue reading “The Crisis of Belonging”

The Last Lecture

Teacher-centered instruction, a dominant approach for centuries, finds its roots in several ancient civilizations and has evolved through a variety of cultural contexts. History helps explain how this type of education has reflected certain social values and sustained its popularity through the transformations of the industrial era. It wasn’t until the 20th and 21st centuries that significant re-evaluations began to challenge this traditional model, based on outcome evidence and learning science. To understand the factors behind s transition, it’s crucial to examine the long trajectory of teacher-centered pedagogy, including the influence of global educational traditions and the impacts of capitalism and modernization. This exploration provides a comprehensive understanding of the present state of higher education and the diverse factors that have shaped its evolution.

The tradition of teacher-centered pedagogy finds its roots in ancient civilizations such as Greece, China, and India, predating widespread literacy and printed materials. In ancient Greece, figures like Socrates, Plato, and Aristotle engaged in dialectic methods, fostering environments where oral discourse was paramount.[1]   In China, Confucian education emphasized hierarchical relationships and moral instruction, with teachers like Confucius himself serving as the central figures of wisdom.[2] Similarly, in India, the Gurukul system placed teachers (gurus) at the heart of the educational process, guiding students (shishyas) through rigorous intellectual and spiritual training. Continue reading “The Last Lecture”

Never Smart Enough

Everyone wishes for higher intelligence. Like beauty and fitness, it’s another quality everybody seems to want. But at some point in life, most people accept what they have and just plow ahead. This sense of defined limits comes from grades, standardized tests, performance evaluations, and chosen pathways reinforced throughout life in competitive comparison. Because of this, attitudes toward intelligence become a perfect set-up enhancement marketing. Rarely is the definition of intelligence questioned, even though the concept is extremely murky. Instead, what gets advanced is the hope of salvation, supplement, addition, or replacement of native functioning, these days offered in a dizzying array of methods, tricks, and technologies. Memory boosting supplements like Brainmentin and Optimind flood the consumer market, often pitched to aging baby-boomers.

Students drink Red Bull or acquire ADD-drugs to study for tests. Exercise and nutritional products promise sharper thinking through purportedly “natural” means. Dig a little further, and one finds unexamined values in intelligence discourse, which privilege reasoning and memory over just about anything else. Important as such traits may be, alone they can’t account for many and diverse ways people navigate their lives, adapt to changing circumstances, or act in creative ways.

So, what is intelligence? The Cambridge Dictionary says it’s the “ability to understand and learn well, and to form judgments and opinions based on reason.” Most other sources say roughly the same thing. Yet people who study intelligence argue that single definitions just won’t do. There simply are too many variables that go into “intelligent” thinking and behavior –– among them cognition, capacity, context, experience, emotion, orientation, language, memory, motivation, and overall physical health. Continue reading “Never Smart Enough”

Disability and the Politics of Cure

Disability awareness has increased in recent decades, along with a growing academic field dedicated to it. Partly this is due to the concerns of a large aging baby boomer population. Disability advocacy groups are becoming increasingly vocal. Yet despite incremental advancements in affordances and civil also rights, the specter of normalcy persists. In many areas of life –– from employment and housing to education and medical care –– bias, discrimination, and uneasiness can negatively impact people with disabilities.

Even though no longer uniformly cast as outsiders, many with disabilities entering “ableist” society do so at a price. Social awkwardness or a reluctance to engage someone with a disability remain commonplace. This type of misunderstanding can lead to stereotyping, with the assumption that a person in a wheelchair wants or needs assistance. There can be a lot of confusion when it comes to certain conditions, such as when a speech difference is construed as a cognitive deficit. Furthermore, people with disabilities are sometimes accused of exploiting their disabilities for unfair advantages.

Let’s face it. Majorities tend to view the world from their own perspective, expecting others to conform. This impulse is particularly strong in the U.S., where citizens cling to their rugged individualism. In one of the most competitive countries in the world, failures are more likely to be attributed to individual shortcomings than to any systematic problem. The result has been a distorted view of variances of many types, physical and mental abilities prominent among them. In what often is termed the “medical model” of disability, impairments are seen as illnesses in need of treatment or “cure.” Because the model only focuses on functional differences, it can reduce the person to a diagnosis, while “constructing disability” as a problem.

In the disability community, activists and scholars argue that a condition only becomes a disability when it is conceptualized as one. In its place, they advocate a “social model” of disability, focusing on accommodations, universal design, and inclusive attitudes. People with disabilities can, with the appropriate affordances, found a company like Apple Computer (Steve Jobs), conduct an orchestra (Itzhak Perlman), become poet laureate (Maya Angelou), or President of the United States (Franklin Roosevelt). Continue reading “Disability and the Politics of Cure”

Updating the Self

Neuroscientists call the brain an “anticipation machine” because it spends so much time predicting the future.[i] It does this by piecing together past experiences to build scenarios of expected outcomes, in a process that reinforces itself as predictions come true. But of course things don’t always come true,  creating uncertainty and wreaking havoc on the anticipation machine. In mild cases this expresses itself in a sense of worry that things might go wrong. But pile up a lot of bad experiences and you end up expecting the worst, in what psychologists call “anticipatory dread.”[ii] While this can be a healthy process in buffering the shock of negative events, it also can spiral into a harmful sensation of crisis.

Recent research has a lot to say about the anticipation machine’s relationship to the update impulse. Visions of the future don’t spring from a vacuum, but link to objects, expected outcomes, or something we think we want. This desiring process applies to just about everything, whether it’s a slice of pizza or the admiration of others. But here’s the fascinating part: Getting things is less powerful than wanting them. That new pair of jeans might bring a thrill. But soon comes the yearning for another purchase. Neuroimaging reveals that “wanting” and “liking” occur in different parts of the brain, with the former more strongly active than the latter. Contrary to common wisdom, motivation isn’t influenced by animalistic hungers and drives. What gets people going is the imagination, which is why advertising favors feelings over facts. Continue reading “Updating the Self”

Beyond the Slogans: Evidence-Driven DEI in Higher Education

The past year has witnessed unprecedented assaults on Diversity, Equity, and Inclusion (DEI) initiatives in universities. Often disguised as support for “traditional” values or academic freedom, these criticisms mask a deeper debate about the role and direction of higher education in a diverse society. To navigate this turbulent discussion, it’s important to move beyond slogans and delve into the evidence-based benefits of DEI, not just for educational institutions, but for the very fabric of a democratic society.

Historically, American academia has been marked by exclusion. Access to knowledge, the cornerstone of a thriving democracy, was largely reserved for privileged white students. This reality underscores the dynamic nature of tradition in higher education. True progress lies not in clinging to past practices, but in expanding access to reflect the rich tapestry of American life. DEI serves as a crucial tool in this expansion. Far from a political tool or mere slogan, it represents a data-driven approach to dismantling barriers that impede access and success for historically marginalized communities. Research paints a clear picture:

  • Improved Student Outcomes: Studies by the National Bureau of Economic Research show that diverse learning environments significantly enhance academic performance and critical thinking skills.
  • Higher Graduation Rates: The American Association of Colleges and Universities reports that campuses with robust DEI programs boast higher graduation rates, particularly for socially marginalized students.
  • Stronger Civic Engagement: Research by the National Center for Education Statistics reveals that universities with strong inclusivity practices foster greater student satisfaction and civic engagement.

Continue reading “Beyond the Slogans: Evidence-Driven DEI in Higher Education”