The Creative Inner Child?

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childhood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates over individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments.

This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children,  which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make education more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry. Continue reading “The Creative Inner Child?”

Stop Blaming Students: Toward a Post-Pandemic Pedagogy

There’s trouble in the college classroom these days. But you can’t blame students. The pandemic and other disruptions of the past two years have shaken higher education to the core, casting doubt on how universities deliver instruction, pay their bills, and justify their existence. Enrollments are dropping across the nation, as students and their families increasingly see college as  overpriced, inequitable, and non-essential. More disturbing still are shifts taking place within institutions themselves, as dispirited students are losing motivation and enthusiasm for learning.  Clearly something has to change, with many pointing to the classroom as a key place to start.  But will it be enough?

“A Stunning Level of Disconnection” is the way one recent article described the situation. “Fewer students show up to class. Those who do avoid speaking when possible. Many skip the readings or the homework. They have trouble remembering what they learned and struggle on tests,” several professors reported.[1] Instructors are trying to reach and teach students, to figure out the problem, and do anything they can to fix things, with many now concluding in frustration that “It may be necessary to change the structure of college itself.” Call it a stress test for higher education – the seismic disruption of the college classroom during the COVID-19 years, and its ongoing after-shocks. At all levels of instruction, educators continue to voice alarm over the persistent malaise and underperformance of college students. Continue reading “Stop Blaming Students: Toward a Post-Pandemic Pedagogy”

The Problem with Rigor

“It’s Time to Cancel the Word Rigor,” read a recent headline in education press.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

When School is a Factory

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Creativity During Crisis: Do We have What it Takes?

History has shown that crisis brings out creativity, as people find themselves facing unexpected challenges and innovating out of necessity. Countless innovations and scientific breakthroughs have come from disasters and wars – penicillin, jet engines, and the internet to name but a few. Can we muster this same energy in these stay-at-home days of coronavirus?

I’ve been getting this question from University of California undergraduates in a creativity class I teach (now online, of course). Many are looking to creativity to feel better, but say they don’t know how. I reply that we are all acting creatively in countless ways, but rarely recognize our actions as such.

There is nothing unusual in this creative disconnect. Recent surveys show 80% of the American public seeing creativity as essential to their lives and work, but that 70% think they just don’t have it in them. Much of this comes from misguide views about creativity, often coming from stereotypes about natural talent, inherent genius, and artistic originality.

The problem has structural roots. For years researchers have reported a “creativity crisis” in American schools and business, generally due to risk-avoidance owing to economic anxiety. Students are obsessed with grades and future earnings, while companies stick with what already works. This results in a climate of lingering anxiety, only amplified by a fear-driven formulas of much news and entertainment.

With nowhere to turn, a stressed-out America now runs to the now-booming self-help industry, which promises salvation in finding one’s “inner artist” or regaining the “magic” of childhood. The coronavirus epidemic has pushed this trend to new heights, as consumers search for answers from external sources.

The result is a growing panic as people scramble to find, build, or otherwise maximize their creative profiles – often blaming themselves when they fail. And of course failure is inevitable, since recognized forms of creative success place it out of reach for ordinary citizens. An entrenched culture of media celebrity props up this view.

It’s time to view creativity as the universal quality it really is.  All of us have it, just like we have intelligence.  But like the failures of I.Q. testing long ago revealed, the problem lies in valuing only certain types of ability. This not only leaves out anyone who isn’t an “artist,” but it’s often loaded with biases against those lacking the time or resources to gain conventionally-recognized skills.

Especially in this moment of crisis, we need to embrace the “everyday creativity” in the typical things we do in solving simple problems, improvising around the house, or making a meal from leftovers. The online popularity of DIY mask-making is a great example of this, although, once again, few see this activity as a “creative” pursuit. Ditto for postings on Facebook, Instagram, or Twitter –– all so ubiquitous that their creativity seems inconsequential.

This dismissal of everyday creativity also comes in part from beliefs that “creative” always means something “new” or “original.”  The truth is that most famous creatives borrowed from the past or built of work by others. Michelangelo copied classical sculpture, much as Shakespeare did with ancient myths. And no, Steve Jobs did not invent the iPhone by himself.

Cognitive science says that mimicry is an essential part of human learning and communication. Researchers explain that most creativity is “combinational” in that we put together already known or found elements to create something else (like decorating a room). This can lead to “exploratory” creativity when combinations produce unpredicted outcomes (making up a new recipe).  Creativity never comes from a vacuum.

Generosity is another important part of creativity. This is because nearly all creative acts are done with someone or with some recipient in mind. Psychology long has recognized that doing for others returns self-esteem to the giver. In everyday creativity this comes from that satisfaction of begin appreciated, needed, or simply connected to others.

One joyful example of this of such social creativity is the global “clap-for-carers” phenomenon happening daily all around the world. Usually at 7 or 8 pm, homebound people go to their windows and doorways, and start clapping, hooting, or banging pans to recognize health care workers who are risking themselves to save lives.

It’s a spontaneous creativity much like that seen at rock concerts, now transformed as a collective affirmation. The beauty of these ordinary forms of creativity lies in their availability to everyone. We all can be creative if we realize that we indeed to have it in us, use it all the time, and need to give ourselves credit.

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”