Welcome to Cyberschool

While technology always has played a role in education , it went into hyperdrive with the pandemic-driven move to online learning. Up to this point, economic pressures and growing student numbers already were causing a panic in education. Schools were struggling to trim budgets as “accountability” scrutinized everyone. These extant conditions presented an upside to some of the changes that would occur.  Most dramatically, the shift to doing schoolwork at home eliminated shortfalls in classroom space and, at least temporarily, student housing as well. As the pandemic continued the share of higher education offered online jumped from 10 percent in 2019 to 33 percent a few years later.[i]  But as everyone now knows, so-called “distance learning” isn’t for everyone and doesn’t work for all kinds of material.  Research shows that one-size-fits-all character of mechanical course delivery disadvantages students of many kinds.

Online schooling isn’t as new as you might think. The idea of distance learning dates to vocational and self-improvement correspondence courses of the eighteenth century, which arose with improvements  in mail delivery systems. Often cited as an early example was a shorthand course offered by Caleb Phillips, advertised in a 1721 edition of Boston Gazette with claims that “students may by having several lessons sent weekly to them, be as perfectly instructed as those that live in Boston.”[ii] By the 1800s all manner of vocational skills were being taught by mail, as well hobbies like drawing and painting. The University of London became the first college to offer distance learning degrees in 1858. By the end of the century, learning by mail had become big business for institutions like the Pennsylvania-based International Correspondence Schools (ICS). In the decade between 1895 and 1905, ICS grew from 72,000 to 900,000 students signing up to learn technical and management skills.[iii] Much of this growth was due to the innovation of sending entire textbooks rather than single lessons, along with promotion by a large in-person sales team.

Things really took off when media and telecommunications entered the picture. In the 1920s universities like Penn State began offering courses by radio, with other institutions following in the coming decades. Efforts to use “new” media technology intensified in years immediately following World War II, when millions of U.S. military personnel returned home –– many of whom had their education disrupted by the war. Slides, sound recordings, and films became part of adult education, and before long found their way into the K-12 classroom, which was seeing similar growing pains owing to the influx of baby boom youngsters. When television came along in the late 1950s, it quickly became as a labor saving device. Proponents of closed-circuit video in the K-12 school saw it as a way of allowing a single teacher to instruct multiple classrooms of students, synchronously or asynchronously. With the creation of the U.S. Public Broadcasting Service (PBS) in 1965, local affiliate stations and universities began offering telecourses, the popularity of which grew exponentially in the 1970s and 1980s.

Many of the same arguments for and against televised learning remain with us in today’s era of online learning. Alternatives to in person instruction can democratize education by making courses available to students with demands on their time or for whom travel to a campus may be difficult. Asynchronous learning also allows students to learn at their own pace, studying lessons in small segments, or repeating portions according to individualized needs. The biggest counter-argument to distance learning is the lack of engagement occurring found in in-person classes.

Computers didn’t become a serious part of K-12 schooling until the 1990s, with the broad-based availability of workstations for home use, the development of network technology, and the popular advocacy of computers in education by such public figures as Al Gore and Bill Gates.  Like cable television, the internet was touted as a means of bringing the outside world into the classroom, while connecting students to online resources. In its initial implementation, school computerization was also regarded as a means of leveling the cultural differences among students—much as “visual literacy” had been promoted.  Such attitudes fit well within the progressive belief that digital media could deliver a world of great equity and freedom.

These changes began affecting higher education in the 2000s, as the millennial “echo boom” came of age and pressures rose to accommodate rising numbers of students. A report entitled “The Virtual University,” compiled by senior administrators and businesspeople predicted a restructuring of post-secondary education in the first decade of the 21st century with brick-and-mortal classrooms giving way to online learning. [iv] The highly influential report described “a global electronic campus, which students enter via a computer and thereby telecommute from home, a dormitory room, the workplace, or a community center,” and  “the continuing education and training provided by employers and community organizations.”[v]

Echoing familiar refrains about a rapidly changing technological work world, the report noted the need for employees to retrain for “six to seven different careers in the course of a life time” and that within five years 75 percent of all workers would need retraining. Computer skills then-used by 65 percent of workers, soon rose to 95 percent of employees in the first decades of the new millennium. Citing a survey by the National Home Business Association, the report said that work from a home office would rise 50 percent of the working population  – a statistic that now seems quaint in light of the massive disruptions brought about by COVID and the lasting changes that telecommuting have made so commonplace.

Exacerbating the need for technology even further would be the exponentially growing volume of information itself, with the corpus of human knowledge estimated to be doubling every seven years. With remarkable prescience, the document stated that “Students will increasingly select curricula that enhance their chances of both initial and sustained employment.”[vi] This pressure would force traditional colleges and universities to change their offerings and modes of curricular delivery as they seek “a competitive edge in a student based—or consumer driven—market.”[vii]  Moreover, since it turned out to be difficult for bureaucracy-laden universities to change quickly, much of this new education was rolled out by private for-profit enterprises, as discussed above.

These were only the most obvious examples of a much broader phenomenon taking place inside almost all colleges and universities, as campus after campus began to slowly expand vocational/technical curricula and reduce academic/humanistic offerings. This reapportioning of instructional content was accompanied by incentives to technologize teaching, or in some cases requirements to do so. Initially this meant simply posting a syllabus on the internet or being available to students via e-mail.  Nowadays most institutions have shifted to integrated Learning Management Systems (LMS) like Blackboard and Canvas. Beyond this, faculty are encouraged to translate course materials into digital formats or courseware accessible on demand or from remote locations.

Higher education continues to undergo a change that began two decades ago, as the university is transformed from a relatively autonomous non-profit educational provider into a significant site of corporately regulated capital accumulation. In a two stage process that began with the commodification of research, the university now is increasingly reliant upon and influenced by the mandates of commercial product development.  This began in with the decline in heavy industry, as business and political leaders began to recognize the potential in value-added goods, service industries, and “knowledge-based” products.  The latter recognition of “intellectual capital” as a potentially lucrative form of currency both enabled and benefited from the information technology boom of recent decades.

Add to this the sense of competition felt at every level of the academic environment, with students, professors, and administrators all pressured to achieve scores, reach benchmarks, and continually improve. By definition such an environment is premised on stratification, with winners requiring losers for comparison. Then factor in the role of structural inequities in reproducing advantage and disadvantage up and down the academic food chain. Finally, lest anyone think otherwise, consider the role of new institutional initiatives, educational technologies, and instructional formats in making inequities worse in the name of progress and efficiency.

[i] Melanie Hanson, “Online Education Statistics,” educationdata.org (Jul. 10, 2021)  https://educationdata.org/online-education-statistics (accessed Jul. 26, 2022).

[ii] Livia Gershon, “Three Centuries of Distance Learning,” JSTOR Daily (Apr. 13, 2020) https://daily.jstor.org/three-centuries-of-distance-learning/ (Jul. 14, 2021).

[iii] “The Evolution of Distance Learning,) Florida National University (Aug. 15, 2019)  https://www.fnu.edu/evolution-distance-learning/ (accessed Feb 17, 2022).

[iv] Carol Twigg and Diana Oblinger, The Virtual University (Report from a Joint Educom/IBM Roundtable, Washington DC Nov 5-6, 1996).

[v] The Virtual University, p. 1.

[vi] The Virtual University, p. 5.

[vii] The Virtual University, p. 6.

Leave a Reply

Your email address will not be published. Required fields are marked *