In May 2012, Harvard University and the Massachusetts Institute of Technology announced a partnership to offer on-line courses, free to anyone anywhere in the world. There is a historical resonance in MIT's involvement in the MOOC (massive open on-line courses) movement. MIT is a land-grant university and the announcement came during the 150th anniversary of the Morrill Land Grant Act which created the land-grants. Arguably the greatest democratization of higher education in history, the Morrill Act stressed that higher education should be practical and that it should be accessible. This month historian David Staley looks back over the 150 year history of this experiment in state-funded, democratic higher education.
Read more on education in the United States: Updating 'No Child Left Behind' and The Policymaking of Student Loan Debt.
In December 2011, the Massachusetts Institute of Technology (MIT) announced that it was opening up some of its online courses to anyone in the world who wished to take them, an initiative called MITx. By May of 2012, MIT announced a partnership with Harvard University to offer free online courses jointly. Called edX, these initiatives are part of a larger movement called MOOCs, or "massive open online courses."
Ten years ago, MIT was one of the pioneers of the Open Courseware Movement (OCW), which made available course materials from every one of the courses taught at MIT (today, materials from 2100 courses are free to anyone), and MITx and edX were the next logical step in this endeavor.
While those accessing MITx courses or materials through OCW cannot secure degrees or other credentials from MIT, these are nevertheless highly sought after by students seeking access to high quality technical courses such as "Introduction to Solid State Chemistry," "Introduction to Computer Science and Programming," "Quantitative Methods in Clinical & Public Health Research," and "Artificial Intelligence."
MOOCs and open courseware programs are designed in part to open up access to higher education.
It should come as no surprise that MIT would be a leader in making the content of its courses freely available, as MIT was one of the nation's first land-grant colleges.
The Great Democratic Experiment
In 1862, Congress passed the Morrill Act, the brainchild of Vermont Congressman and later Senator Justin Smith Morrill, which created the land-grant universities in the United States.
At its heart, the Act was based on two principles that have continued to influence the way in which we think about higher education in the United States: that it should be widely accessible (with states underwriting higher education so that it is affordable for many) and practical.
Morrill intended for these state-supported schools to educate agriculturalists and mechanics, because he understood that a rapidly industrializing nation would need people with practical skills. Although the U.S. economy has since industrialized and globalized, in many ways higher education in America is still defined by the goals of access and practicality.
The idea of greater access to college education was not unique to Morrill. Indeed, so-called "worker's colleges" such as The Agricultural College of Michigan and the People's College of New York long predated the Morrill Act.
Other legislators had proposed bills similar to Morrill's, but he was the most persistent advocate for expansion of higher education. In 1856, Morrill proposed his first resolution, which called on the Committee on Agriculture to create of a Board of Agriculture including "one or more national agricultural schools upon the basis of the naval and military academies."
Morrill argued that the federal government expanded commerce by investing in lighthouses and harbors, and that agriculture and agricultural education should similarly benefit from federal largess especially given what Morrill saw as the deteriorating condition of agriculture in the United States.
"Our country is growing debilitated, and we propagate the consumptive disease with all the energy of private enterprise and public patronage," he said. "Does not our general system of agriculture foreshadow ultimate decay? If so, is it beyond our constitutional power and duty to provide an incidental remedy?"
Improving the skills and knowledge of farmers was the best method to reverse this decline, in Morrill's estimation, and the federal government could play a critical role. To fund these public colleges, Morrill proposed land grants, which had been used successfully to fund the railroads.
A land grant meant that the federal government would set aside sizeable amounts of land—up to 30,000 acres in each state—which would then be sold, the proceeds of which would fund "the endowment, support, and maintenance of at least one college" in each state, although in some cases these funds would also augment the resources of an existing college, as with the case of MIT.
Morrill met fierce opposition to his proposals, not as much for his educational goals but for this method of funding. While land grants had been used by the federal government before, opponents nevertheless saw land grants for education to be an example of federal overreach. Many southern states, in particular, rejected this; Clement Clay from Alabama called Morrill's efforts "one of the most monstrous, iniquitous and dangerous measures which have ever been submitted to Congress."
It should come as no surprise, then, that after several unsuccessful attempts at passing his education bills, the Morrill Act was passed only in 1862, during the Civil War, after Southern legislators had left Washington.
Morrill proposed several bills between 1856 and final passage in 1862.In each version, Morrill proposed that the federal government's role in higher education was in creating colleges that were both practical and accessible. Their curricula would teach agriculture, engineering, military tactics, and the sciences, as well as the classical curriculum that had heretofore defined higher education in America.
Forty-eight colleges were formed as a result of the first Morrill Act. There was a second Morrill Act, passed in 1890 (Morrill was one of the nation's longest-serving senators), after a dozen attempts to expand his original act between 1872 and 1890.
The second Morrill Act was aimed at the reconstituted Southern states, and included provisions that eventually led to the funding of seventeen "historically black colleges and universities" (HBCUs).
The Hatch Act of 1887 (named for Congressman William Hatch of Missouri) set aside funds to establish "agricultural experiment stations," which laid the foundation of the agricultural extension movement that eventually became central to these land-grant colleges.
Access and Practicality 2.0: The GI Bill
While the two Morrill Acts opened up access to more people who wished to attend university, the increase in attendance was gradual and halting. College enrollments skyrocketed only after the Second World War, a process facilitated by acts of Congress.
The Servicemen's Readjustment Act of 1944 (popularly known as the GI Bill) provided assistance to returning veterans of WWII to purchase homes and, importantly, to attend university. The GI Bill led to an explosion in college enrollments far larger than that seen after the two Morrill Acts.
At the time, few understood the impact the GI Bill would have on college enrollments. Optimists assumed that perhaps 10% of veterans would matriculate, and that most would instead seek employment. By fall 1945, 8,000 GIs enrolled in college, but by 1946, that number had swelled to one million; and by 1950, to two million.
The Higher Education Act of 1964 extended GI Bill benefits to every American, via a series of loans and grants. In 1972, an amendment to the Higher Education Act created the Basic Educational Opportunities Grants (BEOG) program, which was soon renamed "Pell Grants" after Senator Claiborne Pell of Rhode Island.
Significantly, BEOGs were grants, not loans, and were awarded to students, not to institutions. Full-time students with financial need who maintained good grades would receive up to $1250 (the equivalent of over $6000 today) of yearly support to pay for college tuition.
One outcome was that colleges began to compete for such students. "The portability feature," writes historian of higher education John Thelin, "meant that hundreds of thousands of recipients now had the means to go to college and a choice as to which college to attend … Between 1972 and 1978 the Pell Grant program was popular with students and institutions, and it helped promote the appeal of 'going to college' to a new generation of students."
The combined effects of these post-war congressional acts were especially pronounced for America's community colleges.
While junior colleges first emerged in the early part of the 20th century, they exploded in the 1960s from around 400 colleges and 325,000 students in 1955 to 973 colleges and 3.4 million students. The Higher Education Facilities Act of 1963 provided funds necessary to build many of these community colleges.
All of these acts aimed to extend access to higher education, very much in keeping with the spirit of the original Morrill Act.
Practical Learning?
Morrill envisioned farmers and mechanics attending college and learning practical skills at the same time they were reading their Greek and Latin classics. Morrill, at least, seemed to wish to balance "higher learning"—learning for learning's sake at a higher intellectual, metaphysical, even spiritual plane—and what the economist Gary Becker in the 1960s first called "human capital development."
In a modern industrial and now global economy, the knowledge and skills of workers is one of the most important capital assets. Therefore, investments need to be made in human capital to the same degree we would invest in other capital assets.
Colleges and universities have, especially since Morrill's time, fulfilled this role of providing students with advanced training in skills that have practical application to the world outside of the academy. That is, in addition to training scholars, American colleges and universities also train technically-skilled specialists.
Indeed, while there are certainly colleges and universities that specialize in one or the other (the local community college specializes in skills training, while the Institute for Advanced Study is the quintessential center for "higher learning"), many colleges and universities attempt to serve a dual purpose: championing the liberal arts and developing employable skills.
But given the choice, observe how many students today major in "practical subjects" rather than the liberal arts; since the 1980s, more students receive bachelor's degrees in business, for example, than any other college major.
Indeed, attendance in colleges and universities grew steadily throughout the 20th century in part because of the promise of a "better life" from a college degree. Frequently cited statistics reveal that those with college degrees earn more over a lifetime than those without college degrees. That we continue to talk of the value of a college degree in terms of skills preparation is in no small measure due to the vision of Justin Morrill.
The nature of the modern economy is such that many jobs require advanced training. Wired and National Public Radio's "Planet Money" recently teamed up to explore the next generation of "smart jobs," those that require more than a high school education but perhaps not as much as a bachelor's degree.
As Adam Davidson writes, "Smart jobs tend to scramble the line between blue-collar and white-collar. Their titles tend toward the white (technician, specialist, analyst), but the underlying industries often tend toward the blue, toward the making of physical stuff. Smart jobs can involve factories and machines, plastics and chemicals, but operating those instruments and manipulating those materials demands far more brains than brawn."
During the last presidential campaign, Republican hopeful Rick Santorum made headlines by suggesting that President Obama was an "elitist" for encouraging more people to attend and graduate from college. We should assume that Santorum was trying to score political points with his base by equating "college" with "liberal indoctrination," and that he did not actually believe that most people do not need college.
For this really would miss the point: even to become a car mechanic, as Santorum offered, still requires advanced training beyond a high school diploma. A high school shop class just will not do for an industry that is today as computer-mediated as any other. The question we face today is similar to the one faced by Morrill in the nineteenth century: who is responsible for training those mechanics?
Do-It-Yourself Education
A related question, again one that Morrill addressed, is who will pay for the training of those mechanics? Morrill clearly felt that states bear that responsibility, since developing the technical skills of the citizenry is good for the economy and society.
Since the 1980s, we have faced a very different attitude toward funding for human capital development. Morrill's social contract has frayed, and indeed some wish to dissolve it altogether.
Even before the Great Recession of 2008, many states cut funding for higher education.
Indeed, there is a common quip among some university presidents that, as their funding from state legislatures drops year after year, their universities have gone from "state-supported" to "state-assisted" to "state-located." The chancellor of the University of Wisconsin-Madison spoke seriously about "privatizing" in the face of such dwindling state support. In the current budget battles in Washington, Pell Grants have been targeted for potential cuts.
Tuition has risen as a result of state cuts. This change in funding priorities means that the costs of college increasingly fall upon individuals. That is, rather than the state supporting skills development via grants and other means to reduce tuition, individuals are asked to bear more of the costs of their own training, which means that many students must take out loans to pay for their education.
As tuition increases, the size of those loans increases. For many students, to pay off these loans requires subsequent employment in a job with a substantial enough salary to match their equally substantial loan payments. These students, as a result, choose majors that promise immediate and sufficiently high salaries.
This situation is contributing to what might be termed a "higher education bubble." The term is often used to refer to the ballooning number of student loans and the potential for a massive situation on the order of—or perhaps even larger than— the mortgage bubble that precipitated the 2008 recession.
However, I mean "higher education bubble" in another sense: especially since 1945, access to higher education has been linked with upward social mobility and the promise of high-paying employment and career prospects.
But one of the effects of the Great Recession has been that many newly-minted graduates are unable to find employment, or are "underemployed," meaning they work in low-paying jobs that do not take advantage of their advanced degrees.
Hence the conditions for a higher education bubble: if tuition continues to increase such that students go into debt for tens of thousands, even hundreds of thousands, of dollars with no guarantees of employment to permit repayment, then more and more potential students may ask, "What is the value of a college degree?"
Imagine this scenario: there are three baristas at Starbucks, one who did not attend college and one who did, and he has large student loan payments. The third barista is trying to decide if she should go to college. She looks at her friends and asks, "If we are all working here at Starbucks, and one friend is enduring suffocating debt loads while the other earns the same and has no loan payments, where is my incentive to go to college?"
Many students could decide to "opt out" of college, whatever else its benefits, and make the economic calculation to forgo education and debt. Since 1945, we have assumed ever-increasing numbers of young people attending college.
In the absence of funding opportunities, perhaps those numbers will start to decline, and the period 1945-2008 will be one of a large expansion followed by a large contraction of college attendance. Is it possible that we could be heading back to pre-1945 levels of college attendance?
Indeed, entrepreneur Peter Thiel is encouraging exactly that via his Thiel Fellowship, a grant awarded to 18-year-olds who forgo college. Specifically, fellows are given $100,000 to develop their ideas into businesses.
The fellowship website states: "Thiel Fellows are given a no-strings-attached grant of $100,000 to skip college and focus on their work, their research, and their self-education. They are mentored by our network of visionary thinkers, investors, scientists, and entrepreneurs, who provide guidance and business connections that can't be replicated in any classroom. Rather than just studying, you're doing."
These are young people with the aptitude for college, but Thiel is channeling their talents in another direction and toward a different model of life after high school. (But note that some of the Thiel recipients still aspire to college.)
Entrepreneurs like Thiel and others in the Do-It-Yourself (DIY) education movement seem to be encouraging a return to a pre-1862 model of human capital development that encourages young people to develop skills on their own. In such a model, there is no need for college or college credentials: your success is your credential.
Outside the Box: New Models for Higher Education
Other models have been proposed for funding higher education. Economists such as Milton Friedman and Gary Becker were among those advocating "human capital contracts."
Miguel Palacios Lleras offers the following scenario: "Imagine that a student who wants to attend college, but does not have the resources to do so, signs a contract with an investor in which he commits to pay 10 percent of his income for twenty years after graduation in exchange for $100,000 received today to pay for tuition and living expenses. This contract, which gives the investor a 10 percent stake in twenty years of the student's income, is an equity-like instrument."
The idea would be to replace the student loan with an investment-grade instrument that, in theory at least, would be more attractive to potential students than the prospect of an insurmountable debt load. (Presumably, a graduate who is employed as an engineer has the potential for a higher "payout" than someone who works as a social worker.)
Critics deride such human capital contracts as a modern form of indentured servitude. It is also unclear how many college-age students would actually have access to these potential investors.
Colleges themselves might turn to something like a human capital contract instead of charging tuition.
Clarkson University recently unveiled a plan to provide free tuition to students who win at a business competition (think Shark Tank for college students), with the university providing space, networking opportunities, and marketing. In exchange, Clarkson receives a 10 percent equity stake in any resulting business. In the absence of the state support Morrill advocated, colleges and their students are seeking new ways to fund their investments in higher education.
The investor in a human capital contract sees only a way to earn capital gains, and is presumably less interested in the value of the skills the student might develop. What if that investor were also concerned with skills development?
Businesses almost always benefit from an "educated workforce," hiring graduates of colleges and universities. Indeed, community colleges and career-colleges often work closely with employers, establishing informal pipelines that directly link curricula with the needs of employers. Tell us what skills you seek, we will train the students, and you will hire them, as this reasoning goes.
What if companies became more directly responsible for higher education? That is, what if companies were in the business of directly training their own workforces rather than "outsourcing" this task to colleges, meaning a student would forgo college to be trained directly by the company?
Many companies already have their own "corporate universities" that provide more than just training, and are viewed as a way to develop talent within the organization.
McDonald's' corporate university, nicknamed "Hamburger U," recently announced that its program in business management will be accredited by Manchester Metropolitan University in the UK. In effect, one can now receive a business degree (the equivalent of an associate's degree) from McDonald's, a credential that would be highly valued even by those who do not plan to remain in foodservice. Imagine the attraction of a degree from McKinsey University or Apple U.
Higher Education in the 21st Century
We have assumed since the days of Justin Morrill that colleges simultaneously provide students education and certification of that education. That is, universities are in the business of both training and credentialing students. Those two functions of the university are in the process of being cleaved.
MITx and other such initiatives have separated the skills training from the credentialing. Students are free to watch an MITx lecture but do not receive a credential or certification from MIT.
(Although this is about to change: Antioch University recently announced that they would start offering credit to students who successfully complete course work through Coursera's menu of MOOCs. Also, the American Council on Education (ACE) is considering certifying college credit for some MOOCs.)
A model now emerging has students receiving a certificate for a class completed. Take enough such classes, accumulate enough certificates, or "badges," and an entire degree can be constructed. Or, just the course-level certificate might be sufficient: certificates granted for individual classes appear to have some standing in the marketplace. Indeed, a Microsoft certification holds as much value in the marketplace as some college degrees.
Students at Western Governors University (WGU), a nonprofit online university, can acquire skills through a variety of means: they can take classes through WGU, or can work by themselves in a library, or can claim expertise from prior learning.
WGU offers to credential students, to certify their learning, but that learning occurs at a pace and in a manner determined by the student, suggesting a new relationship between skills acquisition and credentialing. And as the Hamburger U example indicates, corporations themselves might enter the business of credentialing.
Accessibility and practicality will continue to shape the future of higher education in the United States.
There are several key unknowns about that future: who will fund/invest in human capital development? The state? Individuals? Corporations? How will college-level skills be certified? Will we need to certify skills? (Yes, clearly, some occupations will require such certifications: we want to know that a doctor is certified. But how that certification will work remains in question.)
Will colleges and universities continue to hold a monopoly on skills training and credentialing? Or will new models appear to replace the methods to which we have been accustomed for the last 150 years?
Justin Smith Morrill himself was not college-educated, and not because he did not desire it: he could not afford to attend college, instead moving into a very lucrative career in business. He was an autodidact, however, reading widely and voluminously.
Would a self-taught Justin Morrill have approved of Western Governors University, where his self-paced erudition and vast reading could have been certified and credentialed? I suspect that were he alive today, Justin Morrill would be devouring free online classes from MITx.
Mark Allen (ed.) The Next Generation of Corporate Universities: Innovative Approaches for Developing People and Expanding Organizational Capabilities (Pfeiffer, 2007)
Robert B. Archibald and David H. Feldman, Why Does College Cost so Much? (Oxford University Press, 2010)
Richard Arum and Josipa Roksa, Academically Adrift: Limited Learning on College Campuses (University Of Chicago Press, 2010)
J.M. Beach, Gateway to Opportunity? A History of the Community College in the United States (Stylus Publishing, 2011)
Gary S. Becker, Human Capital: A Theoretical and Empirical Analysis, with Special Reference to Education (National Bureau of Economic Research; distributed by Columbia University Press, 1964)
Coy F. Cross, Justin Smith Morrill: Father of the Land-Grant Colleges (Michigan State University Press, 1999)
Adam Davidson, "The Economic Rebound: It Isn't What You Think," Wired June 2011
Digest of Education Statistics, "Degrees conferred by degree-granting institutions, by level of degree and sex of student: Selected years, 1869-70 through 2018-19," National Center for Education Statistics
Daniel Mark Fogel and Elizabeth Malson-Huddle, eds. Precipice or Crossroads? Where America's Great Public Universities Stand and Where They Are Going Midway through Their Second Century (SUNY Press, 2012)
Anya Kamenetz, DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education (Chelsea Green Publishing, 2010)
Miguel Palacios Lleras, Investing in Human Capital: A Capital Markets Approach to Student Funding (Cambridge University Press, 2007)
Christopher P. Loss, Between Citizens and the State: The Politics of American Higher Education in the 20th Century (Princeton University Press, 2012)
Christopher Newfield, Unmaking the Public University: The Forty-year Assault on the Middle Class (Harvard University Press, 2008)
John Thelin, A History of American Higher Education, second edition (The Johns Hopkins University Press, 2011)
William G. Tierney and Guilbert C. Hentschke, New Players, Different Game: Understanding the Rise of For-Profit Colleges and Universities (The Johns Hopkins University Press, 2007)
Taylor Walsh, Unlocking the Gates: How and Why Leading Universities Are Opening Up Access to Their Courses (Princeton University Press, 2011)