A Speculative Endeavor

Education has become an investment. But what are its returns?

By Eleni Schirmer

The Speculator, by Francis William Edmonds, 1852. Smithsonian American Art Museum, gift of Ruth C. and Kevin McCann in memory of Dwight David Eisenhower, 1976.

Higher education in the United States is a speculative endeavor. It offers a means of inching toward something that does not quite exist but that we very badly want to realize—enlightenment, higher wages, national security. For individuals, it provides the lure of upward mobility, an illusion of escape from the lowest rungs of the labor market. For the federal government, it has charted a kind of statecraft, outlining its core commitments to military strength and economic growth, all the while absolving the state of the responsibility for ensuring that all its subjects have dignified means to live. We are told the path to decent wages and social respect must route through college.

The metric of higher education is credit; it runs on the belief of future value amid present uncertainty. This has readily lent to the industry’s financialization, the elaborate ways of using money to make more money rather than to produce goods and services. Today financialized systems of higher education mean that colleges and universities operate as investors or borrowers or both. University revenues increasingly come from financial activities, such as profits from endowment investments, real estate acquisitions, or leveraging student tuition. Simultaneously, financial costs—such as debt, interest, and fees—command a growing portion of university expenditures. Many universities allocate as much money to debt service as they do to entire departments; in 2020–21 the University of Wisconsin–Madison’s debt-service budget was ten times the size of its student financial-aid budget. Similarly, students become either borrowers, taking on tens of thousands of dollars in debt to fund their tuition, or investors, betting on future earnings, thanks to the education they are accruing. For institutions and individuals alike, financialization compresses education into a means of accumulation.

Rising tuition and reduced public investment have created an ever-worsening student-debt crisis in the United States today, but students and families have struggled to pay for college since the nation’s founding. Federal support for higher education has primarily manifested as tuition subsidies for students and their families rather than directly funding colleges, thereby rendering tuition obsolete. Before the Civil War, colleges operated predominantly as cloistered institutions where the elite could learn Latin and Greek and be trained as ministers. Often privately financed by wealthy donors, many of the largest and most prestigious schools, including Yale and Georgetown, were funded by profits from slavery and colonial exploits. With little public support, these universities were generally expensive to attend—in 1860 the cost of attending a college in the East represented a skilled worker’s entire annual income—and as a result were open mostly to the moneyed, or at least those who could sacrifice years of lost wages in pursuit of a degree. The 1862 Morrill Act, which created the country’s land-grant universities, marked the federal government’s first intervention and investment in the nation’s higher education system. The legislation relied on the expropriation of nearly eleven million acres of Native lands, which were transferred to state governments. In exchange for the Native land titles, the U.S. government promised Indigenous nations all of $400,000. In reality, it paid far less and in many cases nothing at all. Today those tracts are estimated to be worth $500 million. Over time the plundered lands became a special kind of credit factory: universities.

The highest result of education is tolerance.

—Helen Keller, 1903

Throughout the nineteenth century, these newly founded universities heralded the ascendancy of science and industry, twinkling with a Baconian spirit of “useful knowledge.” Lands would be surveyed, ships would be built, and corn would rise taller. The academic institutions were also framed as a matter of national security. In the aftermath of the Civil War, universities were envisioned as a substitute for food systems previously run by Southern plantations and on slave labor. Northern abolitionists, such as Republican representative Justin Morrill of Vermont, who sponsored the land-grant legislation, saw these universities’ focus on agricultural science as a means of ensuring that the United States could sell crops even after the end of slavery. In 1890 a second Morrill Act established institutions specifically for African Americans.

But what likely cemented the role of the university as a mainstay of American statecraft was its mirage of democracy. Universities inspired not just visions of better soil tilling but also the means of upward mobility for even the most calloused hands. The sons (and some daughters) of white farmers could go to college and grab handfuls of the golden knowledge it offered. Between 1890 and 1940 the doors swung open and enrollment in higher education increased fivefold. The vision was perhaps more powerful than its reality: only a slim minority of Americans ever enrolled in college. In 1940 not even 10 percent of the nation’s eighteen-to-twenty-four-year-olds attended college.

World War II took the doors off their hinges thanks to the GI Bill, which rewarded surviving soldiers with access to free higher education. College and federally subsidized mortgages for veterans not locked out by anti-Black redlining were the government’s program to absorb the sixteen million returning veterans. The GI Bill channeled mostly white men into college degrees that led to middle-class occupations suitable for servicing veterans’ mortgages and providing a breadwinner’s wage. Greasing a path to higher education had the added benefit of keeping disaffected veterans from jumping directly into a labor movement that was bursting with militancy. A 1947 Bureau of Labor Statistics report noted that the six months following World War II “marked the most concentrated period of labor-management strife” up till that date. Workers no longer tolerated the restrained wages justified by the war, and the end of wartime price controls meant that living costs spiked while wages flattened. Unbridled from the war’s no-strike pledges, 4.6 million workers went on strike in 1946, with the average strike lasting twenty-four days. Rather than backing workers’ demands for greater wage protections, the federal government passed the 1947 Taft-Hartley Act, which limited workers’ organizing power and protected management’s ability to thwart unions. Meanwhile, the GI Bill subsidized veterans’ tuition, charting their path to higher wages while keeping them a safe distance from erupting shop floors. Seventy percent of male college graduates in 1949 were veterans. The federal government paid up to $4,400 (more than $51,000 in 2022 dollars) for each soldier’s degree and living expenses. Funded through tuition reimbursements, the GI Bill solidified the model of the federal government providing financial assistance for students.

Over the next decade, as the Cold War progressed, the federal government stepped up its higher education investments. Rather than just sop up the aftermath of war, the university would build a labor force specifically designed for the wars to come. In 1958, a year after the Soviet Union launched Sputnik 1, President Dwight D. Eisenhower signed into law the National Defense Education Act, which comprehensively funded science and technology education. The policy was motivated by an overwhelming fear that America’s science classrooms were somehow inferior to those in the USSR. The NDEA also established the first federal student-loan program by directly providing low-interest loans aimed at students who enrolled in defense-related fields, such as science, mathematics, and foreign languages.

When economists in the early 1960s noticed that countries with higher educational attainment also had more robust economic growth, a new policy paradigm began to emerge. Heavily influenced by the thinking of the economist Theodore Schultz, mainstream politicians embraced the idea of education as a means of transforming humans into a resource very much like capital. Like little coins, properly invested humans could increase the country’s supply of money, patriotism, and perhaps even democracy. Schultz believed that cultivating human capital via education would ameliorate a hardening system of economic inequality. “Truly, the most distinctive feature of our economic system is the growth in human capital,” he wrote. “Without it there would be only hard manual work and poverty except for those who have income from property.” Education appeared to offer an almost magical way of improving the national economy, bolstering national security, and bettering poor people’s fates—all without having to overhaul the system of capitalism.

Scenes at the University with Images of the Ancient Sages (detail), Japan, seventeenth century.

Scenes at the University with Images of the Ancient Sages (detail), Japan, seventeenth century. The Metropolitan Museum of Art, Mary Griggs Burke Collection, gift of the Mary and Jackson Burke Foundation, 2015.

Schultz’s model of human capital captured the attention of key liberal policy makers, including Clark Kerr, a labor economist who served as president of the University of California system from 1958 to 1967. Inspired by the Schultzian vision of higher education as a profitable investment, Kerr set to work ensuring that the university would serve as the key vehicle for building national economic power, akin to “what the railroads did for the second half of the last century and the automobile for the first half of this century.” He was especially keen to include women, people of color, and low-income students. This program for higher education powerfully consolidated several key prongs of the liberal accord: it pledged a commitment to domestic social inclusion while preparing the “knowledge workers” necessary for the country’s petrochemical, agrochemical, and defense industries.

In 1965 this vision of higher education became the law of the land under President Lyndon B. Johnson’s Higher Education Act. The HEA also created the basic architecture of today’s student-loan industry. Whereas the 1958 NDEA had authorized the U.S. Treasury to lend money to students, student loans under the HEA originated with private banks but were backed by the federal government. These loans were first available only to low-income borrowers, but eventually all students became eligible. This significant expansion of the student-loan business proved to be a very profitable arrangement for private banks, which took all the profits but absorbed none of the risks.

In addition to shifting student loans into the hands of private banks, the HEA significantly increased federal spending for higher education. It distributed federal aid to historically Black colleges and universities, opened bridge programs to encourage first-generation students to pursue college degrees, and paved the way for the Pell Grant program, which allocates federal funds to low-income college students. Though hardly perfect, this system of federally funded higher education enabled unprecedented numbers of students to enroll in higher education at no or very low cost.

 

In the context of social struggles, an education is more like a bonfire on a winter night than a table lamp in a hallway. It illuminates society’s haunted and austere structures, the precious life within. It draws people together. It does not ensure comfort. By the mid-1960s universities provided a critical gathering place for activists as the nation plunged into a deeply unpopular war in Vietnam, and legacies of racist and sexist inequalities unleashed new currents of social-movement organizing. Colleges and universities incubated the Black power movement, women’s liberation organizing, the free speech movement, and antiwar mobilization. Students were not content simply to learn about the structures of the world; they fought to transform them. Many of these movements pushed back against the “knowledge society” that planned militaristic, capitalist endpoints for education. If education had become the means to build a new society, it could also be the means to dismantle a dysfunctional one.

It was precisely the liberatory capacity of education that quickly drove conservative policy makers and pundits to oppose publicly funded, free higher education. The radical potential of public higher education had proved dangerous because it could be harnessed to social movements capable of challenging the systems of oppression that support the status quo. To fulfill the promise of education as human capital, education had to be chained to its economic role and divorced from its revolutionary possibilities. A key leader of this effort was economist Milton Friedman. In response to protests by “intolerant radicals” at the University of Chicago, Friedman wrote in 1969, “We must do some drastic rethinking if we are to preserve the university as the home of reason, persuasion, and free discussion.”

For Friedman, that meant shifting the costs of education onto students. As a kind of capital, education counted as private property, and its cost should be shouldered by the individual beneficiary, not the public. He adamantly opposed state investment in collective education, instead favoring a voucher-style system for individuals, similar to how the GI Bill provided grants to veterans rather than eliminating tuition. Friedman’s design for higher education had in fact undergirded the 1958 Treasury-funded student loans. But both the GI Bill and the NDEA were time-bound, short-term programs; Friedman sought a long-lasting policy framework to contain his ideology. He found no better executor of his vision than Ronald Reagan.

Almost immediately upon becoming governor of California, Reagan campaigned to impose tuition and fees on the state’s previously free university system, a thinly veiled punishment for student activism. Guided by Friedman’s free-market ideology, Reagan recast free tuition, up until then a beloved public benefit, as an unfair public entitlement that forced taxpayers to fund the reading habits of communists, gays, and bra-burning peaceniks. Higher education, he and Friedman believed, was like all things American: it should come from hard work, sweat, and credit. Over the next two decades, as Reagan moved from Sacramento to the White House, he continued his assault on free higher education. It is thanks to his administration that it is now widely assumed that college should be funded through tuition, and tuition financed through loans, reversing decades of state-funded and either tuition-free or very low-cost public higher education. While Democratic and Republican legislators disagreed on the mechanics of the student-loan market, both parties agreed on the basic premise: the ticket to higher education would be bought through borrowed money rather than public funding.

Girls jumping rope at the Chuxi village primary school, Fujian, 2014. Photograph by Michael Yamashita.

Girls jumping rope at the Chuxi village primary school, Fujian, 2014. Photograph by Michael Yamashita. © Michael Yamashita / GEO Image Collection / Art Resource, NY.

Through the 1980s and 1990s, Congress’ major interventions in higher education loosened rules of credit. Legislators increased borrowing limits, relaxed regulations, and authorized new high-risk financial instruments for borrowers who had exhausted all other options. Parents could now take on virtually unlimited debt on behalf of their children, for example. An explosion of growth in private lenders followed the creation of new asset-backed securities that enabled lenders to divest long-term risk. New developments in financial technologies, such as student loan asset-backed securities, transformed student loans into tradable securities, yielding huge gains for investors who bought student debt and resold it at a profit. At the same time, Congress rolled back basic consumer protections for student borrowers, making refinancing and deferment or cancellation via bankruptcy (or even death) nearly impossible, while authorizing lenders to garnish wages, tax returns, disability payments, and Social Security checks.

It was only a matter of time until lines of credit morphed into chains of debt. Today student debt hovers at $1.7 trillion, saddling 45 million borrowers with tens of thousands of dollars of debt. Financing higher education this way is a poverty tax: those who have the least access to wealth end up with the highest costs. The longer a borrower takes to pay back his or her loan, the more he or she pays in interest, raising the price tag of the degree. Regressive wage disparities in the labor market further exacerbate these inequalities. On average, a Black man makes 22 percent less per hour—and a Black woman 34 percent less—than a white man. As a result, Black borrowers end up paying more for the same degree than white students do. Women also carry higher debt loads than men, with Black women the most burdened by student debt. For all its mystique, a vision of higher education based on human capital does not transform society so much as it intensifies existing social dynamics: the elite climb higher; the poor get poorer. The notion of education as human capital is no longer a poetic metaphor but a terrifying reality: students have become capital, their debts purchased, reinvested, and gambled upon. After years of diligently making payments, many borrowers find their balance far greater than the amount they initially borrowed, thanks to compound interest. Rather than provide the path out of poverty promised by higher education, student loans trap borrowers in cycles of poverty.

Rising student debt has educational and societal consequences. Students choose majors based on the careers that will service their debt, learning how to engineer drones or navigate the jargon of business management. For the past twenty years, business has been the most popular department in the country, accounting for almost 20 percent of college majors. While the nation has faced an acute K–12 teacher shortage for almost a decade, less than 5 percent of all college graduates have majored in education since 2015—an alarming reversal from 1970, when 20 percent of college graduates studied education. A student now needs a job that will pay back debt rather than fulfill social-welfare needs. The United States desperately needs rural dentists, public-interest lawyers, social workers, and nurses, but the comparatively low wages and high degree costs of these professions drive people away from pursuing such work.

It isn’t just students who find themselves in debt as a result of these policies. For the past five decades, tax cuts have reduced state coffers, forcing public institutions to reduce budgets. These institutions have taken on increasing amounts of debt to keep their doors open. From 2009 to 2019, the institutional debt of U.S. colleges and universities rose 71.1 percent, amounting to more than $336 billion. Rising institutional debt has two main effects. First, more of an institution’s revenue must be set aside to pay interest and fees as the first budgeting priority. Before workers get paid or buildings get renovated, funds are reserved to pay back creditors, often years in advance. Second, credit ratings become institutions’ lifelines. Preserving credit scores becomes key; a low credit score makes borrowing money more difficult and expensive. Schools with large endowments maintain high ratings by maintaining millions in reserves, but dipping into these accounts to pay for operations could damage their scores. This partially explains why, amid a global pandemic, universities furloughed workers and continued to charge full tuition rather than tap their endowments, which have earned record profits. Credit-rating companies scrutinize a university’s power to enact unilateral fiscal actions—collective-bargaining agreements and shared governance are seen not as democratic mechanisms vital to the institution’s health but as impositions on its ability to shut down programs, cut wages, or raise tuition. Universities invest in marketing departments, flashy student centers, and winning athletic teams to secure brand loyalty, paying customers, and revenue streams. If these tactics happen also to yield scholars, democrats, or simply future workers, that’s fine.

One must love people a good deal whom one takes pains to convince or instruct.

—Mary de la Riviere Manley, 1720

Education is not alone. Financing public goods through debt reflects a transformation of the political economy over the past five decades that the economic sociologist Wolfgang Streeck has termed the transition from a tax state to a debt state. Social services previously provided through progressive taxation, such as free tuition, are now available only by taking on debt. Now you must use debt to finance not only higher education but also health care, housing, even your own imprisonment, as carceral systems increasingly rely on court fees and fines. The threadbare welfare state runs thanks to widespread credit, rather than taxing corporations or top earners.

The resulting austerity doesn’t just result in fewer and worse public services and widening inequality as financiers increase their wealth while the vast majority goes into debt; it also diminishes democracy. Funding priorities, such as what public services a society needs, are determined by creditors; elected officials and voters are relegated to secondary actors at best. As austerity subsumes the nation, debt becomes its heartbeat.

 

What does it mean to have a nation under debt? Austerity leads not just to unequally distributed money but also to unevenly distributed time. Working people dash from job to job, scrambling to pay one damn bill after the next. When I ask my students what they would do differently in college if there were no such thing as debt, virtually all of them tell me they would focus more on their studies and work fewer jobs. A few would change majors and take more art classes.

Debt makes us rush, but it also forces us to wait—for a relief check to finally arrive, for the White House to decide whether our student loans remain in deferment—or to stay in a job or a relationship long past its expiration date. Making people wait, the sociologist Pierre Bourdieu observed, is an integral part of domination. My students enroll in college and take on debt in order to sell their future time at higher rates. They may realize their debt will be best serviced by managing a fast-food restaurant instead of working as an elementary school teacher.

I think about all this as I wait for the bus on a busy street in a mixed-income neighborhood, staring at the advertisement for a local university. A picture of a smiling woman with a curly black Afro and impeccable teeth beckons: “Return to learning!” The image suggests that her education has led her to a decent hourly wage, existential satisfaction, and a dentist. How many semesters’ worth of tuition did this ad cost? I wonder absentmindedly. When the bus stops half a mile away in an upper-class neighborhood, an advertisement on the same street displays a gleaming luxury watch. A diamond-studded second hand twirls; a bank account somewhere swells. The working class wager how best to sell their time more lucratively. The wealthy sit back and let the earnings roll in. Time itself profits the investor.

Credit, writes the economic historian William Goetzmann, is a time machine. Its seemingly magical capabilities convert current resources into future value. The heart of all finance systems lies in a simple time proposition: let this money become something more later. Credit serves as both the fast-forward and the plus button, telling a story of what might come. Debt is the story of what it all meant in the end. Creditors leverage stories of the past to justify actions in the future—whose debts get wiped clean and whose remain on the books for generations. If credit enables stories to move forward, debt indexes them. Debt covenants, the legal contracts of debt, frequently stipulate the payment order; in times of scarcity, these covenants mandate creditors’ rights to the most valuable resources, putting liens on crops and intercepting state aid for public schools to service debts. Debt creates a fixed plot, carving a line between two points: now and the next payment. It is this seizure of hope that roused the anti-austerity Spanish indignados’ cry “No Future.” In a time of debt, can there be such a thing as a future?

Education, which consists of its own system of credits, is also a time machine. It is a means to carry us to unknown horizons. It can dial the world forward. But it also steers important questions backward. What are our debts to one another? To the enslaved people who labored to build and maintain the institutions of learning? To the Indigenous nations whose land was expropriated for universities’ founding? To the millions of acts of unpaid and underpaid care labor undergirding each hour of scholarship—the wife typing out her scientist husband’s manuscript, the nanny minding the children while the professor works late, the janitor sweeping the hallways into the night? What would the civilization that paid back these debts look like? What kind of universities would be guided by these questions? I remember the look on a student’s face after reading Silvia Federici’s 1974 essay “Wages Against Housework.” Aghast, her cheeks slightly flushed, she realized she had built a vision for her life around a career that would allow her time both to make a wage and provide for a family. But why was she expected to have a family in the first place? And, more to the point, why was the labor of caring for others uncompensated? Her classmates fell silent when she asked education’s most dangerous question: Could there be another way?

The heart of this inquiry is a time proposition: an appraisal of past harms, an investment in future repair. An education driven by these questions is a speculative endeavor; it conjures a possible world into existence. But it gestures to something beyond mere accumulation—whether of wages, war power, or return on investment. An education that journeys into these questions is not an easy one. The late bell hooks wrote in 1994’s Teaching to Transgress, “Sometimes the mountaintop is difficult to reach with all our resources, factual and confessional, so we are just there collectively grasping, feeling the limitations of knowledge, longing together, yearning for a way to reach that highest point. Even this yearning is a way to know.” An education may not provide answers to these questions, but it can offer a gathering point, a place to ground the struggles necessary to answer them.

The classic image associated with higher education is the ivory tower. In some ways this is a fitting icon. A tower is used to hoard stores, surveil subjects, monitor enemy activity. So, too, has the university been used to accumulate and privatize resources and to build warfare capacity. But that is not its only tradition. The word campus comes from the Latin for “field,” and perhaps that legacy offers an alternative icon. Like a field, an education is open and muddy and ambiguous, available for play as well as production. Its outcomes are uncertain, its future dependent upon the struggles and labors that root within it.

Related Reads