A familiar scene plays out again and again in American public life in the 21st century. In the wake of a mass shooting such as the one in Parkland, FL, commentators, pundits, and politicians all gather around to talk about the country’s broken mental health system and suggest its connection to the violence.

Nikolas Cruz—the suspected gunman at Marjory Stoneman Douglas High School in Parkland, FL—being arrested (left). A graph depicting mass shooting deaths in the U.S. from 1982 to 2016 (right).

Their solutions, however, are few to none. Whether Nikolas Cruz’s mental illness was a factor in the shooting is still being investigated, but the ease with which we talk about a defective mental health system is juxtaposed with a paucity of concrete solutions.

This pattern raises the question of whether the American mental health care system is in fact broken. The metrics we have don’t paint an encouraging picture.

The U.S. Department of Health and Human Services reports that one in five Americans has experienced issues with mental health; and one in ten youth have suffered a major bought of depression.

A vigil for increasing mental health care at Cook County Jail in 2014 (photo credit: Sarah-Ji).

The effects of mental illness on life quality of life and health outcomes are significant. Individuals with severe mental illness such as schizophrenia, major depressive disorder, or bipolar disorder (about four percent of the population) live on average 25 years less than other Americans. As many as a third of individuals with a serious diagnosis do not receive any consistent treatment.

The mentally ill are far more likely to be the victims of violent crime rather than the perpetrators. Only 3-5% of violent crimes can be tied in some way to a person's mental illness, and people with mental illnesses are ten times more likely to be the victims of violence than the general public.

And while the relationship between mental illness and poverty is complicated, having a severe mental illness increases the likelihood of living in poverty. According to some estimates, a quarter of homeless Americans are seriously mentally ill.

Darren Rainey, who suffered from schizophrenia, died in 2012 from burns to over ninety percent of his body after prison guards locked him in a shower for two hours with 180°F water (left). A graph and chart showing the percentage of inmates with and without mental health problems in state prisons in 2006 (right).

Most troubling, perhaps, is the criminalization of mental illness in the United States. At least a fifth of all prisoners in the United States have a mental illness of some kind, and between 25 and 40 percent of mentally ill people will be incarcerated at some point in their lives.

A study by Human Rights Watch revealed that prison guards routinely abuse mentally ill prisoners. Darren Rainey, a mentally ill prisoner at the Dade Correctional Institution in Florida, was boiled to death in a shower after being locked in it for more than two hours by prison guards.

More are sent to prison in part because fewer mental health facilities are available. The disappearance of psychiatric hospitals and asylums is part of the long-term trend toward “deinstitutionalization.” But jails and prisons have taken their place. Today, the largest mental health facilities in the United States are the Cook County Jail, the Los Angeles County Jail, and Rikers Island.

The entrance to Cook County Jail in Chicago, IL (left). The Angeles County Jail in downtown Los Angeles, CA (center). An areal view of the Rikers Island prison complex in New York City (right).

 So how did we get to the point where mental illness is frequently untreated or criminalized?

Activists, advocates, and professionals like to pin the blame on Ronald Reagan, particularly his 1981 Omnibus Budget Reconciliation Bill, which raised defense spending while slashing domestic programs. One of the cuts was to federal funding for state community mental health centers (CMHCs).

President Ronald Reagan outlining his tax plan in a televised address from the Oval Office in 1981.

However, attributing the present state of the system solely to Reagan would ignore the prevailing patterns in mental health care that came before him. Three impulses have long shaped the American approach to mental health treatment.

One is an optimistic belief in quick fixes for mental illness to obviate long-term care, ranging from psychotropic medications to eugenics. The second is a more pessimistic determination to make the system work as cheaply as possible, often by deferring the costs to somebody else and keeping them from public view. Last is the assumption that people with mental illnesses are undeserving of charity, either because of genetic defects or because they should be curable and thus not under long-term care.

Indeed, mental health care occupies a paradoxical place in the history of social welfare in the United States, where aid is socially accepted only for the “deserving needy.” People with mental illnesses rarely fit this mold. At times, behaviors deemed socially aberrant were classified as mental illness (the American Psychiatric Association designated homosexuality a mental illness until 1973).

Gay rights activists Barbara Gittings and Frank Kameny and Dr. John E. Fryer—a gay psychiatrist in disguise—at a panel discussion at a 1972 American Psychiatric Association conference the year before the association removed homosexuality as a mental disorder from its diagnostic manual.

The Birth of the Asylum and the Hospital

The nineteenth century saw the growth of something like an organized asylum system in the United States. Asylums themselves were nothing new. London’s Bethlem Royal Psychiatric Hospital, better known as Bedlam, was founded in 1247. In the United States however, the creation of these asylums took time, in part because their cost was deferred to state governments, which were leery of accepting the financial burden of these institutions. Consequently, local jails often housed ill individuals where no local alternative was available.

An engraving of Bethlem Royal Psychiatric Hospital in London, England around 1750.

Early in the 19th century, patients in asylums were called “acute” cases, whose symptoms had appeared suddenly and whom doctors hoped to be able to cure. Patients who were deemed “chronic” sufferers were cared for in their home communities.

The so-called chronic patients encompassed a wide range of people: those suffering from the advanced stages of neurosyphilis, people with epilepsy, dementia, Alzheimer’s disease, and even alcoholism.

The number of elderly patients in need of assistance and treatment increased in tandem with increasing lifespans during the 19th century. As county institutions grew crowded, officials transferred as many patients as they could over to new, state-run institutions in order to lower their own financial burdens.

Oregon State Hospital for the Insane opened in 1883 and is one of the oldest continuously operated hospitals on the West Coast (top left). Oregon State Hospital was both the setting for the novel (1962) and the filming location (1975) of Ken Kesey's One Flew Over the Cuckoo’s Nest (top right). The patient population at Eastern Oregon State Hospital tripled in its first fifteen years (bottom left). Built to relieve the overcrowding at Oregon State Hospital, Eastern Oregon State Hospital in Pendleton itself quickly became overpopulated (bottom right).

Oregon State Hospital’s story is typical. It housed a population of 412 in 1880, expanded to nearly 1,200 by 1898, and in 1913 opened a second state hospital to house a patient population that had more than quadrupled since 1880.

Most other states confronted similar circumstances. Some built a host of smaller institutions in different counties while others concentrated their populations in a few large institutions. But the end result was the same: hospitals proliferated and grew bigger. New York’s inpatient population (which, to be sure, had outsized proportions) was 33,124 in 1915; by 1930, it was 47,775.

New York’s first state-run facility for the mentally ill, the Lunatic Asylum at Utica opened in 1843 and adopted “moral treatment” methods.

As the institutionalized population mushroomed, treatment of the mentally ill evolved. Doctors throughout the 19th century placed their hopes in what was they called “moral treatment,” rehabilitation through exposure to “normal” habits. In many cases, these habits included working. Most institutions were attached to farms, partly to provide food for the people living there, but also to provide “restorative” work. Others had workshops.

There is, at best, mixed evidence on whether such treatments were effective, although supporters claimed high rates of recovery for patients treated in asylums. In any event, moral treatment was only ever intended for acute cases, so it fell out of fashion under pressure from the ever-multiplying population in hospitals.

Patients performed manual tasks like shoe-making at the Willard Asylum for the Insane in New York (left). Female patients engaged in agricultural labor at a mental health facility (right).

Combined with changing patient demographics, hospitals were increasingly serving as custodial institutions. Doctors working with patients suffering from dementia or late-stage neurosyphilis could not expect those in their care to improve. The role of medical professionals shifted from therapy to caretaking.

Prevention: Eugenics as a “Cure” for Mental Illness

Discontented with the idea of being mere caretakers, psychiatrists began to work toward cures and preventive techniques in the late 19th and early 20th centuries. The most conspicuous manifestation was the growth of eugenics and forced sterilization. These “cures” targeted specific populations, such as immigrants, people of color, the poor, unmarried mothers, and the disabled.

Southern asylums in the Jim Crow era were segregated and ones for African Americans received far less funding and accordingly suffered from chronic overcrowding, abuse, and generally deplorable conditions. An investigative commission in 1909 found Montevue Asylum in Maryland to be one of the state’s worst facilities. Patients there slept on floors with minimal bedding (left), were often shackled (center), and had little space during the day (right). 

Although expressing some reservations about who was receiving eugenic treatment, many psychiatrists enthusiastically supported it. While doctors remained skeptical about the possibility of curing people with severe and persistent mental illness, preventing it through eugenics promised to solve the problem for future generations.

In 1896, Connecticut became the first state to prohibit marriage for epileptics, imbeciles, and the feeble-minded. In 1907, it was also first to mandate the sterilization of an individual after a board of experts recommended it. Thirty-three states ultimately adopted sterilization statutes, though certain states carried out a disproportionate number of these, with California alone accounting for a third of such operations. Ultimately, more than 65,000 mentally ill people were sterilized.

A 1929 map of states that had implemented sterilization legislation (left). Carrie Buck and her mother Emma Buck at the Virginia Colony for Epileptics and Feebleminded in 1924 (right). Emma had been committed after accusations of immorality, prostitution, and having syphilis. Her daughter was committed after becoming pregnant at seventeen as the result of a rape. 

While we now know that these sterilizations did not prevent mental illness, courts supported the programs. In Buck v Bell, Supreme Court Justice Oliver Wendell Holmes, Jr. argued that sterilizations did not violate people’s rights, concluding “three generations of imbeciles is enough.”

After World War II, revelations about Nazi war crimes turned many citizens against such procedures, but the procedures persisted in some places well into the late twentieth century, disproportionately affecting racial minorities. In Oregon, for example, the Board of Social Protection performed its last surgical sterilization in 1981 and disbanded two years later.

A protest against forced sterilizations in North Carolina around 1971 (left). A historical marker in Raleigh, NC regarding the 7,600 people sterilized in that state (right).

From Prevention to Treatment

Beginning in the early 20th century, some doctors wanted to try new treatments for mental illness rather than preventive measures. They focused on the body instead of lifestyle or psyche. In trying to find physiological origins for maladies, psychiatrists hoped they might treat schizophrenia, manic depression, and other illnesses.

Electroconvulsive therapy (ECT), which induces seizures in people through a series of electrical shocks, became one of the most famous such treatments and is still in limited use today. ECT remains controversial, not least because of its use on non-consenting individuals and its side effects.

Electroconvulsive therapy being administered at a Liverpool, England facility in 1957.

But clinical data indicates it can be effective in mitigating or eliminating symptoms for long periods of time. The same cannot be said for other treatments for schizophrenia and bipolar disorder that emerged in the 1920s.

Building off of the success of malaria therapy in curing syphilis (which involved deliberately exposing patients to malaria), the Austrian therapist Manfred Sakel introduced insulin shock therapy in 1927 as a cure for schizophrenia. He injected patients with successively larger doses of insulin, often to the point of inducing a coma, then revived them with glucose and repeated the procedure. The more fortunate patients emerged from this with considerable weight gain; the less lucky with permanent brain damage or a persistent comatose state.

A nurse administering glucose to a patient receiving Insulin Shock Therapy in an Essex, England hospital in 1943 (left). An image of removed teeth from Henry Cotton's The Defective Delinquent and Insane (1921) (right).

Because the psychiatric profession was still relatively small and the bureaucracy around mental health care was primarily concentrated in hospitals, individual doctors could often experiment to see what would work.

Henry Cotton, a doctor at New Jersey State Hospital from 1907 to 1930, for example, believed that mental illness was the product of untreated infections in the body: he removed patients’ teeth, tonsils, spleens, and ovaries to try and ameliorate their symptoms. Mortality for these procedures was 30 to 45 percent.

Perhaps the most extreme example of a physical treatment was lobotomization. Developed by Antonio Egas Moniz, doctors severed connections between the prefrontal cortex and the rest of the brain by either drilling through the skull or inserting an implement past a person’s eye. Around 40,000 lobotomies were performed in the United States. A few individuals recovered or showed improvement, but most showed cognitive and emotional declines, while others became incapable of caring for themselves or died.

Doctors Walter Freeman (left) and James W. Watts (right) studying an X-ray before a psychosurgical operation.

None of these treatments arrested the alarming growth of patient populations in state institutions. In the case of insulin therapy or Dr. Cotton’s surgeries, we can see now there was no connection between the treatment and mental illness. Such treatments simply traumatized patients or inflicted lasting physical harm.

Patients vs. Budgets

The Great Depression placed further strain on these institutions and hospitals became dangerously overcrowded. States reduced appropriations for their major state hospitals while counties began sending even more people to state institutions. Spending on patient care varied widely across the nation. In 1931, New York spent $392 per capita on hospital maintenance, Massachusetts $366, Oregon $201, and Mississippi only $172.

Under these conditions, the quality of care deteriorated. For instance, Creedmoor Hospital in New York made headlines in 1943 following an outbreak of amoebic dysentery among patients. In Salem, Oregon, a patient accidentally put rat poison in the scrambled eggs in 1942, killing 47 people and sickening hundreds—a painful example of how sloppily the hospital was run.

In 1948, the journalist Albert Deutsch released a book called The Shame of the States in which he cataloged various abuses he witnessed in state hospitals: overcrowding, beatings, and a near absence of rehabilitative therapy.

The movie The Snake Pit (1948) brought these conditions to life, showing the different levels of a hospital, including the “snake pit,” where patients deemed beyond recovery were abandoned in a padded cell.

Journalist Albert Deutsch published a catalogue of abuses in state hospitals in 1948 (left). One of the images in Deutsch’s The Shame of the States of an overcrowded day-room in a Manhattan asylum (center). The 1948 film The Snake Pit depicted a semi-autobiographical story of a woman in an insane asylum who could not remember how she got there (right).

Such attention, along with World War II, mobilized public support for reforms to mental health care. The sheer number of potential soldiers rejected for service on psychiatric grounds—1.75 million—shocked the public. Then, the large number of psychological casualties among men, many of whom suffered from what we now would call Post-Traumatic Stress Disorder, also suggested that environmental stress could contribute to psychological problems.