Editor’s Note: Although one in five Americans currently takes at least one psychiatric drug and mental disorders are recognized worldwide, global pharmaceutical industry funding for new, innovative medications is in serious decline despite promising advances in genetics. The author traces the evolution of psychiatric drug development, the reasons for its retreat, and the changes necessary to meet the growing demand.

During the past three years the global pharmaceutical industry has significantly decreased its investment in new treatments for depression, bipolar disorder, schizophrenia, and other psychiatric disorders.1 Some large companies, such as GlaxoSmithKline, have closed their psychiatric laboratories entirely. Others, such as Pfizer, have markedly decreased the size of their research programs. Yet others, such as AstraZeneca, have brought their internal research to a close and are experimenting with external collaborations on a smaller scale.

This retreat has occurred despite the fact that mental disorders are not only common worldwide, but also increasingly recognized by healthcare systems. There is, moreover, vast unmet medical need, meaning that many individuals with mental disorders remain symptomatic and often disabled despite existing treatments. For example, people suffering with the depressed phase of bipolar disorder often continue to experience severe symptoms even when they take multiple medications with serious side effects. For some significantly disabling conditions, such as the core social deficits of autism and the cognitive impairments of schizophrenia, there simply are no effective treatments. Because mental disorders are highly prevalent and our ability to treat them remains limited, these illnesses cause enormous societal burden. In aggregate, they are the world’s leading cause of disability.2

In addition, this retreat has happened despite the fact that different classes of psychiatric drugs have been among the industry’s most profitable products during the last several decades—and despite the fact that, according to Medco Health Solutions, one in five American adults now takes at least one psychiatric drug. Among the earliest commercial successes were the Valium-like benzodiazepines, used both as tranquilizers and as sleeping pills. These were followed by the Prozac-like selective serotonin reuptake inhibitor (SSRI) antidepressants. Most recently, “second-generation” antipsychotic drugs have been among the global revenue leaders for the pharmaceutical industry, serious side effects notwithstanding. That’s why it’s surprising that almost all industry research dollars are invested in cancer, metabolism, autoimmunity, and other disease areas. As the expiration of patents on blockbuster drugs squeeze budgets, companies perceive their withdrawal from psychiatry as an unfortunate but rational reallocation of research resources.3 This withdrawal reflects a widely shared view that the underlying science remains immature and that therapeutic development in psychiatry is simply too difficult and too risky.

The Diagnosis 
The scientific issues facing translational psychiatry—the application of basic discoveries in neuroscience, genetics, and psychology to understanding disease and to advancing therapeutics—are daunting. The molecular and cellular underpinnings of psychiatric disorders remain unknown; there is broad disillusionment with the animal models used for decades to predict therapeutic efficacy; psychiatric diagnoses seem arbitrary and lack objective tests; and there are no validated biomarkers with which to judge the success of clinical trials.4,5 As a result, pharmaceutical companies do not see a feasible path to the discovery and development of novel and effective treatments . Given the steady stream of drugs that have gained approval during recent years for treating depression, anxiety, schizophrenia, and bipolar disorder, this scientific stall may have seemed to come out of the blue. However, payers (both insurance companies and governments) and regulatory agencies have given up their willingness to accept even more expensive new drugs that, despite marketing efforts, have turned out to be no more than variations on very old themes.3,4 

Even if current drugs recycle old action in the brain, the existing pharmacopeia is a great blessing to many patients and their families. That said, progress for the many patients who respond only partially or not at all to current treatments requires the discovery of medications that act differently in the brain than the limited drugs that we now possess. The molecular actions of all widely used antidepressants, antianxiety drugs, and antipsychotic drugs are relatively unchanged from their 1950s prototypes. Current antidepressants alter levels of the neurotransmitters (serotonin or norepinephrine) in synaptic connections between certain nerve cells in the brain. This is the same basic action of the first modern antidepressant imipramine, discovered in 1957. Antipsychotic drugs act on several different neurotransmitter receptors in the brain, but the critical shared mechanism of all current antipsychotic drugs is blockade dopamine D2 receptors, the same mechanism of the prototype antipsychotic drug chlorpromazine, discovered in 1950.

More problematic is the failure to improve efficacy, although significant progress has been made since the 1950s on safety and tolerability. Even the most recent antidepressants are no more effective than imipramine. The early antidepressants could prove deadly in overdose; selective SSRIs and other modern antidepressants are far safer and have far milder side effects, permitting far wider use. All antipsychotic drugs, with the exception of clozapine (discovered in the 1960s), have roughly the same efficacy as chlorpromazine. Clozapine clearly benefits some patients with schizophrenia and bipolar disorder, even when other drugs have failed. But the basis of its greater efficacy remains mysterious. Moreover, clozapine’s very severe side effects limit its use.

Notably, existing antipsychotic drugs, including clozapine, treat only a subset of the symptoms of schizophrenia, such as hallucinations and delusions. None of the existing drugs improve the cognitive schizophrenia symptoms that are responsible for much disability. The second-generation antipsychotic drugs have far less tendency than older drugs to cause serious motor system side effects, some of which mimic Parkinson’s disease. But the newer drugs carry their own burden of serious side effects, such as significant weight gain and elevated levels of glucose and lipids.

Arriving at the Crossroads
The mid-20th century saw the birth of modern psychopharmacology, specifically the discovery of the first medicines that could effectively treat symptoms of specific disorders. Serendipitous observation followed by intelligent follow-up played an important role in the history of medicine, famously including Alexander Fleming’s discovery of penicillin. Fleming studied rather than discarded the mold-contaminated petri dishes on which he had plated bacteria. Similar serendipity was involved in discovering the utility of lithium as well as the prototype antipsychotic, antidepressant, and benzodiazepine drugs.

In 1949, John Cade, who was interested in the properties of uric acid, recognized that it was the lithium moiety of his lithium urate salts that was sedating his guinea pigs. This led him with breathtaking rapidity to test lithium on patients with mania. In 1950, the French surgeon Henri Laborit tested the new drug chlorpromazine, originally developed as an antihistamine, as a medication to be used before general anesthesia. Based on chlorpromazine’s sedating properties, he recommended that his psychiatric colleagues, Jean Delay and Pierre Deniker, test it on agitated psychotic patients. Remarkably, the sedation turned out to be a side effect; the true benefit of chlorpromazine (later branded as Thorazine) was its ability to diminish the hallucinations and delusions of patients with schizophrenia and related disorders.

As chemists attempted to improve upon the three-ring structure of chlorpromazine, one of the compounds that emerged, imipramine, failed to treat psychosis but markedly elevated mood. Imipramine was then developed as the first of the tricyclic antidepressants, and it became the prototype antidepressant that increases the concentration of the monoamine neurotransmitters (serotonin or norepinephrine) in synapses. It works by blocking the reuptake “pump” that normally removes  these neurotransmitters from synapses after they have delivered their signal. The other major antidepressant mechanism, monoamine oxidase inhibition, was based on yet another serendipitously identified drug, iproniazid. It was synthesized in attempts to produce chemical alternatives to isoniazid, a drug used to treat tuberculosis. Iproniazid failed to treat tuberculosis but markedly improved the depressed mood of the chronically ill patients in its clinical trial.

Pharmaceutical companies wanted to follow up on these remarkable discoveries in the 1950s in order to produce additional revenue-generating treatments. Today, in most fields of medicine, scientists are able to identify the molecules in the brain with which these drugs interacted and then to exploit these drug targets to develop new medications. Scientists might identify new targets unrelated to existing drugs via knowledge of the genetics of disease risk or understanding of the molecular mechanisms of the illness. In the 1950s and 1960s, however, the biochemical and molecular tools for identifying neurotransmitter receptors did not exist, and the existence of neurotransmitter reuptake transporters was not known. (The discovery of antidepressant and antipsychotic drugs motivated research in the 1950s and 1960s that would eventually yield Nobel Prizes for Julius Axelrod in 1970 and Arvid Carlsson in 2000. Axelrod and colleagues discovered neurotransmitter reuptake mechanisms; Carlsson recognized that antipsychotic drugs must work by blocking dopamine receptors. )
Lacking molecular tools, pharmacologists working in psychiatry developed assays, mostly in laboratory rats, based on the effects of prototype drugs such as chlorpromazine, imipramine, and chlordiazepoxide (Librium) on animal behavior. For example, a rat placed in a beaker of cold water will swim for a time, but it will eventually stop swimming and begin to float. The antidepressant drug imipramine prolongs the period during which the rat attempts to swim. This forced swim assay was rationalized with anthropomorphic terms such as “behavioral despair,” but it was never shown to model human depression. Using the forced swim and a battery of other assays, new compounds were screened for possible antidepressant efficacy. Compounds that acted like imipramine were deemed candidate antidepressants, as long as they were not too toxic, and often tested in human clinical trials. Such assays ended up being black box drug screens.

Despite some degree of surface plausibility—the forced swim result seemed to mimic learned helplessness, a putative model of depression—the mechanism by which imipramine causes continued swimming is still not known. Moreover, imipramine increases the duration of swimming with a single dose, whereas depressed human beings generally require several weeks of treatment before therapeutic effects emerge. A large number of compounds that passed such black box assays were ultimately approved as antidepressant, antipsychotic, and anxiolytic drugs; thus, the approach seemed to be succeeding. From the very beginning, however, astute scientists predicted that this approach to drug development would result in the identification of only “me too” drugs. Unfortunately, during the last 50 years that concern has been fully borne out. Perhaps the most troubling aspect of excessive reliance on these black box assays is that potentially efficacious drugs with novel mechanisms of action (e.g., possible antidepressants that are different in mechanism from imipramine) may well have been screened out and discarded.

Since the 1950s, scientists have worked to go beyond these assays and to create animal models of human depression, bipolar disorder, schizophrenia, and other psychiatric disorders. Relying mostly on laboratory rodents, they have used a range of tools from environmental stressors to the insertion of human disease risk genes into the brains or germ lines of mice. If we are to be clear-eyed about the results to date, we would have to conclude that none of these models have proven adequate. Animal research remains critical for basic science, including investigations of basic molecular pathways on which drugs act. However, the use of animals to produce “good enough” models of real human diseases requires that the disease mechanisms be conserved in evolution between the animal selected and human beings. In short, psychiatry will advance not by rejecting animal research, but by showing appropriate circumspection about the use of animals to model diseases.

Another important lesson is that even effective drugs may not prove to be useful keys to understanding disease mechanisms. Even if drugs that block dopamine receptors treat psychotic symptoms, it does not follow that the fundamental problem is excess dopamine any more than pain relief in response to morphine suggests that the original problem is a deficiency of endogenous opiates. If the gains made in other fields of medicine are to serve as models for psychiatry, it is time to make new attempts to understand fundamental disease mechanisms and to apply what is learned to therapeutics.

Discovery and Development
Industry plays a critical role in producing new treatments. There is overlap between research in industry and in academia—each realm has different core strengths. Academics are able to tackle projects that are too risky and long-term for companies focused on increasing stock prices or returning dividends to shareholders. The equation also includes the funders of academic science—largely governments and foundations, which also have a far longer time horizon than do investors in private companies. Funded by tax and philanthropic dollars, academics can accept greater risk and can recognize that the significant leaps into the unknown by which science progresses may have no obvious practical implications in the short run, and that creative explorations commonly produce failure more often than they produce success.

Academia is therefore in a better position than industry to investigate basic mechanisms of disease, as well as other matters that may ultimately be relevant to therapeutics but are still distant from the design of products. Such research can reveal genes, proteins, or other molecules that, if activated, blocked, or otherwise modified, might exert therapeutic benefits and be labeled drug targets. If the research convincingly demonstrates a potential role in the disease processes, the research may earn a “validated drug targets” label. At this point, pharmaceutical companies generally need to take the next step, which is to search for chemical compounds that bind to and modify the target in desired ways. It is the role of medicinal chemists, most often company based, to painstakingly synthesize and study variations on promising chemical compounds in the search for a good drug—a chemical compound that has the desired effect on the target, that is not too toxic, that can be absorbed into the body, and, in the case of psychiatric medicines, that can enter the brain. Once identified, the promising drug must be tested in humans and shown to have the desired therapeutic effects without disproportionate side effects. The clinical trials that take a chemical compound from the stage of discovery to approval by regulators such as the Food and Drug Administration are extensive and costly. Large companies have the necessary resources and experience to orchestrate such trials, which are critical steps in producing new medicines for psychiatric disorders.

Recently, venture capitalists have been exploring ways to pick up the slack, but so far they have not made a significant difference. While they might make an initial investment in a biotechnology start-up, it is far more likely that their return will come through an initial public offering of stock (assuming that the start-up is able to create a successful product) and/or a decision to sell off the start-up to a large pharmaceutical company. As large companies withdraw from psychiatric drug development, interested parties have considered various solutions, including public-private partnerships that would involve government, academia, and industry in an attempt to decrease the risks inherent in drug discovery and development. One such partnership, the Alzheimer’s Disease Neuroimaging Initiative (ADNI), has found potentially useful biomarkers for clinical trials that could be used by any company.6Such partnerships may help, but since knowledge concerning most psychiatric disorders is less advanced than that concerning Alzheimer’s disease, it is far more important to encourage basic understandings of disease processes. In short, without more basic advances, the climate doesn’t seem favorable yet for the creations of partnerships to find biomarkers for psychiatric disorders.

Recent Advances
As long as we guard against renewed self-deception about what constitutes meaningful advances, there is good reason to feel optimistic about the long-term future of translational psychiatry—despite its palpable scientific challenges. My optimism is based partly on the extraordinary vitality of neuroscience and perhaps, even more important, on the emergence of remarkable new tools and technologies to identify the genetic risk factors for psychiatric disorders, to investigate the circuitry of the human brain, and to replace current animal models that have failed to predict efficacious new drugs that act by novel mechanisms in the brain. New ideas are, of course, central to scientific progress, but new tools can open up unexpected worlds and thus undergird the formulation of truly novel hypotheses. As brilliant as Galileo was, without advances in optics, he would not have observed the four moons of Jupiter that undergirded new models of the solar system.

Crucial to our better understanding moving forward is the clue that many psychiatric disorders run strongly in families. Based on family and twin studies, autism, schizophrenia, bipolar disorder, and attention-deficit/hyperactivity disorder are among the most common familial disorders. Far from having simple patterns of inheritance, we now know that psychiatric disorders result from the interaction of a very large number of genes—almost certainly hundreds—with different genetic pathways of risk in different families. To state this in another way, the aggregate influences of different subsets of risk-associated genes produce high likelihoods of autism, schizophrenia, or bipolar disorder in particular families (depending on which genes are involved). Moreover, different disorders such as schizophrenia and bipolar disorder share many risk genes but also have unshared risk genes.7 Finally, genetic risk is not fate: both chance and specific environmental factors operate in the context of a genetic risk background.

The ability to convincingly identify the many risk-associated genetic variants— some common in populations, others very rare—has proven necessary to study tens of thousands of patients and to compare them with roughly equal numbers of healthy individuals. Such an approach was simply not feasible until modern genomic technologies made it possible to determine genotypes (to use tools such as gene chips to identify specific variants in particular places in the genome) in an inexpensive and highly comprehensive manner and to sequence DNA rapidly, accurately, and cheaply. Although steady improvements in the technologies continue, the cost of DNA sequencing has declined approximately one million–fold over the last decade. Just over a decade ago, at the time of the human genome project, the cost of determining each “letter” in the genetic code was about one dollar; given that there are 3 billion such letters (or base pairs) in each human genome, the cost was $3 billion. The cost today of sequencing most stretches of DNA can be as low as $.07 for 1 million base pairs. As a result, we have gone from knowing about a handful of likely risk-associated genetic loci for schizophrenia in 2008 to approximately 75 by the end of 2012. As larger population samples are collected, progress is accelerating in the genetic dissection of schizophrenia, bipolar disorder, and autism.

A list of risk-associated genes does not guarantee an understanding of disease or of new therapies. One exciting recent development is the emerging recognition that genes involved in schizophrenia, bipolar disorder, and autism do not represent a random sample of the genome. Rather, the genes are beginning to coalesce into identifiable biochemical pathways and components of familiar neural structures. More excitement comes from the finding that a large number of risk-associated genes in autism, schizophrenia, and bipolar disorder code for proteins involved in the structure and function of synapses.

Our best hope is that the genetics will unfold over the next several years, due to the efforts of large international consortia that have formed to recruit and to study patients. As genetic clues accumulate, scientists are devising new ways to investigate their neurobiological functions and dysfunctions. One interesting development is to use stem cell technologies to complement the use of laboratory animals with human neurons engineered from skin cells of healthy subjects and from patients. The leading approach is to take a small skin biopsy from the arms of volunteers and to transform skin fibroblasts into neural progenitors and into neurons. Genetic engineering can then be used to add risk-causing mutations to “healthy” neurons and to reverse risk mutations in patients’ neurons. But it is still early in this new field, and it is not yet possible to engineer the specific kinds of neurons implicated in schizophrenia by postmortem studies.

This barrier is likely to fall soon. Whether or not engineered neurons or human neural circuits on a chip prove to be good systems for studying gene function, researchers will make substantial efforts to turn genetic clues into ideas for therapeutics. Many researchers hope that such efforts will help attract the pharmaceutical industry back to psychiatry by demonstrating new paths to treatment development. The emerging genetic results may be the best clues we have ever had to the etiology of psychiatric disorders. If other areas of medicine can guide us, there is enormous promise in deprioritizing existing drugs and old-fashioned animal-based assays as investigative tools and instead focusing on actual disease mechanisms identified by genetics. Technology has only recently begun to make this possible.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s