“Get over it!”
I
am not talking to the people who are clinically depressed and saying “snap out
of it.” I am talking to the people who continue to question the legitimacy of
major depressive disorders and I’m saying “get over yourself.” I can tell you
right now that when you have close friends or relatives go through this, you
will no longer try to explain away concerns by telling people that everyone has
bad days, that they are simply tired and cranky, and that once they get fresh
air they’ll realize life is just dandy.
Depression
is a chronic illness and we need to treat patients – and their friends and
families – with the respect and care they deserve. It’s like diabetes or high
blood pressure: it’s real and it’s treatable. So, the next time someone asks
you if depression is a real illness or someone being wimpy about their sadness,
please set them straight for me. Further, if you recognize the symptoms we’ll
talk about, dare yourself to start a conversation with the person you’re
concerned about; as far as treatment goes, the sooner the better. (I’ve already
talked a bit about how the Affordable Care Act will impact coverage of psychiatric illness. In the future, I promise you posts
about alcoholism, anxiety, schizophrenia, and obsessive compulsive disorder. And,
yes, you guessed it: these will all come to you from the perspective of someone
who has friends and family who have suffered.)
Back to the basics
A
recent seminar at the medical school at Michigan left many students with
questions (some as blatantly inappropriate as “when did we decide depression
was an illness we should prescribe drugs for?”), so I’ll go back to cover a few
basics. The Diagnostic and Statistical Manual of Mental
Disorders (DSM) IV defines major depressive disorder as a syndrome,
where the diagnosis requires a patient to have at least five of the nine symptoms,
consistently, for more than two weeks. These symptoms are what you might presume:
depressed mood most of the day, diminished interest in almost all activities,
significant weight loss or gain, sleep changes, physical agitation, intense fatigue, excessive
guilt and feelings of worthlessness, diminished concentration and
indecisiveness, and suicidal thoughts. For this post, I’m focusing on major,
unipolar depression, but some of what we’ll talk about is also applicable to
bipolar disorder, particularly given that people with bipolar tend to
experience “lows” that are longer and more severe than the “highs.”
Over
the course of a lifetime, 16% of people get depressed. That’s a lot. Of those
people, only half get treated, despite what we now know about the host of
associated problems, ranging from loss of productivity to increased cancer
mortality. A few other interesting numbers: nearly 15% of women experience
post-partum depression, about half of patients who have experienced a heart
attack also develop depression, and more than a quarter of adult Americans suffer
from a diagnosable mental disorder in a given year. Those numbers are so high
that it’s essentially impossible that you don’t know someone who has dealt with
mental illness.
Disease
progression in depression is an interesting phenomena and it has much to do
with brain chemistry, some of which remains incompletely understood. What we do
know for sure, though, is that after one episode, there is a 50% chance of
another depressive period; after a second episode, there is an 80% chance of a
third. For the first depressive period, a relatively great stressor is required,
but the threshold for stress lowers during subsequent episodes until eventually
the trigger is endogenous. Suicidal thoughts and behaviors, arguably the most
alarming symptom of depression, are actually more related to impulsiveness than
severity of depression. In fact, hospitalization is basically recommended just
to give patients the time to change their minds. In studies of people with
suicidal intentions who had jumped off the Golden Gate Bridge and survived,
it’s found that the vast majority changed their minds on the way down.
Apparently for some people the impulsiveness passes rather quickly. And, no,
I’m obviously not at all proud that our quintessentially Californian bridge is
the world’s leading suicide site.
So what’s one to do? Think about
treatment. And then start right away.
Behavioral
changes are the first line of attack. Most doctors, psychologists,
psychiatrists, and therapists will recommend changes in exercise, sleep hygiene,
and the like. In conjunction with these lifestyle changes, cognitive behavioral
therapy (CBT) can have enormously positive influences. What exactly is CBT?
It’s the best thing since sliced bread (which actually sometimes gets stale,
anyhow). CBT addresses specific maladaptive behaviors and thought processes,
providing patients with explicit goals and systematic tools. It’s incredibly
effective for a whole host of disorders, including substance abuse, eating
disorders, and major depression. This – I promise you – is solid evidence-based
practice. For many psychiatric illnesses, current environmental stressors are a
much better predictor of disease progression than are early life factors. That
is to say, recent life stress and chronic stress matter more than a dysfunctional
childhood. Psychodynamic and Freudian types of talk therapy analyses are
decidedly out of vogue…since they are ineffective and were never based on hard
science. CBT, in contrast, won’t necessarily uncover all of your secret
histories and deepest conflicts, but will give you concrete tools to take with
you out the door and apply to your life, right that day. It helps you identify
your own, specific, individual negative beliefs and behaviors and learn to
replace them with positive, healthy ones. The principle of the therapy in some
ways stems from the age-old advice that even if you can’t change a situation
you can change the way you think and respond.
For
many people, behavioral changes are not enough, though, as the brain chemistry
is already too out of whack to rebalance itself without some help. There is
nothing wrong with needing medication and taking it says absolutely nothing
about a person’s intellect, personality or strength. If you are even remotely
tempted to think otherwise, please feel free to join the rest of us in this
century, along with common sense, decency, and science.
In
case I get a reader who doesn’t speak Science (and since I’m not completely
fluent in that language myself), we’ll stick with fairly basic information on
drugs. The vast majority of doctors start off by prescribing a selective
serotonin reuptake inhibitor (SSRI, such as Prozac), which increases serotonin
in the synapse of nerves. To make a long story short, serotonin is a neurotransmitter
that relays messages between brain cells that regulate mood, appetite, sleep, learning,
and sexual desire. Interestingly, the biochemical change happens in hours yet
the antidepressant effect usually takes weeks. We don’t completely understand
the mechanistic pathways involved and we haven’t ever measured levels of
serotonin in a living human brain. We’ve measured blood levels of
neurotransmitters such as serotonin (and, yes, levels are lower in depressed
people), but we actually don’t know how well blood levels correspond to brain
levels. Additionally, there is dispute among researchers about exactly why
SSRIs are effective, with some scientists claiming that it has less to do with
the increased serotonin-mediated messages and more to do with a regeneration of
brain cells that is induced by serotonin.
Other
commonly used drugs include serotonin and norepinephrine reuptake inhibitors
(SNRIs, such as Cymbalta) and norepinephrine and dopamine reuptake inhibitors
(NDRIs, such as Wellbutrin). The pertinent biochemistry here is that
norepinephrine is a stress hormone affecting attention and memory and dopamine
is a neurotransmitter implicated in the reward-driven learning system. The rest
of the antidepressants out there fall into one of three categories: tricyclic
antidepressants (yup, the chemical structure has three rings; they’re not used
much anymore), atypical antidepressants (they don’t fit neatly in another
category; they’re often taken in the evening to help with sleep), and monoamine
oxidase inhibitors (MAOIs, usually prescribed as a last resort since there can
be deadly interactions with certain foods). A little side note: MAOIs were
discovered back in the 1950s, during trials for new drugs to treat tuberculosis.
It turned out that the medication had psychostimulant effects on the
tuberculosis patients. The history of medicine is crazy! (Pun intended. Pun
always intended, tasteful or otherwise.)
For many, finding the right
medication takes a while. It’s a process of trial and error that frustrates
patients for many reasons: some medications take up to two full months to have
effect, the side effects are often worse at the beginning, and there can be
withdrawal symptoms if you stop too abruptly. There are a few recent
developments that can help patients find the right medication, including a blood test to
check for specific genes we think affect how your body uses antidepressants.
The cytochrome P450 (CYP450) genotyping test is one such predictor of how an
individual might metabolize a medication.
Last
resorts include ketamine and electroconvulsive therapy; we usually save these
for patients whose depression has resisted other forms of treatment. Ketamine causes
brief psychotic episodes, working much like phencyclidine (PCP, aka “angel
dust”) and other hallucinogens used recreationally. After a few hours, ketamine
then works as an antidepressant through its action as a glutamate antagonist. In
striking contrast to other drugs options, electroconvulsive therapy (ECT) is a
treatment that actually involves passing electrical currents through the brain,
theoretically to affect levels of neurotransmitters and growth factors. Despite
the suspicion surrounding ECT and its side effects, it typically gives immediate
relief even when other treatments have failed. Of all depressed people treated,
only 1 to 2% receives ECT.
Sadly, there’s still so much we don’t
know
Depression
is undoubtedly a veritable illness, but it’s not only “misunderstood” in the
social sense of the term, but also within realm of science. Given that it does
have a biological basis, perhaps it’s worth considering how evolution allows
this to continue happening. Why would this ever confer an advantage? Why are
there still people in the population who are predisposed? How on earth does
this play into survival of the fittest? I have a few speculative thoughts,
based – as per usual – on personal observation and somewhat anecdotal evidence.
Rates
of depression are much higher in first-world countries, which might suggest
something about how our lifestyles have changed faster than evolution can keep
up. We are experiencing stress and anxiety at a whole new level, what with this
new-fangled concept of “long-term goals” and “delayed gratification.” I’m
certainly in no position to pooh-pooh lengthy endeavors, given that I’m
committed to another decade of education and training, but I do sometimes
wonder if perhaps people in third-world countries – living closer to that hunter-gatherer
lifestyle – are better adapted to their types of daily stress. There is
evidence related to other mental illnesses, too; for instance, schizophrenic
people have a much better prognosis in rural India, likely due to the familial
and social structures in place and the markedly decreased stigmatization
compared to industrialized areas.
The
strong correlation between depression and anxiety is also worth touching on. At
first glance, it may seem counterintuitive that these two are related, since
they appear confusingly contradictory, but it turns out that many people
experience both and that one predisposes you to the other. Without going into
too many details, suffice to say that once you look at the neurotransmitters
involved in states of arousal (the nervous energy kind, but also the sexual
kind) and depression, it actually makes a lot of sense. There have been some
particularly interesting recent studies with medical interns and residents (generally
stressed, anxious people) about the differences in rates of depression between
men and women. While about 20% of men developed depression, a full 40% of women
did. Further, having a child during internship increased the likelihood of
depression for women but had no effect of risk for men. More broadly, people
who attended medical school outside of the United States were significantly
less depressed. I’m not even going to try to go into that – way too many
confounders and way too much opportunity to say things that are far from
politically correct – but you can use your imagination as to why that might be.
There
are some promising directions for future depression treatment research and it’s
these questions we should be addressing, not the silly ones about whether it’s
a major problem or not. Given what we see in patients with both depression and
immune dysregulation, is it likely that cytokines play a role in disease onset?
Would anti-inflammatory drugs alter depressive symptoms? Since we see so much depression after strokes
and heart attacks, should we investigate other cardiovascular factors? As about
40% of variance in depression can be attributed to genetics, should we devote
more effort to searching for specific genetic polymorphisms? How useful is
functional magnetic resonance imaging (fMRI) in studying the changes in
activity and connectivity of depressed patients? Is it relevant that depressed
people have smaller hippocampi and that their amygdalae are more strongly
connected to their prefrontal cortices?
Scientist
or not, I hope you learned something today…or, at the very least, revisited and
reflected on important issues. If you take away anything at all, let it be
this: depression is real, chronic, and treatable. Some of the most incredible
people I know have dealt with it and mental illness should never, EVER be
stigmatized. Let’s get on with it, progress with research, and talk about it
openly.
I wonder what exactly you mean by "For the first depressive period, a relatively great stressor is required..." It seems to me that if 40% of the variance in depression is correlated with genetic factors, maybe the onset doesn't need to be "caused" by a precipitous event, but rather, one's brain is just programmed to have the various levels of neurotransmitters that cause depression/anxiety?
ReplyDeleteAlso, are there studies that look at blood-levels of seratonin in conjunction with CBT? (I'm sure there are... point me to them?) And I was wondering if you knew off the bat- when blood-levels are taken, are they taken constantly over a period of time and then averaged, or is just one point estimate usually taken? Seems like a point estimate could be misleading.
I know one of the problems with depression studies is that a lot of it uses self-reporting of symptoms, which is problematic for lots of reasons. I wonder how much blood-level studies are used and how reliable they are...
Great article. Thanks for writing!
Thanks for reading, Candace! I always appreciate your thoughtful comments and questions, and it’s true that when I am not submitting articles to real journals I tend to get lax about citations. First off, you are absolutely correct that for many people there is no “causal event” for the first period. Many people you ask could point to something about their school/work/personal life that changed before the preliminary onset of depression, but there are also a huge number of people who can’t…and this of course makes their illness no less legitimate. These two articles explain some of the basic ideas surrounding major life events, genotypes, and serotonin levels. Interestingly, we don’t yet see any specific evidence for interactions between stressful life events and genotypes, though I would not be surprised if we find some in the future. Depression is definitely one of those “multifactorial” syndromes, much like Type II diabetes, where the genetic information we have is still much less useful than the behavioral/physiological information.
ReplyDeletehttp://psycnet.apa.org/psycinfo/2008-01363-017
http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=270166
Most of the studies I’ve seen that look at serotonin levels in conjunction with CBT are to evaluate efficacy of CBT vs. SSRIs vs. both. I’ll have to look through stuff to see if I can find a good one explaining relative levels. You make a good point that an estimate at a given time point could be misleading (yes, this is what they usually do…people don’t want to give blood many, many times per day) but usually the samples are large enough that the analyses consider this type of variation. Of course levels do vary throughout the day, and can even change depending on what you eat, but ranges are fairly tight for a given individual. Also, here is a cool article explaining how we can at least measure serotonin transporter in brain, if not actual serotonin levels.
http://archpsyc.jamanetwork.com/article.aspx?articleid=482176
Finally, I agree with you that self-reporting of symptoms is problematic in a lot of ways, yet some of the symptoms we care about most simply HAVE to be reported by the individual. Many intake evaluations now try to get some information from others as well (or they use questions that ask in a less personal way), about productivity at work, relationships with friends, etc., but it’s still an area that could use much improvement. I think there is a big push to find more reliable blood tests that could help diagnose and predict types of successful treatment for an individual, yet I’m not convinced we’re quite there. Then again, this came out this year:
http://www.nature.com/tp/journal/v2/n4/full/tp201226a.html