Saturday, December 31, 2011

Ring in the New Year

Whew.  A bit of a hiatus there.  I hadn't quite realized how much I had overextended myself until decamping to Texas for some serious sitting around.  There I had all sorts of unrealized ambitious plans to read and get caught up, visit some local Crossfits, answer some emails, etc.  I did very little at all besides the holiday stuff, a handful of morning runs, and reading some fiction.  Since the whole family was recovering from a steady onslaught of viral preschool nastiness that began before Thanksgiving, I sorely needed the rest.  A very mild but persistent case of walking pneumonia had settled in, so it was nice to take it easy and finally get well.

In the midst of the sitting around, I was able to visit two Austin "real food" restaurants, Hudson's on the Bend (where I had rattlesnake cakes - not gluten-free but very tasty, and Hudson's has several gluten-free options) and Foreign and Domestic (steak and pigs brains.  I have to say the brains at Animal were better, but the steak at F&D was fabulous, the yogurt with dill sauce sublime, and the atmosphere quintessential Austin).  I'm told the Noble Pig is also excellent (though it is a sandwich restaurant).

One of Lance Armstrong's yellow jerseys at Hudson's on the Bend
Pig brains and huckleberries
Yogurt with dill sauce
Perhaps the most ambitious thing I did all week was to go to the rail yard for the Austin Steam Train Association.  We were able to get a behind the scenes tour, and the kids were thrilled (kids love trains,  as do the adults at the Austin Steam Train Association).   

At this point I have numerous articles and a large stack of books waiting to be read.  I'm hoping that without my class, my schedule will be a bit more forgiving.  You never know what will turn up, however, and the backlog of emails and to-dos--formidable!

Friday, December 16, 2011

Time to Freak Out. Sensibly.

There is a reason I stick to relatively easily modifiable practices and how they might (possibly!) improve health and prevent disease.  I like fun exercise, real food, wool socks in the wintertime, and sunshine.  I don't like to think about the years of farm pesticide waste seeping into the groundwater, or the estrogenic compounds in plastic.  Plastic compounds are ubiquitous and incredibly convenient.  In all our packaged foods.  Sippy cups.  Tupperware.  IV bags and tubing.  Coating paper receipts.  In the lining of canned foods and soda.  The most famous is BPA (found primarily in receipts and number 7 plastic), but all sorts of plastic contain all sorts of weird compounds.

Image from Flickr Creative Commons
I like to live a relatively processed food and gluten free life - but philosophical ramblings about candy cigarettes aside, I don't dive across the table and grab the birthday cake out of my kid's hand at the party.  (I'm not generally tempted by the birthday cake myself, as it is generally of the grocery-store azol-dye soybean oil frosted variety.  There was an incredible ice cream cake at a recent party that I'll admit to stealing a few bites from).   There's a line between living a somewhat normal life and being completely obsessed and anxiety-ridden about food, and I certainly don't want the kids to be obsessed and anxiety-ridden about food.   Nor would I lie about my kids having celiac or peanut allergies - the last thing I want is a terrified preschool teacher calling me about the goldfish cracker my kid snatched from some other kid's lunch, and should she call an ambulance or what.  Nothing is totally off-limits within reason, though the healthy stuff has to be consumed first, before the leftover Halloween candy.  And yes, they do get gluten-free pretzels as a snack (they are cooked in palm oil).  And sometimes those sugar-bombs otherwise known as raisins.

So we muddle through, minimizing harm, and the way I approach plastics is to slowly transition away from them and avoid heating anything (or putting hot food) in them.  (I try not to think about those years and years of microwaved lean cuisines).  I get milk delivered from a local organic dairy in glass bottles.  Is that enough?  Some (many of you, perhaps) would say no.  But aluminum lunch containers are expensive (and have plastic lids that tend not to fit as closely as plastic on plastic), and many of the plastic ones I have are still serviceable and attractive.  Canned foods are also tricky - on a mostly "paleo" "real foods" "avoiding processed food" diet the major canned foods will be coconut milk and tomato products (maybe canned pumpkin?).  In general I made an effort to avoid these except for maybe once per week, figuring, again, the dose makes the poison, and tomato sauce makes anything more palatable for the kids (a variation of the old parenting trick of putting ketchup on everything.)

Ignorance is bliss, really.  At the end of November a research letter was published in JAMA- "Canned Soup Consumption and Urinary Bisphenol A: A Randomized Crossover Trial." In this little Harvard School of Public Health Study, student and staff volunteers consumed 12 ounces of either fresh (prepared without canned ingredients) or canned (Progresso brand) soup daily for lunch (they were vegetarian varieties of course - this is HSPH!).  For the first 5 day period, the soup was consumed daily.  After a 2-day washout, the treatment assignments were reversed.  Urine samples were taken on the 4th and 5th days of each phase.  Urinary BPA was found in 100% of Progresso consumers and 77% of fresh soup consumers, and following the 5 days of canned soup, urinary BPA was 1221% higher than the urinary BPA of the fresh soup consumers.

"The increase in urinary BPA concentrations following canned soup consumption is likely a transient peak of uncertain duration.  The effect of such intermittent elevations in urinary BPA concentration is unknown.  The absolute urinary BPA concentrations observed following canned soup consumption are among the most extreme reported in a nonoccupational setting."

I have to admit I'd canned (heh heh) Progresso and other pre-prepared soups from my eating list a long time ago due to the biochemistry-happy omega-6 fest in the list of ingredients… as expected from any processed food maker trying to scratch a profit by using the least expensive commodity items.  I try to use marinara sauce from a glass jar whenever possible (we'll ignore the plastic seal around the top), and I'm looking for good convenient alternatives to canned coconut milk… but the pantry still has some canned items, to be sure.  And certainly the cardboard box variety of foods has plastic in the lining as well, right?  I make more and more of my own bone broth, but sometimes you just need a bit of stock on hand.  Am I being hopelessly neurotic and silly worrying about plastics, BPA, and canned items (and handling receipts as little as possible)?

Well, 2011 has not been a friendly year for BPA.  A month before the research letter in JAMA alarmed the Progresso soup executives, another scary article was published in Pediatrics: Impact of Early-Life Bisphenol A Exposure and Executive Function in Children (free full text!).  A prospective observational study, so the typical caveats apply.

Urine was collected from pregnant women at 16 and 26 weeks, and at birth) and later from the resultant babies at 1, 2, and 3 years of age.  The results?  Well, BPA was detected in >97% of the gestational and child urine samples.  With adjustment for confounders, each 10-fold increase in gestational BPA concentrations was associated with more anxious and depressed behavior on standardized scales, along with poorer emotional control.  This was true more of girl babies than of boys.  The urinary levels in the children themselves didn't make much difference in behavior, and there was no difference between girls and boys.

There was another scary article about exposure of infants to breastfeeding moms replete with BPA that I can't find now, and this cute article from January in JAMA about nematodes and BPA.  I avoid gluten (for the most part) due to some skin effects and general creepiness, and I don't see why I should feel differently about estrogenic compounds leaching from plastics.

But no, I don't leap across the table and grab the Capri Sun out of my kid's hand at the birthday party either.  Nor will I add a machete to my list of standard kitchen tools so that I can make coconut milk from scratch.   I drink from a plastic-free water container at the gym and the next set of lunchbox containers will be metal… but life has to be lived.  And at least I can worry about these things affecting my children, rather than tuberculosis, mines, or revolution.

Saturday, December 10, 2011

Evolution and Anorexia Nervosa

There was a bit of a dust-up in the paleo and low carb blogosphere about some comments Gary Taubes apparently made about anorexia and insulin in an interview.  He noted that insulin was used as a therapy for anorexia, thus suggesting that (perhaps) anorexia, like obesity, is a disorder of fat metabolism. My suspicion is that Gary was using those studies as an example of how insulin could cause weight gain.   On the other hand, one doesn't need exogenous insulin to refeed anorexics  - the time-tested method is to keep those far gone enough to have medically dangerous symptoms (unstable blood pressure, dropping electrolytes, or super slow heart rate) under lock and key and get calories in whatever way possible (including via a tube inserted into the stomach.)

One of my attendings in at Children's Hospital characterized anorexia as "a desperate disease."  Often purging and starvation are combined (though this combination would be more correctly called "eating disorder not otherwise specified" or "anorexia nervosa, bingeing-purging subtype" than strict anorexia nervosa), and there were many cases of young teenagers hiding vomit and stool in places in their rooms to conceal purging and to get laxatives (not surprisingly, constipation is a symptom of anorexia).

Cowboy Junkies - Bea's Song (one of the better songs ever written - right click to open in new tab)

My evolutionary psychiatry interest has always been in how psychiatric disorders have changed over the past 100 years of rapidly changing lifestyle and diet.  Anorexia nervosa is one of those illnesses that was exceedingly rare until 50 years ago, then escalated rapidly, then leveled off so far as prevalence, though those who are affected encompass more children and more men now than ever before.  My educated guess is that only a small percentage of us are capable of starving ourselves outright without being under lock and key, and that vulnerable population shows symptoms earlier and earlier in life as societal pressures and the obesogenic environment increases.

A quote from my previous blog post linked above (the medical literature references can be found there):

All eating disorders remain relatively rare [though in total they are more common than schizophrenia and bipolar I disorder]. Anorexia afflicts about 0.5% of women and 0.1% of men. Bulimia around 1-3% of women (also 0.1% of men), and binge eating disorder 3.3% of women and 0.8% of men. Anorexia nervosa remains the most deadly of all psychiatric disorders, with a 5-10% death rate within 10 years of developing the symptoms, and an 18-20% death rate within 20 years. Anorexia is endemic in the fashion industry, to the point where models are now being airbrushed to add curves. Another model, Isabelle Caro, died at age 28 of anorexia, and Ana Reston of Brazil died at age 20, still modeling with a BMI of less than 14.
Photo of Isabelle Caro from Wikipedia
The current state of the art treatment of anorexia begins with refeeding, mostly because we know that semi-starvation itself causes obsessions, depression, and fixation on food.   In the hospital, patients work closely with dietitians, trying to learn how to eat a healthy amount and to establish a better relationship with food.  While medicines that promote weight gain are prescribed, antidepressants and other agents are fairly useless in a starvation situation.

You can imagine the typical well-meaning dietician designed diets for these sick young people.  It's the food pyramid with way too many grains, too little fat, and a focus on "healthy" rather than good old fashioned farm fresh food.  And while I don't really have any objections a food pyramid Mediterranean-style whole foods diet (autoimmune issues with grains notwithstanding), I know that what happens in real life is not skipping breakfast, a light lunch, and a late supper of mussels, olive oil, roasted peppers, tapenade and homemade sourdough bread, but rather three meals and two snacks a day, a version of Weight Watchers™ with Skinny Cow ice cream sandwiches, whole grain Rice o Roni, cans of beans, omega-6 laden commercial salad dressing, boneless skinless chicken breasts,  and "lite" yogurt.

The problem with so many meals a day is that one has to think about food constantly.  I don't think that is the best way to recover from an eating disorder, though one would have to be careful with fasting as well.  I believe intermittent fasting is a valuable practice, a way to lower food reward and to ultimately establish a good relationship with food - I don't have to have it right now, but later would probably be fine too - however, fasting can trigger binges in those who are vulnerable.  It is not verboten in those of normal or excess weight, but should be undertaken with care and support.  In my mind, the healthiest diet is one that you don't have to think about all that much - poached eggs, a beef stew with some liver chunks you cook once and eat all week long.  Cold potatoes and butter.  Forgetting to eat every now and again.

M83  Midnight City (right click to open in new tab)

I believe Jamie sent me this recent paper, Role of the evolutionarily conserved starvation response in anorexia nervosa.  It is a fascinating piece, with an in-depth consideration of biology, evolution, and insulin.

The authors speculate that "AN [anorexia nervosa] may be caused by defects in the evolutionarily conserved response to food and nutrient shortage associated with reduced calorie intake."

Some more facts about eating disorders - in 10-20% of patients, the disorder is short-lived.  In 20-30% it is chronic and unremitting.  The most seriously affected are at greatest risk for hypothyroidism, loss of bone density, electrolyte disturbances, low blood cell counts, amenorrhea, suicide, and death.

In anorexia, the physiology of starvation is paramount.  Both brain and peripheral metabolism responses come into play, orchestrated by the brain and the endocrine system (I don't think obesity is far different - I see no reason that obesity would be regulated by fat tissue or the liver when the brain and endocrine system are doing their thing).

The goal of the starvation response is to conserve energy, delay growth, preserve ATP (by increasing efficiency of energy metabolism) and to minimize oxidative damage.  In starvation, changes in the hypothalamus of the brainstem result in a fall in blood insulin levels and a suppression of other anorexogenic factors.  Once ketosis occurs with the depletion of glycogen stores, there is an increase in output from the sympathetic nervous system and stimulation of food-seeking behaviors.  These multiple pathways explain why fasting can be healthy, but also stressful.

One of the major biochemical pathways activated is the IGF-1/FOXO response (an insulin growth factor 1 pathway).  So the authors of this paper postulate something a bit similar to Gary Taubes - anorexia arises when there is defective regulation in the starvation pathway, similar to how insulin deficiency (due to insulin resistance) is a factor in diabetes.  Meaning there is a lot going on with respect to home life, environment, stress, and temperament in eating disorders, but only a select few have the genetic capability to deliberately starve themselves is response to the environment, and those few may have differences in the IGF-1/FOXO pathway.  The researchers were able to find some yeast, fruit flies, worms, and mice with defects in that pathway who tend to restrict food and develop more slowly (or, alternatively, eat more and spontaneously gain weight), and who have genetic differences in the IGF-1/FOXO pathway.

Evidence for genetic vulnerability to anorexia includes the fact that eating disorders are highly heritable. (Uruguayan model Luisel Ramos and her sister both died from anorexia in recent years).   When doing genome-wide linkage analysis of families with eating disorders, many components of the starvation response pathway are located in highly suspect genetic areas.  In practical terms, the increased impetus on thinness and subsequent dieting brings out the reinforcing starvation response as a result of the genetic vulnerability.  A single episode of excessive caloric restriction seems to bring out long-term changes in the neurotransmitter production mediated by FOXO.

Thus caloric restriction and weight loss predispose to additional episodes of dieting, especially in susceptible individuals wih defective regulation of their starvation response, or with perseverative bias in behavior, reflected in obsessive thoughts and compulsivity.

How do these general ideas affect treatment?  Family therapy, distress tolerance, and cognitive behavioral therapy around distorted body image is a cornerstone of therapy for eating disorders, along with the refeeding.

Should we use insulin to treat anorexia?  Well, the reactive hypoglycemia and other risks are problematic.  A more sophisticated approach is to use IGF-1 itself - it can increase appetite and reverse bone loss seen in anorexia.  Long term treatment tends to result in hyperplasia of the lymphatic tissue, tumor promotion, and excess accumulation of body fat.  

Better that we never begin dieting in the first place.  Skipping the processed foods and ensuring there are plenty of healthy fat and nutrients for the brain and muscles seems like the optimal and common sensical approach in that regard.  I'm not sure what to do about the fashion industry...

Saturday, December 3, 2011

Beyond the Chemical Imbalance Part 2

Love the new song from the new album by the Black Keys:  Lonely Boy (from El Camino)   I know this guy's video will get slashed soon enough, but for now… enjoy!  The Black Keys said Lonely Boy is a departure from their typical style, as it is up-tempo.  I wish they would do more like this one!

Are you peppy yet?  You ought to be, because we are going to dive back in to this paper (sent to me by Jamie some weeks ago):  Beyond the serotonin hypothesis: Mitochondria, inflammation, and neurodegeneration in major depression and affective spectrum disorders.

There's all this talk about pathogenesis, chickens, and eggs.  Well, I know where it ends.  We chase the trail back to the beginnings (is it abuse?  temperament? soda?  ho-hos?  winter? rancid vegetable oil?  bad reality TV?  the jury is still out).

But here is where it ends.  Ground zero.  Ratty neurons, smoking mitochondria, and brain damage.  Inflammation.

Inflammation is the term for the complex biological response of tissues to harmful stimuli, such as pathogens, damages cells, or irritants.  Inflammation is a protective attempt by the organism to remove the injurious stimuli as well as initiate the healing process for the tissue.
I knew there was a reason I liked this paper so much.  Two sentences of real wisdom.   The paper continues on to talk about cell death, mitochondria, and the cell "executioners" called capases.  They are cysteine proteases that bring it when a cell needs offing.  These are the cellular equivalent of the Necro-whatevers from Chronicles of Riddick.   Overproduction of reactive oxygen species by shoddy, inefficiently-acting mitochondria is a central feature of neuron cell death.   The tricky wicket is that mitochondrial dysfunction and cell death leads to more inflammation, dysfunction, and cell death.

The presence of an inflammatory response in major depression… is evidenced by, amongst other things, increased plasma levels of pro-inflammatory cytokines and acute phase reactants, oxidative damage to red blood cell membranes and serum phospholipids, and lowered serum zinc.
Pro-inflammatory cytokines can induce depression in 70% of people treated with such agents.  Elevations of cytokines have been reported in depression, anxiety, fibromyalgia, migraines, IBS, chronic fatigue syndrome, diabetes, autoimmune arthritis… of course, says any doctor.  The so-called "mitochondrial cocktail" can improve mitochondrial function after a few months and includes the following:  CoQ10, riboflavin, and at least one additional antioxidant (vit C, E, or alpha lipoic acid), and l-carnitine or creatine.

Older school psychopharmacologists will try the following:

Tricyclic antidepressants - they act as classical mitochondrial decouplers by hindering ATP synthesis and enhance ATPase activity.  They tend to change how mitochondria function in a neuroprotective way.

SSRIs: some seem to be toxic to mitochondria at large doses, but protective at lower doses.  In animal models, all antidepressants attenuate inflammation-induced brain cytokine production and prevent the development of depression induced by high dose interferon.  In fact, antidepressants seem to have this effect regardless of mechanism (SSRI, tricyclic, lithium) - which is a major argument against the monoamine theory of depression.

Lithium: seems to enhance mitochondrial function in humans and rats.  Long-term is even better than short-term.  Lithium is the favored medicine of the gray-haired psychopharmacologist.  Between the neuroprotective effects and the anti-suicide benefit, you might expect people to encourage lithium to be in the water

Shock therapy:  Yes, it is still around.  Frankly, there is no faster treatment for depression and it works in refractory cases, thus is often a lifesaver.  It has terrible side effects, there's no denying it.  And it seems to increase the mitochondrial efficiency in rats.

Up to 50% of patients with major depression are unresponsive to medications… here is a poem from old Egyptian papyrus (from the anchor paper)

Disease has sneaked into me.  I feel my limbs heavy.  I no longer know my own body.  Should the master physician come to me… My heart is not revived by his medicines. 

Monday, November 28, 2011

Depression - Beyond the Chemical Imbalance (Part 1)

Today we go back to the basics of depression.  Borodin's Nocturne (right click to open in new tab).

I would say there are three main theories held by the general public about the causes of depression:

1) Bootstrap theory:  you are a lazy good-for-nothing who just needs to snap out of it and get up and get yourself better.

2) Trauma theory: too much stress, death, trauma, etc.

3) The chemical imbalance:  You have an SSRI deficiency and your serotonin needs to be regulated (see this memorable old zoloft commercial)

Of course, I don't subscribe to any of these theories entirely, though there are elements to each of them that hold a kernel of truth - my belief and one that is largely supported by the literature is that stress and genetic susceptibility leads to depressive symptoms, which are mediated by inflammatory means in the brain.  And certainly if one is capable, getting up and getting out and exercising and eating right can be very helpful, but sometimes asking a depressed person to wake up early and exercise is like asking someone with a broken ankle to go for a run.  The frontal lobe isn't firing on all cylinders.  There's no motivation, no zazz.

The scientifically minded probably are most familiar with theory number three.  In medical terms, the "chemical imbalance" theory is called the "monoamine hypothesis" of depression.  The monoamine theory is (I would say) largely accepted by doctors of a certain age (even psychiatrists), but it holds about as much water as the carbohydrate-insulin theory of obesity.  Back in the day there was a medication for blood pressure called reserpine.  Among other things it depletes the brain of serotonin, and does indeed tend to cause depression (it is rarely used nowadays).

The first antidepressant, a drug used to treat tuberculosis, was found serendipitously.  One of its actions was to change the concentrations of monoamines (such as serotonin and norepinephrine) in the synapse between nerves.   And thus, the monoamine hypothesis of depression was born along with a billion dollar antidepressant industry.  All the antidepressants affect the monoamines one way or another, and they work… if you are lucky, often with side effects, and maybe they protect your brain during one episode of depression, but they don't seem to protect you from the next episode if you go off the medicine when you feel better (talk therapy when compared to medicines seems to have more long term benefit, not surprisingly).

Along the way, the monoamine theory picked up a bunch of other diseases (called the affective spectrum disorders) including major depressive disorder's anxious twin, generalized anxiety disorder, migraines, irritable bowel syndrome, bipolar disorders, social phobia, PTSD, OCD fibromyalgia, and chronic fatigue syndrome among others (1).  All of these diseases have been shown to respond (somewhat) to three or more different classes of antidepressant medication.

Problem is, when you measure serotonin in depressed people, the levels are often all over the map.  In fact, low serotonin doesn't really correlate with depression very well at all, though low serotonin in the central nervous system does correlate with suicide, violence, and insomnia.   Brain researchers quickly figured out that the monoamine hypothesis has some pretty big holes, and the mechanism of antidepressants is not about increasing serotonin and other monoamines in the synapse but rather changing the efficiency with which monoamine signals are transmitted.

Instead, the current literature-supported theory of the brain pathology of depression and the other affective spectrum disorders leads us to two things going awry - the immune system (inflammation) and mitochondrial dysfunction.

How messy is the study of depression?  Consider these facts - if we look at the modern criteria, the classic unipolar major depression is a smallish subset of the whole.  31-62% of people with depression have symptoms of "atypical depression" (leaden feelings in the arms and legs, increased appetite, increased sleep, as opposed to the classic weight loss and insomnia).  64-72% of those with atypical depression meet criteria for bipolar spectrum disorders.  Depressive disorders are often comorbid with ADHD, anxiety, and substance abuse disorders.  You can see if we try to study a group of patients with "major depressive disorder" by criteria that represents the typical clinical outpatient, we will get a mix of people with various complicating neuropsych problems, and any studies of so-called "pure" major depressive disorders where other problems are excluded (which is typical for pharmaceutical studies) will not necessarily be generalizable to the actual population.

Add in frequent comorbid medical conditions, and you have a whole soup of pathology.  92% of depressed inpatients have pain, typically headaches or muscle aches.  Irritable bowel and migraines are often found, along with metabolic syndrome, pre-diabetes and diabetes, and obesity.

However, rather than be taken aback by the complexities, theories of mitochondrial dysfunction and inflammation can scoop up the entire variable pathology (which makes these theories very pleasing to me).

So let's start with mitochondria.  As we know, these are the energy factories of the cells, and their primary mission is to make the cellular equivalent of gasoline, ATP.  Problems with the mitochondria tend to show up as symptoms with the most energetically hungry cells of the muscle and nerves.   Nutritionally, CoQ10, carnitine, B-vitamin, and selenium deficiencies can also cause mitochondrial dysfunction directly.  Mitochondria desperately need these micronutrients to do their work efficiently.

Symptoms of mitochondrial dysfunction can be non-specific, but the cognitive symptoms are very similar to those found in depression, including impairments in attention and executive function and memory.  Tellingly, in families with known genetic problems with mitochondria, the symptoms worsen around times of stress - overwork, fasting, over-exercise, and environmental temperature extremes.  Children with mitochondrial disorders are more likely to be depressed than control children, and among adults, folks with known mitochondrial disorders are more likely to have depression, chronic fatigue, major depressive disorders, bipolar disorder, and panic disorder than the general population.

But all of that is the typical chicken and egg clinical stuff.  Maybe people with genetic mitochondrial problems have a lot of stress, and are thus more depressed.  What's the biochemical evidence for mitochondrial dysfunction in major depressive disorders (and bipolar disorder)?  Autopsies show all sorts of interesting problems with mitochondrial proteins, unusual mitochondrial DNA mutations, and poor mitochondrial complex activity (2).  ATP production rates and respiratory chain enzyme ratios seem to be decreased in the muscle of biopsied patients with major depressive disorder and pain.  In fact, several studies have shown that patients with a high degree of somatic complaints (typically muscle aches) have much lower ATP production than average in the muscle.  In chronic fatigue patients, some similar abnormalities have been found.

And finally, in some mouse models of mitochondrial dysfunction, the mice have bipolar-like symptoms and altered levels and turnover of the monoamines.  The researchers worked out that the mitochondrial dysfunction was the cause of the monoamine depletion, leading to the mouse mood disorders.

So mitochondrial problems (which can be brought about genetically, but also by micronutrient deficiencies) can cause oxidative stress, and eventually lead to nerve damage and psychiatric symptoms.  More on the specifics of this pathology and the role of inflammation in the next post.

Thursday, November 24, 2011

Tales of the Metabolically Deranged

Happy Thanksgiving!  I'm probably in too good of a mood to write this post properly, but I have a moment right now and must seize the opportunity, and I'm going to try to make this short but cogent.  A few days ago I noted that I don't agree with Mercola and Rosedale about their characterization (paraphrased, as it wasn't quite as black and white) of glucose as toxic and human beings as on a linear path to diabetes.  Whether or not anyone cares about my opinion as a psychiatrist is another question :-) Commenter js290 wrote the following:

You should read more carefully what Dr. Rosedale wrote in the link you supplied. Your characterization of it is entirely [sic] accurate.

I wrote that I read the articles a couple of times, and found them flabbergasting.  Js290 wrote back:

The way I read it, Dr. Rosedale offered the most generalized solution. The abstraction he makes is we simply define a gradient of metabolic derangement from 0% (healthy) to 100% deranged (diabetic). His argument seems to be simply, the diet that is therapeutic for fully metabolically deranged cannot be unhealthy for the metabolically healthy.
Analogously, it's similar to most of the paleo stance on gluten grains: just because it's tolerable doesn't make it optimal.
Given that you have written about brains function on ketones, that for the same number of carbon atoms, fatty acids produce more ATP than glucose, that the body is capable of producing all the glucose it needs, Dr. Rosedale's view is by far the most generalized and better abstraction from a health perspective. 
Why come up with many different models for different use cases when a single model will work? This is how evolution and natural selection does things: the best abstraction wins.
I think js290 encompasses in a nutshell exactly why I find the theories so puzzling.  I don't see why we should use sick people to tell us what is optimal for all people.  Nor do I know the definition of "metabolically deranged" - do we mean pre-diabetic and type II diabetic?  Obese?  Metabolic syndrome?

I disagree with the characterization that the "derangements" are linear.  My understanding of physiology is that those with healthy beta cell function can do very well with a wide variety of carbohydrate intake and it shouldn't matter that much for optimal health.  We are obligated to use glucose for fuel - we have systems in place to deal with physiologic amounts of glucose.  I also don't think that post-prandial glucose "spikes" are particularly abnormal or dangerous unless they are very high and last a long time.  I don't think glucose or carbohydrates as a macronutrient class per se cause diabetes.  Once you get past a tipping point and start taking out beta cells, hyperglycemia, insulin resistance, and increasing damage occurs, then you have fewer options, dietarily speaking.  Even then, a hard core ketotic zero carber who never cheats may be in good stead, but those who cheat are now (physiologically) even more insulin resistant than they would have been if they ate enough carbs to keep them out of deep ketosis all the time… so glucose "spikes" and area under the curve for glucose and insulin would be even higher than if there were more carbs eaten on a regular basis.

In addition, since the liver will make a bunch of glucose via gluconeogenesis, I don't see much harm in eating moderate amounts of glucose so our liver doesn't need to make it, unless you are needing to stay in deep ketosis for medical reasons. Six of one, half a dozen the other.

And, of course, I think there are certain brain illnesses that could very well benefit from deep ketosis (for some of these conditions it is merely theoretical, for others there are case studies or even pilot data, for epilepsy there is a lot of data) - brain cancers, migraines, epilepsy, bipolar disorder, dementia, autism, and schizophrenia.

In general, I think it is reasonable that fasting and autophagy should be engaged in on an intermittent basis for all individuals, including heathy ones - I know that if on one particular day I personally eat high carb or low carb, I wake up the next morning in ketosis.  That is a sign of metabolic flexibility, which is a positive sign of a healthy metabolism.

Well, that is my opinion.  Starchy root vegetables and fruit are good sources of nutrients and in general less expensive than good quality meat, though perhaps not less expensive or easier to store than than good quality fat, calorie for calorie.  The nutrients in starch tend to be somewhat different than the ones in fat.   I find them pleasurable to eat, personally, and my kids certainly prefer the starch and fruit to just fat, green leafy vegetables, and meat.  Perhaps they would truly want mountains of candy and chocolate and their preferences shouldn't guide our health speculations… well, that's my opinion.  I simply don't think there is enough evidence to suggest that zero carb diets are optimal for everyone for longevity and health, either from an experimental perspective or from a common sense, physiologic perspective.

Tuesday, November 22, 2011

Soda Begets Zombies

Okay, not likely.  But the sugary variety might well be causing depression in those vulnerable to fructose malabsorption.  Have a look at my previous post on the subject.

Today I have a mere observational study that adds to a pile of evidence that soda ain't the best thing in the world to be drinking, behaviorally speaking. "The Twinkie Defense: the relationship between carbonated non-diet soft drinks and violence perpetration among Boston high school students."

Here's an appropriate song (right click in new tab to open):  Kiss With a Fist

Some bad news about behavior and soda, associatively speaking:  In Norwegian adolescents, soda consumption correlated with poor mental health.  Among American college students, those who drank more soda were less likely to be social, less able to understand emotional cues, and more likely to favor individualism (is that bad?).  There are several reasons soda might cause problems - the sugar could lead to a low blood sugar "crash" which is associated with violence (as I discussed in this post).  In addition, soda is a pretty poor source of nutrition other than straight-up calories, so if it replaces more nutritious food in the diet, big soda-drinkers could end up with micronutrient deficiencies.  And yeah, micronutrient deficiencies could lead to more violence.  No one measured if anyone was a fructose malabsorber.

The experimental design of the Boston study was pretty simple - Boston public high school students were randomly selected and asked to answer a survey.  Those who answered that they drank five or more cans of non-diet soda every week comprised 30% of the sample.  They controlled for a bunch of covariates (but I can think of several million more).  Alcohol, age, gender,  race, sleep, smoking, family dinners.

Heavy soda drinkers had similar BMIs to less heavy soda drinkers, and were no more likely to have less than 6 hours sleep.  White, Black, and Hispanic kids are all equally likely to be heavy soda drinkers, but Asians were significantly less likely to be quaffing 5 or more cans a week.

Heavy soda users were far more likely to smoke or drink alcohol, and were far more likely to carry around a knife, have been violent with a sibling, a date, or another young person. When the sample was split into 4 quartiles rather than two, the violence link remained linear, suggesting a dose response relationship.

And that's pretty much it.  A rather limited self-report study with some statistical crunching, no causal relationship can be inferred; though there are some sensible physiologic explanations as to why soda could make you knife your sister, it isn't proven here.  Brain-eating was not examined.

More posts this week!  I need to answer js290 regarding the whole linear glucose thing, and I figured it would warrant a short post rather than a comment.  Jamie has sent me a few papers, and I pulled some reviews on inflammation, atopy, and behavior.

Saturday, November 19, 2011

Handel and the Biology of Allergy, Atopy, and Suicide

With respect to classical music, I'm a much bigger fan of the classical and romantic composers (Mozart, Beethoven and all those Russian guys like Rimsky-Korsakoff) than the earlier baroque composers (such as Bach and Handel).   Baroque is beautiful and often awe-inspiring, but too structured for my taste --  like a doily or a stained glass window in a church.  Back then gardens were arranged into strict geometric designs.  We caged nature.  Then pastoralism came into vogue, and we get landscape architecture that was meant to be natural and pleasing rather than quite so civilized, and the music of the time reflects this aesthetic as well.  The structure and expectations of the preceding baroque era seemed to squeeze the lyricism out of baroque music, though not the beauty.

But a baroque giant, Handel, wrote the Largo from Xerxes, a selection of music that seems to wrest poetry from rigidity.  It's an amazing piece, before its time.  A game-changer - who knew Handel could be passionate?  According to Wikipedia, Mozart said of Handel, "[He] understands affect better than any of us.  When he chooses, he strikes like a thunderbolt."  Right click to open in new tab.

We have our biology, our natures.  And yet we dare impose the structure of agriculture and pharmacology  upon it… with an incomplete understanding, at best, particularly in the brain.  Such incomplete understanding will lead seemingly learned folks to suggest that we are all diabetic.  (We are not).  And that glucose is not a toxin but foods that raise blood sugar levels are toxic (er, what?).  I probably should not stick my foot into this one, as I am not a metabolism blogger, but for heaven's sake.  Without glucose in the bloodstream we are dead.  With too much glucose we are poisoned.

Physiology is all about Goldilocks, the right amounts, and the case that all starches are toxic for the vast majority of people is a very, very poor one.  And if fasting glucose and overall levels of circulating glucose are important, the strict low-carber who cheats every once in a while might be in more trouble than the healthy starchy-carb eater who will be more exquisitely insulin-sensitive.  As Kurt Harris and Melissa McEwen have noted, there is no evolutionary precedent for lifelong very low carb diets, and plenty of examples of healthy cultures who eat starchy carbs.  So with our incomplete understanding of physiology, I feel it is a safer bet to use reasonable precedent rather than a zero-carb theory of an optimal longevity diet as a prescription for most people.  Certainly there are likely exceptions - seizures, dementia, the first stage of weight loss, in many folks with diabetes.  Personally I think calling starch a poison for the majority of people makes about as much sense as demonizing saturated fat.

So.  Back to allergies and suicide.  This part of the post is for me, really.  I need a spot to look back and see the nitty gritty stuff organized in a way that makes sense to me.  Actually, this is the point of the entire blog.  My self-taught fellowship in Evolutionary Psychiatry.

In the beginning, there were Th1 and Th2 helper cells.  These are soldiers of our immune system, called lymphocytes, with different capabilities and called into action in different circumstances.  The call to war comes in the form of the release of inflammatory cytokines.  There are a whole soup of these chemosignals, released also primarily by T lymphocytes, mostly named interleukins, which are shortened to IL-1, IL-2, IL-3, etc. etc.

In general - the release of inflammatory cytokines can lead to reduced activities and social interaction,  and it can activate the HPA axis which can lead to supernormal responses to stress, which can further ramp up the inflammatory response.  Elevated pro-inflammatory cytokines can activate the IDO enzyme, reducing serotonin production (this may explain the link between allergy and suicide).  Certain types of Th2 related cytokines also increase insomnia.  Allergy, then, does not increase the risk of suicide by making people feel sick, but directly through inflammatory means, leading to people withdrawing from social activities and over-react to stressful situations.

What are some more specific features?  Well, proinflammatory cytokines released at the time of the allergic reaction activate the HPA axis - the glucocorticoids and catecholamines cause a suppression of Th1 and a shift to Th2 activity by inhibiting IL-12 and promoting IL-10.  The pro-inflammatory cytokines also cause dysfunction of corticosteroid receptors.  The Th2 lymphocyte produces IL-4, which can effect serotonin metabolism as well - and IL-4 is known to have more effect in people with some genes than others.

En anglais, we are talking here a direct mechanism by which a certain kind of stress, allergy, can affect the neurochemicals in the brain and therefore behavior, and it seems that some people are genetically more vulnerable to this stress than others.  That said, other researchers have postulated that the Th1 cytokines impair the serotonin machinery even more than the Th2 ones… we're back to not too hot, not too cold, not too soft, not too hard.  It all has to be just right.  And what is just right?  Depends on your genes and your epigenetics.  Superimposing an emulation of the evolutionary milieu is only a first approximation and a good guess.

Thursday, November 17, 2011

Sniffles and Suicide

Let's revisit the basic premise, my hypothesis we are searching to disprove.  Humans are not broken, but we are not optimized for our current environment, whether it be modern stress, modern social structure, modern diet, modern sleep, or modern activity.  Perturbations from the homo sapiens "norm" of paleolithic life (minus the few adaptations we have accumulated in the recent years) lead to modern human disease, mental and physical.  

In a broad stroke, pathology in the human body is mediated via abnormal immune response - thus autoimmunity and inflammation.  This piece of the theory has mounting evidence both for physical and mental health diseases.  Inflammation in the brain can lead to disturbances of human behavior - to anxiety, extreme depression, and even suicide.

I know.  That sort of a title will make any Evolutionary Psychiatrist's ears prick up.

First, though, suicide.  With any human condition, sometimes the best place to look for clarity and understanding is Steinbeck, who won the Nobel and Pulitzer prizes for literature.  And, like many men of the 20th century, John Steinbeck died of heart disease in his 60s.  

He wrote a sympathetic portrayal of a suicidal man in what I consider to be his best work, East of Eden:

On another sheet he wrote, "Dear Will, No matter what you youself may think -- please help me now. For Mother's sake -- please.  I was killed by a horse -- thrown and kicked in the head -- Please!  Your brother, Tom."

...In his bedroom he broke open a new box of shells and put one of them in the cylinder of his well-called Smith and Wessen .38 and he set the loaded chamber one space to the left of the firing pin.  His horse standing sleepily near the fence came to his whistle and stood drowsing while he saddled up.

It was three o’clock in the morning when he dropped the letters in the post-office at King City and mounted and turned his horse south toward the unproductive hills of the old Hamilton place.

He was a gallant gentlemen.

Certainly many suicides are planned -- often if family members get letters or phone calls ahead of time, it can be prevented.  Sometimes though, suicidal urges come on in an unbearable wave, and if someone has access to lethal means (typically firearms, hanging, or jumping), the urge becomes deadly.  A very interesting examination of people who survived jumping from the Golden Gate Bridge showed that only 10% of people who survived went on to complete suicide later.  Most of the time, the urge to kill oneself is impulsive.  Often there are biological markers - increased inflammation and low serotonin, among others. 

Allergies, of course, are very common.  And if suicide is in part an inflammatory issue, then one would suspect that people inflamed with allergies are more likely to commit suicide.  There are, indeed, correlations between allergy and suicide.  And while I tend to think of spring and fall as suicide seasons (spring in particular)  due to rapid change of sunlight during those times, spring and fall are also allergy seasons, with spring being the peak of hospitalizations for those with severe allergy problems.  It turns out that the link between suicide and seasons (particularly the springtime) is stronger in those with allergies.

The authors of the allergy medicine and suicide paper had an interesting premise.  They looked at the theory between the connectedness of allergy and suicide in a very biological way.  They looked a data showing increased gene expression of cytokines that mediate the activity of a  of a certain type of immune cell, Th2 (T-helper cells type 2) in the prefrontal cortex of postmortem suicide victims.  Then they looked at the main treatments for allergies/asthma - antihistamines (like claritin) and intranasal steroids (like rhinocort).   An antihistamine won't change Th2 helper cell activity.  A steroid will decrease it via direct means.  The authors went county by county in the US comparing non-sedating antihistamine prescription data and inhaled corticosteroid data with reported suicides, antidepressant prescription data, availability of psychiatrists, urban vs. rural, demographics, within-country vs. intra-county and crunched numbers.  And man, they crunched numbers to a degree that is way beyond my ken.  They used logarithms and differential equations and basically took the huge amount of data and crunched the heck out of it.  I can't speak to the veracity of the crunchedness as I am no statistician.  It sounds reasonable but any mathematical skeptic who wants to look at the paper and pull it apart, feel free to get in touch.  

They found that antihistamine prescriptions (and they excluded sedating antihistamines such benadryl or vistaril which are often prescribed for sleep, not allergies) were positively correlated with suicide rates (p=.0001) and that intranasal corticosteroids prescription rates were inversely associated with suicide rates (p=.0004).  So if prescriptions for inhaled corticosteroids were to increase by 1%, the suicide rate should decrease by 0.16% (0.04 suicides per 100,000 people).  The use of decongestants was neutral with respect to suicide risk.

The discussion in the paper is complex, noting that systemic steroids are known to cause problematic psychiatric side effects (which is certainly true) but that intranasal steroids (for the most part) seem to bypass this problem and have minimal systemic effects, merely decreasing inflammation in the nose, where it counts.  Some data suggesting that intranasal steroids do cause jitteriness, anxiety, agitation, insomnia, and depression in certain people makes the findings of this study even more interesting.  Could Th2 suppression in certain cases more than compensate for the negative psychological effects of nasal corticosteroids, at least with respect to suicide? 

In East of Eden, Tom knew what his mother felt about suicide:

[She] had a strong distaste for suicide feeling that it combined three things of which she strongly disapproved -- bad manners, cowardice, and sin.

Inflammation is, indeed, unmannerly, but I think we go too far to call it cowardice or sin.  We should look closer, look further, and truly delineate this pathology.  Suicide is the 10th leading cause of death in the world, and the 11th in the United States.  Inflammation is the number one cause of death in the modern world.

Sunday, November 13, 2011

Is Postpartum Psychosis an Autoimmune Disease?

Here's an article for the "everything is connected" file.  Also for "yes, psychiatric disease has biologic underpinnings and is medical illness" file.  Also the "inflammation in the wrong place at the wrong time is super-bad" file.  And it may be of interest to anyone who has had symptoms of autoimmune disease helped by an anti-inflammatory (paleo-type) diet.

Postpartum psychosis is rare and scary.  About 1 in 1000 women become psychotic in the first months after having a baby (though anything up to 12 months after is considered "postpartum" the greatest risk is in the first month).   The most typical presentation is one of manic psychosis, with prominent insomnia, irritability, and delusions of grandeur.  However, some women will also be depressed and be delusional and suicidal, or even with delusions that lead a women to kill her baby.

Not surprisingly, a prior diagnosis of bipolar disorder is the greatest risk factor for developing postpartum psychosis.  However, most women with postpartum psychosis have no history of psychiatric illness at all (1). Often the illness requires hospitalization, and though there are no "consensus treatment guidelines," in almost all cases benzodiazepines (sedative, anti-anxiety meds, such as lorazepam) are used to help stabilize sleep-wake cycles, and in most cases antipsychotics are also used, typically with good effect.  If those aren't helping, lithium is added.

Here's another bit of info about pregnancy.  The fetus is obviously genetically different than mom, so women develop a depressed immune system during pregnancy, in order to protect the growing beastie from mom's antibodies and killer cells.  This is why I was told to studiously avoid unpasturized cheese and raw eggs and deli meat during pregnancy, and why healthy women in the third trimester are much more likely to develop severe complications and die from the flu than women who are not pregnant.

It is well known that in women with a dysfunctional immune system (the autoimmune diseases, such as multiple sclerosis, rheumatoid arthritis, and autoimmune thyroiditis), the autoimmune symptoms are generally greatly ameliorated during pregnancy.  However, this time of relatively low autoimmune symptoms is followed in the post-partum period by a "rebound" with greatly increased symptoms and greater autoantibody titers measured in the serum.

So is post-partum psychosis a symptom of autoimmune disease?  Specifically autoimmune thyroid disease, as thyroid disease (both hyper- and hypothyroidism) is well known to cause psychiatric symptoms, even psychosis?

Well, those societies with socialized medicine were able to gather data in such a way as to start to give us an answer to that question.  In the Netherlands, all the women in a certain area of the country who developed post-partum psychosis and ended up in the hospital were checked for autoimmune thyroid antibodies and thyroid function upon admission to the hospital.  A larger control group of other post-partum women were also checked.  Critically, women who were medicated at admission (particularly with lithium) were excluded from the study, as lithium is known to depress thyroid function.  All women with a previous history of thyroid disease, bipolar disorder, schizophrenia, or psychosis were excluded.  That left a group of 29 women with new-onset post-partum psychosis and 117 controls.

Here is what the researchers found.  5% of post-partum women in the control group had measurable autoimmune thyroid autoantibodies at 4 weeks after delivery, a sign of autoimmune thyroid disease.  This is comparable to surveys of a general population of women in the Netherlands.  None of them had measurable abnormalities in thyroid function or any symptoms.  In contrast, 19% of the post-partum psychosis patients had measurable thyroid autoantibodies at admission (again, prior to receiving any lithium or antipsychotic medication treatment), and half of those women also had measurable thyroid abnormalities.  In the following 9 months, 67% of the postpartum psychosis women with autoimmune thyroid antibodies went on to develop measurable thyroid problems (abnormal TSH or free thyroxine).  None of the control women did.  The odds ratios for these findings were all >2, some as large as 9, which is quite significant, especially considering the size of the sample).

Even though patients with previous bipolar disorder were excluded from this study, the researchers note that the 19% prevalence of autoimmune thyroid antibodies in these psychotic women is similar to the prevalence in women with bipolar disorder (2).  And, to really get your noggins going, twin studies of bipolar disorder show that the presence of autoimmune thyroid antibodies are correlated not only to the illness itself, but to the genetic vulnerability to the illness (3).

The researchers in this study strongly recommended that all women with postpartum psychosis be monitored for thyroperoxidase antibodies and thyroid function abnormalities, and furthermore that all women at high risk for postpartum psychosis be monitored before and throughout pregnancy and the postpartum period.  Though this was a small observational study, the advice seems very reasonable.

And, as always, we find that "post-partum psychosis" like many psychiatric symptoms is the equivalent of a fever - signaling underlying abnormalities, but not always caused by the flu.  Sometimes fevers are caused by different bugs, or cancer, or autoimmune disease.  Differentiating the underlying pathology will go a long way to informing our treatments (and helping in prevention) in the future.

Psychiatrists and other doctors reading this article will be interested in one directed more to healthcare professionals about the same study at the MGH Center for Women's Mental Health blog by Ruta Nonacs, MD PhD.  Thanks to Dr. Trevisan for the link to the blog post!

Friday, November 11, 2011

Evolutionary Psychiatry and Bipolar Disorder

A couple of papers came to my attention this week that relate to what I consider "real" evolutionary psychiatry.  That is, what are some evolutionary reasons we might have genes that make us vulnerable to psychiatric disorders.  The "real" evolutionary psychology academics consider this discussion to be "Abnormal Psychology,"  and like all evolutionary psychology, it is somewhat controversial.  I actually don't think about the disorders much as evolving in an evolutionary light… I tend to think in terms of the human body and gut and neurons working outside design specs, thus breaking down (that is "my" version of evolutionary psychiatry).  But the overlap between one ev psych and another deserves some scrutiny, because certainly I can make the case (and often do, to patients, particularly those with bipolar and ADHD and OCD) that there are elements of these disorders that are quite adaptive to certain situations, as long as the maladaptive parts don't derail everything.

Part of my major hypothesis is that the industrial/digital age and processed, low-nutrient food has brought out more severe and different phenotypes of underlying vulnerabilities, and has also brought out seemingly new illnesses ("atypical depression").  Thus a less destructive phenotype (a mildly paranoid schizotypal person, for example, who will be more creative than the average person, as compared to full-blown schizophrenia, which impairs functioning to a terrible extent) will persist in the population due to selective advantage for certain traits.

My only previous article on "real" evolutionary psychiatry can be found here: The Creative Advantage.

Well.  It is fitting the first of the two papers today comes from Medical Hypothesis.  Melissa McEwen sent it to me - I think she likes that journal because it is crazy and it makes her laugh.  (Here is advice for those aspiring to be published there:  The purpose of Medical Hypotheses is to publish interesting theoretical papers. The journal will consider radical, speculative and non-mainstream scientific ideas provided they are coherently expressed.) 

I have no publications, myself, though I have assisted in some research efforts.  My mentor would like me to cobble something together from all the research I've done for the blog on affective disorders, or something.  I never did fancy myself much of an academic… thus never felt the motivation to write anything boring enough to be published in a journal ;-)… however, it would certainly add to my street cred.  I don't know that I can write the exotic papers for Medical Hypothesis, however (Melissa sent me a paper previously about some sort of reptile origin theory of illness?  I can't remember, but it was hysterical, and it must have slipped through part of the editorial process as it was also mostly incoherent).  Back to evolutionary psychiatry!

Evolutionary origin of bipolar disorder - revised (EOBD-R) is the paper in question, by one Julia A. Sherman of Madison, Wisconsin.  A quick google search brings up her original EOBD (sans R) from 2002.  And as much as I make light of Medical Hypothesis, I do have to admire the originality of the paper, noting that it is wildly speculative and the basic premise is most likely incorrect.

Ms. Sherman connects the dots between the circadian rhythm issues and vulnerability to bipolar disorder (manic episodes are known to peak in the spring time, when the light increases exponentially day by day in extremely northern or southern latitudes.)  In a nutshell, the EOBD suggests that bipolar disorder developed as an adaptive trait in the northern temperate zones which had more extreme winters during the ice age.  If you were hypomanic all spring and summer, you got a lot of stuff done, and then you could slow down and "hibernate" (be depressed) during the winter and not use much energy.  The "R" or revision in question actually brings in the interesting bit of Neanderthal DNA that all of us non-San Bushmen (or otherwise directly derived from Africa without side-stepping through Europe or Asia and coming up close and very personal with other hominids) seem to have.  Ms. Sherman thinks that bipolar disorder is a Neanderthal trait.

Hmmm.  She brings in the observations of a German psychiatrist from the early 20th century, E. Kretchmer, who noted that folks with bipolar disorder tended to be of a certain "pyknic" constitutional type.  They have a "thick trunk, relatively short limbs, and a big head on a short, thick neck."  Apparently, several other researchers in the early 20th century, when they were really into measuring heads and body sizes and making silly claims based on the measurements found that manic-depressive patients were more likely to be pyknic (endomorphs),  and schizophrenics were more likely to be "leptosomic" (ectomorphs).  A much newer study in 2003 also confirmed this finding. Sherman believes it is striking that Neanderthals are also described as having big bellies and short limbs.  Interesting.   In addition, people of African descent apparently have lower incidence of bipolar disorder in some studies Sherman cites.  However, there is no reliable data among truly purely African populations such as the San.

There is more to the paper, but I hit the highlights.  I'm not entirely sure what to think.  There is no question that bipolar disorder has a seasonal component and that light and dark therapies can be useful. Hypomania, with the extended bursts of drive, energy, and creativity can also be very adaptive.  And one can imagine being moderately depressed during a cold, dark winter might keep a small tribe of Neanderthal out of each other's hair, when there might have been nothing much to do anyway.  I think the pyknik bit is a little ridiculous - as we know, actually, both schizophrenia and bipolar disorder are related to metabolic syndrome, even in people who have never been on medicines.  I also would bet a poodle that there are San bushmen with some bipolar disorder out there, but symptoms might be attenuated by their latitude and hunter-gatherer lifestyle with plenty of socialization and exercise (though they do like those omega 6-filled mongongo nuts), according to my own wildly speculative theorizing…

The second paper is far less... imaginative and was published in the much more staid Journal of Affective Disorders.  Creativity and affective temperaments in non-clinical professional artists:  An empirical psychometric investigation.  These researchers looked at 152 undergraduates in art school (or other creative majors) vs. 152 undergraduates who were in majors predicted to lead to professions "mostly requiring the application of learned rules" (like accounting, I suspect).  The students were tested with several standard measures to detect subclinical and clinical manifestation of cyclothymia (a mild variation of bipolar disorder), and also other scales of general health and demographic data.  I don't think it will surprise anyone to know that the creative students were significantly more likely to score into the cyclothymic range on these scales.

An interesting quote from the paper:

Positive mood, and happiness in particular, was postulated to fuel creativity. Enhanced positive affect is a feature of both hypomania and mania, which are core symptoms of bipolar disorder and may predispose people within the manic-depression/bipolar disorder spectrum to creativity. By contrast, postulated that depression might provide fresh insights which can be executed into the artistic oeuvre during the energized phases of cyclothymia.

 So it appears that bipolar disorder is linked to creativity, which may have an adaptive advantage.  I don't think that is particularly controversial, but it is nice to see more studies on the subject.  Now were Neanderthals more creative than homo sapiens?  At least in the springtime?  That might help Julia Sherman's hypothesis!

Edited to add - a Neanderthal winter love song? By Primitive Radio Gods (right click to open in new tab).

Monday, November 7, 2011

Is Some Psychiatric Disease a Manifestation of Lyme Disease?

Hello there!  I’ve been offline at home for the past week, ever since the big snowstorm that came through the northeast in late October.  The wet, heavy snow landed on trees that still had their leaves, leading to incredible amounts of damage and fallen limbs.  That night as the snow fell, we listened to branches cracking and snapping and falling and saw arcing electricity light up the sky as wires were torn down.  By morning hundreds of thousands of people were without power, including us — our wires were pulled off the side our house by falling branches.   30 foot tall trees were snapped in half, like toothpicks.  (That picture is of the end of our driveway)

On the Friday before the storm I received my order of 60 pounds of grassfed, high quality beef from Paidom Ranch in Texas.  On Sunday morning I filled 20 freezer bags full of the fresh snow and stuffed them in the powerless freezers and fridge, so the meat was still nice and frozen even after a few days without power.  Without heat and daytime temps in the mid-fifties, it might have been just fine as it was.  Halloween was put off until the following Friday night, though the neighborhood is still a mess.  Power came back in two days, but our wires were still on the ground, caught in a turf war between the utility company, local electricians, and the town inspector.  Back in Texas where I’m from, the utility company is responsible up to the meter — here there is something of a gray area between the property line and the meter, which would be okay, except you are at the mercy of the utility company as to when the power comes on and off so you can repair the lines on your property, leading to the ridiculous situation where we had live wires ready to electrocute anyone unlucky enough to trespass for several days, until the town inspector intervened and made the power company fix the problem.  

 Well.  As I write this article, we are still offline at the house with respect to cable TV, landline phone, and internet (so-called bundled services are so much fun!).  I’m hoping it will be back in the next several days so I can publish — fortunately the smartphone works for email and the like.   

But enough of our tale of first world woe, except the disease I am about to discuss has long been maligned due to the attention it has received affecting wealthy Connecticut kids.  However, I’ve seen many afflicted folks in my practice here in Massachusetts, and several family members and friends have been treated for it.  So I take Lyme Disease, or Borrelia burgdorferi very seriously.  Borrelia, along with several other nasty organisms, such as the microbes that cause babesiosis, ehrlichiosis, Rocky Mountain Spotted Fever, and rickettsia, are all spread by dirty tick bites.

(Quick and easy tic repellant for the spring and summer - 2 tbs coconut oil, 1 tbs aloe vera gel, 25 drops gardenia or rose oil, 25 drops lavender oil. The essential oils can be found at Whole Foods. Stir and smear on ankles, wrists, forehead, nape of neck, belly, and anywhere else tics are liable to climb in.  Last summer prior to use of this ointment we had 6 tics found on kids and husband (tics and mosquitos do not like me - I am too sour) - afterwards no tics whatsoever. Thanks to @OldSalt72 for the link to the recipe) 

Lyme disease is endemic to the area in which I now live, and if transmitted, can become a multisystem inflammatory disease that can affect the skin, joints, heart, and nervous system.  Borrelia burgdorferi is a spirochete gram-negative bacteria in the Treponemataceae family, like the agent that causes syphilis, rather famous for its neurological findings, Treponema palidum

Syphilis and tuberculosis are known in medicine as the “great imitators,” meaning these infections can be mistaken for many other problems, from psychiatric disorders to heart disease.  Lyme disease is very tricky as there are supposedly many false negatives when you check serology.  If you have a suspected deer tick attached for more than 24-48 hours here in Massachusetts, you are better off getting an extended antibiotic regimen just in case (for adults, doxycycline, for young kids, amoxicillin), considering the nastiness of the illness if not treated early.  

In a paper from 2002, Hajek et al screened for antibodies to Borrelia in 926 Czech psychiatric patients between 1995 and 1999.  884 healthy persons were recruited at the same time from the general healthy population.  36% of the psychiatric patients were positive in at least one antibody class for exposure to Borrelia.  Only 18% of the healthy controls were positive.  That’s a rip-roaring difference, to put it in scientific terms.  Since the psychiatric group was older, statistically, than the healthy group, age-matching was done (statistically), and the results were similar.

An older study in the US showed very low seropositivity in a group of psychiatric patients - however, the general population in the US has much less exposure to Lyme (1% as opposed to a much higher percentage of folks in Europe), and the assay used to detect the antibodies was rather old-fashioned in the older study.  (Western blot in 2002 was gold standard, ELISA a close second, and flourescent immunoassay was used in the older US study).  

The first immune response to Borrelia will be IgM antibodies, which peaks 4-6 weeks after infection, followed by the more specific IgG response, which peaks at 6 weeks, but like IgM can persist for months to years.  Antibodies circulating in immune complexes are likely to signify ongoing disease activity.  

If Borrelia burgdorfi is related to psychiatric problems, there are two main mechanisms which could be responsible.  Patients vulnerable to psychiatric disease may also be more susceptible to Lyme or neurotoxic effects due to genetic or other factors.  Second, Borrelia may cause psychiatric symptoms, which is the most parsimonious explanation.  PCR analysis and other advanced methods ought to be employed in the future, and more surveys run on the population to better clarify this phenomenon.

In short, Borrelia burgdorfi is one of those suspicious, long-term, nasty  and sneaky pathogens that could contribute to inflammation and issues in the brain for years or decades.  It may be well one of the many inflammatory activators that are increasing our psychiatric diagoses in recent times, along with diet, lack of sleep, lack of sunshine, increased stress, and lack of exercise.  The problem with Lyme is that many people who seem to have symptoms (and even those with known tic bites and classic rashes) do not test as positive.  But is it really a solution to give everyone a course of doxycycline?  This issue is my question with respect to any infection that may have neuropsychiatric symptoms - we need more data.  In the mean time, I like anti-inflammatory, nutrient rich diets and plenty of sleep, sunshine and stress reduction.

Sunday, November 6, 2011


Hi all! A snowstorm took out my
home Internet service a week ago, and it is still not repaired. I have a post written for when service is restored, but in the mean time, be sure to check out the "map" at the upper right for some pertinent archives.

- Posted using BlogPress from my iPhone

Thursday, October 27, 2011

Brain Shrinkage and B Vitamins

As we discussed before, humans are rather unique among primates in that our brains appear to shrink as we age.  When I first posted the article, Steve Parker, M.D.* noted that there was a recent article on PLOS1 about B vitamins decreasing brain shrinkage, so I thought I would investigate the matter further.  Recently I've been going through this years' issues of The Carlat Report** to get some CMEs, and they mentioned the B vitamins and that study too.

Snow Patrol:  Called Out in the Dark (right click in new tab to open)

But before we get to that study, let's look at an older one, from 2002:  Homocysteine and Brain Atrophy

(I KNOW.  Homocysteine.  That bad boy of the folate cycle.  Always hanging out when everyone wants him to be recycled and go home.  Possibly cleaving your disulfide bridges and leaving your arteries and cartilage all crispy and brittle.  Maybe just a sign that you aren't eating all your B vitamins, amino acids, and various co-factors that you need.)

Homocysteine (Hcy) has been implicated as a risk factor for vascular disease as well as brain atrophy.  There is evidence to implicate Hcy in increased oxidative stress, DNA damage, the triggering of apoptosis and excitotoxicity, all important mechanisms in neurodegeneration.  Hcy… also causes damage to the vessel wall… the high prevalence of hyperhomocysteinemia in the population and its easy treatability [via B vitamin supplementation - ED] make Hcy an interesting amino acid for future intervention studies in the prevention of degenerative brain disorders.

  So if we want to break it down, homocysteine is part of the one-carbon metabolism cycle.  B12 and folate are also an important part of this cycle, and deficiencies in these are clearly related to nerve damage and neural tube defects in infants.  Deficiencies in either of these vitamins will lead to an increase in homocysteine, which is also apparently directly neurotoxic.

The brain might be particularly vulnerable to higher levels of homocysteine because it lacks two major metabolic pathways for eliminating it (biochem nerds, hold on to your hats), remethylation and transsulfuration.

All right.  There's theory, and lots of it.  What about the observational evidence?  Well, a number of cross-sectional studies have examined a relationship between too much homocysteine and brain atrophy.  In an Australian study of stroke patients, high homocysteine was related to increased brain atrophy, and the OPTIMA and Rotterdam studies replicated this finding in healthy elderly.    Other observational studies have shown a correlation between higher homocysteine level and Alzheimer's disease, to the point where homocysteine levels seemed to be able to predict the speed of progression of the disease.

If we look at specific cognitive impairment trials, higher homocysteine levels have been shown to correlate with poorer performance on a number of cognitive tests - story recall, spacial coping, etc.  In fact, homocysteine levels accounted for "7-8% of the variance in late-life cognitive ability."

In a prospective observational study (the Framingham heart study), higher homocysteine at baseline was related to an increased risk of developing Alzheimer's later on.  Several other smaller studies have repeated this finding.

Moving onto trials - B vitamins lower homocysteine levels.  Folic acid supplementation can lower homocysteine by 25%, B12 by a further 7%.  Betaine is somewhat less effective.  These B vitamins have been used in mild cognitive impairment and dementia.  In small, open-label trials, B vitamin supplementation (typically folate and B12) have been helpful in some tests of cognitive impairment and homocysteine levels…

Fast forward to 2010, when the freely available study at PLOSone came out:  Homocysteine-Lowering by B Vitamins Slows the Rate of Accelerated Brain Atrophy in  Mild Cognitive Impairment: A Randomized Controlled Trial

Sounds cool, right?  Well, turns out the first author has a patent for a folate or B vitamin something or other in the treatment of Alzheimer's so keep that in mind.  But still cool.  168 folks with mild cognitive impairment were randomized to placebo or B vitamin supplementation (0.8 mg/d folic acid, 0.5mg/d vitamin B12, and 20 mg/d vitamin B6).  Both groups of people underwent baseline and serial follow up MRIs and cognitive testing.

In the end, the B vitamin supplementation seemed to slow the brain atrophy compared to the control group.  The treatment response was related to baseline homocysteine levels - the rate of atrophy in patients with a homocysteine level at baseline of >13 micromol/L was 53% lower in the active treatment group than in placebo.  A faster rate of atrophy was associated with lower cognitive testing scores.

So, pretty interesting.  And certainly treating our elders with B vitamin supplementation seems to have few downsides.  A lifetime of offal eating might leave us well-served in that regard.

* Steve Parker MD is the go-to person where I send my patients leery of my wild and wooly evolutionary approach who are looking for something a bit more… conservative.  He has a great set of evidenced-based blogs and books on the Mediterranean Diet, with some ketogenic options, and has recently started up a Paleo Diabetic blog as well.

** The Carlat Psychiatry Report is my go-to source for unbiased, evidence-based round-ups of everything in psychopharmacology and beyond.