Can We Stop Becoming More and More Sedentary?

The past several decades have seen us becoming a country of sitters.  We spend less and less time in physical activity because on the one hand we don’t have to and, on the other, we don’t want to. Getting to work, to shops, supermarkets, social and religious events, movies, concerts, or restaurants is rarely accomplished by walking or bike riding to the destination. This is changing in urban area where bike lanes are gradually replacing traffic or parking lanes. Still, bike riders are still very much in the minority, and even more so when inclement weather makes riding uncomfortable or dangerous. Walking is still a preferred mode of transporting oneself in cities like New York, where the pedestrian often arrives sooner than a car due to traffic congestion. But many cities, and certainly suburban and rural areas, are too spread out or lack sidewalks to make walking to work or the supermarket possible. And then there is a matter of time. A few weeks ago, I decided to walk to a supermarket located about 2 ½ miles from where I live. It was a beautiful spring Sunday and the walk was in lieu of a visit to the gym. The five-mile round trip took a good part of the morning and, combined with carrying a knapsack heavy with groceries back home, I am disinclined to repeat the experience.

We also don’t move enough because work or school necessitates sitting at a desk in a meeting or lecture, in a library, or in an office seeing clients or patients. To be sure, some occupations require physical activity such as running after toddlers in a daycare center or hammering sheet rock in a construction site.  But many occupations now require less physical activity than in the past. Our mail carrier uses a small van to deliver the mail; several years ago he would have walked.  We no longer have to walk to a bank to deposit a check or withdraw money. The cell phone takes care of money-less payments and our groceries, along with everything that we need, can be delivered to our door. Devices, which send signals remotely, like the television remote and the more sophisticated smartphone, have further reduced our need and desire to move. Why get up to turn off a light if your phone will do it? Why sweep the floor if your cute robotic device takes care of the dirt?

Of course, our almost constant use of the cell phone has also reduced our physical activity. In a nearby park people sit on benches hunched over their cell phones rather than walking, or sit on workout benches in my gym checking messages, rather than lifting weights.

Thus it is not surprising that our population is now even more sedentary than it was ten years ago.  A report in last week’s JAMA (April 23) analyzed sedentary behavior or, more simply, hours sitting, in almost 52,000 participants who took part in a National Health and Nutrition Examination Survey (NHANES). Three age groups were involved: children from 5-11, adolescents 12-19, and adults 20 years and older. People were asked how much time they spent each day sitting at work, with friends, commuting, reading, playing cards, watching television or using a computer.

About two-thirds of the participants in each age group spent at least two hours a day watching television. The survey did not include people who might binge watch all the episodes of a particular program for hours.  About half the people surveyed sat in front of their home computer for an hour or more each day. This time spent sitting was in addition to the time spent at their computers at work or at school. Moreover, the amount of sitting time may have been underestimated because the survey did not look at time people spent with their cellphones and tablets at home, in coffee shops, or while traveling. When all the sitting time was added up, the researchers found that as a country we are sitting about 8 hours a day compared with about 7 hours ten years ago.

The report describes such stark consequences of sedentary behavior that the reader feels compelled to stand up and walk around while reading about the increased risk of obesity, cardiovascular disease, cancer, diabetes and overall mortality.  However, the authors offer no specific countermeasures to decrease our sedentary behavior.

Since increasing the time we spend not sitting should have positive effects on our health, it is surprising that so little has been done to accomplish this.  Apps will monitor our activity and may increase our motivation to move more, but in a passive way. There is no app that acts as one’s mother to say, “Turn off the computer and go outside and play.”

It should be possible to program computers, tablets, phones and even television sets to make us move. If cars brake when we are too inattentive to do so ourselves, and keep us from drifting from one lane to another, our devices should be able to make us stand up, walk, stretch and maybe do some exercises. My computer shuts down to install an update even when I don’t want it to.  What if my computer or tablet shut down when it detected my inertia for 50 minutes and won’t go back on until I move?  My cell phone tells me how much screen time I use, but why not tell me to stop bending over the screen, stand up straight and go for a walk?  We all get fidgety watching televised advertisements for drugs that will allow us to float through a field of butterflies with our partner, or scenes of cars driving through deserts or up mountain-sides. What if we could program our television to substitute a virtual reality show that gets us moving through that grassy field, or hiking up a mountain for five minutes?

Technology has made us sit too much. Now is the time for technology to get us to move.

“Trends in Sedentary Behavior Among the US Population 2001-2016,” Yang L, Cao, C, Kantor E, et al, JAMA 2019; 321: 1587-1597

Excess Skin After Major Weight Loss: Might Removing It Prevent Weight Gain?

The financial officer of an organization to which I belong decided to have bariatric surgery. Bob (not his real name) needed to lose about 200 pounds and the operation, called the gastric sleeve, narrowed his stomach and decreased the production of ghrelin, a hormone that increases hunger. He lost about 190 pounds, significantly improved his food choices, and now exercises several times a week. But despite his success, and with it his improved health and energy, he told me that he was unhappy. “I had an image of myself as a thin person, which motivated me to always stick to the diet and work out. But now that I have lost all this weight, I feel encased in suit of loose skin. I have to force myself to go to the gym because I think everyone is staring at the skin hanging from arms and sagging down my thighs.  I have to buy clothing in a size too large. My loose flesh prevents me from getting my arms in the sleeves of my jacket and zipping up my pants unless my clothes are baggy.”

Bob’s problem is not unique.  Many patients who undergo bariatric surgery and are successful in losing very large amounts of weight are confronted with bodies distorted by excess skin. This is not a problem for those who lose much smaller amounts of weight. The skin regains its elasticity after being stretched, and regains its normal shape as it does, for example, after pregnancy. If large amounts of weight are lost very slowly, sometimes the skin regains its original shape, although this is less likely to occur in an older individual.

Surgery to promote rapid and massive weight loss, or extreme dieting and exercise, as seen in the television show program, “The Biggest Loser,” can leave pounds of skin behind.  Although those of us who have not gone through the massive gain and then loss of weight might view the problem as merely cosmetic and a small price to pay for the weight loss, the problem is not simply cosmetic. (“Surgical solutions to the problem of massive weight loss,” Spector J, Levine S, and Karp N, World J Gastroenterol. 2006 12: 6602–6607.) In their article describing surgical solutions to help the newly thin deal with their excess skin, Spector and his co-authors point out that patients who have large amounts of skin draped over their limbs and the torso may be in chronic pain and the skin can be easily infected.  Giodano reiterates their views in an article (“Removal of excess skin after massive weight loss: challenges and solutions,” Open Access Surgery 2015; 8: 51-60) and adds that physical impairment, including difficulty exercising or indeed even walking, and low self-esteem are some of the other problems caused by the excessive skin. Moreover, dieting and exercise are unable to bring the skin back to its original elasticity.

There is a solution. It is called body contouring, a plastic surgery that removes the skin, and by doing so, reveals the body shaped by the weight loss. Bob underwent several plastic surgical procedures over a period of many months but the results, giving him a body that finally revealed its nearly 200 pound weight loss, was attained only after a considerable cost in pain and money. He had to take time off from work, required a brief hospitalization for one procedure and, in his words, ‘”I won’t be taking a vacation for decades to pay for everything.” He justified going through this in part because he believed his professional appearance would be improved if he were able to wear clothes in the appropriate size for his weight and not to house his excess skin.  But he admitted another more personal reason: “I was afraid that I would gain back the weight because I was so disappointed in how I felt and looked. In fact, my body was so distorted that I think I looked worse than when I was obese. ”

The failure of patients undergoing bariatric surgery to maintain their weight loss beyond one year post-operatively has been reported. (“Long-term Metabolic Effects of Laparoscopic Sleeve Gastrectomy,” Golomb I, Ben David M Glass A, et al JAMA Surg. 2015; 150:1051-1057.) According to the Golomb et al report, a significant amount of weight is gained relatively early, i.e., within the first few years, and many of the patients did not lose enough weight to reach their goals before they started to gain again.

However, for those who did reach their weight-loss goal like Bob, would having body-contouring surgery support their efforts to maintain their weight loss?  There is no answer. Indeed, the way to provide an answer would be to carry out a study comparing weight maintenance of patients whose excess skin is removed with patients who do not get the body contouring surgery. Both groups would receive the same nutritional counseling, personal training and psychological help so the only difference between the groups would be the removal of excess skin. Of course, the problem with such a study is that the results may show a positive effect on weight maintenance of skin removal. And then what?  The cost of such operations is almost prohibitive for most people and rarely covered by health insurance.

But perhaps this will change. Bariatric surgery is paid for by many insurance plans because studies have shown that the medical costs of obesity are much higher in the long run than the cost of the surgery. If body contouring is shown to have a significant effect on preventing weight gain after bariatric surgery, then perhaps this too will be covered by health insurance.

The better solution, of course, is to prevent the excessive weight gain necessitating the surgery.

The Silent Cause of Tiredness

Too often the response to the question “How are you?” is, “Tired.” A list of reasons justifying the fatigue usually follows: working hard and late, a household of children and/or visitors, too many outside commitments with deadlines, school papers and exams, inadequate sleep, recovering from a cold, and, of course, stress. The list could go on. Missing from this list, however, is a silent but potent cause of tiredness: iron deficiency anemia.  Iron is needed by the body to make hemoglobin, the constituent of red blood cells that transports oxygen from the lungs through the blood and delivers it to the cells. If, over a period of time, too little iron is consumed to make hemoglobin in amounts necessary to meet the needs of the body, iron deficiency anemia results.

Extreme fatigue is one of the symptoms of iron deficiency anemia, along with decreased stamina, increased vulnerability to infections, sensitivity to cold, increased heart rate and dizziness. Pale skin is also a symptom, but like so many of these signs, especially fatigue, other reasons for their presence can easily be summoned.  Many of us assume that we are suffering from some yet identified virus if we feel dizzy or out of breath climbing stairs. And, for many people, being pale in the winter is hardly considered unusual. And we often respond to our tiredness by eating. “Maybe if I eat a snack, I will feel more energetic,” we tell ourselves as we reach for a cookie or bag of chips.  We are unlikely to consider that maybe our fatigue is caused by an insufficient amount of iron in our diets. Unnoticed and unchecked, the depletion of iron stores continues to cause persistent fatigue that does not respond to more sleep or getting over a viral infection.

The National Institutes of Health Office of Dietary Supplements recommends that men and women of non-childbearing years obtain 8 mg of iron daily and 18 mg for premenopausal women. The larger requirement for women of childbearing age is based on monthly blood loss from menstruation.  Blood losses from medical conditions may also decrease iron stores. I had a neighbor who had a silent bleeding ulcer for months and was found to be severely anemic.

Iron deficiency anemia is not uncommon.  (“Iron Deficiency Anemia,” Killip S, Bennett J, Chambers M, Am Fam Physician 2007 1: 75: 671-678) According to a recent publication in the American Family Physician, “ The prevalence of iron deficiency anemia is 2 percent in adult men, 9 to 12 percent in non-Hispanic white women, and nearly 20 percent in black and Mexican-American women.” The trend toward intermittent fasting or cleanse diets may increase these numbers as a one or two-day fast cleanse diets, has been shown to rapidly deplete iron. (“Effect of short-term food restriction on iron metabolism relative well-being and depression in healthy women,” Wojciak R, Eat Weight Disord. 2014; 19:21-327)

Obtaining the necessary amount of iron from the diet is not as easy as, for example, getting enough vitamin C.  Although many foods contain iron, not all the iron in the food gets into the body. There are two types of iron: heme iron and non-heme iron. Heme iron comes from animal sources and is considered more “bioavailable” than non-heme iron. This means that the iron in the food is more able to get into the body from the intestinal tract than non-heme iron.

Liver is a good source of heme iron, but this food is not universally enjoyed (except, perhaps, by cats).  Lean meat and seafood, especially octopus, are also good sources, although the latter is also not particularly popular. Indeed, for most non-vegetarians as well as vegetarians and vegans, more of our iron comes from plant sources than animal foods.  According to the Office of Dietary Supplement report, about half of the iron we eat comes from fortified bread, cereal and other grains. In fact, cereal is a good source of iron:  one cup of bran flakes contains 4.5 mg of iron which is about half the amount men and post-menopausal women need each day. An avoidance of grain products means that the vegetarian and vegan eater must depend on obtaining iron from vegetables, lentils, dried beans, soy products like tofu, and nuts and seeds. The amount of iron in plant foods that are not fortified is low so that large quantities must be eaten each day to meet iron intake requirements, especially for women of childbearing age.  Moreover, there often is a misperception of how much iron is in the foods we think of as good sources of this mineral.

“I eat plenty of spinach and nuts,” a friend will say, “so I am not worried about getting enough iron even if I try to avoid eating meat.“  But an entire cup of cooked spinach (which is a large amount raw since it shrinks when cooked) has only 6 mg of iron. A cup of cashew nuts has 4 mg and lots of calories. Two large eggs have less than 2 mg of iron and one would have to eat an entire cup of hummus to get 5 mg of iron.

Iron in plant foods is also less “bioavailable” than the iron in animal foods. There are phytates and other substances in plant foods that grab hold of the iron and prevent much of it from being absorbed into the body from the intestinal tract. In fact, studies on the iron status of vegetarians have shown that they tend to have lower iron stores than non-vegetarians.   (“The effect of vegetarian diets on iron status in adults: A systematic review and meta-analysis,” Haider L, Schwingshackl L2, Hoffmann G3, Ekmekcioglu C ,  Crit Rev Food Sci Nutr. 2018; 58(8):1359-1374)

Fortunately, eating foods that are high in Vitamin C counteracts the effect of phytates on preventing iron from entering the body. Eating a vitamin C-rich food such as citrus fruits or juice, strawberries, broccoli, cauliflower, Brussels sprouts, and peppers including chili peppers, with an iron-containing food like oatmeal or tofu, significantly increases the absorption of iron, especially for people with low iron reserves.

However, if blood tests show that iron deficiency or iron deficiency anemia is present, it may be necessary to take an iron supplement and doing so should be under the care of a physician.  For many, this may be an easier solution than eating chopped liver or grilled octopus.  Once the problem is resolved and iron stores are back to normal, fatigue and the other symptoms of the anemia should disappear.

Eating Late: Will It Make Us Gain Weight?

Is it true that when we eat may influence our weight? For years, some nutritionists and diet consultants have told us “…not to eat dinner later than 6 pm,” or “…If you eat late at night you will gain a pound while you sleep,” or “…it is better to eat most of your calories early in the day.”  Now that daylight saving time has arrived, we may find ourselves eating dinner much later than we did a few months ago when it was dark by 5:30, or even much earlier. Indeed, as the hours of daylight extend into the evening, and the weather becomes benign, dinner may be pushed back even further as we are reluctant to go inside and settle down for the evening. If the timing of our meals does make a difference, might this have an impact on our weight? Should we stick to eating dinner no later than 7 pm because if we ignore this time limit, we will be gaining weight?

Compelling evidence supports the idea that the timing of meals may affect weight. A large study examining meal times among Seventh-Day Adventist church members in the United States and Canada suggests that we should consider rearranging our meal schedule. Researchers looked at food records of 50,660 adult Seventh-Day Adventists and their BMI ( body mass index), a measurement of their weight status. Would there be a relationship between the number of meals consumed, the timing of the major and smaller meals, which meals were usually skipped and their weight? Their results might make one reconsider when to eat.

People who ate breakfast had lower BMIs than those who habitually skipped this meal. Moreover, people who made breakfast their major meal of the day, rather than lunch or especially dinner, had a significantly lower weight than those who ate their largest meal at dinner. Eating a bigger lunch than dinner also produced lower body weight, although the differences were not as striking as between those who made breakfast their main meal of the day and those who ate their largest meal at night.  Snacks were counted as meals and, no surprise, people who ate more than three meals a day were in the heaviest category.

Breakfast consumption has also been linked to weight loss in a study in which dieting subjects ate most of their calories at breakfast or at dinner. Both groups ate the same number of calories but those who ate most of their calories at breakfast lost significantly more weight than the other group.

These results suggest that populations that traditionally eat tiny breakfasts and large evening meals might have a high rate of obesity. In two such countries, Spain and Argentina, breakfast is often only coffee and perhaps a roll or pastry, and dinner usually begins, at least in restaurants, no earlier than 10:30 pm. However, despite their late dining and inadequate breakfasts, the prevalence of obesity doesn’t even come to close to what we have in the States where we finish our dinners before they have picked up their forks to begin theirs. The prevalence of obesity in both Spain and Argentina is around 14%.

In contrast, one out of every four Americans is obese. Moreover, articles lamenting the increase in the numbers of overweight and obese individuals in these countries do not mention the lateness of the dinner hour, but instead focus on the same factors that are responsible in part for our rise in obesity: too many high calorie snacks, too little exercise, too much watching television, too little consumption of fruits and vegetables and too much fast food. Sound familiar?

Nevertheless, can we disregard the studies indicating that consuming the majority of our calories before sunset might help us in the obesity battle? Should we stop having people over for dinner or celebratory occasions involving food in the evening, and switch to brunch or breakfast instead? Should lunch be the default main meal and dinner limited to soup and a salad, or yogurt and fruit?

One problem with transferring information from studies with compelling results such as the one with the Seventh-Day Adventists is that life gets in the way of implementation. Early mornings, filled as they are with getting breakfast for the family, walking the dog, long commutes, getting the kids to daycare or school, and the myriad obligations that arise between waking up and being at work seem incompatible with preparing and consuming a large meal. Moreover, lunch, the other opportunity to eat the major meal of the day, is rarely a complete meal. Do people go home for a hot meal at lunchtime anymore? Most of us content ourselves with a salad or sandwich and consider ourselves lucky if we can eat it at a table rather than at our desk or sitting on a curb near a construction site.

Perhaps the real problem is being too hungry at dinner. If breakfast and lunch are skipped or skimpy, late afternoon-early evening hunger hijacks our control over eating while preparing dinner, at the meal itself, and afterward. We may justify our grabbing and gobbling because we have eaten so little earlier in the day. And we munch on cookies or ice cream after dinner because “they couldn’t have any more calories than the breakfast or lunch we skipped.”

It is unlikely that breakfast will become the new dinner, regardless of research on its impact on weight. But we should not minimize the importance of this meal as well as lunch in controlling our hunger late in the day. It really might work.

Antidepressants: The Hidden Contributor to Obesity

Years after weight gain was recognized as a side effect of antidepressant therapy, researchers have presented evidence of its contribution to the increase in obesity.  For those patients, who for years have described the devastating effect antidepressant have had on their weight, it is a ‘told you so’ moment. Last spring the British Medical Journal published a report by Rafael Gafoor, Helen Booth and Martin Gulliford, documenting the significant weight gain in Britain experienced by  patients on a variety of antidepressants, compared to the general population.  Using electronic medical records, they tracked weight status of the 53,000 British patients who had been prescribed antidepressants over ten years, and compared their weight to a similarly large group of untreated individuals.  Both groups gained weight, but a significantly larger number of those in the antidepressant-treated group increased their weight. Moreover, weight gain did not stop after the first year of treatment, but according to their findings continued, on average, for six more years.  The drug that caused the most weight gain was mirtazapine (Remeron).

Moreover, weight gain as a side effect of antidepressant treatment was not confined to those who were overweight or obese at the start of their therapy, but included patients who were of normal weight prior to treatment. The authors conclude that the impact of antidepressant drugs contributing to the increase in obesity in the UK has been overlooked, and should be considered a major risk factor. Their assessment of the impact of antidepressant therapy on generating obesity can be applied to the USA where, as in the UK, it has been almost entirely ignored as a risk factor.

That antidepressants and related drugs used for bipolar disorder and other mental  disorders cause weight gain is well known to patients and their mental health providers.  Several years ago, my associates and I were asked to develop a weight maintenance center at a Harvard associated psychiatric hospital to help patients lose the weight they gained (or were gaining) on psychotropic drugs.  What was so striking about our clients was that unlike those who have struggled with weight gain all their lives, they rarely had a problem with their weight prior to their treatment: eating a healthy diet and exercising characterized their lifestyle, and few had ever needed to be on a diet.

Because the data for the BMJ report was derived from electronic records, no information about alterations in food choice instigated by drug treatment was reported. However, several papers (as cited in the reviews below) have pointed to an increase in carbohydrate intake, and the absence of satiety associated with antidepressant use.

Those attending our clinic complained of an almost irresistible need to snack frequently on sweet or starchy foods and some (although usually those on mood stabilizers) would report eating a second meal an hour or so after the first, because they did not feel full. A professor of psychiatry at Boston area hospital shared the experience of a patient on Remeron who woke up every night to eat boxes of crackers and cookies.

The BMJ report did not offer information on whether weight was lost after withdrawal from antidepressants; presumably, after the psychotropic drug(s) is no longer in the body, appetite should return to normal. There have been reports of patients unable to lose weight despite dieting and exercising, sometimes for months and indeed years, after they have stopped their medication – but this information is largely anecdotal.

Recognizing the contribution of psychotropic drugs to the rising rate of obesity may lead to interventions to prevent or diminish weight gain. Ideally a patient should be advised on diet and exercise at the initiation of the drug therapy, but one wonders whether adhering to a regimen to prevent weight gain is practicable for a patient while still symptomatic. Moreover, often the dietary advice, although well intentioned, may be counterproductive if it includes restricting carbohydrates. Since the synthesis of serotonin depends on the consumption of carbohydrates, and since not only mood but satiety is dependent on serotonin activity, offering a low carbohydrate diet may only exacerbate the cravings and the absence of satiety.

Acknowledgment by practitioners of the real possibility of weight gain as a side effect of psychotropic drug treatment, and the availability of Individual and/or group weight loss support must be part of the treatment plan.  Obesity is not a benign side effect; it has well known health consequences, and may significantly affect the quality of life of the individual. Social isolation, employment discrimination, embarrassment at a body no longer recognizable are but a few of the consequences. Consideration of a patient’s weight status prior to treatment is also important;  a drug like Remeron known to cause uncontrolled eating may catapult an overweight individual into obesity.

Those who have gained weight as a consequence of their psychotopic medication have been invisible as a sub-group among the obesity community. One hopes that this report is the first step in making us notice and help them.

Chronic Lack of Sleep May Have Serious Consequences

Insomnia is a lonely and often neglected problem. This disorder may be found in 10-30 percent of the population, and perhaps higher among the elderly, females, and people with medical and mental disorders, according to the review by Bhaskar, Hemavathy and Prasad. But chronic insomnia may be neglected by family practitioners or treated inadequately. The insomniac cannot call their medical care provider at 2 or 3 a.m. after (once more) lying awake for hours and ask for the doctor for help the way one would if experiencing a medical problem during the day. Nor is waking someone up to relieve the 3 a.m. loneliness a good idea; it’s likely that the person awakened would not be good company. Moreover, since almost everyone has faced sleepiness at some point due to muscle pain, jet lag, a barking dog, or worry, we who suffer from insomnia sporadically may not realize how debilitating this condition can be when it is chronic.

Some occupations are vulnerable to sleep deprivation, either because their jobs don’t give them enough time to sleep, or because they have trouble falling asleep. Shift workers are prone to insomnia, and it may worsen when their sleep-wake cycle changes on their days off or when moving into a new work cycle. One consequence is a significantly greater occurrence of depression among shift workers compared to other groups, according to an analysis published a few years ago.

Despite the many symptoms associated with insomnia, the consensus seems to be that it is a disorder which is under-recognized and under-treated. This may be because sleep habits are not queried by the health provider, and complaints are not offered by the patient unless linked to an obvious cause for sleeplessness, such as pain, hot flushes, reflux, sleep apnea, or medication. Health providers may have neither the time nor the expertise to treat the disorder, assuming it is not related to an obvious cause . . . such as sleep apnea. Or, they may rely on pharmacological interventions to induce sleep, even though these drugs have side effects and/or limited efficacy. Sleep clinics may detect the underlying cause(s) of the sleep disturbances, but usually do not offer long-term therapeutic help.

Support groups for insomniacs exist and may provide information and help if this is not available from health providers. A.W.A.K.E., which stands for “alert well and keeping energetic,” is a national organization started years ago by the American Sleep Apnea Association to provide support for people using a new device, the PAP (positive airways pressure) machine, for sleep apnea. Currently the A.W.A.K.E program has expanded its outreach to anyone in the community with sleep problems. Other support groups helping those with specific problems that interfere with sleep, such as restless legs, are also listed on Internet sites and found throughout the country. But these groups are only as good as the information offered. Someone with serious psychological side effects from lack of sleep probably won’t find anyone in these support groups with the expertise to deal with their problems. However, one benefit may be no longer feeling isolated and lonely when sleep is elusive. Perhaps these groups, at the very least, give the insomniac the name of someone to talk to at 3 a.m.

Medical residents are another group that has been identified as vulnerable to impairments in mood and performance because of sleep deprivation. Their sleep needs are not met, because their work schedules require being “on-call” all night, after working all day. The numerous television hospital dramas with their interpersonal catastrophes fail to mention that the hospital staff may be suffering from depression, impairment of performance, and difficulties with interpersonal relationships because of inadequate sleep. The cognitive deficits associated with restricted sleep are not emphasized in these programs either, but are also a well-researched side effect.

However, emotional, cognitive, and physical impairments potentiated by sleep deprivation are not restricted to these two groups. In an article describing the results of a multi-site study testing an intervention to improve sleep, Freeman and his co-workers link a lack of sleep to clinical depression and suggest that many insomniacs experience general mental distress at their continuing failure to achieve restful sleep. Their study targeted university students whose insomnia caused paranoia and hallucinations, side effects that are probably not well known as a consequence of insomnia. The authors used an online cognitive-behavioral intervention over several weeks and compared the effects to conventional treatments for insomnia, such as medication, and suggestions about avoiding caffeine, regular bedtimes, and relaxation techniques. Despite the fact that no therapist was present in the experimental intervention, the online treatment was effective. Their intervention significantly reduced insomnia, paranoia, and hallucinations after 10 weeks, decreased depression and anxiety, and improved general well-being. What is striking about their results is that the improvements in mental and cognitive function were accomplished without drugs, and the therapy and the educational and cognitive interventions were carried out online.

References

“Prevalence of chronic insomnia in adult patients and its correlation with medical comorbidities,” Bhaskar, S, Hemavathy D and Prasad S, J Family Med Prim Care, 2016 Oct-Dec; 5(4): 780–784.

“Night Shift Work and Risk of Depression: Meta-analysis of Observational Studies,”  Lee A, Myung S-K Cho J, et al, J Korean Med Sci, 2017 32(7): 1091-1096.

“Sleep Deprivation and Depression,”  Al-Abri M, Sultan Qaboos Univ Med J, 2015; 4: 4-6.

The Unhappy Consequence of Not Being Able to Exercise

I knew she was going to become depressed. The email she sent said that her doctor said no tennis, swimming, golf, or rapid walking until the wound on her leg healed. She had fallen off her bike, the wound became infected, and a short healing time turned into weeks.

“I don’t know what to do with myself,” she  wrote.  “I am irritable, worried, depressed and anxious.  This is the longest I have gone without any physical activity.”

She exercised all her life, and has a master’s degree in exercise physiology. Her outside activity used to change with the seasons, but now that she had traded life in a cold European country for the warmth of Florida, she had been able to engage in outside physical activity year-round. But, for the time being, she could only prop up her leg and hope the healing would occur quickly .

My friend’s mood changes are well known among  committed exercisers who must stop exercising.  Magazines devoted to particular sports, such as running, devote columns to alternate types of exercise while recovering from an injury sustained during a race, for example. And the Internet is replete with articles, blogs and anecdotes written by those who find themselves unable to pursue their sport because some part of their body has been injured.   Moreover, many research studies have been carried out quantify, to some extent, the degree of mood changes brought on by experimentally-induced cessation of exercise.

In an experiment designed to see whether runners really do experience mood changes when they stop running, forty male runners who ran regularly were divided into two groups. One group was allowed to run during the six weeks of the study, and the other group was not allowed to run for two weeks in the middle of the study. Depression and other mood states were rated weekly and confirmed what my friend and others have experienced.  Depression, anxiety, insomnia and general stress were elevated during the non-running weeks among the runners. When they were allowed to go back to running during the last two weeks of the study, their moods were the same as the group that never stopped running.

Similar findings were reported among 40 women who engaged in aerobic exercise regularly and were told to stop their aerobic activity. Their moods were compared to a placebo group that had continued to exercise.  Those who abstained from exercise exhibited depressed mood and increased fatigue compared with those who did not stop their physical activity…And these below-listed studies are a small example of many that have been published.

And yet, exercise withdrawal due to injury, or other factors such as caring for a sick parent or child, overwhelming work obligations, prolonged adverse weather conditions, and numerous other life events, may be overlooked as a cause of significant changes in mood. Mental health professionals recognize exercise addiction and the mood changes that occur when the exercise is stopped, either due to injury or because the amount of exercise is pathological. But my friend exemplifies an individual who is not addicted to physical activity but does it, like brushing her teeth, as part of her daily routine. Indeed, soon after we spoke, another friend who had a surgical procedure on her leg called to tell me that she was “going crazy” because she was not allowed to swim until the surgical wound was healed.

“What am I going to do?” she almost wailed to me on the phone. “How can I survive without swimming?”

How many primary care physicians inquire about change in exercise patterns when investigating depressed or anxious mood, or increased fatigue in a patient?  Would it even occur to many (unless they also exercised regularly) to ask about changes in activity? Or when a physician tells a patient that he or she can’t run, or go to a gym, or play tennis, or walk quickly for several weeks, is there any thought given to the impact of such prohibition on the mood of the patient?

Conversations about exercise focus heavily on the benefit of physical activity on mood, weight loss, sleep, cognition, and on and on to convince those who would rather sit than walk on a treadmill to start to move for their health. But has enough attention been paid to helping patients deal with the mood and energy changes that occur when it must cease for a period of time?

One problem is understanding why stopping consistent exercise should have such a negative effect on general well-being.  Many who have experienced the inability to exercise for a period of time often cite an increase in stress and worry that no longer can be dampened by vigorous activity. Exercise allowed them to cope; without it, they must seek out alternatives and often don’t find them.  But what is it about running or biking or swimming or working out in a gym that allows our brains to increase their coping skills? Moreover, even when we find out the answer beyond such things as endorphins—which not everyone experiences, and certainly not all the time—the problem remains: what to do until exercise can begin again?

Magazine articles, Internet chatter and blogs offer some suggestions, but what about professional help? Shouldn’t a patient who is told, “No exercise for X weeks!” be referred to a physical therapist to learn what physical activity can be done? My non-swimming friend did learn from a physical therapist that she could do Yoga and Pilates; my other friend decided to do upper body strength training. When I last checked, both were considerably less grumpy.

 

 

 

“Effects of temporary withdrawal from regular running,” Morris, M, Steinberg, E , Syeks A et al,  J of Psychosomatic Res. 1990; 34: 493-500.

“Depressive mood symptoms and fatigue after exercise withdrawal: the potential role of decreased fitness,”  Berlin A,1, Kop W,, Deuster P,. Psychosom Med. 2006 Mar-Apr;68(2):224-30.

“Mental health consequences of exercise withdrawal: A systematic review,” Weinstein A, Koehmstedt C and Kop W,. General Hospital Psychiatry 2017; 49:11-18.

 

Valentine’s Day Chocolates: It Used to Be So Uncomplicated

When Richard Cadbury decided to package chocolates in heart-shaped boxes and sell them as gifts on Valentine’s Day in 1861, the most complicated result of his brilliant idea was difficulty in choosing a particular bonbon.  Should the chocolate be filled with chocolate or vanilla cream, chocolate truffle or a cherry in a cherry liquor? No one questioned the nutritional wisdom of eating a food whose ingredients included sugar, cocoa butter, full-cream milk powder, cocoa liquor, lecithin, vanilla and cocoa. Valentine’s Day was special and so was chocolate.

Jumping ahead many decades, chocolate Valentine’s Day gifts now have to be compatible with contemporary attitudes toward food. Chocolate itself has been clothed with health-giving properties; the darker, and often the more bitter, the better. Whatever ingredients chocolate contains to make it a health-giving food however, the amounts are really too small to make much of a difference (unless one eats a 3 ½ ounce chocolate bar containing a few hundred calories daily). But endowing chocolate with the same positive nutritional properties as say, kale…takes away the guilt at enjoying the delectable calories.

The most obvious nutritional hazard is the calories. Who should receive chocolates? Someone who is very thin? She or he probably wouldn’t eat them because it might cause weight gain. Someone who needs to lose weight? The gift conveys the message that the recipient is fat so what difference does it make if the gift makes her or him fatter. Is the giver saying, “I like you fat, so eat these chocolates?”

And then there are those whose personal eating profile makes eating a combination of sugar and fat problematic, so the type of chocolate edible presented as a gift might actually carry health risks. For example, there are many people these days with no tolerance for gluten or other chemicals found in grains, such as wheat and barley. There are others who cannot eat sugar, and still more who have embarked on a diet eliminating all carbohydrates.  None of these people can be the recipient of baked goods made with conventional wheat flour and, for some, sugar.

A friend of mine was in a quandary because she was making a special Valentine’s Day dinner for friends and learned that one of them was on a gluten-free regimen. However, he could eat small amounts of sugar. She searched the Internet for a flour-free chocolate cake and found a recipe with very positive reviews. Her only concern, as she told me later, was the combination of ingredients that made her wonder if she ought to have an EMT standing by when she served the cake.  She said, “I laid all the ingredients on the counter: two sticks of butter, six eggs, two cups of gourmet chocolate chips, sugar, and vanilla, and really thought aborting the recipe. I like this guy and didn’t want to send his cholesterol through the roof.”  She made the cake, which was delicious. She decided to serve such small portions that the heart-unfriendly ingredients couldn’t do much harm.

Since the gluten-sensitive guest could eat sugar and other carbohydrates, theoretically a Valentine’s dessert could have used ingredients like almond, rice or coconut flour. But what about those advocates of a totally carbohydrate-free way of eating? The so-called keto folk avoid carbohydrates entirely because they want their bodies to stop using glucose for energy and switch to using a byproduct of fat instead. Any morsel of carbohydrate that crosses their lips will cause the body to revert back to glucose. What is the giver of an edible Valentine gift going to do? Answer: Find or make foods that are mainly fat and sugar substitutes.

The popular keto diet limits the options, although not the calories. This diet forbids its users to eat carbohydrates in order to coerce the body into using a part of the fat molecule, fatty acids, for energy. These are converted into substances called ketones, and they supply energy formerly supplied by the natural source of energy in the body, glucose. The other half of the fat molecule, glycerol, is converted to glucose (don’t tell anyone) to be used for energy by the brain, which much prefers glucose to fat.  People on the keto diet may not know that this glucose is chemically identical to the glucose in chocolate or bran flakes or oatmeal when these foods are digested in the intestinal tract. What is worrisome about restricting intake to foods with little carbohydrate is that in addition to eliminating most of the fruits, vegetables, and high-fiber carbohydrates we should be eating, the foods can be extremely high in fat and calories. The Valentine Day’s keto edibles are a striking example.

Cheesecake sheathed in a chocolate shell, or drizzled with chocolate, is available commercially and as cheesecake itself is mainly cream cheese and sour cream (and in this case artificial sweetener), its high fat, sugar-free content makes it perfect for a keto Valentine food.  Peanut butter chocolate chip cookie dough works for all sorts of diets (raw, vegan, Paleo, gluten-free, sugar-free, grain-free) and looks like it would also be appropriate…but is not because it contains too much carbohydrate in the form coconut flour and almond flour. Most keto diet acceptable Valentine gifts have to be homemade and, like the flourless chocolate cake, may spread the waist while spreading love.  A chocolate truffle is made from cream cheese, cocoa powder and whipping cream. Chocolate hearts are made from coconut oil and cocoa powder, with artificial sweetener. Dipping bacon strips in chocolate makes a Valentine breakfast for your keto sweetheart’s breakfast or, if you live in England, you can buy a heart-shaped sausage from Marks and Spencer for the breakfast table.

Somehow these recipes don’t convey the traditional appeal of the old-fashioned Cadbury heart- shaped box and its many imitators.  Valentine’s Day was never meant to be celebrated by eating various cream cheese-based foods.

By definition, Valentine’ Day is a sweet holiday with a message of friendship, affection and love. Wouldn’t it be nice if we could take this day of uncomplicated messages to uncomplicate our dietary profiles as well? No one in 1861 receiving the first heart-shaped boxes of chocolate had to worry about whether eating the chocolates would throw the body into some sort of metabolic disaster. And why today should a person deciding what chocolate gift to buy for his or her sweetheart have to think about the food idiosyncrasies of the recipient?  Maybe, just as one hopes messages of friendship and love are not limited to February 14, one also hopes that a reasonable approach to eating can extend beyond the day as well.  Unless one has a medical reason to avoid certain foods, couldn’t we decide that foods that bring such pleasure and are associated with such positive emotions be allowed? After all, we don’t limit romance and love to one day. Why should we limit a piece of chocolate to one day either?

 

Getting Nutrients from Food is So Old-Fashioned: Try an Intravenous Drip Instead!

I looked at the remaining drugstore-brand vitamin pills in the container and wondered whether I ought to continue taking them. A few days ago, an advertisement from a local plastic surgery/wellness/anti-aging spa offered a reduced rate for a procedure in which I could get an intravenous of vitamins and minerals.  According to the blurb that accompanied the offer, the “drip” would allow these essential nutrients to bypass my stomach and intestinal tract, and go directly into my blood thus avoiding the risk of some nutrients not being fully absorbed or altered by the digestive process. The promotional material promised an enhanced glow to my skin, better sleep, and increased energy.

A quick search on the Internet revealed that the intravenous procedures offered by this spa will be the “new” health procedure this year and to expect to see these “drip” spas becoming as ubiquitous as nail salons.  The benefits of receiving a vitamin-mineral infusion a couple of times a week were compelling, according to the web site advertisements. One company calls its preparation a, “brain booster” and recommends its infusion before examinations (studying might help also). Improving immune function is a standard objective of most of the vitamin -mineral infusions, although none of these clinics said to skip getting the flu vaccine. What if someone with the flu comes into the spa to boost immunity?  Should they get the vitamin infusion or go home to bed? This was not addressed.

Someone who really hates vegetables, rarely eats fruits, and dislikes swallowing vitamin pills might welcome the chance to lie on a recliner, listen to soft music and have vitamins and minerals pumped into his or her body every few days. Throw in a pedicure and it is a perfect day of self-renewal.  But why would someone want another type of infusion offered by this spa, namely an infusion of amino acids?  Amino acids are in every protein we eat, and the only people who might need an extraneous source of amino acids are those whose medical condition such as stomach cancer or severe gastrointestinal disease makes digesting protein difficult.  Vegan diets limit protein to plant sources of protein, and some foods may lack adequate amounts of specific amino acids. But so far, vegans have not been advised to skip the quinoa, and instead get an infusion of amino acids.

On the other hand, getting essential nutrients without relying on food might appeal to someone attempting to maintain a pathologically low weight (models, for example). The infusions of vitamins, minerals and amino acids would be a big improvement over a diet of calorie-free soda and cigarettes.

But of course going to infusion spas rather than eating is not sustainable or sensible. There is no provision for an energy source; no infusions of glucose or fat are provided by these clinics.  And it is absurd to equate the nutritional value of a synthetic mixture of vitamins and minerals with the nutritional complexity of micronutrients in food.

But what is disturbing about these spas/clinics offering these nutrient drips is that they are making the same spurious claims that health food restaurants have been making for years.  There is a popular health food restaurant near me promising everything except immortality for their smoothies. A neighbor told us that he did not get the flu vaccine because of one of the smoothies claimed to confer resistance to the flu virus.

A quick scan of some Internet sites promoting nutrient infusions seem to be making similar claims.  Some intravenous clinics offer a seemingly random assortment of amino acids, minerals and vitamins to overcome depression, halt compulsive behavior, improve sleep, decrease the symptoms of mental disorders like bipolar disorder, prevent the symptoms of PMS, help smoking cessation, and of course, weight loss.

These infusion bags of health have about as much scientific validity as products sold by so-called snake oil hucksters who promised their powders and drinks would cure everything. One could shrug off the IV spas as harmless, but they aren’t. The client would not know whether the amounts of vitamins or specific amino acid or minerals are in the range of safe intake, the client would not know if any medication he or she is taking might be adversely affected by these infusions, and whether he or she might experience side effects. When vitamins, minerals and amino acids enter the body by mouth, they slowly enter the body and some of the nutrients may not make it out of the digestive tract into the blood stream. So the dose of vitamins, for example, that gets into the body by mouth tends to be smaller than when coming from an intravenous solution. Moreover, does the client know how much of the vitamins, minerals, and amino acids are retained by the body or eliminated through the urine?  Are the people formulating the solutions medically knowledgeable about the diseases they are supposedly treating? Would they offer medical care if the solutions have no benefit? Do these clinics use licensed personnel to administer the drip? Is the environment sterile to avoid contamination? If someone has an adverse response like a severe allergic response, is there a medical team to handle this?

The most serious aspect of these heathy drips is that there is no validity to their claims, and in some cases, may prevent people from seeking credible medical help. No one is going to lose weight or relieve the symptoms of obsessive-compulsive disorder or bipolar disorder with a drip of some vitamins and an amino acid. Relying on the “magic” of these drips, rather than interventions with scientific evidence supporting their utility, may work only because of a strong placebo effect. And if it does, that is fine. However, if we have truth in advertising, then the drips ought to be labeled “placebo” so the client knows what he or she is getting.

PMS Carbohydrate Craving and Personalized Weight Loss Plans

There is much talk these days about developing a personalized diet based on DNA analysis, lifestyle, food sensitivities, and the use of apps alerting a dieter to situations that might derail a diet. However, in developing an overeating profile, is enough attention being given to a condition that causes some women to eat foods that are expressively forbidden on their diets? Does the eating profile include the information that this condition occurs every month, often for five days or longer? Does the eating control app have in its database knowledge that if the dieter does not get the food she craves during that time she may become very angry, may even delete the app from her phone, or that cognitive changes may make her misplace the cell phone? This monthly change is premenstrual syndrome (“PMS”), and unfortunately, it may be overlooked or marginalized when planning an individualized food plan. Indeed, if the wrong foods are on the food plan, the dieter may find her symptoms worsening and her ability to stay on a diet eroded.

PMS is associated with a change in hormones that occurs in the luteal, or second half, of the menstrual cycle. Estrogen levels begin to decrease and progesterone to increase soon after day 14 or so of the cycle. PMS typically appears a few days before menstruation and can suddenly alter mood, sleep, energy, concentration, and food cravings. Not all women experience PMS; the severity of the symptoms vary from barely noticeable to hampering daily life. Women who experience PMS may not experience it every month and with the same degree of severity. The most severe form is called premenstrual dysphoric disorder and is similar to clinical depression except, unlike a typical depression, it goes away by the beginning of the next menstrual cycle. PMDD, as it is called, is often treated with anti-depressants.

Craving chocolate is commonly associated with PMS and is not to be taken lightly as anecdotes describe women braving blizzards to get a chocolate bar. However, the cravings encompass both sweet and salty crunchy carbohydrates. A weight-loss client told me, “I did not know I was premenstrual until I returned home from my weekly grocery shopping with bags of cookies, ice cream, chips, hot fudge sauce, and packaged cupcakes. My husband asked me why I hadn’t bought any real food, and I told him this was what I wanted to eat. I got my period the next day.”

Several years ago, we were able to admit normal weight women with PMS to our MIT clinical research center to evaluate their mood and directly measure what they were eating when they were at the beginning of their menstrual cycle. We then would evaluate three weeks later when they had PMS. Food was provided in pre-measured servings at meals, and a computerized vending machine allowed the women to obtain protein-rich snacks such as cold cuts and cheese, as well as sweet and starchy snacks such as cookies and potato chips between meals and in the evening. When these normal-weight women were premenstrual, their calorie intake increased by more than 1100 calories a day, compared to the first half of their menstrual cycle — and the calories came from carbohydrate meals and snack foods.

Because all of these women were active and did not overeat when they were not premenstrual, their weight remained stable. However, if they had been trying to lose weight, the obvious response in developing a personalized weight-loss plan would be to insist on cutting out carbohydrates. Indeed, it seems obvious that if they had been on a low-carbohydrate diet, PMS would not have affected their food intake, because carbs would not have been allowed.

Perhaps. But eliminating carbohydrates would have affected their mood, and done so negatively.

Our research team discovered that the deterioration in mood, energy, focus and control over carbohydrate intake was due to alternation in serotonin activity, probably caused by the shift in hormones at the end of the menstrual cycle. Our research was involved in the first use of an antidepressant (Sarafem) that increased serotonin activity to relieve the symptoms of severe PMS.

Women with PMS apparently crave both sweet and starchy carbohydrates because their consumption will increase the level of serotonin. Eating carbohydrates is a natural solution to easing the deterioration of mood, energy, and concentration. A two-year study on the effects of a carbohydrate-rich drink on these symptoms of PMS showed this to be the case. The small amount of carbohydrate in the drink decreased cravings for carbohydrate snack foods significantly. When the women were given a drink containing protein, the PMS symptoms were intense including alterations in cognitive function.

The test carbohydrate beverage used in our study was fat and protein-free, and thus its calories came only from a combination of a simple sugar, glucose, and a mixture of starchy carbohydrates. Some breakfast cereals could easily be substitutes for our drink with their sprinkling of sugar on a high-fiber, starchy crunchy square or flake.

Eliminating carbohydrates, as is still the fashion in many weight-loss plans, overlooks a significant connection between this nutrient and brain function. The brain needs carbohydrates to be consumed to maintain serotonin levels and activities, especially when hormonal changes decrease such activity. In short, to remove carbohydrates in the interest of weight loss may be akin to tampering with nature.

References

Wurtman J, Brzezinski A, Wurtman R, and LaFerrerre B, , “Effect of nutrient intake on premenstrual depression,” Am J of Obstetrics and Gynecology l989; 161(5): 1228-1234

Brzezinski, A, Wurtman J, Wurtman R, Gleason R, Greenfield J, and Nader T D, “Fenfluramine suppresses the increased calorie and carbohydrate intakes and improves the mood of women with premenstrual depression,” Obstetrics and Gynecology l990; 76: (2) 296-391

Sayegh R, Schiff I, Wurtman J, Spiers P, McDermott J, and Wurtman R, “The effect of a carbohydrate-rich beverage on mood, appetite and cognitive function in women with premenstrual syndrome,” Obstetrics and Gynecology 1995; 86: 520-528.