Category Archives: Simon Says

When Bone Soup Promises More Than It Delivers

One of my neighbors was recently diagnosed with liver and pancreatic cancer. She is rapidly losing weight because eating and digesting food causes her pain, but her weight loss may make recovery from chemotherapy more difficult. She told me she is drinking bone broth in order to obtain the nutrients she needs, and to halt her weight loss.

Why? I asked her when we talked today.Everyone says it is good for me,” she answered, everyone upon further questioning being some relatives and a few friends. “But you need nourishment, I protested. You need to eat protein, you need carbohydrates for energy, and you need vitamins and minerals. You aren’t going to stop losing weight by drinking bone-flavored water.“

Fortunately, her oncologist referred her to a hospital dietician experienced in the nutritional needs of cancer patients such as my friend, and the bone broth is now watering some house plants. But this incident is an example of how popular food fads, health food supplements and neighborly advice may exacerbate, rather than solve nutritional problems.

Bone broth, a soup containing mostly water and the flavor and some nutrients from the bones cooked in it, is a broth that people have been eating for eons. It is, in some respects, like drinking liquid, salty Jell-O. When beef bones are cooked for long periods of time, they turn into a gelatinous mass, as I discovered when I forgot about a pot of water and bones I was simmering in order to make stock for soup. (Washing the pot became a major endeavor.) This gelatin in the hands of competent cooks can be turned into aspic, a translucent covering for pates and cold chicken, or a sweet “Jell-O” type dessert. Proponents of bone broth point to the gelatin as evidence of its vast nutritional value: all the good protein and the collagen from the bones is going to decrease inflammation, fortify your bones, and lubricate your joints. What is not mentioned is that gelatin is an incomplete protein because it lacks the essential amino acid tryptophan, and contains very small amounts of another amino acid, tyrosine.

Both tryptophan and tyrosine are needed for the synthesis of new protein in our bodies. Thus, if my friend depends on the gelatin in bone broth in order to make new protein for her muscles that are wasting away, she will be unable to do so. Moreover, the collagen in bone broth is digested in the intestinal tract, and is no more able to lubricate our joints than the butter or oil we may be eating.

Ironically, if a chicken were simmered along with the bones it would turn into (drum roll please) chicken soup. The chicken is a good source of protein, and although the power of chicken soup to heal the body may be exaggerated, its ability to soothe the distress of a bad cold or flu, or maybe restore the body after a bout of chemotherapy does not seem to be in dispute.

It is disconcerting to find bone broth sold in supermarkets and online for not inconsiderable amounts of money. In the old days, before this fad, people threw a few bones in a pot of water and whatever vegetables they had to make a very cheap soup. Bones also used to be given to dog owners or sold in enormous quantities to be turned into gelatin, or the fertilizer bone meal. Paying $10.00 or more for a box of bone broth containing mostly water seems absurd.

What is so worrisome about this food fad, and the many others that pop up like mushrooms after a wet spell, is that they suggest we don’t have to rely on food for our daily nourishment or to compensate for some nutritional deficit such as lack of vitamin C or iron. The health food store, not healthy foods at the grocery store, is promoted as the path to nutritional wellness. I receive updates from several online newsletters describing the latest supplement entering the health food market. It is often astonishing to read about the promises made, without any evidence, for these products. One of many entering the market this past month includes bitter melon, cinnamon bark, fenugreek seed, olive leaf and artichoke leaf, holy basil herb and lycium fruit. These are presented in a liquid and supposedly will maintain normal blood sugar levels in people with normal blood sugar levels (italics are my own). Apparently the makers of this supplement never heard of insulin that our pancreas secrete (for free) when we eat carbohydrates. Another product also just now for sale is made from Siberian rhubarb roots and promises to help menopausal symptoms like hot flushes. The research supporting these claims and many others is often not real or reproducible, but how would a consumer know this? 

My friend with cancer believed that the bone broth she was drinking, even though her weight was melting off, was nourishing her. Unfortunately, she was getting none of the nutrients she needed. People may hesitate to seek medical advice or ignore it completely because they are convinced that the promises made by the supplements will be the answer to their medical problems. Supplements can interfere with drugs one is already taking. Given the number of supplements on the market, and the sometimes bizarre source of ingredients (who knew that rhubarb could be grown in Siberia?), physicians may not know whether the ingredients are dangerous. Plus the dose of a supplement may be entirely too high. For example, many doses of melatonin range from 3 mg to 10 mg; the dose established by clinical research puts the dose at 0.3-0.5 mg and the higher dose may dampen the body’s own melatonin production.

The FDA has information about the ingredients, function and side effects of many supplements, and it is worth spending time learning about a supplement that has been recommended or advertised before taking it. Some are critically important, such as those providing the vitamins and minerals an individual may not be able to obtain through food. My friend does take a vitamin-mineral supplement because she finds it too painful to eat many fruits and vegetables.

Our health is too important to be left to the sellers of health products. Checking out the scientific validity of a product may not be possible without the help of dieticians or others knowledgeable about the contents and claims of these products. But it is worth making the time to do so.

Will Stress Lead to Autoimmune Disease?

A worrisome report from a group of Icelandic scientists linking stress to autoimmune disorders appeared in a recent edition of JAMA. The media alerted us to their findings in terms that were, well, stressful: If, or more realistically when, we experience severe stress, we will be increasing the likelihood of developing diseases ranging from thyroid to hair-loss disorders. This report resonated with me, as I did develop an autoimmune skin disorder at an age when it was rare to show the first symptoms.

My physician asked whether I had been stressed earlier in the year. The answer was yes. My stress was due to worry and sorrow over a close friend’s diagnosis of a terminal disease. Would I have been “immune” from this autoimmune skin problem if the year had been less stressful? The media description of the results of the study would have you believe it to be so. An acquaintance told me about her son who is working an impossible number of hours as a first-year associate in a large law firm. “He is so stressed,” she told me, “that I am worried he will develop some awful disease.” And she quoted from a news release about the study to me:

The study looked through medical records of more than 100,000 Swedish adults who had been diagnosed with stress-related psychiatric disorders, the medical records of 126,652 siblings of these patients and 1.1 million unrelated individuals. The two latter groups had no stress-related disorders. Forty percent of those with stress-related psychiatric disorders were male and their average age, 41.

What is striking about their results is that over a 10-year follow-up period, a significantly larger number of individuals who had a stress-related psychiatric disorder were diagnosed with an autoimmune disease compared to the other two groups. Some of the diseases included Addison’s disease, rheumatoid arthritis, psoriasis, multiple sclerosis (M.S.), Crohn’s and celiac disease. Also, the risk for developing a particular disease differed. For example, there was a higher risk for celiac disease than rheumatoid arthritis.

Does this mean that a stress-filled year or short-lived acute stress will lead to a lifetime struggle with an autoimmune disease such as M.S.? The answer is no. To begin with, the stress is not simply stress; it is a diagnosed psychiatric disorder. Post-traumatic stress disorder, acute stress reaction, and adjustment disorders are listed by the authors as associated with increased risk for subsequent autoimmune disease.  Being stuck in a two-hour traffic jam on the way home from work is very stressful but unlikely to cause the development of the skin lesions associated with psoriasis.

Moreover, as the authors point out (despite the hype in the media), “The relatively modest differences in the incidence rates of autoimmune disease between the exposed and unexposed,” (i.e. stressed and non-stress disordered individuals) “…should not lead to special monitoring of people who have been diagnosed with a stress disorder.”

But this conclusion still leaves the question as to why there should be a connection, even a weak one, between PTSD and M.S. or thyroid disease. Changes in cortisol levels, changes in pro-inflammatory cytokine levels, or an overly active inflammatory response/immune system are put forth to attempt to understand the process leading to the disease. But no workable answer is available yet.

What is so sad about the study results, however, is that those who have a stress-related disorder, such as PTSD, which in itself significantly affects quality of life, might then have to endure many decades of another disease that compounds the diminished quality of life.  However, the report found that antidepressant treatment for PTSD reduced the risk of an autoimmune disease. Thus treating the stress disorder may be the answer to preventing another lifetime disorder from developing.

“Association of Stress-Related Disorders With Subsequent Autoimmune Disease,” Song, H., Fang, F., Tomasson, G., et al, JAMA 2018; 319:2388-2319

Are Kids Born, or Made Into, Emotional Overeaters?

Anyone who has eaten when frustrated, angry, bored, worried, exhausted, lonely, or depressed—but not hungry—has engaged in emotional eating
(So that makes most of us.)  And for most, the food eaten is less likely to be steamed broccoli, poached chicken breast, or fat-free yogurt and far more likely to be a member of the so-called carbohydrate junk food family.

We know this from studies carried out at the MIT clinical research center about 25 years ago. Emotional overeaters were offered a choice between protein snacks like miniature meatballs or luncheon meat and carbohydrate snacks like cookies and crackers. The choice was always the carbohydrate foods. The predictable choice of carbohydrates led to research confirming that the carbohydrates were chosen not from taste (the meatballs were delicious but ignored) but because eating crackers or cookies led to an increase in the mood-soothing activity of serotonin. Our conclusion, reinforced by many subsequent psychological studies, was that people used carbohydrates as a form of self-medication.

But how did we learn to do this? And indeed, did we learn to do this, or is medicating with food something we are born with?

Infants don’t eat to make their bad moods go away. They eat to make their hunger go away.   And infants don’t eat when they are not hungry.  Theoretically infants, especially those who are breastfed, do not overeat since it is almost impossible to get infants to swallow more milk when they are done feeding. The mouth closes, the head is turned away, and often sleep takes over.

So how does an infant who self-regulates her food intake turn into an emotional overeater? Some pediatric obesity researchers such as Savage, Birch, Marini, et. al.1 suggest that it is the mother’s fault. Mothers who interpret every sign of their infant’s distress as hunger will feed their infants too often. The baby may not eat but eventually, so the researchers surmise, the baby associates feeling bored, lonely, wet, annoyed or whatever emotions babies feel with being offered food.

This association seems to be strengthened when parents offer treats to the now older child to soothe her. Blisssett, Haycraft and Farrow measured cookie and chocolate consumption among preschool children when they were stressed in a research setting. Children whose mothers often gave them snacks to comfort them ate more sweet snacks than children whose mothers did not offer them snacks when they were upset.

Is this how it begins? The child grows up and, when experiencing the predictable stresses of childhood, adolescence and adulthood, turns to food as a means of coping?

But there is much unanswered about this assumption, i.e. that children will turn into emotional overeating adults because they were given treats as children to help them overcome distress, boredom, or anger.

Do children growing up in cultures where food is scarce become emotional eaters? They may worry as adults about not having enough food and hoard food or overeat because they learned as children that food is not always available. But is this emotional overeating?

Do all children in a family become emotional overeaters in response to being given comfort food while growing up? Often some children in a family overeat sweet or starchy junk food and others reject these items. What makes Sally, but not Sam, reach for cookies when experiencing a negative mood state? Why doesn’t Sam also use food to feel better?

Do children, and indeed adults feel comforted if given any food when upset or only specific foods? The answer is obvious, at least in our culture.  Foods offered and eaten in times of stress tend to be tasty, sweet or starchy and often high in fat (cookies, chocolate, ice cream).  If, theoretically, a toddler was always offered a piece of broccoli or spoonful of cottage cheese after bumping his head or feeling confined in a stroller, would he grow up and reach for the same foods when upset? Probably not, but this is testable. If a child grows up in a community where it is common to eat hot chili peppers or munch on dried seaweed or snack on avocado, then would these be comfort foods?

Are children nurtured from early infancy in a daycare center where meal and snack times are regulated and not dependent on a child’s mood less likely to become emotional overeaters?

Might children who are denied so-called tasty junk food because of their adverse effect on weight and health, feel compelled to eat such foods when they are old enough to get the food themselves? And might they overeat such foods to compensate for the years they were denied such treats?

Clearly much research has to be done before we understand whether an emotional overeater is born or made that way.  Answers may come from studies in which self-defined emotional overeaters are given covertly a food that they tend to eat when stressed, and a food that is never eaten  (crackers versus cottage cheese). Measurements of their emotional state before and following eating are measured. If the emotional overeater shows an improvement in mood to one or the other test food, then the change must have come about because of some change in the brain regulation of mood, and not because of taste or the anticipation that the food will help the mood.

And perhaps, eventually, we can find what in the food gives the child or adult an emotional hug, so we can strip away the calories and leave just the good feeling behind.

Dividing a Daschund: Cementing a Friendship

This blog is for all caretakers and friends who look out for one another.

Simon, our long-haired dachshund, runs to Mary Lou’s apartment and makes low, moaning sounds of anticipation as we wait for her to come to the door. Once in her arms, he licks every inch of her face and then runs to her kitchen.

 “Simon, you know there won’t be any treats!” I call after him. Mary Lou, slender herself, is strict about getting Simon’s weight under control, but it’s a hopeless goal.

 Mary Lou and I hug. We have not seen each other since she left for Palm Beach and we, South Beach last fall. Now it is May, and Mary Lou’s turn to have the dog.  I hand her Simon’s heartworm and tick prevention pills, his leash and harness, and take the elevator to our apartment. I miss the dog already. He won’t return to our bed (literally) until next fall. 

 It is right and fitting that Mary Lou and her husband have Simon for six months. They own half of him, although which half it is, after almost 14 years, is still contested.  We bought Simon together, not long after Frieda, my wire-haired dachshund, died.

 Mary Lou and I became friends almost 30 years ago when we moved the same month into a new condominium building in Boston.  My husband and I were traveling frequently for work, and she offered to care for Frieda. Their condo became the dog’s second home, and Frieda spent so time at their medical supply company that her picture appeared on the cover of the company catalogue.

Frieda died at 16, and after we stopped grieving, Mary Lou and I agreed that it hurt too much to get another dog. Six weeks later we bought Simon. The breeder, named Jenn, was so fussy that she interviewed me on the phone before allowing us to visit. So we decided not to tell her that we were going to buy and share the dog. Our story was that I wanted a dog and Mary Lou was helping me find one.  It was a wise decision. I doubt that Jenn would have tolerated the dog being shared like a lawn mower. The puppy, whom we named Simon, seemed unconcerned. 

Sharing the puppy was the only way we managed to live through the two years it took to housebreak him. Like many of his breed, it mattered little to him that our carpets were not grass. “You take him; I am out of pee cleaner!” became a common refrain during the frequent hand-overs.  

Our somewhat erratic sharing of Simon eventually became fixed by season.  Mary Lou and her husband became snowbirds, and as their Florida apartment did not allow dogs, Simon lived with us from November to early May. We followed the snowbird migration a few years later living in a building littered with dogs.  

Dividing two dogs has cemented our friendship. Like an old married couple, we kvetch over the same things, share private details about our lives, comfort  each other, gossip ( too much),  and occasionally go hiking.  

We also get lost. Often.

There was the time we hiked with Simon on Blue Hill, a nearby 630 foot nano-mountain, and could not find our way back to our car.  Using an out of date map, (we didn’t know) and following a trail marked with barely visible dots (the trail had been abandoned) we were certain that the three of us would become a newspaper headline when our bodies were discovered. We were rescued by a hiker who pointed out our stupidity as she pointed us in the right direction.  

That was our last hike. But the reason was not our phobia about getting lost again. Simon is almost blind. He has a genetic disease similar to macronnuclear disintegration.  He walks slowly, his nose acting as a built in white cane, scanning the space around him for obstacles. He manages well enough in familiarly scented areas but rock strewn hiking paths, typical of those on Blue Hill, are no longer possible.

And the other reason is that Mary Lou has cancer. The double whammy of her treatment protocol, radiation and chemotherapy, is stilling her normally active life.  So the three of sit together in the library of our building, which is a social space for residents. Our armchairs are close enough so that Simon’s head is on one lap and his tail on the other. (He is a very long dog). We each rub him and talk and laugh and gossip and sometimes cry because that is what friends do. And our love for Simon and our love for each other passes through his furry body to each of our hands and our hearts and our memories. 

Can You Get Scurvy If You Eat Out Too Much?

Soon after arriving home from a short trip to Manhattan, I took a vitamin pill.  No, there was nothing arduous about the return journey that required a dose of nutrients. But on the train back to Boston, I reviewed in my mind the various places where we breakfasted and dined (lunch was usually skipped) and realized, that except for a shared salad at one dinner and some fruit at a breakfast, I had failed miserably at consuming the recommended daily servings of fruit and vegetables.  For a 2000-calorie diet, the recommendation is to consume about 2 -2 ½ cups of fruit and 2 cups of vegetables daily.

This wasn’t because I had left vegetables and fruits untouched on my plate. There were never any on the plate. The restaurants (Greek, French, and mixed American), chosen by consensus, had large selections and theoretically should have been able to supply some vegetables. Indeed, the Greek fish restaurant did have appetizers, i.e., Meze, that incorporated some vegetables like eggplant and cucumbers into purées, dips and wraps (like grape leaves).  But the main courses in all three restaurants presented an entrée on an otherwise naked plate. To be sure, vegetable side dishes and salads were available but the size and, quite frankly, the cost of these extras made them less attractive. Somehow spending the money for three grilled asparagus that one would spend for a pound of the same vegetable at Whole Foods seemed like an unjustifiable extravagance.

Desserts were not considered but quick polite scans of the dessert menu (after all, if a server puts one in your hand, the least one can do is look at it) showed a uniform absence of anything resembling a fruit.

Obviously eating away from home because of business, travel or vacations is not going to cause acute malnutrition. And is certainly possible and not that all difficult to choose restaurants that offer enough vegetable and fruit selections to satisfy the USDA nutrient intake recommendations as well as one’s mother. Had we been eating on our own, we would have done so.

But we have come a long way from the time when all restaurants put vegetables on the plate, gave you a salad along with the breadbasket, and included fresh fruit on the dessert menu. There was a time when cafeterias were as common as fast-food restaurants are today, and the number of cafeteria trays holding vegetables was as numerous as those containing meat, chicken or fish. To be sure, the salad may have consisted of watery iceberg lettuce and tasteless tomatoes, and the vegetables came straight from an industrial size can, but no one expected a lunch or dinner meal to consist only of a solitary protein entrée. Fifty or sixty years ago, if you were served a plate with two lonely lamp chops or a chunk of fish and nothing else, you might have thought the server forgot to put the two veg and a potato on your plate.

Like other cultural changes that creep up on and take hold (who remembers records and landlines?), we don’t notice the chronic absence of vegetable options in the “nice”’ restaurants, or our habit of putting together our own meals without including them.  And a result, we fail to notice that we may have stopped eating vegetables altogether. They have become a forgotten food.

In contrast to the ongoing debate over high and low-carb or high & low-fat diets, the extraordinary powers of protein to turn us back into Paleolithic cave people, and the devastating effects of gluten on the brain, no one discusses vegetables.  Who debates the merit of spinach over kale or Brussel sprouts over broccoli? When was the last time the Science section of leading newspapers had research on the merits of vegetable consumption? 

Fortunately, there are some recent trends that may forestall an outbreak of scurvy or other nutrient deficient diseases. Leading chefs are inventing ways of turning the ordinary carrot, string bean or beet into creative, original dishes that rival the importance of the protein selections on the menu. Vegetable-laden smoothies and juices are becoming ubiquitous; the selection of bottled vegetable juices go far beyond V8, and juice bars allow customization of vegetable and fruit mixtures. Mixed drinks containing vegetables haven’t found their way into wine bars yet but someone will come up with an alcohol beverage that somehow incorporates kale. Supermarkets have, for many years now, made vegetables available for immediate consumption. No washing, peeling, slicing or dicing necessary; just chewing.  And to remedy the “How do I get my family or spouse to eat vegetables?” problem, many frozen varieties are sold with sauces or suggestions on how to transform the pea or carrot into a gourmet dish.

But….the vegetables have to be bought and eaten at home, not left to gradually decompose in the vegetable bin. If eating away from home is more frequent than dining in one’s kitchen, restaurants should be chosen that offer healthy salads and vegetable side dishes with affordable prices.  Most restaurants display their menus on the Internet so it should be possible to find some that do not regard vegetables as a colorful garnish.  The cost of those vegetable side dishes could be decreased if both the entrée and the vegetables and/or salad, are shared.  Lunch is an easy meal at which to eat vegetables as these days many feature salads or salad bars; even airport restaurants offer a variety of freshly made salads. (Our problem in New York was that we skipped lunch).

It takes some effort to develop scurvy; even the British sailors who did so were not vulnerable until many weeks of vitamin C deprivation. But it also takes a little effort to remember that vegetables are part of a healthy diet and should be hunted and gathered, even if the gathering is at a salad bar.   

Are Australians Becoming the New Fat Americans?

On our first recent trip to Australia, I could hardly wait sighting our first kangaroo, hopefully with a joey (baby) in its pouch. They did not disappoint, nor did the adorable but totally inert Koala bears, cockatoos with designer plumage (who talked back to me), wombats which looked like horizontal furry fireplugs and the platypus, first seen in a 2nd grade book on mammals that lay eggs.

But what I did not expect to see were obese Australians. My uninformed image of the sheep rancher in the outback, the crocodile wrestler, or surfer barely escaping shark attacks made me assume that all Australians were lean, muscular, vigorous, tall and wind-burned. And in the first city we visited, Sydney, this was largely true. No sheep ranchers were in sight but the crowds of men and women going off to work in their suits, briefcases, and sleek hairdos were by and large thin or of normal weight. They walked fast and looked like they spent some of their leisure time in gyms, or running or biking.

However, in conversations with some health writers, including physicians, at meetings my husband and I attended, I quickly learned that the low BMIs (body mass indexes) of Sydney residents were atypical. “Just wait until you get into the suburbs, small towns and other cities,” they told me. “Then you will see how fat we Australians are becoming.” And indeed, not only were their observations accurate, they were also reinforced by daily newspaper accounts about the obesity race Australians were about to win. Even though we Americans still rank number one in our prevalence of obes

One of the reasons given for the rapid rise in weight gain was to enable Australians to disguise themselves as Americans when they traveled abroad, but my nutritionist /health writer acquaintances described other causes as well:

• Too large portion sizes (although not as large as ours)

• Little awareness that excessive calorie intake will caused weight gain.” People seem not to understand that eating a fast-food lunch of 2500 calories will affect their weight,” one health journalist told me. “People just think they are getting more for their money.”

• Too much sugar in their beverages, both hot and cold. Australians love their coffee, which is understandable as it is superb, and are more likely to add sugar to their drink rather than a non-calorie sweetener. And they drink many fizzy, sugar and fruit-flavored drinks along with sugar- filled sodas.

• Butter is consumed like water. “Watch how we eat our bread and rolls,” another told me. “We slather it on, carefully covering the entire surface of a piece of toast or roll and would be horrified if bread were not served with butter. “ She was right. At the various dinner-lecture evenings we attended, I noticed that everyone split opened their roll and carefully used up the two pats of butter placed next to their plate. And at the ubiquitous breakfast buffets, the toast had a thick layer of butter before being layered with several slices of fatty bacon and/or sausage.

• Snack foods are very high in fat as well as sugar. Our low or fat-free starchy snacks like pretzels, rice crackers, and popcorn are not that common and people will, for example, eat scones, pastry tarts, doughnuts, and turnovers with an afternoon cup of coffee.

• As in the U.S., too little exercise is also linked to obesity among adults and children. Long commuting times and work hours, lack of physicaleducation in schools, and disinterest in playtime for children adds up to a sedentary life style.

Advice on stopping and reverse obesity was similar to those in the States: cut out sugar, increase physical activity, and eat less meat (they are great meat consumers). Also, consume more fruits and vegetables, whole-grain products and low- fat dairy foods. But none of these recommendations addressed what I was told was the major contributing factor to obesity: alcohol intake.

Everyone I asked told me that many Australians might drink a bottle of wine every night at dinner and then really drink over weekends. A physician friend said that binge drinking was common and not just among the young.

A young female wellness advocate said that she is pressured to drink excessively when out with friends. She went on to tell me that no one talks about the calories people consume from alcohol. It is rarely mentioned as a cause of obesity. And no attempt is made to decrease alcohol intake to promote weight loss. “The reason,” she went on before I could ask, “is that drinking is a cultural thing. It is who we are, what we do, and people are not willing to change. They focus on cutting out sugar even though that has 4 calories per gram and alcohol has 7. ”

She was right. Scanning articles suggesting ways of losing weight, I found all the familiar 21st century recommendations such as eating gluten-free foods, drinking smoothies made of lemon juice, kale, and kangaroo tail (no, not really), avoiding all sugar, fasting and feasting diets, and lap banding, an increasingly popular form of bariatric surgery to shrink the stomach.

After spending only two weeks in Australia, I hardly qualify as an expert on any aspect of their obesity problems. It took me almost this long to learn how to order coffee (black, white, long, flat white). But I suspect that just as with the U.S., the medical and financial costs of obesity will bring about changes, even in the current untouchable aspects of their butter, meat and alcohol intake. If not, most of the population will end up looking like wombats.

e adults and children, the reports stated that obesity was increasing at a much higher rate in Australia than in the States. And children, according to one long weekend newspaper article, were becoming so heavy, that it was hard for some of them to walk.

Are Sidewalks the Answer To Weight Loss?

We all know the mantra by now: If you want to lose weight or prevent weight gain, you have to exercise along with eating healthfully. What kind of exercise? Why walking, of course. Your doctor mumbles something about walking three or four times a week while writing out the requisitions for lab tests at the conclusion of your annual physical. You want to ask how you are going to manage to do this under the blazing sun and humidity of the summer, or the dark, cold, icy, snowy days of winter, or on leaf-slick sidewalks after a November rainstorm. Or, if you can get another question in before being ushered out of the office, where are you going to walk since you live in a neighborhood without sidewalks?

Many cities or older suburban communities usually have sidewalks. They may not be free of snow in the winter, and cracked and jagged from old tree roots pushing up the pavement, but at least residents don’t have to walk on the road. But this is not the case in many parts of the country where sidewalks and residential areas often part ways. If walking is to be done, it has to be on roads that often have no shoulders where one can stand to avoid being hit by a delivery truck or a mammoth SUV. If the side of the road has dense vegetation or rocks, even standing there may be perilous since there is little space for one’s feet. More than once, I have stayed at a hotel /convention center for meetings in a suburban industrial park and have been forced to walk or run on highways with sand and pebbles flying in my face from passing trailer trucks. And although some suburban communities, often gated, have roads relatively free of traffic, the mind-numbing effect of walking round and round streets with only houses and not a store in sight is enough to send one inside.

Walking is the easiest, most convenient, and least expensive way to exercise, and there is data to support the notion that those who walk most may be the healthiest. [1] New Yorkers are supposed to be the fastest walkers in the country and may be among the healthiest. Today, the life expectancy of a baby born in New York is 80.9 years, which is 2.2 years more than the national average. [2] Of course, these city residents don’t walk just for the exercise; it is often the most efficient and even fastest way for them to go from point A to point B.

The Bureau of Transportation Statistics reported several years ago that in the summer of 2002, 86 percent of the 205 million Americans walked at least once, and 40 percent walked more than 15 days, per month. The presence of sidewalks increased the tendency for adults to take walks, and the Bureau suggested that adding sidewalks to communities without them would increase the number by another 2.8 million. (In all fairness, some of these non-sidewalk communities may have walking trails or parks.) [3]

To be sure, there are many alternative ways of exercising: health clubs, home treadmills, recreational sports from skiing to swimming, dancing, rock climbing and more. All of these will certainly use up calories, increase stamina and bone strength and perhaps even cognitive acuity. However, unlike most of these other forms of exercise, except perhaps biking and rollerblading, walking permits multi-tasking which, in our overly busy lives, makes it a much more attractive form of exercise. Walking the dog, with your children to their school, to do errands, to the post office, to a doctor’s appointment, to a neighbor, or to a movie, theater, concert or restaurant means that the walker is not only burning calories but also accomplishing some other goal. Indeed, there is a website called WalkScore that evaluates over 10,000 neighborhoods in 3,000 cities for their walkability, i.e., ease of doing errands and getting to recreational sites by foot.

Lifestyle change is the buzzword defining what has to happen if the obese are to become and remain thin, and the thin not to become obese. The ability to walk in safety, on sidewalks that lead to somewhere interesting, must be part of this healthy lifestyle if people are to follow it. Those who are fortunate enough to live in neighborhoods with sidewalks and places to walk to are aware of the positive impact this has on their lives. Social interactions are as likely to involve taking a walk as sitting and eating. Being out on the sidewalk has often led to friendships among neighbors who otherwise always leave the house in a car.

To be sure, having sidewalks and places to which to walk does not eliminate the discomfort and even hazards of bad weather, dog owners who do not clean up after their pets, and watching out for cars when streets have to be crossed. But it certainly makes it easier to respond to the suggestion to get out and walk, especially when it means you don’t have to worry about finding a place to park your car.