Exploring chronic disease
Chickens beware. Your meat, and that of other animals, may soon be in higher demand. The problem is that tofu, a soy product often used to replace meat, has once again been tied to negative health consequences – in this case, memory loss and dementia.
Researchers at Loughborough University in England recently published two studies — in the journal Dementias and in Geriatric Cognitive Disorders – which found that eating high levels of some soy products may raise the risk of memory loss.
The research team, led by Professor Ef Hogervorst, tracked soy intake and subsequent memory function in 719 elderly Indonesians living in urban and rural regions of the island of Java. They found high tofu consumption – at least once a day – was associated with worse memory, particularly among subjects over 68 years of age.
A variety of compounds are found in soy products, so naturally Hogervorst and other researchers are intent on discovering which specific compounds, or group of compounds found in soy might contribute to impaired health, such as memory loss among the elderly.
Hogervorst and team seem to be leaning towards the hypothesis that phytoestrogens – micronutients found in soy that partially mimic the hormone oestrogen – may be to blame for the mental damage observed in their study subjects. According to Rebecca Wood of the Alzhemier’s Research Trust (which funded the study), oestrogens tend to promote growth among cells. Could there be something about faster growing cells that could be detrimental to the elderly brain?
Without much evidence at hand to prove the above hypothesis correct, Hogervost and team have also considered the possibility that high levels of oestrogens might allow cells to be damaged more frequently by free radicals, yet the mechanisms behind such a hypothesis also remain cloudy.
Finally, the researchers have hypothesized that the observed mental decline in the Indonesian subjects might not have been caused by the tofu, but by formaldehyde, which is sometimes used in Indonesia as a preservative. Yet the fact that previous research has also linked high tofu consumption to an increased risk of dementia in older Japanese American males, and Amercian soy products do not contain formaldehyde, renders the hypothesis rather dubious as well.
So how would one explain this evidence? Hogervost and his research team seem to have their share of hypotheses, but here’s another: the reason why soy is harmful for older adults is that the primary isoflavone found in soy, Genistein, is immunosuppressive. Dr. Marshall has shown that Genistein actually dysregulates the innate immune response by binding and inactivating the vitamin D receptor. Just like exogenous vitamin D (known as 25-D), chlorogenic acid (found especially in coffee), or other compounds that negatively affect VDR activity, high levels of Genistein are able to slow the innate immune response (which is controlled by the VDR). As innate immune system activity decreases, less of the chronic bacteria implicated in causing a plethora of physical and cognitive symptoms are killed. The result is a temporary decrease in the inflammation generated when the pathogens die and a feeling of temporary palliation.
Dr. Hogervost may not know it, but his data is consistent with that of Dr. Martha Payne. In 2007, Payne and her team showed that vitamin D intake in older adults correlated with higher total volume of brain lesions. What’s the common thread between these two studies? In both cases, the elderly subjects are consuming foods (or supplements in the case of vitamin D) that slow the VDR. The resulting immunosuppression allows numerous bacteria-driven chronic diseases, including Alzheimer’s and dementia– to develop with greater ease -and within the time parameters defined by the study. Younger, and presumably healthier subjects, will likely suffer from the same detrimental effects if given a longer window of time.
Dr. Hogervorst has stated that there is some evidence that soy may “protect” the brains of younger and middle-aged people from damage but its effects on the aging brain are less clear. The statement directly supports the hypothesis that soy is immunodulatory. The periods of “protection” Hogervorst refers to are almost certainly periods of immunosuppression which ultimately allow enough bacterial spread in the brain so that elderly patients present as symptomatic.
Since the Th1 pathogens are picked up over the course of a lifetime, elderly people also generally have higher bacterial loads than their younger counterparts. Since many of the Th1 pathogens likely create ligands that also block the VDR, elderly patients are more susceptible to compounds like Genistein that contribute to VDR dysregulation. Because their VDRs have already been rendered largely inactive by the pathogens they harbor, smaller amounts of a compound like soy can more easily push the elderly “over the edge” into a state where they are strongly immunosuppressed.
Unfortunately the great majority of mainstream researchers remain unaware of Genistein’s negative effects on the VDR. And because the same researchers also fail to factor chronic bacteria into the pathogenesis of mental disorders such as Alzheimer’s and dementia, they are continually confused by other anomalies in their study data. For example, a different study recently found that eating tempe, a fermented soy product made from the whole soy bean, is associated with better memory. Huh?
Again, it is only when the new study is viewed through the lens of the Marshall Pathogenesis that the mental “improvements” observed in subjects can be better understood. Tempe is not only high in soy but also high in folic acid – a compound that bacteria use to their advantage. Upon consumption, folic acid is enzymatically converted into a form that the body can use to produce the nucleic acids that make up our DNA. However, if a person consumes extra folic acid, the Th1 pathogens are able to use the substance to generate their own nucleic acids and replicate and create their own DNA.
The result is that the Th1 pathogens find it easier to survive in a folate-rich environment, so high levels of the substance again prevent the immune system from effectively killing the chronic pathogens implicated in causing memory loss and dementia. By temporarily keeping many these pathogens at bay, folic acid also prevents the rise in inflammation associated with their deaths, and mistakenly, at least in the short-term, gives the impression of “wellness.”
When will the difference between palliation and true wellness become commonly understood? We know enough now to realize that the feel good effects of food simply can’t be taken at face value.
In the meantime, there’s no need to be obsessive about avoiding soy or folic acid. Marshall recommends that soy product in the diet be kept to a minimum (particularly among those patients on the MP). And because it’s excess folic acid that tends to most greatly benefit the bacteria we harbor, he recommends eating only those foods which naturally contain the substance. So products with extra folic acid, particularly white bread products, should be avoided.
It’s not unusual for people, especially those with Th1 disease, to find that soy doesn’t always work well in their diet.
But the reason why soy appears to cause problems in many people with Th1 disease remained largely speculative until several month ago, when biomedical researcher Trevor Marshall used molecular modeling software to observe the way that the primary soy isoflavone (or antioxidant), called Genistein, interacts with the Vitamin D Receptor.
His models revealed that Genistein is a partial agonist (activator) of the VDR, and that the substance forms hydrogen bonds with several of the same residues as the vitamin D metabolite 1,25-D (which also activates the VDR).
But, Marshall warned, there’s a catch. Genistein doesn’t have the “tail” of Vitamin D or Benicar, meaning that it cannot transcribe certain genes “which need co-activators requiring helix 12 to be stabilized.”
This suggests that Genistein interferes with the operation of the VDR, as well as with the operation of two other receptors (PPAR-gamma and PPAR-alpha), all of which are key to the immune system.
Huh? If I just lost you on the last few sentences, they boil down to the following: Marshall’s model shows that Genistein binds and activates the VDR. Yet it appears to modulate its activity, and the activity of several nearby receptors, in a way that can dysregulate the activity of the immune system.
But the above conclusion is based on a molecular model. “How much faith can we put into molecular modeling research?” some skeptics might ask.
A study published this week by researchers at Department of Biological Sciences at University of Notre Dame in Indiana suggests that molecular modeling software, when used effectively, can be pretty darn accurate.
In a paper published in the Journal of steroid biochemistry and molecular biology, the team confirmed that Genistein does indeed bind and adjust the activity of the VDR.
The Notre Dame researchers had previously characterized an area in the human VDR gene (called a promoter), that, when bound by certain substances, adjusts the activity of the Vitamin D Receptor. When the team treated this promoter region with Genistein, they found that the substance up-regulated the transcription of proteins produced by the VDR. They verified these results with a second molecular technique called Western blot analysis.
Now that the ability of Genistein to affect the VDR has also been confirmed by a variety of molecular approaches, Marshall’s message that soy intake should be restricted rings louder and clearer then ever. This is particularly true for people with Th1 disease, who are extra sensitive to substances that dysregulate VDR activity. These people also need the receptor to function optimally in order to help the immune system target the Th1 pathogens.
According to Marshall, problems with soy are dose-dependent. Anything up above the cited “20 grams of roasted soybeans,” or 40mg of isoflavones, is likely to be a problem.
For decades, researchers working with L-form bacteria have warned that while standard antibiotic therapy successfully kills classical bacterial forms, it leaves bacteria that transform into the L-form unscathed. In fact, when the beta-lactam antibiotics are administered to patients with acute infection, they actually foster the growth of L-form bacteria, meaning that patients treated with these antibiotics can certainly plan on dealing with numerous symptoms of chronic disease in the years to come.
Because L-form bacteria grow so slowly, few researchers have made the connection between acute infection and chronic symptoms that rear their heads decades down the road. However, several research teams have finally taken note of the fact that food poisoning victims, who at one point suffered from a severe acute infection, are much more likely to develop chronic symptoms later in life.
“It’s a dirty little secret of food poisoning,” says Lauren Neergaard of Yahoo News. “E. coli and certain other foodborne illnesses can sometimes trigger serious health problems months or years after patients survived that initial bout. Scientists only now are unraveling a legacy that has largely gone unnoticed.”
In a recent interview with the Associated Press, researchers at the University of Utah described how many survivors of severe E. coli infections as children are suffering from severe kidney issues later in life. The research team, which has long tracked children with E. coli, found that about 10 percent of E. coli sufferers develop a life-threatening complication called hemolytic uremic syndrome, or HUS, where their kidneys and other organs fail.
Ten to 20 years after they recover, between 30 percent and half of HUS survivors will have some kidney-caused problem, says Dr. Andrew Pavia, the university’s pediatric infectious diseases chief. That includes high blood pressure caused by scarred kidneys, slowly failing kidneys, even end-stage kidney failure that requires dialysis.
“Folks often assume once you’re over the acute illness, that’s it, you’re back to normal and that’s the end of it,” said Dr. Robert Tauxe of the Centers for Disease Control and Prevention. “The long-term consequences are an important but relatively poorly documented, poorly studied area of foodborne illness.”
Donna Rosenbaum of the consumer advocacy group STOP (Safe Tables Our Priority) told the Associated Press that every week, her group hears from patients with health complaints that they suspect or have been told are related to food poisoning years earlier. One woman to contact the group survived severeE. coli at 8 only to have her colon removed in her 20s. Other people report developing diabetes after a bout of food poisoning that inflamed the pancreas. Then, there are children who survived food poisoning dialysis as toddlers and are now suffering from learning problems.
“There’s nobody to refer them to for an answer,” says Rosenbaum.
So STOP this month is beginning the first national registry of food-poisoning survivors with long-term health problems — people willing to share their medical histories with scientists in hopes of boosting much-needed research.
One person to share her story is Alyssa Chrobuck of Seattle, who at age 5 was hospitalized as part of the Jack-in-the-Box hamburger outbreak 15 years ago that resulted after several people ate meat infected with a dangerous form of E. coli.
Today she’s a successful college student but ticks off a list of health problems unusual for a 20-year-old: High blood pressure, recurring hospitalizations for colon inflammation, a hiatal hernia, thyroid removal, endometriosis.
“I can’t eat fatty foods. I can’t eat things that are fried, never been able to eat ice cream or milkshakes,” says Chrobuck. “Would I have this many medical problems if I hadn’t had the E. coli? Definitely not. But there’s no way to tie it definitely back.”
Further evidence connecting acute infection to chronic illness later in life comes from studies which have shown that about 1 in 1,000 sufferers of campylobacter, a diarrhea-causing infection spread by raw poultry, develop far more serious Guillain-Barre syndrome a month or so later. Their body attacks their nerves, causing paralysis that usually requires intensive care and a ventilator to breathe. About a third of the nation’s Guillain-Barre cases have been linked to previous campylobacter, even if the diarrhea was very mild, and they typically suffer a more severe case than patients who never had food poisoning.
Similarly, a small number of people develop what’s called reactive arthritis six months or longer after a bout of salmonella. It causes joint pain, eye inflammation, sometimes painful urination, and can lead to chronic arthritis. Certain strains of shigella and yersinia bacteria, far more common abroad than in the U.S., trigger this reactive arthritis, too, Tauxe says.
It remains to be seen how long it will take Tauxe and other researchers to connect these chronic infections of L-form bacteria. But when the connection is finally made, the medical community will become aware of the fact that persistent L-form infection is affecting the entire population, not just food poisoning victims. As biomedical researcher Trevor Marshall describes, the chronic illnesses that all people develop are a direct result of the “peasoup” of L-form bacteria and other pathogens they have collected over a lifetime.
The implications of this situation are clear: when a person develops an infection, treatment must be administered that kills both the classical and L-form bacteria. Happily, the Marshall Protocol works effectively to kill L-form bacteria. This means that people can now do a short course of the Marshall Protocol after developing an acute infection, allowing them to get rid of any L-form bacteria that have formed, and saving them from a host of chronic symptoms later in life.
A decade ago, after researchers linked folic acid with a reduced rate of a birth defect called spinal bifidia, the FDA mandated that the substance be added to wheat flour and other grain products. Since that time, 42 other countries have implemented some form of mandatory folic acid fortification based on the same premise.
However recent research has revealed that over the past decade, rates of colorectal cancer in the United States have risen for inexplicable reasons, even as regular colonoscopy check-ups have become more common. In Canada, where folic acid supplementation was introduced a bit later, the same trend has been observed.
Two recent commentaries appearing in Nutrition Reviews address these findings and provide an overview of the existing evidence on folic acid fortification and the associated policy issues.
Dr. Solomons, author of one of the commentaries, “Food Fortification with Folic Acid: Has the Other Shoe Dropped?” advises that a careful reconsideration of the fortification program is needed stating, “One size of dietary folic acid exposure does not fit all. It can be beneficial to some and detrimental to others at the same time.”
Since the risk-benefit value of fortification varies according to age, Solomons suggests a reevaluation of the manner in which folic acid to prevent birth defects is delivered to the public. Among other things, Solomon feels that folic acid supplementation could be targeted in a different manner to women of reproductive age. Then, in order to protect the rest of the population, reducing folic acid levels in foods for which fortification is optional (such as ready-to-eat cereals and commercial drinks), would be worthy of consideration.
“Folic acid supplementation wields a double-edged sword,” remarks Dr. Young-In Kim, author of Folic Acid Fortification and Supplementation—Good for Some but Not So Good for Others,” the other commentary review published in the November issue. According to Kim, “It may be beneficial or harmful, depending on the timing of intervention.” Exposure to high intakes of folic acid in early life and young adulthood may provide life-long protection from the tendency for cancer formation in different organs, such as the large intestines, whereas such exposures later in life, when cell damage has occurred, can spur on the advance of the tumor.
These results should come as no surprise to people on the Marshall Protocol who realize that L-form bacteria at the root of most chronic disease use extra folic acid to their own advantage. If a person consumes too much folic acid, L-form bacteria will use it to generate their own nucleic acids, facilitating their ability to replicate and create their DNA.
Similarly, researchers in the UK are arguing against a measure that would require an increase in the amount of folic acid added to food products in England. Their concerns stem from the results of a recent study by the Institute of Food Research in the United Kingdom, which found that the liver could easily become saturated by folic acid at the levels the English government plans to add to the food supply (levels that would equal those mandated by the American government).
Writing in the British Journal of Nutrition, the researchers warn this could lead to unmetabolized folic acid entering the blood, which could damage health. “With doses of half the amount being proposed for fortification in the UK, the liver becomes saturated and unmetabolised folic acid floats around the blood stream,”reports Dr. Sian Astley of the research team.
“This can cause problems for people being treated for leukemia and arthritis, women being treated for ectopic pregnancies, men with a family history of bowel cancer, people with blocked arteries being treated with a stent and elderly people with poor vitamin B status.”
“For women undergoing in-vitro fertilization, it can also increase the likelihood of conceiving multiple embryos, with all the associated risks for the mother and babies.”
Studies have confirmed that unmetabolized folic acid accelerates cognitive decline in the elderly with low levels of vitamin B12. It may also increase the incidence of breast cancer in postmenopausal women.
However, since the 1980s a consensus formed that folic acid is metabolized in the small intestine in a similar way to naturally-occuring folates.
“We challenge the underlying scientific premise behind this consensus,” states Astley, who also warned it could take 20 years for any potential harmful effects of unmetabolized folic acid to become apparent.
These studies are proof positive of the fact that adding artificial levels of a substance to the food chain can have serious, unintended, and detrimental effects on much of the population. While government efforts to help the public obtain substances such as folic acid have been in good faith, the simplistic notion that a large, diverse, population should all be given the same level of a substance has run its course.
In the case of vitamin D, mandatory fortification has resulted In disastrous consequences. Armed with the understanding that “vitamin” D obtained from supplements (25-D) is not a vitamin, but a potent secosteroid that represses the innate immune system and the transcription of thousands of genes, it is of utmost importance to re-evaluate the vitamin D dietary guidelines. Molecular research implies that when it comes to folic acid as well as vitamin D, we should be much less cavalier in our eagerness to fortify the food supply.
Amy Proal graduated from Georgetown University in 2005 with a degree in biology. While at Georgetown, she wrote her senior thesis on Chronic Fatigue Syndrome and the Marshall Protocol.