Mother and infant

Maternal diet during pregnancy is thought to be one of the most influential factors on child health and development. However, dietary interventions during this period may miss a critical window to improve health during childhood, as well as adult life.

In a recent series of articles in The Lancet, researchers address the significance of nutrition in the preconception period, or the time before a woman becomes pregnant. The series of three articles challenges the current perspective of the preconception period. Currently defined as 3 months before conception, the authors suggest preconception should also include any time a woman is at child bearing age. This revision is based on an understanding of the biological events occurring during the periconceptional period, or the time immediately surrounding conception. In addition, it addresses a lack of nutritional preparedness for pregnancy in women of reproductive age and the failure of dietary interventions during pregnancy in preventing adverse health outcomes.

How does pre-pregnancy nutrition affect child health?

The periconceptional period begins before fertilization occurs, with maturation of sperm and oocytes, and extends until implantation of the fertilized egg. From the time of fertilization, this process occurs relatively quickly (up to 9 days in humans) but is characterized by drastic changes developmentally, genetically, and metabolically. The embryonic genome undergoes epigenetic modifications, or alterations to the DNA that do not change the genetic code but rather affect how a gene is expressed by turning expression on or off. These modifications are responsive to environmental conditions and nutrient availability, and likely adapt to promote optimal survival under existing conditions. However, the established gene expression pattern may be detrimental in environmental conditions outside of the uterus, promoting disease development later in life.

Although epigenetic changes can occur throughout one’s lifetime, the periconceptional period is unique in that a small number of cells are present. Full exposure to the environment allows this founder population of cells to establish the genetic program that persists throughout development.

How does this change current practice?

The influence of maternal nutrition during the periconceptional period on disease and development in offspring is not a new concept. Both maternal overnutrition and obesity, as well as undernutrition have been known to adversely affect metabolic regulation in offspring and increase the risk for metabolic disease development.

More recently, analysis from the UK National Diet and Nutrition Survey suggest that less than 10% of women of reproductive age meet the recommended daily intakes during pregnancy for several key micronutrients including zinc, vitamin A, folate, and calcium. Only 30% of women meet the daily intake recommendations for iron. A lack of success of multiple micronutrient supplementation during pregnancy in improving child health outcomes, including survival, growth, body composition, and blood pressure, indicate the importance of correcting such nutritional deficiencies well before pregnancy.

These findings suggest that preconception intervention strategies should include population targeted interventions for women of reproductive age, in addition to those targeting the 3 months before conception. This will allow adequate time to correct for nutritional deficiencies before pregnancy.

Suppose you’ve been told to eat an anti-inflammatory diet, or maybe you’re a practitioner whose clients want to know whether this is right for them. Before hopping on this buzzy bandwagon, ask yourself ‘For what purpose?’

Without missing a beat, you say ‘Well, to reduce my inflammation!’

While technically a noble intention, let’s first acknowledge that this term is used loosely in everyday conversation, but it’s more misunderstood than one might initially believe. Let’s talk about this elephant in the room, dive in, and answer a few key questions: What’s inflammation in the first place? What factors (dietary and otherwise) contribute to, or mitigate it? And finally, how might we modify our diets and our behavior to reduce it?

 

creative-diagnostics.com

What is inflammation?

In broad terms, inflammation is the body’s immune system’s response to a stimulus.1This can be in response to common injuries such as burning your finger, or falling off of a bicycle, after which you feel the affected area become red, warm, and puffy- this is a localized response to injury, characterized by ‘increased blood flow, capillary dilation, leucocyte infiltration, and production of chemical mediators.’2In short, an inflammatory response means the innate (non-specific) immune system is ‘fighting against something that may turn out to be harmful.’

It turns out that while inflammation is often cast in a negative light, it’s actually essential in small amounts for immune-surveillance and host defense.2 In true ‘Goldilocks’ form, too little and too much inflammation both pose problems; in fact, most chronic diseases are thought to be rooted in low-grade inflammation that persists over time. This inflammation may go unnoticed by the host (you!) until overt pathologies arise, which include, but are not limited to, diabetes, cardiovascular disease, nonalcoholic fatty liver disease, obesity, autoimmune disorders, inflammatory bowel disease, and even clinical depression. This concept is called ‘The inflammation theory of disease,’ in which inflammation is the common underlying factor among the leading causes of death.3

How do we measure inflammation?

Although measuring low-grade chronic inflammation (read: A chronic, low-grade immune response) carries a number of limitations, studies frequently measure cellular biomarkers such as activated monocytes, cytokines, chemokines, various adhesion molecules, adiponectin, non-specific markers such as C-reactive protein, fibrinogen, and serum amyloid alpha. Key inflammatory pathways include sympathetic activity, oxidative stress, nuclear factor kappaB (NF-kB) activation, and proinflammatory cytokine production.4 Now you might wonder, ‘What does this mean for me? What modifiable factors can activate my key inflammatory pathways?’ If we are to address this question appropriately, let us turn our attention to both dietary and behavioral moderators.

cbsnews.com

What makes up an anti-inflammatory diet?

Prolonged low-grade inflammation is associated with excessive oxidative stress and altered glucose and lipid metabolism in our fat (adipose) cells, muscle, and liver.4 Therefore, research suggests that certain dietary components can modulate these key inflammatory pathways and clinical pathologies. Dr. Barry Sears explains in a review paper that “anti-inflammatory nutrition is the understanding of how individual nutrients affect the same molecular targets affected by pharmacological drugs.” 5

Compelling research from large-scale, longitudinal observational studies including the Women’s Health Initiative Observational Study6 and Multi-Ethnic Study of Atherosclerosis (MESA) study7suggest that a diet with appropriate calories that is low in refined carbohydrates, high in soluble fiber, high in mono-unsaturated fatty acids, a higher omega-3 to omega-6 ratio, and high in polyphenols, all have anti-inflammatory effects on the body. A Mediterranean diet pattern that incorporates olive oil, fish, modest lean meat consumption, and abundant fruits and vegetables, legumes, and whole grains, shows more anti-inflammatory effects when compared to a typical American dietary pattern. Other observational and interventional studies have also suggested that dietary patterns incorporating green and black tea, walnuts, ground flaxseed, and garlic are also associated with reduced inflammation.

drmarkhyman.com

 

Can my stress levels influence inflammation, too?

To conclude our discussion with anti-inflammatory dietary strategies would be a half-told story. In fact, “Communication between the systemic immune system and the central nervous system (CNS) is a critical but often overlooked component of the inflammatory response to tissue injury, disease or infection.”3

Behavioral studies have shown that prolonged psychological stress can activate the same pro-inflammatory pathways we’ve been discussing all along. While chronic psychological stress can promote over-expression of pro-inflammatory mediators, it can also promote overeating unhealthful foods in the absence of hunger. 8 Repetitively stress-eating calorie-dense, nutrient-poor foods not only further exacerbates psychological distress and creates a vicious cycle of stress-eating, but over time promotes adiposity, which we’ve described is itself a pro-inflammatory state.

painisnotprison.com

Integrative strategies and considerations

This ‘cross-talk’ between the brain and body suggests that strictly dietary or strictly behavioral interventions are not enough to reduce inflammation on their own. Instead, we must consider integrative diet and lifestyle preventions/interventions simultaneously. Going forward, we’ll need better biomarkers and more research looking at individual responses to diet (personalized nutrition!), and better understanding of how food components and behavioral factors modulate genetic targets involved in the inflammatory response.

 

References:

  1. What is an inflammation? National Center for Biotechnology Information. https://www.ncbi.nlm.nih.gov/pubmedhealth/PMH0072482/. Published January 7, 2015. Accessed March 16, 2018.
  2. Hunter P. Stress, Food, and Inflammation: Psychoneuroimmunology and Nutrition at the Cutting Edge. EMBO Reports. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3492709/. Published November 2012. Accessed March 16, 2018.
  3. Hunter, Philip. The Inflammatory Theory of Disease. EMBO Reports, Nature Publishing Group, Nov. 2012, ncbi.nlm.nih.gov/pmc/articles/PMC3492709/.
  4. Galland, Leo. “Diet and Inflammation.” Sage, 7 Dec. 2010, journals.sagepub.com/doi/abs/10.1177/0884533610385703?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub=pubmed.
  5. Sears, Barry, and Camillo Ricordi. “Anti-Inflammatory Nutrition as a Pharmacological Approach to Treat Obesity.” Journal of Obesity, Hindawi Publishing Corporation, 2011, ncbi.nlm.nih.gov/pmc/articles/PMC2952901/.
  6. Thomson, C A, et al. “Association between Dietary Energy Density and Obesity-Associated Cancer: Results from the Women’s Health Initiative.” Journal of the Academy of Nutrition and Dietetics., U.S. National Library of Medicine, ncbi.nlm.nih.gov/pubmed/28826845.
  7. “Associations of Dietary Long-Chain n-3 Polyunsaturated Fatty Acids and Fish With Biomarkers of Inflammation and Endothelial Activation (from the Multi-Ethnic Study of Atherosclerosis [MESA]).” The American Journal of Cardiology, Excerpta Medica, 4 Mar. 2009, www.sciencedirect.com/science/article/pii/S0002914909001088?via=ihub.
  8. Tryon, M., Carter, C., DeCant, R. and Laugero, K. (2013). Chronic stress exposure may affect the brain's response to high calorie food cues and predispose to obesogenic eating habits. Physiology & Behavior, 120, pp.233-242.

Potential mothers, new mothers and mothers of multiples often worry about how their nutrition will affect their children. With the high rates of childhood obesity, some mothers worry more than ever about what they are putting in their bodies. Although this could be a discussion that includes pesticides on food, chemicals in cleaners and even air pollution, let’s just focus on nutrition.

 

It has been shown that a strong predictor of a child’s future BMI is the mother’s pre-pregnancy BMI (Schou-Anderson et al, 2012). This prediction comes from two sources, environment (how parents eat directly influences how children eat) and genetics (especially epigenetics). Epigenetics is how our cells control gene expression without changing the core DNA sequence and can include both DNA methylation and histone modification. It is consistently reported that maternal diet can directly alter DNA binding sites (Aagaard-Tillery et al, 2008) and DNA methylation (Dudley et al, 2011) in the offspring of mothers fed high fat diets. High fat diets contain energy dense foods consisting of >45% of total calories from fat, essentially mirroring the typical Western diet, which is full of highly palatable, highly processed energy dense foods. While this is certainly not a comprehensive list of publications on this topic, it is safe to say that maternal diet can influence an offspring’s risk of developing obesity through epigenetics (a nice review here). Hence the idea that whatever you eat, your baby also experiences.

 

While this may not be a novel concept, it is more important than ever to educate mothers (and fathers!) about the influence their diet could have on their future children’s body composition and their overall risk for obesity-associated diseases. While this information may initially leave parents anxiously asking questions like “Is there anything I can do?!”; “Is the damage already done?”; or “What could I have done differently?”, our goal is to provide information that is both reassuring and accurate knowing that with the right nutritional decisions, your child will be just fine!

 

Multiple studies have shown interventions in eating patterns and exercise work for reducing obesity and risk for associated diseases (reviewed here, here, here, here, here etc.).

 

The trick? Implementing these changes in your families diet and exercise routines to change the trajectory that epigenetics may have imposed when your little one was no larger than a grain of rice.

 

Sunday morning at ASN’s 2017 Scientific Sessions and Annual Meeting began with the “Nutrigenomics and Personalized Nutrition” session hosted by the ASN Nutrient-Gene Interactions RIS. The presentations that followed addressed aspects of the question: how do genes and nutrition interact?

Silvia Berciano, Tufts University, opened the session with her presentation on “Behavior related genes, dietary preferences and anthropometric traits.” She explored how genes involved in behavioral and psychological traits influence dietary habits, which then subsequently affect physical traits. Data was used from the Genetics and Lipid Lowering Drugs and Diet Network (GOLDN) Study to investigate how behavioral candidate genes, food preferences, and anthropometric traits interacted. Her study found multiple genes were associated with various dietary components, such as chocolate intake was linked to a variation at the OXTR locus, which was associated with waist circumference. Her results support that mapping genes could lead to a better understanding of how to personalize nutrition.

Krittikan Chanpaisaeng, Purdue University, next presented on “Femoral and L5 Spine Trabecular Bone Are Differentially Influenced by Dietary Calcium Restriction and Genetics in Growing Mice.” Her study explored site-specific effects of dietary calcium intake on the trabecular bone mass in the right femur and L5 vertebrae using genetically distinct mouse lines randomized to receive adequate or low-calcium intake. She found the low-calcium intake had a negative effect on trabecular bone mass, but this was especially seen on the structural integrity of the femoral, as compared to L5 vertebrae. This suggests the low-calcium intake has a differential effect on trabecular bone mass. Further, there was large phenotypic diversity among the mouse population for all femur and spine parameters, which shows genetics may be a major regulator of trabecular bone mass. These findings demonstrate that genetic mapping can help classify sites that could be more susceptible to bone loss. This information could be used to identify those that may then be more vulnerable to osteoporosis.

From the same research group, James Fleet, (below image) Purdue University, shared his talk, “Multi-Trait Genetic Mapping Reveals Novel Loci Responsible for Genetic and Genetic-by-Diet Interactions Affecting Bone, Vitamin D, and Calcium Metabolism.” Fleet used multi-trait analysis to examine interactions between bone mass, calcium, and vitamin D phenotypes. He used Principle Components Analysis to look for patterns and relationships between phenotypes. This approach found loci not identifiable on a single-trait map, which can be employed to establish a framework for understanding how genetic variation can interact with calcium intake to affect the development for components of bone mass.

Maxwell Barffour, (below image) University of California Davis, then presented his findings on “Hemoglobinopathies and Child Feeding Practices as Predictors of Anemia in Rural Laotian Children: Evidence from the Lao Zinc Study.” Anemia tends to peak around complementary feeding due to inadequate feeding practices. He examined how Hemoglobin E-trait (HbE), a common structural hemoglobin variant in Southeast Asia, was associated with risk for anemia. Children, aged 6-23 months from the Lao Zinc randomized controlled trial, were genetically mapped to see if they had the normal trait (HbAA), were heterozygote (HbEA), or homozygote (HbEE). Feeding practices were assessed using a 24-hr dietary recall administered to the parents. His study found homozygotes had more than triple the risk for anemia and heterozygotes had a 40% greater risk for anemia. His findings support that among breastfed children, consumption of iron-rich foods and especially food diversification, may reduce anemia risk in this population.

Riva Sorkin, University of Toronto, examined how salivary amylase may influence consumption of a high starch diet in her presentation, “Genetic variation in the AMY1 gene is associated with dietary carbohydrate and starch intake in a young adult population.” Salivary amylase is encoded by the AMY1 gene, which has high copy number variation, meaning that large regions of the genome are duplicated or deleted on chromosomes. Single nucleotide polymorphisms by the amylase genes have been shown to be associated with AMY1 copy number. Sorkin used data from an ethno-culturally diverse population from the Toronto Nutrigenomics and Health Study and genotyped the participants for AMY1 single nucleotide polymorphisms. Participants also completed questionnaires about their health, lifestyle, and physical activity. She found that polymorphisms in the AMY1 gene were linked to dietary intake patterns, such as carriers for the minor allele in rs1999478 in East Asians had significantly higher total energy and sugar intakes. Her study suggests there are unknown mechanisms, including genetic polymorphisms, that affect satiety and appetite regulation that should continue to be explored.

M. Elizabeth Tejero, INMEGEN, discussed how response to a fish oil supplementation trial in Mexicans (aged 18-40 years) was influenced by genetics during her talk on “Differences in the transcriptome of responders and non-responders on glucose metabolism markers after fish oil supplementation.” Participants were given fish oil pills (DHA and EPA) for 6 weeks. Participants with the largest reduction or increase in fasting insulin were then pair-matched by age, sex, change in omega-3 index, and BMI. Between these two groups, there was no difference in gene expression before the study. However, after the trial there were changes in genes related to inflammatory response and glucose metabolism. Participants with the largest reduction in fasting insulin exhibited more changes in these genes, as compared to participants with the largest increase in fasting insulin. Her findings support that dietary intake can influence gene expression.

Next, Tolunay Aydemir, (below image) University of Florida, presented her talk, “Zip14-Mediated Zinc Transport Contributes to Regulation of Glucose Homeostasis in Intestine, Pancreas and Liver.” Her study used a ZIP14 knockout mouse model to support findings that ZIP14 is vital for regulating signaling events for glucose homeostasis and inflammation. The knockout mice were found to have impaired zinc-dependent insulin degrading proteases, insulin degrading enzyme, and cathepsin D, which increased the activity of the insulin receptor. These mice also displayed greater hepatic glycogen synthesis and impaired gluconeogenesis and glycolysis linked to reduced cytosolic zinc levels. Her study showed that ZIP14-mediated zinc transport is related to intestinal barrier function, biosynthesis and secretion of insulin, insulin receptor activity, and glucose homeostasis in the liver. This suggests ZIP14 could be a new target for treating diabetes and other glucose-related disorders.

Qiaozhu Su, (below image) University of Nebraska-Lincoln, ended the session with her presentation on “MicroRNAs in the Pathogenesis of Lipogenic Diet Induced Hyperlipidemia and Insulin Resistance.” Her study investigated how micro RNAs could regulate the development of hyperlipidemia and insulin resistance induced by a high-fructose diet. Illumina small RNA sequencing was used to identify micro RNAs that were altered in response to a high-fructose diet in mouse livers. Su found that genetic depletion of one type of micro RNA protected the mice from fructose-induced insulin resistance, while overexpression of another type of micro RNA led to inflammatory stress in the liver. These findings suggest there may be potential to manipulate these micro RNAs to prevent the development of high-fructose diet induced hyperlipidemia and insulin resistance.

By Marion Roche, PhD

The target set out by the World Health Assembly is to reduce the anemia in all women of reproductive age by 50% by 2025. Women make up about 3.5 billion in population on our planet. In order to reach this World Health Assembly target, it will be essential to address anemia in the 600 million adolescent girls in the world and recently their nutrition has been getting more attention.

The global birth rate has declined over the past decade, except when analyzing the rate for adolescent girls, with 17-20 million adolescent pregnancies per year. Eleven percent of all pregnancies are to adolescents and 95% of these adolescent pregnancies are occurring in developing countries.

Complications from pregnancy and child birth are the second greatest contributor to mortality for girls 15-19 years of age. Young maternal age increases the risk for anemia during pregnancy, yet adolescent women are less likely to be covered by health services, including micronutrient supplementation, than older women. Compared with older mothers, pregnancy during adolescence is associated with a 50% increased risk of stillbirths and neonatal deaths, and greater risk of preterm birth, low birth weight and small for gestational age (SGA) (Bhutta et al, 2013; Kozuki et al, 2013; Gibbs et al, 2012).

Reducing anemia in adolescents is often motivated by efforts to improve maternal and newborn health outcomes for pregnant adolescents; however, benefits for improving adolescent school performance and productivity at work and in their personal lives should also be valued.

Globally, iron deficiency anaemia is the third most important cause of lost disability adjusted life years (DALYs) in adolescents worldwide at 3%, behind alcohol and unsafe sex (Sawyer et al, 2012).

Adolescents have among the highest energy needs in their diets, yet in developing countries many of them struggle to meet their micronutrient needs. The World Health Organization recommends intermittent or weekly Iron Folic Acid Supplements for non-pregnant women of reproductive age, including adolescent girls. IFA supplementation programs have often been designed to be delivered through the existing health systems, without specific strategies for reaching adolescent girls.

I have heard adolescence referred to as “the awkward years” when individuals explore self-expression and autonomy, but it is also definitely an awkward period for public health services in terms of delivering nutrition, as we often fail to reach this age group.

There have been examples of programs going beyond the health system to reach adolescent girls, such as through schools, peer outreach, factory settings where adolescents work in some countries and even sales in private pharmacies to target middle and upper income adolescent girls.
The Micronutrient Initiative implemented a pilot project with promising results in Chhattisgarh, India where teachers distributed the IFA supplements to 66,709 female students once per week during the school year over a 2 year pilot.

It was new for the schools to become involved in distribution of health commodities, but engaged teachers proved to be effective advocates. There were also efforts to reach the even more vulnerable out of school girls through the integrated child development centers, yet this proved to be a more challenging group of adolescents to reach. Peer to peer outreach by the school girls offered a potential strategy. The current project is being scaled up to reach over 3.5 million school girls.

Adolescent girls have much to offer to their friends, families and communities beyond being potential future mothers. It is time to get them the nutrients they need to thrive in school, work and life.

By Meghan Anderson Thomas

The age of menarche has decreased significantly in the past century, from an average age of 16-17 years old to younger than 13 years of age (Buttke, Sircar, & Martin, 2012). There are several different theories as to why this may be occurring. Some believe that environmental toxins or exposure to estrogen-disrupting compounds (EDC) may play a role. EDCs are found in household plastics, cleaners, deodorizers and personal care products. Other theories include increased body mass index in children and adolescents. Increased hormones found in obese children maybe responsible for the earlier onset of puberty. Finally, nutritional implications such as breast versus bottle-feeding and increased dairy and meat intake in adolescence may also play a role in puberty at younger ages.

EDCs include benzophenones, dichlorophenols, parabens, triclosan, which are compounds that effect estrogen signaling by binding to the receptor and have downstream effects (Buttke, Sircar, & Martin, 2012). These compounds are becoming increasingly common in everyday and household use. This type of exposure may be implicated as one of the causes of decreased age of menarche. In a study by Buttke et al, the level of urinary EDCs was analyzed in females between the ages of 6-11 and 12-19 (2012). Females with urinary EDCs above the 75th percentile have significantly lower age of menarche (Buttke, Sircar, & Martin, 2012). These results are worrisome, because pollutants in our environment are influencing the development of adolescents. This is a larger public health concern than previously believed. Further investigations are underway to better understand which products are the most dangerous culprits.

Obesity has become a major epidemic, whereas two-thirds of the Americans are overweight or obese and one-third of children are overweight or obese. While obesity in adulthood can lead to a plethora of health concerns, it was previously thought that childhood obesity might have reversible effects. However, obesity in young females has been shown to have an influence on early-onset puberty. Obesity causes an increase in certain hormonal levels including leptin, insulin, IGF-1, certain binding proteins, and androgens (Marcovecchio & Chiarelli, 2013). Early signs of puberty are not the only effects seen by the hormonal changes associated with obesity, hyperandrogenism may be present as well (Marcovecchio & Chiarelli, 2013). Hyperandrogenism involves increased body and facial hair, alopecia, acne, and increased libido. Both hyperandrogenism and earlier development in females may have extreme social effects in adolescent females.

Nutrition in newborns is predominately breast-feeding at approximately 75%, however, after just one-week postpartum breast feeding incidence drops to 16.2%. Approximately 20% of formula-fed infants are given soy-based formula (Andres, Moore, Linam, Casey, Cleves, & Badger , 2015). Isoflavones are organic compounds that act as phytoestrogens in mammals and are found in soy-based products and may be feared to cause estrogenic effects such as early-onset puberty (Andres, Moore, Linam, Casey, Cleves, & Badger , 2015). Currently, the most recent study on hormonal additives was done in 1988 by the FAO/WHO Committee on Food Additives Joint with the Federal Drug Administration (FDA) which showed no concern for human consumption of hormonal additives (Larrea & Chirinos, 2007). Later, Larrea and Chirinos show that the study may be concerning due to the inadequate scientific elements that were used (Larrea & Chirinos, 2007). Furthermore, previous studies on the effects of hormonal additives on early onset of puberty are inconclusive and current studies are still underway (Andres, Moore, Linam, Casey, Cleves, & Badger , 2015). The conclusions of the current longitudinal studies will be a vital factor in not only post-partum nutrition but child and adolescent nutrition as well.

The significance of all of the theories behind early menarche is due to the psychosocial effects of early maturity of young girls and the unwanted attention they may receive. Early onset of puberty also causes women to have longer exposure to estrogen, which may be associated with several types of cancers, including breast and endometrial cancer. Estrogen exposure also increases risks for cardiovascular disease and high cholesterol. These health-related side effects were significantly lower when women were experiencing menarche at older ages. Clearly, more research needs to be done in order to investigate the multifactorial causes of early menarche in adolescents; however, current studies seem to implicate both environmental and nutritional exposures.

References
Andres, A., Moore, M., Linam, L., Casey, P., Cleves, M., & Badger , T. (2015, March). Compared with feeding infants breast milk or cow-milk formulas, soy formula feeding does not affect subsequent reproductive organ size at 5 years of age. The Journal of Nutrition , .
Buttke, D., Sircar, K., & Martin, C. (2012). Exposure to endocrine-disrupting chemicals and age of menarche in adolescent girls in NHANES. Environmental Health Prospective , 120 (11), 2003-2008.
Larrea, F., & Chirinos, M. (2007). Impact on human health of hormonal additives used in animal production. Rev Invest Clin , 59 (3), 206-211.
Marcovecchio, M., & Chiarelli, F. (2013). Obesity and growth during childhood and puberty. World Review of Nutrition and Dietetics , 106, 135-141.
NIH. (2009-2010). Overweight and Obesity Statistics. Retrieved 2015, from National Institute of Diabetes and Digestive and Kidney Diseases: niddk.nih.gov

By Ann Liu, PhD

Researchers are using carrots to produce a new tracer that will help scientists study vision and brain function. The results of this study were presented in the “Carotenoid and Retinoid Interactive Group: Bioavailability and Metabolism of Carotenoids and Vitamin A” on March 29 by Joshua Smith and John Erdman, PhD, from the University of Illinois at Urbana-Champaign.

Lutein is a carotenoid which accumulates in the retina of the eye and may protect the eyes from damage, especially age-related macular degeneration. It also accumulates in certain areas of the brain and may be beneficial for cognitive performance. However, little is known about how lutein accumulates in tissues such as the brain or how these tissues metabolize it. This led researchers to embark on a mission to develop lutein labeled with a non-radioactive, stable tracer (carbon-13) as a tool to study the metabolism of lutein in tissues.

Enter the colorful carrots. Carrots are a good source of lutein, but the amount of lutein can vary depending on the variety of carrot. Researchers tested seven different carrot cultivars that ranged in color from red to yellow to purple to see which one produced the most lutein. Then they had to culture the carrot cells in flasks and optimize the growing conditions to increase lutein production.

Once they figured out the optimal growing conditions, the carrot cells were fed carbon-13 labeled glucose. The lutein then had to be extracted using reverse-phase high performance liquid chromatography, and incorporation of the carbon-13 tracer was assessed using mass spectrometry. Approximately 58% of the lutein extracted from the carrot cells was uniformly labeled with carbon-13.

So what’s next for this new tracer lutein? The researchers plan to use it to study tissue accumulation of lutein in animal models before embarking on any studies in humans. They will also be going back to the lab bench to see if there are any more changes they can make to further improve their lutein yield.

This research was funded by a grant from Abbott Nutrition through the Center for Nutrition, Learning, and Memory at the University of Illinois.

By Kevin Klatt

The National Academies of Science, the World Health Organization, The American Association for the Advancement of Science, the European Food Safety Authority and Food Standards Australia New Zealand are just a few of the international organizations that have position papers on the use of genetic engineering as it applies to food. These reports all conclude that genetically modified foods present no unique safety threats compared to traditionally bred crops and/or have not been linked to detrimental human health outcomes (the Genetic Literacy Project has a nice infographic depicting these organizations here). Notably missing from this extensive list are, oddly, nutrition organizations.

Two of the major American nutrition organizations are the Academy of Nutrition and Dietetics (AND) and the American Society for Nutrition (ASN). The AND does not currently have a position on genetic engineering; however, its member center (1) informs us that a new Evidence Analysis Library paper entitled “Advanced Technology in Food Production” is due to come out soon. ASN does not have an official position paper on genetic engineering, either. Rather, genetically engineered foods are briefly mentioned in two of their publications: “Processed Foods: contributions to nutrition” (2) and “Nutrition Research to Affect Food and A Healthy Lifespan” (3).

At a time when misinformation about genetically engineered crops is all too common in the public discourse, it seems rather odd that neither of the two largest nutrition organizations are providing guidance on or actively engaging in this topic of conversation. Nevertheless, the conversation continues on without nutrition. A quick look at the agenda (4) for the upcoming National Academies workshop on January 15-16th entitled “When Science and Citizens Connect: Public Engagement on Genetically Modified Organisms” highlights this disheartening reality: no one representing the field of nutrition is scheduled in the line-up of speakers or presenters.

There are likely many reasons why nutrition has abstained from the conversation. Genetically engineered foods inherently address wildly interdisciplinary concepts: everything from sustainability, to agricultural economics and plant genetics. Nutrition is certainly a piece of the puzzle, but it is not the whole thing. Alternatively, maybe we were not invited to the conversation. It only takes a few seconds of Google’ing to see that many nutrition professionals, particularly registered dietitians, have been rather vocal in the crusade against genetically engineered foods.

Regardless of the reason for nutrition’s absenteeism, the field should take a vested interest in influencing the conversation with its unique perspective. The scientific literature is filled with numerous examples of genetic engineering with great potential for the field, even beyond the case of golden-rice; folate-enriched tomatoes (5), calcium-enhanced carrots (6), non-browning Arctic Apples (7) and low acrylamide potatoes (8) illustrate just a few of the ways that GE technology can be responsibly used to improve the nutrient quality of an individual’s diet. A recent paper in Nature Biotechnology did a thorough analysis of the status and market potential of transgenic biofortified crops, highlighting the wide spectrum that have undergone nutrient biofortification and their potential role in human nutrition (9). As this paper highlights, the promise and potential of GE foods is not that they will be the solution to improving diet, but rather, that they can be a part of the solution. We have been breeding crops with little to no consideration for the nutritional phenotype for centuries (10); nutritional scientists working with plant scientists (similar to what HarvestPlus currently does) could certainly alter that trajectory and improve the nutrient quality of the food supply. In addition to genetically engineered crops for human consumption, genetically engineering feed consumed by farm animals can alter the nutrient profiles of animal foods; most recently, yeast genetically modified to produce their own omega 3 fatty acids (11) made headlines as a way to sustainably improve the fatty acid profiles of farmed salmon (12). Though likely far from market availability, the potential to genetically engineer animals themselves to alter their nutrient profiles has even been discussed (13).

While the promise of genetic engineering’s potential abounds, ASN’s publication on the future of nutrition research (3) asks us a rather prudent question: “Can we leverage technologies, such as biotechnology and nanotechnology, to develop novel foods and food ingredients that will improve health, both domestically and abroad, and provide credible, tangible functional health benefits?” As it stands now, the answer to that question is still unknown, and further research to identify the answer continues unguided by statements from professional nutrition organizations.

The scientific community as a whole could benefit from including nutrition in the genetic engineering conversation. As the National Academies prepare for this conference, and wonder how to improve communications between the public and scientists, I cannot help but see an alternative route through nutrition. While the National Academies seems to realize that crops resistant to already-feared agricultural chemicals offer an intangible benefit to consumers, their focus on chestnut trees, butterflies, and mosquitoes still feels distant. Addressing the conflation of genetic engineering with pesticide resistance is certainly a start, but these alternative applications of genetic engineering do not address food, which is where the controversy exists most prominently. To truly address this issue, the public is going to need to see a direct benefit from genetic engineering as it applies to food; nutrient biofortification offers a promising outlet for this. Imagine if individuals were introduced to genetically engineered foods through folate-enriched tomatoes instead of pesticide-resistant corn.

Position stances from nutrition organizations on the applications, safety aspects, and future directions of genetically engineered foods are long overdue. With the genetic engineering debate furthering consumer distrust in scientific bodies, it is all the more essential for prominent nutrition organizations to team up with other scientific bodies, and enter into this conversation. The benefactors of our research and professional activities are those who eat food, some of which is genetically modified; we should no longer sit silent on this major food-related issue.

References
1. http://www.eatright.org/Members/content.aspx?id=6442482664
2. http://ajcn.nutrition.org/content/99/6/1525.abstract
3. http://advances.nutrition.org/content/4/5/579.full
4. http://nas-sites.org/publicinterfaces/files/2014/07/PILS-02-GMO-Interface-agenda05.pdf
5. http://www.pnas.org/content/101/38/13720.full
6. http://www.pnas.org/content/105/5/1431.long
7. http://www.arcticapples.com/arctic-apple-nutrition/
8. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2607532/
9. http://www.nature.com/nbt/journal/v33/n1/full/nbt.3110.html
10. http://www.ncbi.nlm.nih.gov/pubmed/20467463
11. http://www.ncbi.nlm.nih.gov/pubmed/20804805
12. http://civileats.com/2014/02/24/costco-to-sell-salmon-fed-gmo-yeast/
13. https://www.bio.org/sites/default/files/2011_ge%20animal_benefits_report.pdf

By Banaz Al-khalidi

November is National Diabetes Month and World Diabetes Day takes place yearly on November 14 to engage millions of people worldwide in diabetes advocacy and awareness. The International Diabetes Federation estimates that 382 million adults (20-79 years old) suffered from diabetes in 2013, which equates to a prevalence of 8.3%. To provide a better perspective by nation, the 10 countries with the highest prevalence of diabetes in 2013 were as follows: Tokelau (37.5%), Federated States of Micronesia (35%), Marshall Islands (34.9%), Kiribati (28.8%), Cook Islands (25.7%), Vanuatu (24%), Saudi Arabia (24%), Nauru (23.3%), Kuwait (23.1%), and Qatar (22.9%). However, if we were to look at the 3 countries with the greatest number of people with diabetes, China ranks the highest (98.4 million), followed by India (65.1 million) and USA (24.4 million). These figures are quite alarming.

Of those suffering from diabetes, type 2 diabetes comprises almost 90% of people with diabetes around the world. As such, type 2 diabetes is one of the fastest growing health problems in the world. So what could be driving this epidemic?

Evidence from observational studies have consistently shown us that low blood levels of vitamin D are associated with an increased risk of type 2 diabetes. The results of numerous observational studies led to speculation that the development of type 2 diabetes is associated with vitamin D insufficiency. Going back to the figures presented earlier, if vitamin D insufficiency is a risk factor for type 2 diabetes, one might also speculate that countries with higher prevalence of diabetes are facing a coexisting problem of type 2 diabetes and vitamin D insufficiency. For example, Tokelauans (the nationals of Tokelau) who have the highest prevalence of type 2 diabetes, may also be at risk for vitamin D insufficiency despite having a tropical and marine climate. The question then becomes, could vitamin D be a causal factor in the development of type 2 diabetes? While this might sound too simplistic, I assure you it’s not.

Interpretation of evidence on vitamin D and type 2 diabetes is complicated for a number of reasons. First and foremost, observational studies do not tell us anything about the cause-effect relationship between vitamin D and type 2 diabetes because of possible uncontrolled confounding factors, such as physical activity, that may affect both vitamin D levels and the risk of type 2 diabetes. Second, observational studies cannot inform us about reverse causation. In other words, which comes first, the chicken or the egg? Third, there are a myriad of factors that affect vitamin D levels, including environmental, cultural, genetic and physiological factors. It remains unclear then whether there is a causal link between vitamin D and type 2 diabetes.

To answer this question, a large genetic study published in The Lancet Diabetes and Endocrinology journal looked at the causal association between low blood levels of vitamin D and risk of type 2 diabetes. The study concluded that the association between vitamin D and type 2 diabetes is unlikely to be causal. The research, which was a Mendelian randomization study, examined the link between type 2 diabetes risk and vitamin D, by assessing the genes that control blood levels of vitamin D. Most importantly, the design of this study has a powerful control for confounding factors and reverse causation which are issues of concern in observational studies. This may partly explain the discrepancy between results from earlier observational studies and this study in question. However, we still need to be cautious about interpreting the results from mendelian randomization studies as some of the underlying assumptions in the study might remain untested.

The take home message is that no special recommendations could be made about vitamin D levels or supplementation for people with type 2 diabetes. However, long-term randomized trials of vitamin D supplementation remain important to elucidate vitamin D’s role in type 2 diabetes.

As we recognize National Diabetes Awareness this month, it is important to remind patients that diabetes is a progressive chronic lifestyle disease that can be controlled by making healthy lifestyle changes- such as partaking in regular physical activity, eating a balanced diet, maintaining a healthy body weight, taking prescribed medications, joining a smoking cessation program, and improving sleeping patterns.

Reference
Zheng Ye, Stephen J Sharp, Stephen Burgess, Robert A Scott, Fumiaki Imamura, Claudia Langenberg, Nicholas J Wareham, Nita G Forouhi. Association between circulating 25-hydroxyvitamin D and incident type 2 diabetes: a mendelian randomisation study. The Lancet Diabetes & Endocrinology, 2014; DOI: 10.1016/S2213-8587(14)70184-6.

By Sheela S. Sinharoy, MPH

The 3rd Micronutrient Forum Global Conference took place from June 2-6, 2014 in Addis Ababa, Ethiopia, with approximately 1,000 attendees and more than 80 sessions. Some of my personal highlights were:

• Lindsay Allen’s talk on biomarkers for vitamin B12. Dr. Allen argued that depending on the biomarker used, vitamin B12 deficiency may be more prevalent than iron deficiency.
• Michael Fenech’s presentations on the exposome, especially the impact of nutrient deficiencies on the integrity of DNA. He has found that the DNA damage from folate deficiency is equivalent to the damage from 10 times the allowable annual exposure to ionizing radiation.
• Daniel Raiten and Bas Kremer’s talks on the importance of a systems biology perspective. It’s good to be reminded of the need for research on nutrient-nutrient interactions and the role of nutrient “clusters” within biological systems.

The most interesting session, however, was the plenary session on the risks and benefits of iron interventions. Many of us know that iron deficiency is the most common nutritional disorder in the world. It is a major cause of anemia but not always the dominant cause. We also know that the main anemia control strategy worldwide is iron supplementation. However, in cases of anemia that are caused by factors other than iron deficiency, iron supplementation can actually be harmful, exacerbating malaria and increasing pathogenic bacteria in the gut. How, then, to determine whether or not iron supplementation is appropriate?

One possible solution came from Sant-Rayn Pasricha, one of the speakers in the plenary, who presented research on the use of the hormone hepcidin to assess iron status. He and his co-authors found that measurement of plasma hepcidin concentrations is useful for detecting iron deficiency and is more sensitive than ferritin. It is also more practical than the current approach, which involves measurements of ferritin, soluble transferrin receptor, and C-reactive protein to assess iron status.

This is of major importance, especially for those of us who work in developing countries where anemia levels are high. In Dr. Pasricha’s sample of children in The Gambia and Tanzania, 61% had anemia, but only 13% had iron deficiency anemia. Under current recommendations, all of the anemic children would be given iron supplementation, even though most of them were not iron deficient. This is not only a poor use of resources but, more importantly, potentially hazardous.

Iron supplementation is normally guided by hemoglobin levels, which measure anemia but not iron deficiency. Is it time to replace hemoglobin testing with hepcidin testing? There is no low-cost assay for hepcidin, so this is not a practical solution in the field just yet. In the meanwhile, it is important to consider the risks of infection and iron overload that can follow from inappropriate supplementation.

The knowledge I obtained at Micronutrient Forum will undoubtedly enrich my work moving forward. As I continue to make my way through articles referenced in various presentations, I am already looking forward to the 4th Micronutrient Forum Global Conference, scheduled for 2016 in Mexico.