High red meat consumption is associated with increased risk of colorectal cancer. Various mechanisms have been proposed, including mutagenesis, alterations of the gut microbiome, and effects on local immunity and inflammation. This lack of well-defined mechanistic explanations for diet and cancer associations coupled with our inability to derive causal inferences from population-based studies allows us to rationalize that burger we ate at lunch or that steak we ate at dinner. The preparation and consumption of red meat is a major social and dining pleasure in the Western culture, so there is resistance to concern ourselves with the cancer risk associated with red meat. In fact, advertisements do not add a rapid-fire statement that consumption of more than half a portion of red meat per day has been associated with increased risk of cardiovascular disease, diabetes, cancer, and even death, because these data are not from randomized controlled trials? Would we heed a warning if there was evidence that burgers and steaks induced the expression of small noncoding RNAs that inhibit the expression of tumor-suppressor genes in our enterocytes? What level of evidence is necessary to convince ourselves that a dietary exposure is sufficiently causal to put a warning label on it? How do we experimentally obtain such evidence? If we knew the mechanism, perhaps we could modify the risk sufficiently such that we can have our steak and eat it too—without the warnings. Cancer Prev Res; 7(8); 777–80. ©2014 AACR.
See related article by Humphreys et al., p. 786
According to the World Cancer Research Foundation/American Institute of Cancer Research (WCRF/AICR), consuming >500 g/week (∼70 g/day) of cooked red meat increases the risk of colorectal cancer, with the level of evidence considered convincing (1). In support of the WCRF/AICR evaluation of the level of evidence, Chan and colleagues reported that the risk of colorectal cancer increased by 29% for every 100 g/day of red/processed meat with a plateau at ∼140 g/day (2). Overall, these results support a dose–response relationship between red meat intake and colorectal cancer risk. Similarly, results from the Nurses' Health Study and the Health Professionals Follow-Up Study showed a higher hazard of death from all causes, cardiovascular disease, and cancer and the potential to prevent 9.3% and 7.6% of deaths in men and women, respectively, if red meat intake was limited to 0.5 servings/day (42 g/day; ref. 3). As such, recommendations are made to the public to reduce consumption to this level. Despite evidence that red meat intake is likely, most likely, probably, or convincingly causally related to colorectal cancer, it is qualified with statements such as the study is observational so causality cannot be inferred.
Unfortunately, findings from observational studies on diet and cancer risk are often met with skepticism from the scientific and medical communities and, increasingly, from the public. This uncertainty reflects concern over measurement error, the role of unmeasured confounders, and the inability to infer causality such that changing one factor in the diet would decrease an individual's risk. Thus, findings from association studies can easily be dismissed, which becomes particularly appealing with an exposure like red meat that is economically, socially, and culturally important. On the other hand, conducting randomized dietary interventions for cancer endpoints as necessary “gold standard” evidence to change behavior, make recommendations, and inform food policies and marketing is simply not feasible due to their cost prohibitiveness and difficult nature.
Overcoming some of these inherent challenges was a major impetus in the mid-1990s to merge aspects of epidemiology, genetics, and molecular biology into the discipline of molecular epidemiology. A primary objective is to link exposures identified in the observational setting to a biologically meaningful intermediate (intermediate response biomarker) to enable inference on causality by meeting at least three of the Bradford Hill Criteria: dose–response relationship, biologic plausibility, and temporality. The identification of biologic and clinically meaningful intermediate risk biomarkers that act in the etiopathogenesis of disease (e.g., lipids, glucose, and insulin) has proved highly valuable in demonstrating that specific exposures like diet are causally linked to cardiovascular disease and diabetes.
For cancer, demonstrating causal linkage has proved to be more challenging than originally expected. This is partly due to the lack of sufficient insight on the mechanisms of early tumorigenesis from the basic sciences in the 1990s and due to challenges associated with studying tissue biomarkers in humans. Over the past decade, our understanding of the molecular mechanisms and the complexity of tumorigenesis from a biologic framework is much more evolved. There is currently more extensive and concrete knowledge on the molecular and cellular determinants of tumor development, including more refined understanding of the cancer hallmarks (4) that include self-sufficiency in growth signals, evasion of apoptosis, insensitivity to anti-growth signals, limited replicative potential, angiogenesis, and the ability to invade and metastasize.
In this issue of the journal, Humphreys and colleagues (5) provide a wonderful example of how mechanistic insight from the knowledge gained on cancer hallmarks and their molecular mediators can be successfully integrated into the assessment of dietary interventions in small randomized studies. Using the power of a randomized, cross-over design for a dietary intervention in only 23 healthy individuals, the authors present the first evidence that relatively short duration (4-week), high red meat consumption can significantly increase proliferation in the rectal mucosa and that this increase is correlated with higher expression of oncogenic microRNAs (miRNA): the miR17-92 cluster and miR21.
miR17-92 is a polycistronic cluster located in intron 3 of the C13orf25 gene at 13q31, containing six tandem stem-loop hairpin structures that yield six mature miRNAs: miR17, miR18a, miR19a, miR20a, miR19b-1, and miR92a-1. This cluster has been designated oncomir-1, reflecting the findings that overexpression of mir17-19b in the presence of c-Myc overexpression accelerates B lymphomagenesis (6). Subsequently, the miR17-92 cluster has been shown to be critical in normal tissue development and in malignant transformation, exhibiting effects on cell growth and survival. More recent efforts have demonstrated that overexpression of miR19 is both necessary and sufficient to promote c-Myc–driven B-cell lymphomagenesis by repressing genes necessary for apoptosis (7, 8). A number of the target genes of the miR17-92 cluster have been identified (see Fig. 1) as tumor suppressors and include regulators of cell cycle (CDKN1A), apoptotic genes (PTEN, BCL2L1), and the angiogenesis inhibitors THBS1, TGFβ, and CTGF (9) that have been shown to be attenuated when the miR17-92 cluster is activated by c-Myc in colon cancer cell lines (10).
Located at 17q23.2, miR21 was the first miRNA to be classified as an oncogene. Like oncomir-1, miR21 acts by inhibiting the expression of a number of tumor-suppressor genes, including PTEN and BCL2, and in colon tumors, programmed cell death protein 4 (PDCD4)—a poorly characterized protein thought to play a role in apoptosis and shown to be decreased in colon cancer (11).
In their study, Humphreys and colleagues show that high red meat consumption (300 g/day raw weight of lean red meat or approximately 240 g/day cooked meat) induced expression of the miR17-92 cluster and miR21 and increased the number of cells staining positive for proliferating cell nuclear antigen (PCNA). As further evidence for a direct relationship between the miRNAs and proliferation, the increase in proliferation was correlated with decreased expression of miRNA-specific target genes: CDKN1A, PTEN, and BCL2L1. Although one can argue the relevance of these results given the very high red meat exposure, these findings provide compelling evidence that high red meat consumption induces, through some mechanism—perhaps c-Myc, the overexpression of oncogenic miRNAs in rectal mucosa. Do these findings add to our efforts to educate the public on red meat consumption and colorectal cancer risk?
As of 2012, annual beef consumption estimates in the United States decreased 43% from 91 pounds per year per person (peak) in the 1970s to 52 pounds in 2012 or a decrease from nearly 800 g/week to just under the WCRF/AICR-recommended 500 g/week or less at 64 g/day and slightly higher than the recommendation of a 0.5 serving per day or 42 g/day observed in the Harvard cohorts. Although we would like to assume that this reflects increasing knowledge of the adverse health effects of red meat, these declines are fairly recent and parallel reduced availability and higher cost of beef due to reductions in the size of cattle herds in the United States as a result of prolonged drought and climate effects. Current trends are in the right direction for prevention, but it is not clear what has motivated the decline in consumption. The evidence presented by Humphreys and colleagues on the oncogenic nature of very high red meat intake is fairly striking and certainly warrants considering dietary counseling to those individuals who consume very high red meat. However, we are left wondering from Humphreys this study if the current average intake is sufficiently low to reduce the observed effects on oncogenic miR17-92 and miR21 expression. One wonders at this point whether red meat consumed at <0.5 servings per day is “likely, most likely, or probably” not oncogenic.
For the red-meat lover, resistant starch might be an option. High dietary fiber, including resistant starches, has been consistently associated with a reduced risk of colorectal cancer (1). This protective effect has been attributed to a number of factors, including the chemopreventive properties of butyrate, a short-chain fatty acid, produced during the fermentation of starch that escapes the small intestine and enters the large bowel (12). In their study, Humphreys and colleagues show that adding butyrylated high amylose maize starch as 40 g/day StarPlus (50%–60% resistant starch) to a high red meat diet significantly attenuates the oncogenic effects of red meat on the rectal mucosa. Further, when eaten with a butyrylated source of resistant starch that was shown previously to significantly increase butyrate levels (13), high red meat consumption did not induce the expression of miR17-92. For example, study participants who received the high red meat diet first, followed by the high red meat supplemented with StarPlus, had their miR17-92 levels returned to their baseline, preintervention values. Notably though, the oncogenic miR21 induced by the high red meat intake was not significantly attenuated by the supplemental starch, which suggests that that the butyrate effect may be mir17-92–specific or require longer than 4 weeks to attenuate the potential oncogenic activity of red meat.
Although the Humphreys' study suffers some from a small sample size and a lack of a priori knowledge on the duration of washout needed to restore rectal mucosal markers to their baseline, the findings do support the general concept that diet is a compound exposure that collectively acts to influence colorectal health. This study nicely illustrates the inherent challenges of trying to derive causal inference from a single exposure when considered in the context of the greater complexity (i.e., dietary intake of resistant starch). To our knowledge, there have been no studies testing whether the effect of red meat intake on colorectal cancer risk is modified by resistant starch consumption levels, an exposure that appears to be best captured by measuring individual butyrate production (13), which may reflect the need to account for the gut microbiome, adding another layer of complexity. The most exciting aspect of the Humphreys' study is that it illustrates how a relatively simple and powerful study design, the randomized cross-over study that nicely controls for within-individual variability, combined with biologically informed molecular markers can be conducted fairly quickly in relatively small samples (only tens to a few 100 individuals) to gain insights on the biologic consequences of exposures such as dietary red meat and resistant starch consumption. Such studies are compelling, thought provoking, field advancing, and, most importantly, feasible—the aspiration of molecular epidemiology.
Disclosure of Potential Conflicts of Interest
No potential conflicts of interest were disclosed.