Prospectively gathered data from the National Health and Nutrition Examination Survey I and the National Health Evaluation Follow-Up Study were analyzed to evaluate the risk of colorectal cancer due to consumption of iron. Morbidity and mortality data due to colorectal cancer were available on 14,407 persons first interviewed in 1971 and followed through 1986. A total of 194 possible colorectal cancers occurred in this group over the 15-year period. Subsite analysis showed that the risk of colon cancer due to iron intake was elevated throughout the colon for both men and women, with the highest adjusted risks for the interquartile range seen in the proximal colon for females (relative risk, 1.51; 95% confidence interval, 1.41-1.60). The risk of rectal cancer was not significantly elevated for men or women. Elevated serum iron was also associated with increased risk; however, this effect was strongest in the distal (rather than proximal) colon and was significant only among females (adjusted relative risk, 1.73; 95% confidence interval, 1.03-2.92). The mean transferrin saturation was higher among cases than controls (30.7 versus 28.7%), but total iron-binding capacity did not seem to predict the occurrence of colorectal cancer. Proportional hazards models confirmed that the effects of iron and serum iron were not confounded by age, gender, energy consumption, fat intake, or other known risk factors for colorectal cancer. These data suggest that iron may confer an increased risk for colorectal cancer, and that the localization of risk may be attributable to the mode of epithelial exposure. It seems that luminal exposure to iron increases risk proximally, whereas humoral exposure increases risk distally. These differences may be due to such factors as oxidation state, binding proteins and the presence of other cofactors such as bile acids, products of bacterial metabolism.