Micronutrients & dietary fibre
Antioxidant vitamins & minerals
There have been many case-control and cohort studies assessing the relationship between antioxidant nutrients and chronic disease outcome mainly in relation to cancer and CHD. In addition, there are over 15 randomised, double-blind, placebo-controlled, intervention trials, some related to primary prevention and some secondary (FNB:IOM 2000). Some of these have tested single supplements while others have tested mixtures. Some have shown benefits for cancer (Blot et al 1993, Clark et al 1996, 1998) or aspects of heart disease (Stephen et al 1996). Some studies have shown no effect (GISSI 1999, Greenberg et al 1990, 1994, Hennekens et al 1996, Lee et al 1999, Stephen et al 1996) and some have reported negative effects (ATBC 1994, Omenn et al 1996), the latter generally associated with ß-carotene. A brief summary is given below in relation to the individual key antioxidative micronutrients
Vitamin A & carotenoids
The antioxidant benefits of vitamin A relate primarily to its precursor, ß-carotene. Many case-control studies and cohort studies have shown a relationship between ß-carotene intake and cancer risk reduction. However, intervention trials have been disappointing. An intervention study for skin cancer (Greenberg et al 1990) showed no effect and neither did another for polyp prevention (Greenberg et al 1994). Trials in relation to cervical cancer, including some in Australia, have also shown no effect of vitamin A (Mackerras et al 1993, 1999). In fact, the CARET trial (Omenn et al 1996) on lung cancer produced an increased risk with 30 mg ß-carotene administered together with retinyl palmitate, as well as an increase in total mortality, and the ATBC trial (1994) showed an 11% increase in risk of ischaemic heart disease with ß-carotene and an 18% increase in lung cancer. The Linxian cancer intervention study (Blot et al 1993, 1995) included ß-carotene with vitamin E and selenium and showed a 9% reduction in total mortality, a 23% reduction in cancer mortality and a 10% decrease in stroke with the supplement mix. Hennekens et al (1996) showed no effect on CVD or cancer in men receiving 50 mg supplements on alternate days and Lee et al (1999) saw no effect in women using the same dose. The carotenoid lycopene has been associated with reduction of risk in prostate cancer, but results are inconsistent (Giovannucci et al 1995a, Kristal & Cohen 2000).
Low levels of dietary or plasma carotenoids have also been associated with eye conditions. The existence or severity of cataracts has been linked to higher intakes of plasma carotenoids in some studies (Brown et al 1999, Hankinson et al 1992, Lyle et al 1999, Mares-Perlman et al 1995, Seddon et al 1994, Chasen-Taber et al, 1999) but not all (The Italian-American Cataract Study Group 1991, Vitale et al 1993). For many of the studies with positive results, effects were seen for some carotenoids but not others.
The carotenoids lutein and zeaxanthin have been associated with prevention of macular degeneration (Eye Disease Case-Control Study Group 1993, Hammond et al 1996, Seddon et al 1994, Snodderly 1995) but some studies have shown no effect (Mares-Perlman et al 1994). Mares-Perlman et al did, however, find an effect of increasing plasma lycopene. West et al (1994) also found a protective effect for plasma ß-carotene and lycopene.
Vitamin C
Several case-control and cohort studies have reported protection by vitamin C for cardiovascular disease and stroke (Enstrom et al 1992, Gale et al 1995, Khaw et al 2001, Knekt et al 1994, Nyyssonen et al 1997, Pandey et al 1995, Sahyoun et al 1996, Simon et al 1998). Other studies have shown no protective effect (Enstrom et al 1986, Kushi et al 1996b, Losonczy et al 1996, Rimm et al 1993).
Block (1991) has claimed that the epidemiologic evidence for vitamin C as being protective against cancer is strongly suggestive, but others claim it is not convincing (Ames et al 1995). From case-control and cohort studies, prevention has been claimed for a range of cancers including breast, cervical, colorectal, pancreatic, lung and gastric cancers (Bandera et al 1997, Bueno de Mesquita et al 1991, Fontham et al 1988, Freudenheimn et al 1990, Ghadirian et al 1991, Howe et al 1990, 1992, Knekt et al 1991, Kushi et al 1996a, Ocke et al 1997, Romney et al 1985, Shekelle et al 1981, Wassertheil-Smoller et al 1981, Yong et al 1997, Zatonski et al 1991). However, others have shown no such association (Graham et al 1992, Hinds et al 1984, Hunter et al 1993, Le Marchand et al 1989).
The few RCTs that have been conducted with vitamin C have proved disappointing (Heart Protection Study 2002, Ness et al 1999). Ness et al (1999) reported a meta-analysis of three trials with vitamin C supplements and cardiovascular disease in western populations (total of 1,034 subjects). There was no overall reduction in mortality with vitamin C supplementation, the relative risk being 1.08. The cancer intervention studies of Blot et al (1993) in China also showed no beneficial effect of vitamin C on cancer mortality rates, nor did the Polyp Prevention Trial of Greenberg et al (1994). The FNB:IOM, in its 2000 DRI review, concluded that data were not consistent enough to be able to identify a level of vitamin C that could be used for setting recommendations in relation to vitamin C and cancer.
Some studies of dietary vitamin C in relation to cataracts have shown benefits, (Jacques & Chylack 1991, Leske et al 1991, Robertson et al 1989) and others have not (Hankinson et al 1992, Vitale et al 1993). However, in the study of Hankinson et al (1992), the use of supplement long term did relate to reduced risk. Asthma and cognitive function have also been assessed in relation to vitamin C intake. A recent Cochrane review of asthma concluded that there was no benefit of increased intakes of vitamin C. The results with cognition were mixed; with one showing no benefit (Jama et al 1996) and the other showing better memory performance (Perrig et al 1997).
Vitamin E
Data related to the effects of vitamin E on chronic disease status are limited, but the strongest evidence is for CHD. A number of double-blind controlled trials assessing chronic disease outcome have been completed, including the Cambridge Heart Antioxidant Study (CHAOS) trial (Stephens et al 1996), the GISSI-Preventione Trial (1999) for CHD; the Health Outcomes Prevention Evaluation (HOPE) trial (Yusuf et al 2000) for heart disease and the Alpha-Tocopherol Beta-Carotene (ATBC) Cancer Prevention Study which also reported heart disease endpoints. Of the heart disease studies, only the CHAOS trial gave a positive result of a 77% decrease in risk of subsequent non-fatal myocardial infarction, but no benefit to cardiovascular mortality. The ATBC trial, which was undertaken in cigarette smokers, also reported a 50% increase in haemorrhagic stroke deaths with vitamin E, but no effect on lung cancer, the primary endpoint. The GISSI trial showed no effect with vitamin E, but did show an effect with omega-3 fats.
Two reviews that included meta-analyses of the data relating to vitamin E and CHD concluded that vitamin E has little effect on outcome. Eidelmann et al (2004) conducted a computerised search of the English-language literature from 1990 and found seven large-scale randomised trials of the effectiveness of vitamin E in the treatment and prevention of cardiovascular disease. Data were available on myocardial infarction, stroke, or cardiovascular death. Six of the seven trials showed no significant effect of vitamin E on cardiovascular disease. In a meta-analysis, vitamin E had neither a statistically significant nor a clinically important effect on any important cardiovascular event or its components, nonfatal myocardial infarction, or cardiovascular death. The authors concluded that the odds ratios and confidence intervals provided strong support for a lack of statistically significant or clinically important effects of vitamin E on cardiovascular disease. Shekelle et al (2004) also undertook a systematic review of placebo-controlled, RCTs, with a meta-analysis where justified, and concluded that there is good evidence that vitamin E supplementation neither beneficially or adversely affects cardiovascular outcomes.
Data in relation to vitamin E and cancer from epidemiological studies is limited. A study assessing intakes in the US NHANES I Epidemiological Follow-up study (Yong et al 1997) showed an inverse association in smokers and a prospective cohort study also found a weak inverse relationship with lung cancer (Comstock et al 1997). Two prospective cohort studies of breast cancer (Dorgan et al 1998, Verhoeven et al 1997) and one case-control study (van’t Veer et al 1996) found no relationship with vitamin E status. A case-control study of prostate cancer showed no link (Andersson et al 1996) but an inverse association was found in one prospective study (Eichholzer et al 1996) although earlier cohort studies showed no association (Comstock et al 1992, Knekt et al 1988).
There have been few intervention trials of vitamin E and cancer. One study of heavy smokers showed no benefit for lung cancer but 34% lower incidence of prostate cancer (ATBC 1994, Heinonen et al 1998). Two small trials have shown no effects on mammary dysplasia or breast disease (Ernster et al 1985, London et al 1985) and no secondary polyp preventive effect was seen in five trials (Chen et al 1988, DeCosse et al 1989, Greenburg et al 1994, Hofstad et al 1998, McKeown-Eyssen et al 1988).
Vitamin E has also been investigated in relation to immune function (Ghalaut et al 1995, Meydani et al 1997) and cataracts ( Jacques & Chylack 1991, Hankinson et al 1992, Knekt et al 1992, Leske et al 1991, Mares-Perlman et al 1994, Mohan et al 1989, Robertson et al 1989, Vitale et al 1993) with mixed results. The one intervention for cataracts showed no effect of 50 mg a-tocopherol/day (Teikari et al 1998).
Selenium
Selenium has been assessed in relation to both cancer and CHD. Selenium intakes greater than RDI have also been shown to improve immune function (Broome et al 2004).
Selenoproteins have an anticancer effect in cellular and animal experimentation and there are some indications of a protective role from human studies (Coombs 2005). One US study, using a nested case-control design within a cohort also showed that prostate cancer risk was lowered in those with higher toenail selenium levels (Yoshizawa et al 1998). However, there have been only three human intervention trials, one of which, in China, used a mixed supplement including selenium (Blot et al 1995). In this study, significantly lower total mortality occurred among those receiving supplementation with ß-carotene, vitamin E and selenium. The reduction was mainly due to lower cancer rates, especially stomach cancer, with the reduced risk becoming apparent 1 to 2 years after the start of supplementation. The Selenium and Vitamin E Cancer Prevention Trial (SELECT) in the US showed no effect of supplements of 200 µg/day on skin cancer risk (Clark et al 1996) but significant reduction in total cancer and cancers of the prostate, lung and colorectum. However, Duffield-Lillico et al (2003) analysed this study further and found that supplementation actually increased the risk of squamous cell carcinoma and total non melanoma. The SELECT is rapidly reaching its accrual goal 1 to 2 years ahead of schedule and should provide additional evidence about this relationship.
Some researchers suggest that intakes in the region of 100–200 µg/day resulting in plasma levels of about 120 µg/L, may be necessary to maximise cancer prevention (Combs 2005, Thomson 2004, Whanger 2004) but the data on long term effects of intakes at this level are currently limited.
The available data on selenium seem to suggest that men, and particularly male smokers, may benefit more than women from supplementation in terms of lowering cancer risk (Kocyigit et al 2001, Waters et al 2004).
Evidence for a protective role of selenium against cardiovascular disease (CVD) is conflicting. Two large studies in low selenium populations indicated that selenium was an independent risk factor for myocardial infarction (Salonen et al 1982, Suadicani et al 1992), but others have not found this (Rayman 2000). The data from some studies, but not all (NÉve 1996), suggest there may be a threshold effect operating such that protection is only afforded those with prior low selenium status (Huttunen 1997, Salvini et al 1995, Suadicani et al 1992).
Protection, if it does occur, is likely to be related to an antioxidant effect on the oxidative modification of lipids and aggregation of platelets. In some studies, the effect is seen only in smokers, who are known to have lower blood selenium concentrations than non-smokers (Kay & Knight 1979, Thomas 1995). The status of other antioxidants such as vitamin E might also influence the outcome.
Summary of antioxidants and chronic disease status
Studies of the effects on antioxidant nutrients have shown some promising leads, such as the potential of selenium in prostate cancer prevention, but many intervention studies show little effect or even adverse effects. Case-control and cohort studies that initially identified the antioxidant micronutrients as having preventive potential generally compare people in the population consuming their everyday diets. These studies typically indicate that subjects at or above the top quintile of their population’s intake generally have lower risk of a range of chronic diseases. This may relate to the nutrients of concern, but may also reflect more general benefits arising from consumption of the foods that contain these nutrients. As the 90th centile is the midpoint for the highest quintile, it may therefore be prudent for people to consume a diet which would provide these nutrients at levels currently equating to, the 90th centile of intake in the population. A dietary approach rather than a supplemental one is encouraged to maintain nutrient balance and optimise benefits. The 90th centile of intake for vitamin C in Australia and New Zealand is about 220 mg/day for adult males and 190 mg/day for adult women. For vitamin E, the 90th centile of intake is about 19 mg for men and 14 mg for women. For vitamin A, the 90th centile of intake is 1,500 µg/day and for women, 1,220 µg/day and for ß-carotene, the 90th centile is 5,800 µg/day for men and 5,000 µg/day for women.
There are no national selenium intake data for Australia and it is known that New Zealand is a low selenium country, so reference to the 90th centile of intake is probably not useful in this case.
Folate
Apart from its well known benefits in the prevention of neural tube defects in the foetus, folate is increasingly thought to play a role in reduction of chronic disease risk. In the cardiovascular area, this relates to its role in reducing the levels of plasma homocysteine, a key risk factor for increased CVD. Homocysteine is a sulphur-containing amino acid derived from enzymic transformations of the essential dietary amino acid, methionine. Interest in homocysteine stemmed initially from the observation that sufferers from a number of different rare genetic disorders, which all manifested themselves in elevated levels of circulating homocysteine, also had in common a greatly accelerated rate of atherosclerosis. This immediately begged the question of whether mild elevations of serum homocysteine were also associated with increased CVD – and increasingly it seems that the final answer is likely to be 'yes'.
In the last 20 to 30 years, numerous retrospective studies and prospective studies have demonstrated a relation between moderate homocysteinuria and premature vascular disease in the coronary, cerebral and peripheral arteries. Supplementation using folic acid with and without vitamin B6 to reduce serum homocysteine levels has proved to be a successful strategy in some studies.
The randomised control trial of Venn et al (2002) showed that increasing dietary folate from 263 µg/day to 618 µg/day significantly increased serum folate by 37% and decreased homocysteine from 12–11 µmol/L over a 4-week period. The volunteers were healthy and in the 50–70 age group. These same researchers (Venn et al 2003) also showed in an RCT that supplementation of healthy subjects aged 40–60 years with either 100 µg/day folic acid or 100 µg/day L-5-methyltetrahydrofolate (MTHF) resulted in significant increments in plasma folate (52% and 34%, respectively) and red cell folate (31% and 23%, respectively), and a significant reduction in plasma homocysteine (–9.3% and –14.6%, respectively). MTHF was significantly more effective than folic acid in reducing plasma homocysteine.
Another such trial by van Oort et al (2003) showed that the minimum folic acid supplementation required for 90% optimal reduction in plasma homocysteine in healthy older adults, aged 50–75years, was 400 µg/day. This study investigated doses of folic acid ranging from 50 to 800 µg/day but failed to record the dietary intake of folate for the participants. The study of Tucker et al (2004) also showed that daily intake folic acid supplements, together with vitamins B12 and B6 at US RDA levels in supplemented cereal, decreased homocysteine in healthy 50–85 year-olds from 7.9 to 7.5 µmol/L.
Schnyder et al (2001) showed in an intervention trial that plasma homocysteine was reduced from 11 to 7 µmol/L and coronary stenosis significantly reduced (compared to controls) after daily supplementation for six months with 1,000 µg folic acid, 400 µg vitamin B12 and 10 mg vitamin B6 in patients who had undergone percutaneous coronary angioplasty.
Finally, in the prospective cohort component of the Nurses’ Health Study, Rimm et al (1998) showed that those women with folate intakes in the top quintile (median 696µg folate/day) had 31% reduction in risk of developing CHD compared to those in the bottom quintile (median, 158 µg folate/day). The effect was strongest in those women who consumed more than one alcoholic drink per day, for whom the reduction in risk was 73%.
Elevated homocysteine levels have also been linked to increased fracture risk in older people. A prospective epidemiological study in older men and women (van Meurs et al 2004) showed that a homocysteine level in the highest age-specific quartile was associated with an increase by a factor of 1.9 in the risk of fracture. The associations between homocysteine levels and the risk of fracture appeared to be independent of bone mineral density and other potential risk factors for fracture. An increased homocysteine level appears to be a strong, and independent, risk factor for osteoporotic fractures in older men and women. The results of another prospective study (McLean et al 2004) also indicate that men and women in the highest quartile of plasma homocysteine had a greater risk of hip fracture than those in the lowest quartile; the risk was almost four times as high for men and 1.9 times as high for women. These findings suggest that the homocysteine concentration is an important risk factor for hip fracture in older persons.
The results of the cross-sectional study of Seshadri et al (2002) on the Framingham cohort also showed an increased relative risk for dementia and Alzheimer’s disease with increasing plasma homocysteine. The risk for Alzheimer’s disease for those with plasma homocysteine greater than 14 µmol/L was double compared to those with lower values. An increase in plasma homocysteine by 5 µmol/L increased the multivariate adjusted risk of Alzheimer’s disease by 40%. Relationships between folate and mental function have also been reported for depression and affective state and for learning deficits (Goodwin et al 1983, Herbert 1962, Reynolds et al 1973, Shorvon et al 1980, ).
Pena et al (2004) in a randomised double-blind trial have also shown that 8 weeks of treatment with 5 mg folic acid improved endothelial cell function by 2.6% in children and adolescents with type I diabetes. It is of interest to note, however, that this level of supplementation is well above the recommended UL for the general population.
Poor folate status is also thought to influence the risk of cancer and to enhance an underlying predisposition to cancer (Heimburger et al 1987, Mason & Levesque 1996). The mechanisms involved are believed to include the induction of DNA hypomethylation, increasing chromosomal fragility or diminished DNA repair, as well as secondary choline deficiency, a lessening of killer cell surveillance, mistakes in DNA synthesis and facilitation of tumorigenic virus metabolism (Kim et al 1997, Mason and Levesque 1996). However, not all studies have shown reduced cancer risk with improved folate status after confounding has been taken into account (Meenan et al 1996, Potischman et al 1991, Verrault et al 1989, Zeigler et al 1990, 1991).
Zhang et al (1999) showed that folate intake of greater than 300 µg/day was associated with a 25% reduction in breast cancer risk in those women from the Nurses' Health Study who consumed at least 15 g of alcohol per day. There are also data from two large well-controlled prospective studies showing a protective effect of higher folate intakes on adematous polyps (Giovannucci et al 1993) and cancer (Giovannucci et al 1995b). Thompson et al (2001) used a case-control design to demonstrate for the first time that supplementation with folate during pregnancy reduces the risk of acute lymphocytic leukaemia in the child by 60%.
Folate at higher than RDI levels has also been shown to lower DNA damage. In a randomised placebo controlled intervention in young Australian adults, Fenech et al (1998) showed that the intake of 700 µg folic acid with 7 µg vitamin B12 reduced the rate of chromosome damage in lymphocytes by 25% in those individuals with above average chromosome damage rates. No further protection was provided by increasing the intake to 2,000 µg folic acid and 20 µg vitamin B12. This study indicated that intake of folate well above RDI is required to minimise chromosome damage (a risk factor for cancer) in 50% of the subjects studied who were otherwise not considered to be deficient by conventional criteria.
Intakes of folate in the Australian and New Zealand populations are currently significantly below the RDI proposed here, with median intakes of about 300 µg/day for men and 230 µg/day for women. The current 90th centile of intake of 416 µg/day in men is close to the new RDI and that of women (303 µg/day) close to the new EAR. The studies above indicate that an additional 100–400 µg/day over current intakes may be required to optimise homocysteine levels and reduce overall chronic disease risk and DNA damage.
Calcium & Vitamin D
Osteoporosis is a major public health problem in Australia and New Zealand. It can be considered both a deficiency and a chronic disease. The WHO concluded in its recent report on diet, nutrition and chronic disease that adequate calcium was important for osteoporosis prevention, at least in older populations, and that in countries with high incidence of osteoporotic fracture, a calcium intake of below 400–500 mg/day among older men and women is associated with increased fracture risk.
The WHO also found that adequate vitamin D status was a key factor in osteoporosis prevention. However, it is of interest to note that in contrast to the perceived role of calcium and vitamin D in osteoporosis prevention, one recent large intervention trial involving 5,292 previously ambulatory elderly people who had already experienced a fracture showed no effect of 20 µg daily oral vitamin D3 or 1,000 mg calcium, alone or in combination, on occurrence of further fractures (Grant et al 2005).
The WHO also recognised that other nutrients and dietary factors may be important for long-term bone health, including high sodium intake, and, paradoxically, either low or high protein intake in the elderly (WHO 2003), as well as components associated with fruits and vegetables (such as vitamin K, phytoestrogens, potassium, magnesium and boron) and activity.
Calcium is also one of the nutrients (along with fluoride, the amount and frequency of free sugars, phosphorus and casein) thought to influence dental caries. The cariostatic nature of cheese has been demonstrated in several experimental studies and human observational and intervention studies (Kashket & dePaola 2002, Moynihan & Petersen 2004, Rugg-Gunn et al 1984). The cariostatic nature of milk has been demonstrated in animal studies (Bowen et al 1991, Reynolds & Johnson 1981), and Rugg-Gunn et al (1984) found an inverse relationship between the consumption of milk and caries increment in a study of adolescents in England.
Although the roles of calcium and vitamin D in optimising bone health have been known for some time, a wider role for these nutrients in chronic disease prevention has been proposed in recent years. There is evidence from both observational studies and clinical trials that calcium malnutrition and hypovitaminosis D are predisposing conditions for various common chronic diseases.
It has been proposed that deficits in calcium and vitamin D increase the risk of malignancies, particularly of colon, breast and the prostate gland. Early work on colon cancer and calcium was inconsistent and led the WHO (2003) to conclude that there were insufficient data to confirm a link between calcium and colon cancer. However, there have been a number of recent studies and re-analyses that support earlier claims of a link. In assessing the effects of calcium on colorectal cancer, Cho et al (2004) pooled the primary data from 10 cohort studies in five countries. The studies included 534,536 individuals, among whom 4,992 incident cases of colorectal cancer were diagnosed. Compared to the lowest consumption group (500 mg dietary calcium/day or less), the relative risk (RR) for those consuming 600–699 mg/day was 0.83 (not statistically different), for those consuming 700–799 mg/day it was 0.79 and for those consuming 800–899 mg/day it increased to 0.89, which was also not statistically significant. The RR decreased to 0.79 for those consuming 900–1,099 mg/day and to 0.76 in those consuming 1,100–1,299 mg/day. The authors stated that a further regression analysis showed little additional protection of calcium above about 1,000 mg/day. When subjects were also classified into vitamin D tertiles, there was no significant effect of increasing calcium intake on colon cancer risk in those in the lowest two-thirds for vitamin D intake but there was an effect in those with the highest vitamin D status. This was despite there being no difference seen in colon cancer risk across the vitamin D tertiles themselves. The greatest effects were between those with the lowest compared with the highest combined vitamin D and calcium status.
Shaukat et al (2005) also undertook a systematic review and meta-analysis of RCTs of calcium supplementation in relation to recurrence of colon adenomas, the precursors of colon cancer. The authors statistically combined the data from the three trials that met strict eligibility criteria. The overall RR was 0.80. The results of this meta-analysis support a role for calcium supplements in preventing recurrent adenomas. Other studies or systematic reviews which support a role for calcium in preventing recurrent adenomas or abnormal colonic cell proliferation include Baron et al (1999), Holt et al (1998) and Weingarten et al (2004).
With respect to vitamin D, in addition to the study of Cho et al (2004), a review by Giovannucci (2005) of vitamin D and cancer concluded that there is substantial evidence that a higher 25(OH)D level obtained through increased sunlight exposure, dietary intake or supplement use inhibits colorectal cancer. He concluded for breast cancer that there was some promising data that were, however, too sparse to be definitive and for prostate cancer that whilst experimental evidence for an anti-cancer role of 25(OH)D is strong, epidemiologic data are not supportive. Some studies suggest that higher circulating 1,25(OH)(2)D may be more important than 25(OH)D for protection against aggressive, poorly-differentiated prostate cancer. Giovannucci (2005) suggests that a possible explanation for the disparate findings with prostate cancer is that these cancer cells may lose the ability to hydroxylate 25(OH)D to 1,25(OH)(2)D, and thus may rely on the circulation as the main source of 1,25(OH)(2)D. He further postulates that the suppression of circulating 1,25(OH)(2)D levels by calcium intake could explain why higher calcium and milk intakes appear to increase risk with advanced prostate cancer.
Calcium and vitamin D have also been purported to play a protective role in chronic inflammatory and autoimmune diseases such as example insulin-dependent diabetes mellitus, inflammatory bowel disease and multiple sclerosis, as well as metabolic disorders including metabolic syndrome and hypertension (Peterlik & Cross 2005). Deficits in calcium and vitamin D affect a wide range of chronic diseases through attenuation of signal transduction from the ligand-activated vitamin D receptor and calcium-sensing receptor, causing perturbation of cellular functions in bone, kidney, intestine, mammary and prostate glands, endocrine pancreas, vascular endothelium, and, importantly, the immune system (Peterlik & Cross 2005).
Whilst the various studies mentioned suggest a protective effect for calcium and vitamin D for a number of chronic disease outcomes, the precise level of dietary intake that would afford protection is difficult to assess from the available studies, in part because many of the benefits are seen with calcium and vitamin D in combination. In many of these calcium and vitamin D studies, it is low intakes of calcium (well below the adult EARs of 840–1,100 mg or RDIs of 1,000–1,300 mg) that appear to increase RR rather than high intakes of well above the EARs and RDIs being protective. Further discussion on calcium and vitamin D in bone health is given in the relevant chapters.
For other nutrients, such as the antioxidants and dietary fibre, a suggested dietary target has been set at the level of the 90th centile of 'current intake' in Australia and New Zealand. The 90th centile of current daily intake for calcium in adults in Australia is 1,310 mg and for New Zealand, 779 mg. The EARs for adults are already set at 840–1,100 mg/day and the RDIs at 1,000–1,300 mg/day.
The 90th centile of vitamin D intake based on the NNS 1995 in Australia has been estimated at 5.5 µg a day, close to the AI of 5 µg/day for younger adults but below that of 10–15 µg for older adults. Dietary intake compared to the action of sunlight on skin, is also a relatively small contributor to vitamin D status other than for people with very limited access to sunlight. The recent national surveys in New Zealand did not assess vitamin D intakes but the data from the earlier national survey in 1991 showed similar values to Australia (LINZ 1992).
For these reasons, no additional suggested dietary targets are set for calcium and vitamin D.
Sodium
The association between dietary sodium and chronic disease rests on the observed relationship between higher sodium intakes and increasing blood pressure, which may lead to hypertension, stroke and myocardial infarction (Cogswell et al 2016, Trinquart et al 2016, Obrien 2016). As both diet and chronic disease are complex entities the scientific position enjoys a healthy debate. This debate is partly underpinned by the scientific literature addressing the physiology of sodium balance through to population health research exposing the dietary sodium-blood pressure relationship (Heaney 2015; Anderson et al 2015).
Briefly, sodium is the primary cation in human extracellular fluid. It has an essential role in the maintenance of key physiological parameters such as extracellular fluid volume and cellular membrane potential (FOB:IOM 2013). Sodium balance is maintained through a range of physiological systems and hormones such as the renin-angiotensin-aldosterone hormone system, the sympathetic nervous system, atrial natriuretic peptide, the kallikrein-kinin system and other factors that regulate renal and medullary blood flow (NHMRC 2006). Recent research suggests that high sodium intakes create a response from a complex regulatory process underpinning osmotic balance. This process is influenced by consumption of food and water, hormone fluctuations, and renal sodium and water excretion (Zeidel et al 2017, Kitado et al 2017, Rokova et al 2017). In the absence of a situation where excessive sweating may be occurring, urinary sodium excretion in humans is approximately equivalent to intake (FOB:IOM 2013). Thus urinary sodium excretion is often used as a biomarker of intake.
In addition to sodium, the development of hypertension has been shown to be related to a number of other dietary factors, notably lower intakes of potassium but also lower intakes of calcium, magnesium and possibly other micronutrients, as well as lower fruit and vegetable consumption (Appel et al 1997, John et al 2002, Margetts et al 1986) and higher alcohol consumption (Marmot et al 1994, Xin et al 2001). Other key factors include overweight and metabolic syndrome (Chen et al 1995, Mulrow et al 2002), lack of physical activity (Lesniak et al 2001, Whelton et al 2002) and genetic predisposition (Corvol et al 1999, Hunt et al 1998, Svetkey et al 2001).
Historically, the large International study of salt & blood pressure (Intersalt Co-operative Research Group, 1988) produced the first substantial set of data on 24-hour urinary sodium excretion and blood pressure from more than 10,000 adults in 52 groups from 32 countries. Significant positive associations were found between sodium excretion and both systolic and diastolic blood pressures. When four centres that had very low salt intakes were removed from the analysis, the overall association was not statistically significant, although an association was found between salt intake and increase in blood pressure with age. This provided direction for further investigations. The data were later re-analysed (Elliot et al 1993, 1996), adjusting for regression dilution caused by measurement errors to find stronger associations, although some suggested the correction factors used may have been overestimated (Day 1997, Davey et al 1997).
Since then a number of controlled intervention studies were undertaken. The well designed DASH trial (Dietary Approaches to Stop Hypertension) (Appel et al 1997) not only addressed dietary sodium but also the place of the total diet in 460 normotensive and hypertensive adults. Subjects received a control diet low in fruit, vegetables and dairy products, with a fat content typical of the average US diet for three weeks and were then randomised to receive one of three diets for eight weeks: the control diet, a diet rich in fruit and vegetables or a combination diet (the DASH diet) rich in fruit, vegetables and low-fat dairy products, and low in saturated and total fat. The salt content of each diet was similar and body weight, physical activity and alcohol were held constant throughout. Compared to a typical US diet, the DASH trial showed that a diet rich in fruits, vegetables, and low-fat dairy products reduced mean blood pressure by 5.5/3.0 mm Hg for systolic blood pressure and diastolic blood pressure respectively. The diet rich in fruit and vegetables produced a reduction of 2.8 mm Hg in systolic blood pressure but not in diastolic blood pressure. In hypertensive individuals, the DASH diet reduced blood pressure by 11.4/5.5 mm Hg and in non-hypertensives, by 3.5/2.1 mm Hg. This study emphasised the significance of the total dietary pattern and the foods contained therein.
Importantly, a follow-up DASH Sodium trial (Sacks et al 2001a) assessed the combined effect of the DASH diet and reduced salt intake. About 400 adults were randomly assigned to the control or DASH diet for three months. Each subject consumed their diet for 30 consecutive days at each of three levels of salt: high (3.6 g or 150 mmol sodium), intermediate (2.4 g or 100 mmol sodium) and low (1.2 g or 50 mmol sodium). The potassium intakes were greater on the DASH diet than in the controls, but were kept the same for all levels of salt intake at approximately 1.6 g potassium in the control diet and 3g in the DASH diet. Weight was stable throughout the study in all groups. Lowering salt intake reduced blood pressure by 6.7/3.5 mm Hg on the control diet and by 3.0/1.6 mm Hg on the DASH diet. The combined effects on blood pressure of the DASH diet and low salt intake were greater than either of the interventions alone and were 8.9/4.5 mm Hg below the control diet at the high salt level. With this combination, mean systolic blood pressure was 11.5 mm Hg lower in participants with hypertension, and 7.1 mm Hg lower in participants without hypertension. The effects were observed in those with and without hypertension, in both sexes and across racial groups. This confirmed the impact of both the total diet (DASH) and a key nutrient (sodium) on blood pressure. As with previous studies, there was healthy scientific debate, with arguments that an increase in the plasma levels of renin noted in the low sodium arm (Alderman 2001) and meaningful effects could only be seen in hypertensive black females in the study (McCarron 2001). This was countered with comments that diuretic therapy, which prevents CVD, also raises plasma renin and whilst susceptibility to salt may vary in the population, the effects were qualitatively similar among all subgroups (Sacks et al 2001b). The DASH set of trials remain distinctive in addressing the interdependence between nutrient, food and whole of diet effects.
The assumptions behind an optimal diet for lowering chronic disease risk are that the diet comprises key foods that may (a) deliver substantive amounts of required nutrients (including sodium), and (b) provide protective effects in the context of a healthy diet. However, diets comprise a range of foods, and foods deliver nutrients, reflecting the interdependence between diets, foods and nutrients. Diets also reflect the food supply, and imbalances may occur which should be addressed (Tapsell et al 2015). This appears to be the case with dietary sodium. The evidence of an association between high sodium intakes and high blood pressure has been consistently shown across various study populations and age groups, and there is a substantial volume of scientific research to evaluate the current evidence (SLR report).
To assess the nature of chronic disease risk in this case, a systematic review of studies which compares effects of high versus low intakes of sodium on blood pressure as the primary end point is required. As the risk is considered in the context of cardiovascular disease, it is also prudent to consider the effect of lowering sodium intake on other risk factors for cardiovascular disease such as total cholesterol, HDL cholesterol, LDL cholesterol, and then on disease end point such as stroke, myocardial infarction and total mortality.
With increasing numbers of studies across the globe, meta-analyses of data became possible, reducing the reliance on single studies to evaluate the evidence. In the review supporting the 2017 NRVs for sodium (SD1), six recently published Systematic Literature Reviews (SLRs) were identified reporting the results of reduced sodium intake on effects on blood pressure, total cholesterol, HDL cholesterol, LDL cholesterol, myocardial infarction, total mortality or stroke (NHMRC 2013, FOB:IOM 2005, FOB:IOM 2013, NHMRC 2006, Suckling et al 2012, Sacks et al 2001). A systemic review and meta-analysis was conducted on 60 articles describing 56 studies (SD1).
The analysis combined data from trials comparing effects of high versus low sodium dietary intakes. There was consistent evidence of the effect of reducing sodium intake on reductions in systolic blood pressure. This was noted to occur in the dietary range of 1200-3300mg/day sodium. The effect was only seen for blood pressure. There was a lack of evidence of effects of sodium intake on disease outcomes and mortality, but this could be expected as blood pressure is only one factor likely to influence mortality and disease end points. There was also no effect seen with reducing sodium intake on cholesterol levels.
In the statistical analysis supporting the 2017 NRVs, the effects on blood pressure were viewed progressively for cutpoints between 1100 mg/day and 3700 mg/day of sodium, by increments of 100 mg of sodium (SD 2). Graphical analyses of data from these studies showed that below about 2000 mg/day, the difference in blood pressure was larger in the group of studies above the cutpoint than below the cutpoint (i.e. there was a stronger effect in the studies with higher sodium intakes that were categorised as belonging to a “low sodium group”) but the reverse was true above 2000 mg/day (not corrected for confounding by differences in the sodium range tested). The above-cutpoint groups tested a smaller range of sodium differences than the below-cutpoint set at all cutpoint values. The above- and below-cutpoint groups showed the same mean difference in systolic blood pressure when expressed per 500 mg difference in sodium (about -1.5 mm Hg per 500 mg reduction in sodium excretion). Therefore differences observed in the analysis of systolic blood pressure above and below 2000 mg/day of sodium intake were due to confounding by variation among the studies in the sodium intakes prescribed between the high and low sodium groups. This analysis did not test for many small differences (for example from 1200 to 1500 mg) but as the relationship appeared present with increasing intakes the assumption was made that the difference was possibly similar.
The current review of a substantial body of scientific evidence for the 2017 sodium NRV (SD1, SD2) showed that in the range of 1200-3300 mg of 24-hour sodium excretion, a dose-response relationship can be observed between a decrease in sodium intake and decrease in systolic blood pressure of about 1.5 mm Hg/500 mg sodium. The meta-analysis showed a reduction in systolic blood pressure when mean population excretion was lowered from about 3500 mg/day to 2100 mg/day. Thus there is strong evidence that reducing sodium is a significant strategy towards optimising the diet for reducing chronic disease risk, bearing in mind that this strategy is interdependent with food choices and the overall dietary pattern.
Potassium
Potassium can blunt the effect of sodium chloride on blood pressure, mitigating salt sensitivity and lowering urinary calcium excretion (Whelton et al 1997). Morris et al (1999) studied the effect of potassium on salt sensitivity and showed that sensitivity was blunted at 4.7 g/day in African American men and 2.7 g/day in white males. Given this interrelatedness, requirement for potassium depends to some extent on dietary sodium, however, the ideal sodium to potassium intake ratio is not yet clear.
Higher potassium intakes have also been related to decreasing risk of kidney stones in studies in western populations in the US and Finland. Curhan et al (1993, 1997) in the US showed the lowest rate of kidney stones in the highest quintile of intakes of potassium in their studies of both men and women (4.0 and 4.7 g/day, respectively) and Hirvonen et al (1999) in Finland showed that stones were reduced at the second quartile of intake (4.6 g/day) and that there were no further reductions at higher quartiles for men and women.
Dietary fibre
Increasing dietary fibre intakes have been linked to lower rates of obesity, cardiovascular disease, diabetes and certain cancers.
Initially, dietary fibre was widely thought of as an inert bulking agent that lacked energy value and thus should have the potential to help in weight control. Methylcellulose, cellulose and other such unabsorbable materials have often been used as satiety agents for those attempting to restrict food intake. Guar supplements and other high fibre, high carbohydrate diets have been used with modest success by diabetic patients attempting to lose weight. It is thought that the small effects seen in these experimental situations might again relate to a satiating effect due to prolongation of absorption and a smoothing of blood glucose response after meals (Holt et al 1992, Jenkins 1988).
However, studies of weight loss using fibre supplements of various kinds have shown that weight loss is rarely sustained. Heaton et al (1976) could show no weight benefit in replacing white with wholemeal bread in a controlled trial and although increased faecal fat loss on high fibre diets was demonstrated by Jenkins (1988), the loss averaged only 7 g/day. However, the British Nutrition Foundation did conclude, in a 1990 report, that foods rich in non-starch polysaccharide (NSP) are useful in weight reduction, probably through the satiation effect and the fact that diets high in naturally occurring fibres are generally lower in fat (and thus energy) and may take longer to chew, thereby influencing meal size.
Dietary fibre intakes have also been linked to reduced risk of CHD, mainly through an effect on plasma cholesterol. The observation by Hardinge & Stare (1954) that complete vegetarians have lower serum cholesterol concentrations than non-vegetarians has been repeated in many subsequent studies. Furthermore, vegetarians typically have higher ratios of high density lipoprotein (HDL) cholesterol to total cholesterol than either lacto-ovo-vegetarians or nonvegetarians. Although these observations arise in part as a consequence of the reduced dietary intakes of saturated fats among vegetarians, subsequent human trials have demonstrated lowered serum cholesterol concentrations in response to some, but not all, fibres or fibre-rich foods. It would appear that wheat bran, wheat wholemeal products and cellulose have no effect on serum cholesterol (Truswell & Beynen 1991, Truswell 2002). Pectin in large doses can affect a 10% reduction, oat bran and oat wholemeal products are capable of reductions of up to 23% (average 6%) and psyllium, 4% for total cholesterol and 7% for LDL cholesterol (Olson et al 1997). Most studies have found guar gum capable of reducing total serum cholesterol, but further studies to confirm the cholesterol-lowering effects reported for gum arabic, xantham gum, gum acacia, karaya gum and locust bean gum are needed (Truswell 1993).
A 1% reduction in serum cholesterol is generally considered to translate to a 2% reduction in CHD, suggesting substantial benefits from increased dietary fibre of the types described. Three major population studies have assessed the effects of dietary fibre on CHD risk. In the Health Professionals Follow-up study of men (Rimm et al 1996) there was a difference in fibre intake of 16.5 g/day between the highest (28.9 g/day) and lowest (12.4 g/day) intake groups and a RR for fatal heart disease of 0.45 and for total myocardial infarction of 0.59. The Nurses' Health Study (Wolk et al 1999) showed that the difference between the highest (22.9 g/day) and lowest (11.5 g/day) consumption groups of 11.4 g fibre/day equated to a RR of 0.77 for total CHD. In the Finnish men’s study (Pietinen et al 1996), the highest consumption group (34.8 g/day) had an RR of 0.68 compared to the lowest consumers (16.1 g/day).
In relation to colon cancer, despite the wealth of experimental data in cell lines or animals models that provides convincing mechanisms and indicative protective effects of dietary fibre on colon cancer and one large human trial showing benefits (Bingham et al 2003), several other human studies have shown little benefit of higher fibre intakes on colon cancer or markers of the risk of colon cancer (Alberts 2002, MacLennan et al 1995, Schatzkin et al 2000). Fruits, vegetables and cereal grains are all good sources of dietary fibre. Nearly all studies of diet and colon cancer in humans have found decreased risks associated with high intakes of fruit and vegetables. However, whilst a number of studies have also reported reduced risks in association with high cereal grain intakes, a few studies have found an increased risk, which casts an element of doubt over the conclusion that the fibre component was responsible for the apparently protective effect almost universally found for fruit and vegetables (Byers 1995). It is possible that some of the confusion relates to the fact that RS was not accounted for in many early studies. International comparative studies show greater correlations between colon cancer and starch (and thus RS) intake across countries than with dietary fibre (Cassidy et al 1994).
Increased dietary fibre intakes have also been related to prevention of hormone-related cancers such as breast cancer. Pike et al (1993) argue strongly that international comparisons of breast cancer incidence are highly consistent with observed differences in circulating oestrogen levels. However, results from epidemiologic studies comparing the circulating levels of steroid hormones in newly diagnosed cases or high-risk groups with low-risk groups; or disease-free controls with hormone-related cancers, have been inconsistent. Breast cancer is the disease whose nutritional epidemiology has been studied most. Several case-control studies have reported decreased risks associated with fibre-rich diets (Baghurst & Rohan 1994, Lee et al 1991, Lubin et al 1986, van't Veer et al 1990). A Canadian cohort study observed a 32% reduction in breast cancer risk in the top quintile of fibre consumers relative to the bottom quintile (Rohan et al 1993), but two cohort studies in the US failed to observe the inverse relationship (Kushi et al 1992, Willett et al 1992). Dietary fibre is thought to exert its apparently protective effect through a reduction in circulating levels of oestrogen (Rose 1990). The exact mechanism by which this occurs remains uncertain. The WHO in its report on diet and chronic disease (WHO 2003) concluded that an effect of fibre on cancer risk was possible but data were insufficient.
Ecologic studies typically find an inverse association between fibre content of the diet and regional prevalence of diabetes (West 1974, West & Kalbfleisch 1971). However in a survey of two populations in Micronesia, one at high risk and one at low risk of developing Type 2 diabetes, estimates of dietary fibre intake were of no predictive value regarding the risk of subsequent diabetes (King et al 1984). The similarity of these populations with respect to many other factors raises the possibility that the association observed in ecologic studies may have arisen as a consequence of differences other than intakes of dietary fibre between the study populations.
Contemporary research on dietary fibre and diabetes is mostly focussed on the potential benefits of dietary fibre in the management (through glycaemic control) of both Type 1 and Type 2 diabetes. Diabetics exhibit substantially higher risks for CVD than their non-diabetic counterparts, and hyperinsulinaemia, insulin resistance and over-treatment of the diabetic with insulin have all been claimed to contribute to the development of a premature atherosclerosis (Venn & Mann 2004, Vinik & Wing 1992). Management procedures that reduce insulin requirements are therefore highly desirable.
High fibre foods typically slow absorption through an effect on gastric emptying and/or entrapment of material in the viscous digesta that result from high fibre intakes. An author of an early report which claimed that increased fibre intakes may be beneficial for diabetics (Jenkins et al 1976) concluded nearly twenty years later
"that the value of high fiber foods lies principally in their ability to prolong absorption in the small intestine" and that... "the effects on carbohydrate and lipid metabolism can be mimicked by reducing meal size and increasing meal frequency over an extended period of time"
(Jenkins et al 1995).
The intakes of dietary fibre that appear to bring meaningful chronic disease health benefits appear achievable through dietary change. The upper intakes from the three CHD risk studies (Pietinen et al 1996, Rimm et al 1996, Wolk et al 1999) that brought major improvements in cardiovascular risk were 29 g and 35 g/day for men and 23 g/day for women. Twenty-nine g/day is just under the current 70th centile of intake for males in Australia and New Zealand and 35 g equates to the 80th centile. For women, 23 g/day is just under the current 70th centile. Thus for people concerned with chronic disease risk, aiming to increase intakes towards the median intake of the highest current quintile of population intake (ie the 90th centile of 38 g/day for men and 28 g/day for women), would appear to be a prudent strategy to reduce chronic disease risk in a manner unlikely to lead to any adverse effects. Increasing intake through additional vegetables, legumes and fruits in the diet would also increase the intake of antioxidant vitamins and folate.