首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background:

Several fast food companies have made commitments to reduce the levels of salt in the foods they serve, but technical issues are often cited as a barrier to achieving substantial reductions. Our objective was to examine the reported salt levels for products offered by leading multinational fast food chains.

Methods:

Data on salt content for products served by six fast food chains operating in Australia, Canada, France, New Zealand, the United Kingdom and the United States were collected by survey in April 2010. Mean salt contents (and their ranges) were calculated and compared within and between countries and companies.

Results:

We saw substantial variation in the mean salt content for different categories of products. For example, the salads we included in our survey contained 0.5 g of salt per 100 g, whereas the chicken products we included contained 1.6 g. We also saw variability between countries: chicken products from the UK contained 1.1 g of salt per 100 g, whereas chicken products from the US contained 1.8 g. Furthermore, the mean salt content of food categories varied between companies and between the same products in different countries (e.g., McDonald’s Chicken McNuggets contain 0.6 g of salt per 100 g in the UK, but 1.6 g of salt per 100 g in the US).

Interpretation:

The salt content of fast foods varies substantially, not only by type of food, but by company and country in which the food is produced. Although the reasons for this variation are not clear, the marked differences in salt content of very similar products suggest that technical reasons are not a primary explanation. In the right regulatory environment, it is likely that fast food companies could substantially reduce the salt in their products, translating to large gains for population health.It is widely accepted that excess dietary salt causes blood pressure to increase, and that salt is a major determinant of population blood pressure levels.1,2 Recent estimates suggest that the numbers of deaths averted by moderate reductions in population salt consumption would be at least as many as those achieved by plausible reductions in population smoking rates.3 In western countries, more than three-quarters of dietary salt derives from processed foods.4 Coupled with rising rates of nutrition-related diseases worldwide,5,6 the food industry has an increasingly important role to play in public health. With consumers now purchasing larger numbers of meals outside the home, fast food increasingly contributes to population intake of dietary salt.7,8 Fast food tends to be more energy dense, contain more saturated fat and salt, contain fewer micronutrients and be eaten in larger portions than other foods.911 Fast food items such as fried potatoes, pizzas and sugar-sweetened soft drinks typically provide between one-third and one-half of daily energy intake but less than one-quarter of most micronutrients.7A number of leading multinational fast food companies have shown their ability to reformulate foods to reduce salt levels. The United Kingdom’s Food Standards Agency and the New York City Health Department now have voluntary salt reduction targets in place for packaged foods,12,13 and the National Salt Reduction Initiative in the United States has set targets for salt reduction for 25 categories of foods in restaurants.Technical feasibility is often cited by industry as an impediment to efforts to reduce salt content, with new processes and technologies required to deliver lower-salt products.14 There is, however, little evidence to verify whether industry has reached the technical limits of salt reduction. A survey of the salt levels in fast foods would provide insight into this issue by quantifying the variability in the salt levels of similar products produced by different companies in different countries.We sought to compile current data on salt content for products offered by six leading transnational fast food chains and to compare the results between companies, countries and products.  相似文献   

2.

Purpose

While there has been considerable effort to understand the environmental impact of a food or diet, nutritional effects are not usually included in food-related life cycle assessment (LCA).

Methods

We developed a novel Combined Nutritional and Environmental Life Cycle Assessment (CONE-LCA) framework that evaluates and compares in parallel the environmental and nutritional effects of foods or diets. We applied this framework to assess human health impacts, expressed in Disability Adjusted Life Years (DALYs), in a proof-of-concept case study that investigated the environmental and nutritional human health effects associated with the addition of one serving of fluid milk to the present average adult US diet. Epidemiology-based nutritional impacts and benefits linked to milk intake, such as colorectal cancer, stroke, and prostate cancer, were compared to selected environmental impacts traditionally considered in LCA (global warming and particulate matter) carried to a human health endpoint.

Results and discussion

Considering potential human health effects related to global warming, particulate matter, and nutrition, within the context of this study, findings suggest that adding one serving of milk to the current average diet could result in a health benefit for American adults, assuming that existing foods associated with substantial health benefits are not substituted, such as fruits and vegetables. The net health benefit is further increased when considering an iso-caloric substitution of less healthy foods (sugar-sweetened beverages). Further studies are needed to test whether this conclusion holds within a more comprehensive assessment of environmental and nutritional health impacts.

Conclusions

This case study provides the first quantitative epidemiology-based estimate of the complements and trade-offs between nutrition and environment human health burden expressed in DALYs, pioneering the infancy of a new approach in LCA. We recommend further testing of this CONE-LCA approach for other food items and diets, especially when making recommendations about sustainable diets and food choices.
  相似文献   

3.
Nutrition labels have raised awareness of the energetic value of foods, and represent for many a pivotal guideline to regulate food intake. However, recent data have created doubts on label accuracy.

Objective:

We tested label accuracy for energy and macronutrient content of prepackaged energy‐dense snack food products. We measured “true” caloric content of 24 popular snack food products in the U.S. and determined macronutrient content in 10 selected items.

Design and Methods:

Bomb calorimetry and food factors were used to estimate energy content. Macronutrient content was determined according to Official Methods of Analysis. Calorimetric measurements were performed in our metabolic laboratory between April 20th and May 18th and macronutrient content was measured between September 28th and October 7th of 2010.

Results and Conclusion:

Serving size, by weight, exceeded label statements by 1.2% [median] (25th percentile ?1.4, 75th percentile 4.3, P = 0.10). When differences in serving size were accounted for, metabolizable calories were 6.8 kcal (0.5, 23.5, P = 0.0003) or 4.3% (0.2, 13.7, P = 0.001) higher than the label statement. In a small convenience sample of the tested snack foods, carbohydrate content exceeded label statements by 7.7% (0.8, 16.7, P = 0.01); however fat and protein content were not significantly different from label statements (?12.8% [?38.6, 9.6], P = 0.23; 6.1% [?6.1, 17.5], P = 0.32). Carbohydrate content explained 40% and serving size an additional 55% of the excess calories. Among a convenience sample of energy‐dense snack foods, caloric content is higher than stated on the nutrition labels, but overall well within FDA limits. This discrepancy may be explained by inaccurate carbohydrate content and serving size.
  相似文献   

4.

Background

To unravel true links between diet and health, it is important that dietary exposure is accurately measured. Currently, mainly self-reporting methods (e.g. food frequency questionnaires and 24-h recalls) are used to assess food intake in epidemiological studies. However, these traditional instruments are subjective measures and contain well-known biases. Especially, estimating the intake of the group of confectionary products, such as products containing cocoa and liquorice, remains a challenge. The use biomarkers of food intake (BFIs) may provide a more objective measurement. However, an overview of current candidate biomarkers and their validity is missing for both cocoa- and liquorice-containing foods.

Objective

The purpose of the current study was to (1) identify currently described candidate BFIs for cocoa (products) and liquorice, (2) to evaluate the validity of these identified candidate BFIs and (3) to address further validation and/or identification work to be done.

Methods

This systematic review was based on a comprehensive literature search of three databases (PubMed, Scopus and ISI web of Science), to identify candidate BFIs. Via a second search step in the Human Metabolome Database (HMDB), the Food Database (FooDB) and Phenol-Explorer, the specificity of the candidate BFIs was evaluated, followed by an evaluation of the validity of the specific candidate BFIs, via pre-defined criteria.

Results

In total, 37 papers were included for cocoa and 8 papers for liquorice. For cocoa, 164 unique candidate BFIs were obtained, and for liquorice, four were identified in total. Despite the high number of identified BFIs for cocoa, none of the metabolites was specific. Therefore, the validity of these compounds was not further examined. For liquorice intake, 18-glycyrrhetinic acid (18-GA) was found to have the highest assumed validity.

Conclusions

For cocoa, specific BFIs were missing, mainly because the individual BFIs were also found in foods having a similar composition, such as tea (polyphenols) or coffee (caffeine). However, a combination of individual BFIs might lead to discriminating profiles between cocoa (products) and foods with a similar composition. Therefore, studies directly comparing the consumption of cocoa to these similar products are needed, enabling efforts to find a unique profile per product. For liquorice, we identified 18-GA as a promising BFI; however, important information on its validity is missing; thus, more research is necessary. Our findings indicate a need for more studies to determine acceptable BFIs for both cocoa and liquorice.
  相似文献   

5.
6.

Introduction

Epidemiological studies suggest three daily servings of whole-grain foods (WGF) might lower cardiovascular disease risk, at least partly by lowering serum lipid levels. We have assessed the effects of consuming three daily portions of wholegrain food (provided as wheat or a mixture of wheat and oats) on lipoprotein subclass size and concentration in a dietary randomised controlled trial involving middle aged healthy individuals.

Methods

After a 4-week run-in period on a refined diet, volunteers were randomly allocated to a control (refined diet), wheat, or wheat + oats group for 12 weeks. Our servings were determined in order to significantly increase the intakes of non starch polysaccharides to the UK Dietary Reference Value of 18 g per day in the whole grain groups (18.5 g and 16.8 g per day in the wheat and wheat + oats groups respectively in comparison with 11.3 g per day in the control group). Outcome measures were serum lipoprotein subclasses'' size and concentration. Habitual dietary intake was assessed prior and during the intervention. Of the 233 volunteers recruited, 24 withdrew and 3 were excluded.

Results

At baseline, significant associations were found between lipoprotein size and subclasses'' concentrations and some markers of cardiovascular risk such as insulin resistance, blood pressure and serum Inter cellular adhesion molecule 1 concentration. Furthermore, alcohol and vitamin C intake were positively associated with an anti-atherogenic lipoprotein profile, with regards to lipoprotein size and subclasses'' distribution. However, none of the interventions with whole grain affected lipoprotein size and profile.

Conclusion

Our results indicate that three portions of wholegrain foods, irrelevant of the type (wheat or oat-based) do not reduce cardiovascular risk by beneficially altering the size and distribution of lipoprotein subclasses.

Trial Registration

www.Controlled-Trials.com ISRCTN 27657880.  相似文献   

7.

Background

Previous research has shown that oral processing characteristics like bite size and oral residence duration are related to the satiating efficiency of foods. Oral processing characteristics are influenced by food texture. Very little research has been done on the effect of food texture within solid foods on energy intake.

Objectives

The first objective was to investigate the effect of hardness of food on energy intake at lunch, and to link this effect to differences in food oral processing characteristics. The second objective was to investigate whether the reduction in energy intake at lunch will be compensated for in the subsequent dinner.

Design

Fifty subjects (11 male, BMI: 21±2 kg/m2, age: 24±2 y) participated in a cross-over study in which they consumed ad libitum from a lunch with soft foods or hard foods on two separate days. Oral processing characteristics at lunch were assessed by coding video records. Later on the same days, subjects consumed dinner ad libitum.

Results

Hard foods led to a ∼13% lower energy intake at lunch compared to soft foods (P<0.001). Hard foods were consumed with smaller bites, longer oral duration per gram food, and more chewing per gram food compared to the soft foods (P<0.05). Energy intake at dinner did not differ after both lunches (P = 0.16).

Conclusions

Hard foods led to reduced energy intake compared to soft foods, and this reduction in energy intake was sustained over the next meal. We argue that the differences in oral processing characteristics produced by the hardness of the foods explain the effect on intake. The sustained reduction in energy intake suggests that changes in food texture can be a helpful tool in reducing the overall daily energy intake.  相似文献   

8.

Objective

How do the holidays – and the possible New Year’s resolutions that follow – influence a household’s purchase patterns of healthier foods versus less healthy foods? This has important implications for both holiday food shopping and post-holiday shopping.

Methods

207 households were recruited to participate in a randomized-controlled trial conducted at two regional-grocery chain locations in upstate New York. Item-level transaction records were tracked over a seven-month period (July 2010 to March 2011). The cooperating grocer’s proprietary nutrient-rating system was used to designate “healthy,” and “less healthy” items. Calorie data were extracted from online nutritional databases. Expenditures and calories purchased for the holiday period (Thanksgiving-New Year’s), and the post-holiday period (New Year’s-March), were compared to baseline (July-Thanksgiving) amounts.

Results

During the holiday season, household food expenditures increased 15% compared to baseline ($105.74 to $121.83; p<0.001), with 75% of additional expenditures accounted for by less-healthy items. Consistent with what one would expect from New Year’s resolutions, sales of healthy foods increased 29.4% ($13.24/week) after the holiday season compared to baseline, and 18.9% ($9.26/week) compared to the holiday period. Unfortunately, sales of less-healthy foods remained at holiday levels ($72.85/week holiday period vs. $72.52/week post-holiday). Calories purchased each week increased 9.3% (450 calories per serving/week) after the New Year compared to the holiday period, and increased 20.2% (890 calories per serving/week) compared to baseline.

Conclusions

Despite resolutions to eat more healthfully after New Year’s, consumers may adjust to a new “status quo” of increased less-healthy food purchasing during the holidays, and dubiously fulfill their New Year’s resolutions by spending more on healthy foods. Encouraging consumers to substitute healthy items for less-healthy items may be one way for practitioners and public health officials to help consumers fulfill New Year’s resolutions, and reverse holiday weight gain.  相似文献   

9.

Objective

Global health challenges include non-communicable disease burdens, ensuring food security in the context of rising food prices, and environmental constraints around food production, e.g., greenhouse gas [GHG] emissions. We therefore aimed to consider optimized solutions to the mix of food items in daily diets for a developed country population: New Zealand (NZ).

Methods

We conducted scenario development and linear programming to model 16 diets (some with uncertainty). Data inputs included nutrients in foods, food prices, food wastage and food-specific GHG emissions.

Findings

This study identified daily dietary patterns that met key nutrient requirements for as little as a median of NZ$ 3.17 per day (US$ 2.41/d) (95% simulation interval [SI] = NZ$ 2.86 to 3.50/d). Diets that included “more familiar meals” for New Zealanders, increased the cost. The optimized diets also had low GHG emission profiles compared with the estimate for the ‘typical NZ diet’ e.g., 1.62 kg CO2e/d for one scenario (95%SI = 1.39 to 1.85 kg CO2e) compared with 10.1 kg CO2e/d, respectively. All of the optimized low-cost and low-GHG dietary patterns had likely health advantages over the current NZ dietary pattern, i.e., lower cardiovascular disease and cancer risk.

Conclusions

We identified optimal foods and dietary patterns that would lower the risk of non-communicable diseases at low cost and with low greenhouse gas emission profiles. These results could help guide central and local government decisions around which foods to focus policies on. That is which foods are most suitable for: food taxes (additions and exemptions); healthy food vouchers and subsidies; and for increased use by public institutions involved in food preparation.  相似文献   

10.
The highly potent botulinum neurotoxins are responsible for botulism, a severe neuroparalytic disease. Strains of nonproteolytic Clostridium botulinum form neurotoxins of types B, E, and F and are the main hazard associated with minimally heated refrigerated foods. Recent developments in quantitative microbiological risk assessment (QMRA) and food safety objectives (FSO) have made food safety more quantitative and include, as inputs, probability distributions for the contamination of food materials and foods. A new method that combines a selective enrichment culture with multiplex PCR has been developed and validated to enumerate specifically the spores of nonproteolytic C. botulinum. Key features of this new method include the following: (i) it is specific for nonproteolytic C. botulinum (and does not detect proteolytic C. botulinum), (ii) the detection limit has been determined for each food tested (using carefully structured control samples), and (iii) a low detection limit has been achieved by the use of selective enrichment and large test samples. The method has been used to enumerate spores of nonproteolytic C. botulinum in 637 samples of 19 food materials included in pasta-based minimally heated refrigerated foods and in 7 complete foods. A total of 32 samples (5 egg pastas and 27 scallops) contained spores of nonproteolytic C. botulinum type B or F. The majority of samples contained <100 spores/kg, but one sample of scallops contained 444 spores/kg. Nonproteolytic C. botulinum type E was not detected. Importantly, for QMRA and FSO, the construction of probability distributions will enable the frequency of packs containing particular levels of contamination to be determined.Food-borne botulism is a severe and deadly intoxication caused by the consumption of food containing as little as 30 to 100 ng of preformed botulinum neurotoxin (45). More than 2,500 cases of botulism were reported in Europe in 1999 and 2000, with the majority of cases in the east of the continent (44). Currently, 25 to 50 food-borne botulism cases are diagnosed annually in the United States (27). There are seven distinct botulinum neurotoxins (types A to G) and a number of subtypes (6, 26, 45). In view of the potency of the botulinum neurotoxin and the severity of botulism, four phylogenetically distinct bacteria are grouped together as the Clostridium botulinum species, solely on the basis of their ability to form botulinum neurotoxin. The divergence between these four distinct bacteria is strong enough to merit their classification as distinct species and in some cases is significantly greater than that between bacteria belonging to different genera, e.g., Bacillus subtilis and Staphylococcus aureus (7). Two of these bacteria (proteolytic C. botulinum and nonproteolytic C. botulinum) are responsible for the majority of cases of food-borne botulism. Strains of proteolytic C. botulinum produce neurotoxins of type A, B, or F, form spores of high heat resistance, and have a minimum growth temperature of approximately 12°C (39). Strains of nonproteolytic C. botulinum produce neurotoxins of type B, E, or F, form spores of moderate heat resistance, and are able to grow and form toxin at 3°C (18, 48) and are recognized as the major hazard associated with minimally heated refrigerated foods (4, 37, 43, 44, 48). These new foods meet consumer demand for high-quality, convenient foods that are low in preservatives, and sales are presently increasing by about 10% per annum in many countries (3, 47).Quantitative microbiological risk assessment (QMRA) is now established as an important microbiology food safety tool (42). Process risk models have been used to assess the safety of specific foods with respect to nonproteolytic C. botulinum and the food-borne botulism hazard (e.g., 2, 41). These process risk models benefit from high-quality information, including that on the incidence of spores of nonproteolytic C. botulinum spores in food materials. The implementation of food safety objectives (FSO) also benefits from the availability of high-quality information on the microbial contamination of foods and food materials (24). This information is most effective in the form of probability distributions rather than as average spore concentrations or other statistics.The difficulty with enumerating nonproteolytic C. botulinum in foods is that there is no effective selective culture medium available. Surveys of the extent of contamination of foods and food materials have used a nonselective enrichment followed by either testing for neurotoxin using a mouse test or enzyme-linked immunosorbent assay (ELISA) or testing for the presence of neurotoxin genes using a PCR test (3, 10, 13, 35, 38, 39). This approach, however, is not optimized for nonproteolytic C. botulinum or proteolytic C. botulinum (therefore potentially failing to recover all spores of either organism) and may also not distinguish nonproteolytic C. botulinum from proteolytic C. botulinum. Heating at 80°C for 10 min followed by incubation at 35°C (54) may be reasonably selective for proteolytic C. botulinum, but there is no similar approach for nonproteolytic C. botulinum, although incubation at 28°C (54) may offer an element of selection. It is necessary, therefore, to develop a method to enumerate spores of nonproteolytic C. botulinum in food materials that is robust and optimized, as well as sensitive and specific for this particular pathogen (and does not also detect proteolytic C. botulinum). When enumerating bacteria in foods, it is essential to demonstrate the efficiency of the method by verifying that small concentrations (in the present study, spores of nonproteolytic C. botulinum) can be detected following addition to test samples.This paper describes the development, validation, and application of a new method to enumerate spores of nonproteolytic C. botulinum in foods and in food materials. This method has been designed to generate data for the construction of probability distributions that can be used in QMRA and FSO settings. Most of the effort has been dedicated to the development and evaluation of the enrichment procedure rather than the PCR test, as the PCR test has received much attention from others (e.g., 3, 10, 16, 36, 38). A low-temperature selective-enrichment procedure is described that has been optimized specifically for nonproteolytic C. botulinum over proteolytic C. botulinum and other bacteria. In order to detect low concentrations of spores, large quantities (200 g) of food materials and foods have been tested. Specific detection of neurotoxin genes is achieved by the use of an established multiplex PCR (36), with an internal amplification control now included (25). By the use of a set of control samples inoculated with defined concentrations of spores of nonproteolytic C. botulinum, the detection limit has been estimated for each food material and food tested. The method has been used in an extensive survey of raw materials intended for use in pasta ready meals, as well as the final meals themselves. The implications for risk assessment and risk management of chilled foods are discussed.  相似文献   

11.

Background

Consumers are increasingly demanding for natural and beneficial foods, in order to improve their health and well-being. Probiotics play an important role in such demand, and dairy foods are commonly used as vehicles for such bacteria, represented predominantly by lactic acid bacteria. Due to consumers demand, food industry is constantly looking for novel bacterial strains, leading to studies that aims the isolation and characterization of their beneficial features. This study aimed to characterize the naturally occurring lactic acid bacteria obtained from a dairy environment, in order to assess their potential use as probiotics.

Results

Preliminary screening and PCR analysis, based on 16S rRNA sequencing, were applied to select and identify 15 LAB strains from the genera Lactobacillus (n?=?11), Pediococcus (n?=?2) and Weissella (n?=?2). All strains showed resistance to low pH and the evaluated bile salt concentrations in vitro. The API ZYM test characterized the enzymatic activity of the strains, and a high β-galactosidase activity was observed in 13 strains. All strains presented resistance to simulated gastric (3?h) and intestinal (4?h) conditions in vitro, the ability to auto- and co-aggregate with indicator microorganisms and a high cell surface hydrophobicity. Most of the strains were positive for map and EFTu beneficial genes. All strains exhibited strong deconjugation of bile salts in vitro and all assimilated lactose.

Conclusions

The phenotypes exhibited in vitro and the presence of beneficial genes revealed the beneficial potential of the studied strains, demanding further analyses in a food matrix and in vivo to allow the development of a functional product, with health-related properties.
  相似文献   

12.

Objective

Each day, tens of millions of restaurant goers, conference attendees, college students, military personnel, and school children serve themselves at buffets – many being all-you-can-eat buffets. Knowing how the food order at a buffet triggers what a person selects could be useful in guiding diners to make healthier selections.

Method

The breakfast food selections of 124 health conference attendees were tallied at two separate seven-item buffet lines (which included cheesy eggs, potatoes, bacon, cinnamon rolls, low-fat granola, low-fat yogurt, and fruit). The food order between the two lines was reversed (least healthy to most healthy, and vise-versa). Participants were randomly assigned to choose their meal from one line or the other, and researchers recorded what participants selected.

Results

With buffet foods, the first ones seen are the ones most selected. Over 75% of diners selected the first food they saw, and the first three foods a person encountered in the buffet comprised 66% of all the foods they took. Serving the less healthy foods first led diners to take 31% more total food items (p<0.001). Indeed, diners in this line more frequently chose less healthy foods in combinations, such as cheesy eggs and bacon (r = 0.47; p<0.001) or cheesy eggs and fried potatoes (r = 0.37; p<0.001). This co-selection of healthier foods was less common.

Conclusions

Three words summarize these results: First foods most. What ends up on a buffet diner’s plate is dramatically determined by the presentation order of food. Rearranging food order from healthiest to least healthy can nudge unknowing or even resistant diners toward a healthier meal, helping make them slim by design. Health-conscious diners, can proactively start at the healthier end of the line, and this same basic principle of “first foods most” may be relevant in other contexts – such as when serving or passing food at family dinners.  相似文献   

13.

Background

Catch-up growth after an infection is essential for children to maintain good nutritional status. To prevent malnutrition, WHO recommends that children are given one additional healthy meal per day during the 2 weeks after onset of illness. We investigated to what extent ready-to-use therapeutic food (RUTF) promotes catch-up growth in children after an acute, uncomplicated episode of Plasmodium falciparum malaria.

Methods

We did an open randomised trial of children aged 6–59 months with confirmed malaria who attended a Médecins Sans Frontières-supported outpatient clinic in Katanga Province, Democratic Republic of Congo. All children received a clinical examination and malaria treatment. Patients were then randomly assigned to either an RUTF group, who received daily supplemental RUTF (a high-protein peanut-based paste) for 14 days, or to a control group, who received no supplemental food. Children were weighed at baseline and on days 14 and 28. The primary outcome was mean weight change after 14 days'' RUTF. Analysis was by intention-to-treat.

Results

93 children received RUTF and 87 received no food supplementation. At day 14, the RUTF group had a mean weight gain of 353 g compared with 189 g in the control group (difference 164 [95%CI 52–277], p = 0.005). However, at day 28 there was no statistically significant difference between the groups (539 g versus 414 g, respectively [p = 0.053]). Similarly, rate of weight gain per kg bodyweight per day was significantly higher at day 14 in the RUTF group (2.4 g/kg per day versus 1.3 g/kg per day, p = 0.005) but at day 28 was 1.9 g/kg per day in the RUTF group versus 1.5 g/kg per day in the control group (p = 0.076).

Conclusions

Children receiving RUTF for 14 days after effective treatment of an uncomplicated malaria episode had a faster weight gain than children not given supplementation, reducing the period that children were at risk of malnutrition.

Trial Registration

ClinicalTrials.gov NCT00819858  相似文献   

14.
Andrea Rosanoff 《Plant and Soil》2013,368(1-2):139-153

Aims

Decreasing mineral concentrations in high-yield grains of the Green Revolution have coincided in time with rising global cardiovascular disease (CVD) mortality rates. Given the Magnesium (Mg) Hypothesis of CVD, it’s important to assess any changes in food crop Mg concentrations over the past 50+ years.

Methods

Using current and historical published sources, Mg concentrations in “old” and “new” wheats, fruits and vegetables were listed/calculated (dry weight basis) and applied to reports of USA’s historic Mg supply, 1900–2006. Resulting trend in USA Mg supply was compared with USA trend in CVD mortality. Human Mg intake studies, old and new, were compared with the range of reported human Mg requirements.

Results

Acknowledging assessment difficulties, since the 1850s, wheats have declined in Mg concentration 7–29 %; USA and English vegetables’ Mg declined 15–23 %, 1930s to 1980s. The nadir of USA food Mg supply in 1968 coincides with the USA peak in CVD mortality. As humans transition from “traditional” to modern processed food diets, Mg intake declines.

Conclusions

Rising global CVD mortality may be linked to lower Mg intakes as world populations transition from traditional high Mg foods to those low in Mg due to declining crop Mg and processing losses.  相似文献   

15.
The flow of long-chain polyunsaturated fatty acids (PUFAs) of the omega-3 family, namely, eicosapentaenoic acid (20:5n-3, EPA) and docosahexaenoic acid (22:6n-3, DHA), exported by amphibian metamorphs from water to terrestrial ecosystems in the Medveditsa River floodplain, was quantified for the first time. The total biomass export by three amphibian species (Pelobates fuscus, Bombina bombina, and Pelophylax ridibundus) per unit area of the lake surface was 0.594 g/m2 per year (as a mean for 2 years). The biomass flow per unit area of land was 0.726 g/ha per year (0.302 g/ha per year for organic carbon) in 2015–2016. The average annual total removal of EPA + DHA by amphibians from the floodplain lake was 1.47 mg/m2 of water surface area. Due to the high content of EPA and DHA in biomass, amphibians are potentially a valuable food for terrestrial predators having no access to other sources of essential PUFAs.  相似文献   

16.

Purpose

Private food consumption accounts for 30 % of total environmental impacts caused by the final consumption of Swiss households. The private expenses for gastronomy and hotels account for another 6 %. Therefore, it is necessary to investigate and better understand the environmental impacts of food consumption and the possibilities for a reduction of these impacts. This was the starting point for the collaboration between the canteen operator SV Group, the life cycle assessment (LCA) consultancy ESU-services, the energy supplier ewz and the World Wide Fund for Nature (WWF) in Switzerland focusing on food consumption in canteens.

Methods

In a first step, an LCA study was used to analyse the environmental impacts of about 20 million meals served in 240 canteens in 2011. LCA data for 160 food items were linked to the food amounts of about 10,000 articles purchased in this year. This was supplemented by data on canteen operation and resulted in a full organisational LCA.

Results and discussion

The impacts of food purchases are about four times higher than the direct impacts due to the operation of the canteens. The most important product groups are meat and dairy products. Improvement potentials have been identified within 14 different themes by the project group. They include measures in the canteen operation (e.g. reduction of food waste or energy-efficient appliances); measures in the supply chain, e.g. a reduction of vegetables grown in heated greenhouses; or the abandonment of air-transported products. But also dietary choices such as a reduction of the average amount of meat per meal are considered as an option. The results and recommendations of the detailed LCA as well as information by other partners have been used by the SV Group to develop the programme ONE TWO WE. It assists the customers (companies who commission the operation of canteens in their premises) to reach improved levels of environmental performance. The programme aims for a 20 % cut on GHG emissions after full implementation in the participating canteens.

Conclusions

The programme started successfully with many customers positively convinced by the proposed changes in the provision of canteen meals. An initial reduction of greenhouse gas emissions compared to the baseline was achieved. This LCA study is a good example for the value of calculating a full organisational environmental footprint for a company in the gastronomy sector and for using the results of such a study to bring down the overall environmental impacts.
  相似文献   

17.

Objective

Previous studies have shown that estimations of the calorie content of an unhealthy main meal food tend to be lower when the food is shown alongside a healthy item (e.g. fruit or vegetables) than when shown alone. This effect has been called the negative calorie illusion and has been attributed to averaging the unhealthy (vice) and healthy (virtue) foods leading to increased perceived healthiness and reduced calorie estimates. The current study aimed to replicate and extend these findings to test the hypothesized mediating effect of ratings of healthiness of foods on calorie estimates.

Methods

In three online studies, participants were invited to make calorie estimates of combinations of foods. Healthiness ratings of the food were also assessed.

Results

The first two studies failed to replicate the negative calorie illusion. In a final study, the use of a reference food, closely following a procedure from a previously published study, did elicit a negative calorie illusion. No evidence was found for a mediating role of healthiness estimates.

Conclusion

The negative calorie illusion appears to be a function of the contrast between a food being judged and a reference, supporting the hypothesis that the negative calorie illusion arises from the use of a reference-dependent anchoring and adjustment heuristic and not from an ‘averaging’ effect, as initially proposed. This finding is consistent with existing data on sequential calorie estimates, and highlights a significant impact of the order in which foods are viewed on how foods are evaluated.  相似文献   

18.

Purpose

The objective was to assess the environmental burden of food consumption and food losses in Germany with the aim to define measures to reduce environmentally relevant food losses. To support the finding of measurements, the study provides differentiated information on life phases (agriculture, processing, retailer, and consumption), consumption places (in-house and out-of-home), and the average German food basket consisting of eight food categories.

Methods

In order to obtain information on the environmental impacts of German food consumption, the study analyzed the material flows of the food products in the German food basket starting from consumption phase and going backwards until agricultural production. The analysis includes all relevant impact categories such as GWP, freshwater and marine eutrophication, particular matter formation, and agricultural land and water use. The life stages consumers, retail, wholesale, food production, and agriculture have been taken into account. Furthermore, transports to and within Germany have been considered. Consumption and production data have been taken from the German income and consumption sample, German production and trade statistics, and studies recently carried out on food losses. In order to model German food consumption, some simplifications had to be done.

Results and discussion

Results show that German food consumption is responsible for 2.7 t of greenhouse gases per person and year. Fourteen cubic meters of blue water is used for agricultural food production per person, and 2673 m2 of agricultural land is occupied each year per German for food consumption. Between 14 and 20 % of the environmental burdens (depending on the impact category) result from food losses along the value chain. Out-of-home consumption is responsible for 8 to 28 % of the total environmental impacts (depending on the impact category). In particular, animal products cause high environmental burdens. Regarding life cycle phases, agriculture and consumption cause the highest impacts: together, they are responsible for more than 87 % of the total environmental burdens.

Conclusions

The study shows that food production and consumption as well as food losses along the value chain are of high relevance regarding Germany’s environmental impacts. In particular, animal products are responsible for high environmental burdens. Thus, with respect to reducing environmentally relevant food losses, measures should focus in particular on the reduction of food waste of animal origin. The most relevant life cycle phases to reduce environmental impacts are agricultural production and consumption in households and out-of-home.
  相似文献   

19.
Laboratory studies were conducted on food ingestion and excretion by selected species of pentatomids on different food sources to support their pest status. We compared the frequency and time of feeding on vegetative (stem) and reproductive (seed) structures of soybean, Glycine max (L.) Merrill and of maize, Zea mays L. by Piezodorus guildinii (Westwood), Dichelops melacanthus (Dallas), and Edessa meditabunda (F.); in addition, the amount of excreta (feces) produced were compared for D. melacanthus feeding on seed of soybean and stem of maize seedling. The feeding behavior of E. meditabunda and P. guildinii on soybean, and of D. melacanthus on maize was recorded using the electropenetrography (EPG) technique. Excretion was estimated using water sensitive paper recording number and area of fecal drops. Results indicated that E. meditabunda on soybean stem repeated events of ingestion (both xylem and phloem sap) over four times per bug during the 8 h of recording for ca. 53 min per event. Dichelops melacanthus on maize seedling repeated each ingestion event over three times per bug for ca. 24 min per event. Piezodorus guildinii feeding on soybean stem repeated each ingestion 1.2 times per bug for ca. 40 min per event; on seed endosperm, it fed for a longer time, ca. 80 min per event, each event repeated only 0.5 times per bug. Number of excretory drops was higher (9.9 drops per bug) when D. melacanthus fed on maize seedling than on soybean seed (1.4 drops per bug). A larger amount of saliva/regurgitate liquid food was expelled when bugs fed on the former than on the later food.  相似文献   

20.

Background

Transgenic expression of small RNAs is a prevalent approach in agrobiotechnology for the global enhancement of plant foods. Meanwhile, emerging studies have, on the one hand, emphasized the potential of transgenic microRNAs (miRNAs) as novel dietary therapeutics and, on the other, suggested potential food safety issues if harmful miRNAs are absorbed and bioactive. For these reasons, it is necessary to evaluate the bioavailability of transgenic miRNAs in genetically modified crops.

Results

As a pilot study, two transgenic Arabidopsis lines ectopically expressing unique miRNAs were compared and contrasted with the plant bioavailable small RNA MIR2911 for digestive stability and serum bioavailability. The expression levels of these transgenic miRNAs in Arabidopsis were found to be comparable to that of MIR2911 in fresh tissues. Assays of digestive stability in vitro and in vivo suggested the transgenic miRNAs and MIR2911 had comparable resistance to degradation. Healthy mice consuming diets rich in Arabidopsis lines expressing these miRNAs displayed MIR2911 in the bloodstream but no detectable levels of the transgenic miRNAs.

Conclusions

These preliminary results imply digestive stability and high expression levels of miRNAs in plants do not readily equate to bioavailability. This initial work suggests novel engineering strategies be employed to enhance miRNA bioavailability when attempting to use transgenic foods as a delivery platform.
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号