首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In the past, ecological risk assessments (ERAs) have generally overlooked the soil microbial community when evaluating the impacts of contaminants in soil. However, the soil microbial community, which includes bacteria and fungi, performs functions necessary for overall ecosystem health, including nitrogen fixation, nutrient cycling, and even degradation of contaminants. The incorporation of the soil microbial community into ERA requires the compilation of adequate toxicity data to complete the hazard assessment phase of the risk assessment. A variety of soil microbial toxicity tests exist that provide rapid and inexpensive results. Surveys of the microbial community at a contaminated site may also provide insight as to their functioning in the presence of contaminants. This paper explores the use of toxicity tests and surveys to evaluate chemical toxicity to microbes. However, many questions related to the best methodological approach to an ERA of the soil microbial community have yet to be answered.  相似文献   

2.
Scientists, risk practitioners, and regulators have debated the need to be proactive in assessing the potential health and environmental risks and benefits of unregulated nano-scale technologies. Many nanotech-based products and applications are already in use or development. Application of a proactive risk-based approach that considers the life cycle of the product, rather than a precautionary principle approach that would likely restrict the progress and advance of nanoscience, will be useful in helping to assess the unknown and unpredictable risks associated with nano-products, nanotoxicity, and nanopollution.

This article summarizes what is currently known regarding the potential toxicity and hazards of nanomaterials. A life-cycle perspective is used to identify important areas for further consideration and research. A conceptual framework is proposed for linking the strategies of life cycle and risk analysis within the same toolbox. This approach allows for prevention and treatment of a material's life-cycle risks, which can be considered in an integrated manner, thereby promoting continuous improvement, proactive risk reduction, and a flexible and adaptive approach to evaluating nanotechnology without stifling innovation.  相似文献   


3.
4.

Background

Framingham risk equations are widely used to predict cardiovascular disease based on health information from a single time point. Little is known regarding use of information from repeat risk assessments and temporal change in estimated cardiovascular risk for prediction of future cardiovascular events. This study was aimed to compare the discrimination and risk reclassification of approaches using estimated cardiovascular risk at single and repeat risk assessments

Methods

Using data on 12,197 individuals enrolled in EPIC-Norfolk cohort, with 12 years of follow-up, we examined rates of cardiovascular events by levels of estimated absolute risk (Framingham risk score) at the first and second health examination four years later. We calculated the area under the receiver operating characteristic curve (aROC) and risk reclassification, comparing approaches using information from single and repeat risk assessments (i.e., estimated risk at different time points).

Results

The mean Framingham risk score increased from 15.5% to 17.5% over a mean of 3.7 years from the first to second health examination. Individuals with high estimated risk (≥20%) at both health examinations had considerably higher rates of cardiovascular events than those who remained in the lowest risk category (<10%) in both health examinations (34.0 [95%CI 31.7–36.6] and 2.7 [2.2–3.3] per 1,000 person-years respectively). Using information from the most up-to-date risk assessment resulted in a small non-significant change in risk classification over the previous risk assessment (net reclassification improvement of -4.8%, p>0.05). Using information from both risk assessments slightly improved discrimination compared to information from a single risk assessment (aROC 0.76 and 0.75 respectively, p<0.001).

Conclusions

Using information from repeat risk assessments over a period of four years modestly improved prediction, compared to using data from a single risk assessment. However, this approach did not improve risk classification.  相似文献   

5.

Background

The Thrombolysis in Myocardial Infarction (TIMI) risk scores for Unstable Angina/Non-ST–elevation myocardial infarction (UA/NSTEMI) and ST-elevation myocardial infarction (STEMI) and the Global Registry of Acute Coronary Events (GRACE) risk scores for in-hospital and 6-month mortality are established tools for assessing risk in Acute Coronary Syndrome (ACS) patients. The objective of our study was to compare the discriminative abilities of the TIMI and GRACE risk scores in a broad-spectrum, unselected ACS population and to assess the relative contributions of model simplicity and model composition to any observed differences between the two scoring systems.

Methodology/Principal Findings

ACS patients admitted to the University of Michigan between 1999 and 2005 were divided into UA/NSTEMI (n = 2753) and STEMI (n = 698) subpopulations. The predictive abilities of the TIMI and GRACE scores for in-hospital and 6-month mortality were assessed by calibration and discrimination. There were 137 in-hospital deaths (4%), and among the survivors, 234 (7.4%) died by 6 months post-discharge. In the UA/NSTEMI population, the GRACE risk scores demonstrated better discrimination than the TIMI UA/NSTEMI score for in-hospital (C = 0.85, 95% CI: 0.81–0.89, versus 0.54, 95% CI: 0.48–0.60; p<0.01) and 6-month (C = 0.79, 95% CI: 0.76–0.83, versus 0.56, 95% CI: 0.52–0.60; p<0.01) mortality. Among STEMI patients, the GRACE and TIMI STEMI scores demonstrated comparably excellent discrimination for in-hospital (C = 0.84, 95% CI: 0.78–0.90 versus 0.83, 95% CI: 0.78–0.89; p = 0.83) and 6-month (C = 0.72, 95% CI: 0.63–0.81, versus 0.71, 95% CI: 0.64–0.79; p = 0.79) mortality. An analysis of refitted multivariate models demonstrated a marked improvement in the discriminative power of the TIMI UA/NSTEMI model with the incorporation of heart failure and hemodynamic variables. Study limitations included unaccounted for confounders inherent to observational, single institution studies with moderate sample sizes.

Conclusions/Significance

The GRACE scores provided superior discrimination as compared with the TIMI UA/NSTEMI score in predicting in-hospital and 6-month mortality in UA/NSTEMI patients, although the GRACE and TIMI STEMI scores performed equally well in STEMI patients. The observed discriminative deficit of the TIMI UA/NSTEMI score likely results from the omission of key risk factors rather than from the relative simplicity of the scoring system.  相似文献   

6.
Two species of polychaetous annelids are dug for sale as baitfrom intertidal mudflats of Maine. This effort generatesnearly$3.5 million in annual revenue and comprises over 90% of thebaitworm fisheries in the U.S. The two species are (1) sandwormsor clamworms, Nereis virens (family Nereididae) and (2) bloodwormsor beakworms, Glycera dibranchiata (family Glyceridae). Numbersof baitworm diggers licensed annually in Maine have increasedfrom 449 in 1948 to a maximum of 1,455 in 1974 and decreasedsince then to 801 in 1991. Sandworm landings increased fromthe late 1940s until the early 1960s when they leveled off.They fluctuated between 300,000 and 400,000 lbs landed annuallyfor the next 20 years. Between 1982 and 1991, the sandworm landingsranged between 179,000 (1990) and 380,000 (1982) lbs landedper year. Bloodworm landings were at a maximum between 1960and 1976, ranging between 140,000 and 215,000 lbs landed annually.After a sharp decline in the bloodworm fishery in the late 1970s,annual landings ranged between 102,000 (1988) and 168,000 (1982)lbs. Reasons for the fluctuations and recent decreases in landingsremain unexplained. Some data suggest that market demand limitsthe baitworm landings for both species, while others imply thatoverharvesting, at least for bloodworms, may be a problem forthese resources.  相似文献   

7.

Background

An HIV vaccine could substantially impact the epidemic. However, risk compensation (RC), or post-vaccination increase in risk behavior, could present a major challenge. The methodology used in previous studies of risk compensation has been almost exclusively individual-level in focus, and has not explored how increased risk behavior could affect the connectivity of risk networks. This study examined the impact of anticipated HIV vaccine-related RC on the structure of high-risk drug users'' sexual and injection risk network.

Methods

A sample of 433 rural drug users in the US provided data on their risk relationships (i.e., those involving recent unprotected sex and/or injection equipment sharing). Dyad-specific data were collected on likelihood of increasing/initiating risk behavior if they, their partner, or they and their partner received an HIV vaccine. Using these data and social network analysis, a "post-vaccination network" was constructed and compared to the current network on measures relevant to HIV transmission, including network size, cohesiveness (e.g., diameter, component structure, density), and centrality.

Results

Participants reported 488 risk relationships. Few reported an intention to decrease condom use or increase equipment sharing (4% and 1%, respectively). RC intent was reported in 30 existing risk relationships and vaccination was anticipated to elicit the formation of five new relationships. RC resulted in a 5% increase in risk network size (n = 142 to n = 149) and a significant increase in network density. The initiation of risk relationships resulted in the connection of otherwise disconnected network components, with the largest doubling in size from five to ten.

Conclusions

This study demonstrates a new methodological approach to studying RC and reveals that behavior change following HIV vaccination could potentially impact risk network connectivity. These data will be valuable in parameterizing future network models that can determine if network-level change precipitated by RC would appreciably impact the vaccine''s population-level effectiveness.  相似文献   

8.
One of the first immunotoxicology studies determined that exposure of ducks to DDT reduced their resistance to a virus infection. The immunotoxic potential of insecticides and herbicides has subsequently been studied extensively in laboratory animals, driven by the global distribution and use of these chemicals. (Ten of the twelve persistent organic pollutants, identified by the United Nations Environmental Program as posing the greatest threat to humans and wildlife, are pesticides; all have been reported to alter immune function under laboratory conditions.) Nevertheless, our knowledge of the human health risks associated with pesticide use and exposure is far from complete. This paper provides a brief overview of the potential effects of chemicals on the immune system, and host factors that mitigate or exacerbate immunotoxic effects. Examples of rodent studies that exemplify categories of pesticide-induced immune system effects are then provided as an introduction to a discussion of pesticide immunotoxicity in humans.  相似文献   

9.
10.
Temporal variation in predation risk may fundamentally influence antipredator responses of prey animals. To maximize lifetime fitness, prey must be able to optimize energy gain and minimize predation risk, and responses to current levels of risk may be influenced by background levels of risk. A ‘risk allocation’ model has recently been proposed to predict the intensity of antipredator responses that should occur as predation risk varies over time. Prey animals from high‐risk environments should respond to predators with relatively low intensities of antipredator behaviour because long periods of antipredator behaviour may result in unacceptable decreases in levels of foraging activity. Moreover, animals that are under frequent risk should devote more energy to foraging during brief pulses of safety compared with animals under infrequent attack. In this study, we experimentally tested the risk allocation hypothesis. We exposed juvenile rainbow trout, Oncorhynchus mykiss, to three levels of risk (high, moderate and low) crossed with two levels of temporal variation (exposed to risk three times a day and once a day). In accordance with the model, we found that trout exposed to risky situations more frequently responded with significantly less intense antipredator behaviour than trout exposed to risk infrequently. The intensity of response of trout exposed to moderate risk three times a day decreased to levels similar to situations of no risk. However, in contrast to the second prediction of the model, animals under frequent risk were not more active during periods of safety compared with animals under infrequent risk. Although behaviour in the face of predation risk was dependent on the broader temporal context in which risk varied, the specific predictions of the risk allocation model were only partly supported.  相似文献   

11.

Background

In previous meta-analyses, aspirin use has been associated with reduced risk of colorectal cancer. However, uncertainty remains on the exact dose–risk and duration–risk relationships.

Methods

We identified studies by searching several English and Chinese electronic databases and reviewing relevant articles. The dose-response meta-analysis was performed by linear trend regression and restricted cubic spline regression. Subgroup analyses were conducted to explore possible heterogeneity among studies. Potential heterogeneity was calculated as Q statistic and I 2 value. Publication bias was evaluated using funnel plots and quantified by the Begg’s and Egger’s test.

Results

Twelve studies were included in this meta-analysis. An inverse association between aspirin use and colorectal cancer was observed in both the overall group (RR = 0.74, 95% CI 0.64–0.83 for aspirin dose; RR = 0.80, 95% CI 0.75–0.85 for frequency of aspirin use; RR = 0.75, 95% CI 0.68–0.81 for years of aspirin use) and subgroups stratified by sex and cancer site. The dose-response meta-analysis showed that there was a 20% statistically significant decreased risk of colorectal cancer for 325 mg aspirin per day increment, 18% decreased risk for 7 times aspirin per week increment and 18% decreased risk for 10 years aspirin increment.

Conclusion

Long-term (>5 years), low-dose (75–325 mg per day) and regular aspirin use (2–7 times per week) can effectively reduce the risk of colorectal cancer.  相似文献   

12.
One hundred obstetric patients were studied for evidence of puerperal thromboembolic disease. Only one case of deep vein thrombosis was detected in patients thought to be in a “high risk” category because of age or operative intervention. Clinical findings were unreliable compared with measurements of 125I-labelled fibrinogen uptake. Doppler ultrasound flow detection proved a simple screening technique but produced no abnormal findings in this series.  相似文献   

13.
The World Health Organization's International Programme on Chemical Safety and international partners have developed a framework for integrated assessment of human health and ecological risks and four case studies. An international workshop was convened to consider how ecological and health risk assessments might be integrated, the benefits of and obstacles to integration, and the research and mechanisms needed to facilitate implementation of integrated risk assessment. Using the case studies, workshop participants identified a number of opportunities to integrate the assessment process. Improved assessment quality, efficiency, and predictive capability were considered to be principal benefits of integration. Obstacles to acceptance and implementation of integrated risk assessment included the disciplinary and organizational barriers between ecological and health disciplines. A variety of mechanisms were offered to overcome these obstacles. Research recommendations included harmonization of exposure characterization and surveillance methods and models, development of common risk endpoints across taxa, improved understanding of mechanisms of effect at multiple scales of biological organization, and development of methods to facilitate comparison of risks among endpoints.  相似文献   

14.
Understanding the link between vaccine immunogenicity and efficacy is currently a major focus in HIV research. Consequently, recent developments in the HIV-1 vaccine field have led to a closer look at immune responses to known efficacious vaccines. We undertook a study to explore clinical predictors of vaccine efficacy following recombinant hepatitis B (rHBV) vaccination in a cohort of HIV-uninfected, hepatitis B virus naïve women living in a peri-urban setting in Cape Town. Our aim was to define host biological risk factors associated with lack of vaccine uptake. We found a significant association (p=0.009) between body mass index (BMI) and lack of vaccine-specific IgG titre (<10mIU/mL). Obese individuals (BMI ≥ 30kg/m2) were significantly more likely to be non-responders following 2 rHBV vaccine doses (Adjusted Odds Ratio of 8.75; p=0.043). There was no observed association between vaccine responses and age, method of contraception or time from vaccination to antibody measurement. These data suggest that obesity-associated factors interfere with vaccine immunogenicity and possible efficacy.  相似文献   

15.
The risk of low-dose radiation exposures has – for a variety of reasons – been highly politicised. This has led to a frequently exaggerated perception of the potential health effects, and to lasting public controversies. A balanced view requires a critical reassessment of the epidemiological basis of current assumptions. There is reliable quantitative information available on the increase of cancer rates due to moderate and high doses. This provides a firm basis for the derivation of probabilities of causation, e.g. after high radiation exposures. For small doses or dose rates, the situation is entirely different: potential increases of cancer rates remain hidden below the statistical fluctuations of normal rates, and the molecular mechanisms of cancerogenesis are not sufficiently well known to allow numerical predictions. Risk coefficients for radiation protection must, therefore, be based on the uncertain extrapolation of observations obtained at moderate or high doses. While extrapolation is arbitrary, it is, nevertheless, used and mostly with the conservative assumption of a linear dose dependence with no threshold (LNT model). All risk estimates are based on this hypothesis. They are, thus, virtual guidelines, rather than firm numbers. The observations on the A-bomb survivors are still the major source of information on the health effects of comparatively small radiation doses. A fairly direct inspection of the data shows that the solid cancer mortality data of the A-bomb survivors are equally consistent with linearity in dose and with reduced effectiveness at low doses. In the leukemia data a reduction is strongly indicated. With one notable exception – leukemia after prenatal exposure – these observations are in line with a multitude of observations in groups of persons exposed for medical reasons. The low-dose effects of densely ionizing radiations – such as alpha-particles from radon decay products or high-energy neutrons – are a separate important issue. For neutrons, there is little epidemiological information. This has facilitated exaggerated claims of high neutron effects with reference to alleged dangers from transports of reactor fuel. However, in spite of limited information, it can be shown that the data from Hiroshima exclude the stated claims. New dosimetric information on neutrons may turn out to be highly informative with regard to an upper limit for the potential effects of neutrons and equally with regard to a reassessment – and a possible reduction – of risk estimates for gamma-rays. Received: 13 November 1999 / Accepted in revised form: 13 December 1999  相似文献   

16.

Background

An obstetrical paradox is that maternal smoking is protective for the development of preeclampsia. However, there are no prior studies investigating the risk of preeclampsia in women who were exposed to tobacco smoking during their own fetal period. We aimed to study the subsequent risk of preeclampsia in women who were exposed to tobacco smoke in utero, using a national population-based register.

Methods

Data were obtained from the Medical Birth Register of Sweden for women who were born in 1982 (smoking data first recorded) or after, who had given birth to at least one child; 153 885 pregnancies were included.

Results

The associations between intrauterine smoking exposure (three categories: non-smokers, 1–9 cigarettes/day [moderate exposure], and >9 cigarettes/day [heavy exposure]) and subsequent preeclampsia (n = 5721) were assessed using logistic regressions. In models adjusted for maternal age, parity and own smoking, the odds ratios (OR) for preeclampsia were 1.06 [95% CI: 0.99,1.13 for moderate intrauterine exposure, and 1.18, [95% CI: 1.10,1.27] for heavy exposure. Estimates were slightly strengthened in non-smoking women who experienced heavy intrauterine exposure (adjusted OR 1.24 [95% CI: 1.14,1.34]). Results were no longer statistically significant after adjustment for the woman’s own BMI, gestational age and birthweight Z-scores.

Conclusion

These data revealed some evidence of a possible weak positive association between intrauterine smoking exposure and the risk of subsequent preeclampsia, however, results were not significant over all manifestations of preeclampsia and confounder adjustment. The increased risk might be mediated through exposed women’s own BMI or birthweight.  相似文献   

17.

Objectives

We tested the a priori hypothesis that self-perceived and real presences of risks for colorectal cancer (CRC) are associated with better knowledge of the symptoms and risk factors for CRC, respectively.

Methods

One territory-wide invitation for free CRC screening between 2008 to 2012 recruited asymptomatic screening participants aged 50–70 years in Hong Kong. They completed survey items on self-perceived and real presences of risks for CRC (advanced age, male gender, positive family history and smoking) as predictors, and knowledge of CRC symptoms and risk factors as outcome measures, respectively. Their associations were evaluated by binary logistic regression analyses.

Results

From 10,078 eligible participants (average age 59 years), the mean knowledge scores for symptoms and risk factors were 3.23 and 4.06, respectively (both score range 0–9). Male gender (adjusted odds ratio [AOR] = 1.34, 95% C.I. 1.20–1.50, p<0.01), self-perception as not having any risks for CRC (AOR = 1.12, 95% C.I. 1.01–1.24, p = 0.033) or uncertainty about having risks (AOR = 1.94, 95% C.I. 1.55–2.43, p<0.001), smoking (AOR 1.38, 95% C.I. 1.11–1.72, p = 0.004), and the absence of family history (AOR 0.61 to 0.78 for those with positive family history, p<0.001) were associated with poorer knowledge scores (≤4) of CRC symptoms. These factors remained significant for knowledge of risk factors.

Conclusions

Male and smokers were more likely to have poorer knowledge but family history of CRC was associated with better knowledge. Since screening of these higher risk individuals could lead to greater yield of colorectal neoplasm, educational interventions targeted to male smokers were recommended.  相似文献   

18.
Risk assessment of GM plants: avoiding gridlock?   总被引:2,自引:0,他引:2  
Cultivation of genetically modified crops is presently based largely on four crops containing few transgenes and grown in four countries. This will soon change and pose new challenges for risk assessment. A more structured approach that is as generic as possible is advocated to study consequences of gene flow. Hazards should be precisely defined and prioritized, with emphasis on quantifying elements of exposure. This requires coordinated effort between large, multidisciplinary research teams.  相似文献   

19.
A flexible framework for conducting nationwide multimedia, multipathway and multireceptor risk assessments (3MRA) under uncertainty was developed to estimate protective chemical concentration limits in a source area. The framework consists of two components: risk assessment and uncertainty analysis. The risk component utilizes linked source, fate/transport, exposure and risk assessment models to estimate the risk exposures for the receptors of concern. Both human and ecological receptors are included in the risk assessment framework. The flexibility of the framework is based on its ability to address problems varying in spatial scales from site-specific to regional and even national levels; and its ability to accommodate varying types of source, fate/transport, exposure and risk assessment models. The uncertainty component of the 3MRA framework is based on a two-stage Monte Carlo methodology. It allows the calculation of uncertainty in risk estimates, and the incorporation of the effects of uncertainty on the determination of regulatory concentration limits as a function of variability and uncertainty in input data, as well as potential errors in fate and transport and risk and exposure models. The framework can be adapted to handle a wide range of multimedia risk assessment problems. Two examples are presented to illustrate its use, and to demonstrate how regulatory decisions can be structured to incorporate the uncertainty in risk estimates.  相似文献   

20.
This study investigated the relationship between level of stress in middle and high school students aged 12–18 and risk of atopic dermatitis. Data from the Sixth Korea Youth Risk Behavior Web-based Survey (KYRBWS-VI), a cross-sectional study among 74,980 students in 800 middle schools and high schools with a response rate of 97.7%, were analyzed. Ordinal logistic regression analyses were conducted to determine the relationship between stress and atopic dermatitis with severity. A total of 5,550 boys and 6,964 girls reported having been diagnosed with atopic dermatitis. Younger students were more likely to have atopic dermatitis. Interestingly, the educational level of parents was found to be associated with having atopic dermatitis and having more severe condition. In particular, girls with mothers with at least college education had a 41% higher risk of having atopic dermatitis and severe atopic condition (odds ratio (OR)) = 1.41, 95% CI, 1.22–1.63; P<0.0001) compared with those with mothers who had attended middle school at most. Similar trend was shown among both boys and girls for their father''s education level. The stress level was found to be significantly associated with the risk of atopic dermatitis. Compared to boys with who reported “no stress”, boys with “very high” stress had 46% higher the risk of having more severe atopic dermatitis (OR = 1.46, 95% CI, 1.20–1.78; P<0.0001), 44% higher (OR = 1.44, 95% CI, 1.19–1.73; P<0.0001) with “high” stress, and 21% higher (OR = 1.21, 95% CI, 1.00–1.45; P = 0.05) with “moderate” stress. In contrast, we found no statistically significant relationship between stress and atopic dermatitis in girls. This study suggests that stress and parents'' education level were associated with atopic dermatitis. Specifically, degree of stress is positively correlated with likelihood of being diagnosed with this condition and increasing the severity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号