首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The impact of new technologies on human population studies   总被引:4,自引:0,他引:4  
Human population studies involve clinical or epidemiological observations that associate environmental exposures with health endpoints and disease. Clearly, these are the most sought after data to support assessments of human health risk from environmental exposures. However, the foundations of many health risk assessments rest on experimental studies in rodents performed at high doses that elicit adverse outcomes, such as organ toxicity or tumors. Using the results of human studies and animal data, risk assessors define the levels of environmental exposures that may lead to disease in a portion of the population. These decisions on potential health risks are frequently based on the use of default assumptions that reflect limitations in our scientific knowledge. An important immediate goal of toxicogenomics, including proteomics and metabonomics, is to offer the possibility of making decisions affecting public health and public based on detailed toxicity, mechanistic, and exposure data in which many of the uncertainties have been eliminated. Ultimately, these global technologies will dramatically impact the practice of public health and risk assessment as applied to environmental health protection. The impact is already being felt in the practice of toxicology where animal experimentation using highly controlled dose-time parameters is possible. It is also being seen in human population studies where understanding human genetic variation and genomic reactions to specific environmental exposures is enhancing our ability to uncover the causes of variations in human response to environmental exposures. These new disciplines hold the promise of reducing the costs and time lines associated with animal and human studies designed to assess both the toxicity of environmental pollutants and efficacy of therapeutic drugs. However, as with any new science, experience must be gained before the promise can be fulfilled. Given the numbers and diversity of drugs, chemicals and environmental agents; the various species in which they are studied and the time and dose factors that are critical to the induction of beneficial and adverse effects, it is only through the development of a profound knowledge base that toxicology and environmental health can rapidly advance. The National Institute of Environmental Health Sciences (NIEHS), National Center for Toxicogenomics and its university-based Toxicogenomics Research Consortium (TRC), and resource contracts, are engaged in the development, application and standardization of the science upon which to the build such a knowledge base on Chemical Effects in Biological Systems (CEBS). In addition, the NIEHS Environmental Genome Project (EGP) is working to systematically identify and characterize common sequence polymorphisms in many genes with suspected roles in determining chemical sensitivity. The rationale of the EGP is that certain genes have a greater than average influence over human susceptibility to environmental agents. If we identify and characterize the polymorphism in those genes, we will increase our understanding of human disease susceptibility. This knowledge can be used to protect susceptible individuals from disease and to reduce adverse exposure and environmentally induced disease.  相似文献   

2.

Background  

Nuclear magnetic resonance spectroscopy is one of the primary tools in metabolomics analyses, where it is used to track and quantify changes in metabolite concentrations or profiles in response to perturbation through disease, toxicants or drugs. The spectra generated through such analyses are typically confounded by noise of various types, obscuring the signals and hindering downstream statistical analysis. Such issues are becoming increasingly significant as greater numbers of large-scale systems or longitudinal studies are being performed, in which many spectra from different conditions need to be compared simultaneously.  相似文献   

3.
Human health risks from occupational exposures are managed by limiting exposures to acceptable levels established by the American Conference of Governmental Industrial Hygienists or another similar body. Acceptable environmental exposures are benchmarked by values such as U.S. Environmental Protection Agency's Reference Doses and Reference Concentrations. The approaches to establishing these values are different, as are the groups they are intended to protect, complicating direct comparisons. Occupational limits are based on a healthy workforce in a narrow age range and do not generally consider sensitive populations. Limits for environmental exposures consider sensitive populations. In this evaluation, physiologically based pharmacokinetic modeling was used to predict tissue doses from acceptable/safe exposures as established by different organizations and agencies. Internal doses calculated for an agency's acceptable/safe exposures via oral and inhalation routes may differ substantially, but are sometimes in excellent agreement. The finding that internal doses resulting from occupational exposures are almost uniformly greater than those from environmental exposures suggests different mindsets among these groups regarding how safe is “safe.”  相似文献   

4.
5.
Diet is considered as one of the most important modifiable factors influencing human health, but efforts to identify foods or dietary patterns associated with health outcomes often suffer from biases, confounding, and reverse causation. Applying Mendelian randomization in this context may provide evidence to strengthen causality in nutrition research. To this end, we first identified 283 genetic markers associated with dietary intake in 445,779 UK Biobank participants. We then converted these associations into direct genetic effects on food exposures by adjusting them for effects mediated via other traits. The SNPs which did not show evidence of mediation were then used for MR, assessing the association between genetically predicted food choices and other risk factors, health outcomes. We show that using all associated SNPs without omitting those which show evidence of mediation, leads to biases in downstream analyses (genetic correlations, causal inference), similar to those present in observational studies. However, MR analyses using SNPs which have only a direct effect on the exposure on food exposures provided unequivocal evidence of causal associations between specific eating patterns and obesity, blood lipid status, and several other risk factors and health outcomes.  相似文献   

6.
The principle of biodosimetry is to utilize changes induced in the individual by ionizing radiation to estimate the dose and, if possible, to predict or reflect the clinically relevant response, i.e., the biological consequences of the dose. Ideally, the changes should be specific for ionizing radiation, and the response should be unaffected by prior medical or physiological variations among subjects, including changes that might be caused by the stress and trauma from a radiation event. There are two basic types of biodosimetry with different and often complementary characteristics: those based on changes in biological parameters such as gene activation or chromosomal abnormalities and those based on physical changes in tissues (detected by techniques such as EPR). In this paper, we consider the applicability of the various techniques for different scenarios: small- and large-scale exposures to levels of radiation that could lead to the acute radiation syndrome and exposures with lower doses that do not need immediate care, but should be followed for evidence of long-term consequences. The development of biodosimetry has been especially stimulated by the needs after a large-scale event where it is essential to have a means to identify those individuals who would benefit from being brought into the medical care system. Analyses of the conventional methods officially recommended for responding to such events indicate that these methods are unlikely to achieve the results needed for timely triage of thousands of victims. Emerging biodosimetric methods can fill this critically important gap.  相似文献   

7.
Preston RJ 《Mutation research》2003,543(2):121-124
In trying to decide what type of scientific paper I could prepare as a tribute to Jim Neel, I thought back over the discussions that we had over some 25 years. Sometimes these discussions were on specific topics such as how to extrapolate from mutation data in mice to those for humans following radiation or chemical exposures. On other occasions, our discussions were of a more philosophical nature, particularly on where the field of epidemiology might or needed to go. For example, what types of data are needed for assessing the public health impact of exposure to environmental agents. Perhaps because I enjoyed these discussions so much, I have chosen to take a look from a current perspective at the field of molecular epidemiology. Jim Neel would have loved to have entered into this discussion; he would have enhanced it in is own inimitable way.  相似文献   

8.
The present study was conducted to determine whether adolescents and/or the elderly are more sensitive to mobile phone (MP)‐related bioeffects than young adults, and to determine this for both 2nd generation (2G) GSM, and 3rd generation (3G) W‐CDMA exposures. To test this, resting alpha activity (8–12 Hz band of the electroencephalogram) was assessed because numerous studies have now reported it to be enhanced by MP exposure. Forty‐one 13–15 year olds, forty‐two 19–40 year olds, and twenty 55–70 year olds were tested using a double‐blind crossover design, where each participant received Sham, 2G and 3G exposures, separated by at least 4 days. Alpha activity, during exposure relative to baseline, was recorded and compared between conditions. Consistent with previous research, the young adults' alpha was greater in the 2G compared to Sham condition, however, no effect was seen in the adolescent or the elderly groups, and no effect of 3G exposures was found in any group. The results provide further support for an effect of 2G exposures on resting alpha activity in young adults, but fail to support a similar enhancement in adolescents or the elderly, or in any age group as a function of 3G exposure. Bioelectromagnetics 31:434–444, 2010. © 2010 Wiley‐Liss, Inc.  相似文献   

9.
We evaluate risk drivers at selected U.S. Army installations by developing a database containing contaminant-pathway-receptor combinations that exceed regulatory thresholds for ecological (toxicity quotient greater than one), human health cancer risk (predicted incremental lifetime cancer risk greater than one in ten thousand), and noncancer human health (hazard index greater than one). We compare the risk drivers from the database to reported corrective action objectives from available decision documents. For noncancer hazards, explosives (particularly in ground water) dominate the reported exceedances of regulatory thresholds in the database. PAHs in home-grown produce show the highest number of exceedances of regulatory thresholds for cancer risk. For ecological risks, PAHs in both terrestrial and aquatic environments dominate the exceedances of regulatory thresholds. All available cleanup levels were derived based on human health exposures rather than ecological exposures, except for one site. In general, ecological risks were considered to be “more uncertain,” and that was used as a basis for not relying on backcalculated target levels on the basis of ecological risk. The reverse was true for human health risks: the “conservative” assumptions incorporated into the modeling provided the justification for backcalculating health-protective target levels.  相似文献   

10.
Deep pressure sores (DPS) are associated with inadequate soft tissue perfusion and excessive tissue deformation over critical time durations, as well as with ischemia-reperfusion cycles and deficiency of the lymphatic system. Muscle tissue shows the lowest tolerance to pressure injuries, compared with more superficial tissues. In this communication, we present new histopathology data for muscle tissue of albino (Sprague-Dawley) rats exposed to pressures for 15 or 30 min. These data are superimposed with an extensive literature review of all previous histopathology reported for albino rat skeletal muscles subjected to pressure. The pooled data enabled a new mathematical characterization of the pressure-time threshold for cell death in striated muscle of rats, in the form of a sigmoid pressure-time relation, which extends the previous pressure-time relation to the shorter exposure periods. We found that for pressure exposures shorter than 1 h, the magnitude of pressure is the important factor for causing cell death and the exposure time has little or no effect: even relatively short exposures (15 min - 1 h) to pressures greater than 32 kPa (240 mmHg) cause cell death in rat muscle tissue. For exposures of 2 h or over, again the magnitude of pressure is the important factor for causing cell death: pressures greater than 9 kPa (67 mmHg) applied for over 2 h consistently cause muscle cell death. For the intermediate exposures (between 1 and 2 h), the magnitude of cell-death-causing pressure strongly depends on the time of exposure, i.e., critical pressure levels drop from 32 to 9 kPa. The present sigmoidal pressure-time cell death threshold is useful for design of studies in albino rat models of DPS, and may also be helpful in numerical simulations of DPS development, where there is often a need to extrapolate from tissue pressures to biological damage.  相似文献   

11.
  The military and civilian nuclear activities in the former Soviet Union led to unique exposures and resulted in high cumulative doses in several populations. In comparison to the atomic bomb survivors, at present the most important cohort in radiation epidemiology, collective and individual doses received by early workers in the plutonium production facilities at Mayak (Chelyabinsk), Techa River residents downstream of Mayak, populations downwind of the Semipalatinsk test site, and subpopulations of Chernobyl victims surpass the Hiroshima/Nagasaki experience in most cases. Even more importantly, the dose rates cover the full range of exposures relevant for radiation protection, i.e., acute to year-long chronic exposures from environmental contamination and bone seeking radionuclides. Parallel to the humanitarian need to mitigate health effects from these exposures, the unique opportunities for research on radiation risks related to low dose rate and chronic radiation have to be explored. Increased efforts by the global radiation research community are needed to address the many questions which cannot be answered by the acutely irradiated survivors of Hiroshima/Nagasaki. Specific attention needs to be drawn to the validation of available exposure and health records and to dose reconstruction which must include dietary sources of exposure. Preliminary intercomparison and validation exercises indicate potentially large sources of error, e.g., due to uncertainties in the reconstruction of early exposures and effects and due to continuing incorporation. Received: 29 February 1996 / Accepted in revised form: 6 March 1996  相似文献   

12.
The exposome is defined as “the totality of environmental exposures encountered from birth to death” and was developed to address the need for comprehensive environmental exposure assessment to better understand disease etiology. Due to the complexity of the exposome, significant efforts have been made to develop technologies for longitudinal, internal and external exposure monitoring, and bioinformatics to integrate and analyze datasets generated. Our objectives were to bring together leaders in the field of exposomics, at a recent Symposium on “Lifetime Exposures and Human Health: The Exposome,” held at Yale School of Public Health. Our aim was to highlight the most recent technological advancements for measurement of the exposome, bioinformatics development, current limitations, and future needs in environmental health. In the discussions, an emphasis was placed on moving away from a one-chemical one-health outcome model toward a new paradigm of monitoring the totality of exposures that individuals may experience over their lifetime. This is critical to better understand the underlying biological impact on human health, particularly during windows of susceptibility. Recent advancements in metabolomics and bioinformatics are driving the field forward in biomonitoring and understanding the biological impact, and the technological and logistical challenges involved in the analyses were highlighted. In conclusion, further developments and support are needed for large-scale biomonitoring and management of big data, standardization for exposure and data analyses, bioinformatics tools for co-exposure or mixture analyses, and methods for data sharing.  相似文献   

13.
There is a paucity of information regarding the long-term health effects associated with exposure to static magnetic fields. Perceptual and other acute effects have been demonstrated above a threshold of about 2 T, and these form the basis for human exposure standards at present. Exposures well above this threshold are increasingly becoming more common as the technology associated with magnetic resonance imaging advances. Therefore, priority should be given to assessing the health risks associated with exposures to such fields. Studies should include a prospective cohort study investigating cancer risks of workers and patients exposed to fields in excess of 2 T, a study investigating effects on human cognitive performance from repeated exposures, and a molecular biology study investigating acute changes in genomic responses in volunteers exposed to fields of up to 8 T. Studies investigating the effects of long-term exposure on cancer, and on neurobehavioural development are also recommended using animals, where the use of transgenic models is encouraged. In addition, dosimetric studies should be conducted using high-resolution male, female and pregnant voxel phantoms, as should theoretical studies investigating the local currents induced in the eye and in the heart by movement during exposure. Finally, studies are recommended to investigate further the ability of static magnetic fields to significantly affect radical pair reactions in biological systems.  相似文献   

14.
Survival of migrating salmon smolts in large rivers with and without dams   总被引:1,自引:0,他引:1  
The mortality of salmon smolts during their migration out of freshwater and into the ocean has been difficult to measure. In the Columbia River, which has an extensive network of hydroelectric dams, the decline in abundance of adult salmon returning from the ocean since the late 1970s has been ascribed in large measure to the presence of the dams, although the completion of the hydropower system occurred at the same time as large-scale shifts in ocean climate, as measured by climate indices such as the Pacific Decadal Oscillation. We measured the survival of salmon smolts during their migration to sea using elements of the large-scale acoustic telemetry system, the Pacific Ocean Shelf Tracking (POST) array. Survival measurements using acoustic tags were comparable to those obtained independently using the Passive Integrated Transponder (PIT) tag system, which is operational at Columbia and Snake River dams. Because the technology underlying the POST array works in both freshwater and the ocean, it is therefore possible to extend the measurement of survival to large rivers lacking dams, such as the Fraser, and to also extend the measurement of survival to the lower Columbia River and estuary, where there are no dams. Of particular note, survival during the downstream migration of at least some endangered Columbia and Snake River Chinook and steelhead stocks appears to be as high or higher than that of the same species migrating out of the Fraser River in Canada, which lacks dams. Equally surprising, smolt survival during migration through the hydrosystem, when scaled by either the time or distance migrated, is higher than in the lower Columbia River and estuary where dams are absent. Our results raise important questions regarding the factors that are preventing the recovery of salmon stocks in the Columbia and the future health of stocks in the Fraser River.  相似文献   

15.
16.
Radon is a ubiquitous natural carcinogen derived from the three primordial radionuclides of the uranium series (238U and 235U) and thorium series (232Th). In general, it is present at very low concentrations in the outdoor or indoor environment, but a number of scenarios can give rise to significant radiological exposures. Historically, these scenarios were not recognised, and took many centuries to understand the links between the complex behaviour of radon and progeny decay and health risks such as lung cancer. However, in concert with the rapid evolution in the related sciences of nuclear physics and radiological health in the first half of the twentieth century, a more comprehensive understanding of the links between radon, its progeny and health impacts such as lung cancer has evolved. It is clear from uranium miner studies that acute occupational exposures lead to significant increases in cancer risk, but chronic or sub-chronic exposures, such as indoor residential settings, while suggestive of health risks, still entails various uncertainties. At present, prominent groups such as the BEIR or UNSCEAR committees argue that the ‘linear no threshold’ (LNT) model is the most appropriate model for radiation exposure management, based on their detailed review and analysis of uranium miner, residential, cellular or molecular studies. The LNT model implies that any additional or excess exposure to radon and progeny increases overall risks such as lung cancer. A variety of engineering approaches are available to address radon exposure problems. Where high radon scenarios are encountered, such as uranium mining, the most cost effective approach is well-engineered ventilation systems. For residential radon problems, various options can be assessed, including building design and passive or active ventilation systems. This paper presents a very broad but thorough review of radon sources, its behaviour (especially the importance of its radioactive decay progeny), common mining and non-mining scenarios which can give rise to significant radon and progeny exposures, followed by a review of associated health impacts, culminating in typical engineering approaches to reduce exposures and rehabilitate wastes.  相似文献   

17.
Epidemiologic studies can play a central role in risk assessments. They are used in all risk assessment phases: hazard identification, dose-response, and exposure assessment. Epidemiologic studies have often been the first to show that a particular environmental exposure is a hazard to health. They have numerous advantages with respect to other sources of data which are used in risk assessments, the most important being that they do not require the assumption that they are generalizable to humans. For this reason, fewer and lower uncertainty factors may be appropriate in risk characterization based on epidemiologic studies. Unfortunately, epidemiologic studies have numerous problems, the most important being that the exposures are often not precisely measured. This article presents in detail the advantages of and problems with epidemiologic studies. It discusses two approaches to ensure their usefulness, biomarkers and an ordinance which requires baseline and subsequent surveillance of possible exposures and health effects from newly sited potentially polluting facilities. Biomarkers are biochemical measures of exposure, susceptibility factors, or preclinical pathological changes. Biomarkers are a way of dealing with the problems of poor measures, differential susceptibility and lack of early measures of disease occurrence that inherent in many environmental epidemiologic studies. The advantages of biomarkers is they can provide objective information on exposure days, months or even years later and evidence of pathology perhaps years earlier. The ordinance makes possible the use of a powerful epidemiologic study design, the prospective cohort study, where confounder(s) are best measured, and exposures, pathological changes, and health effects can be detected as soon as possible.  相似文献   

18.
Szalai G  Xie D  Wassenich M  Veres M  Ceci JD  Dewey MJ  Molotkov A  Duester G  Felder MR 《Gene》2002,291(1-2):259-270
Mouse alcohol dehydrogenase 1 (Adh1) gene expression occurs at high levels in liver and adrenal, moderate levels in kidney and intestine, low levels in a number of other tissues, and is undetectable in thymus, spleen and brain by Northern analysis. In transgenic mice, a minigene construct containing 10 kb of upstream and 1.5 kb of downstream flanking sequence directs expression in kidney, adrenal, lung, epididymis, ovary and skin but promotes ectopic expression in thymus and spleen while failing to control expression in liver, eye, intestine and seminal vesicle. Cosmids containing either 7 kb of upstream and 21 kb of downstream or 12 kb of upstream and 23 kb of downstream sequence flanking genetically marked Adh1 additionally promotes seminal vesicle expression suggesting downstream or intragenic sequence controls expression in this tissue. However, expression in liver, adrenal, or intestine is not promoted. The Adh1(a) allele on the cosmid expresses an enzyme electrophoretically distinct from that of the endogenous Adh1(b) allele, and presence of the heterodimeric enzyme in expressing tissues confirms that transgene activity occurs in the same cell-type as the endogenous gene. Transgene expression levels promoted by cosmids were at physiologically relevant amounts and exhibited greater copy-number dependence than observed with minigenes. Transgene mRNA expression correlated with expression measured at the enzyme level. A bacterial artificial chromosome containing 110 kb of 5'- and 104 kb of 3'-flanking sequence surrounding the Adh1 gene promoted expression in tissues at levels comparable to the endogenous gene most importantly including liver, adrenal and intestinal tissue where high level Adh1 expression occurs. Transgene expression in liver was in the same cell types as promoted by the endogenous gene. Although proximal elements extending 12 kb upstream and 23 kb downstream of the Adh1 gene promote expression at physiologically relevant levels in most tissues, more distal elements are additionally required to promote normal expression levels in liver, adrenal and intestinal tissue where Adh1 is most highly expressed.  相似文献   

19.
Validated biological monitoring methods are used in large-scale monitoring programmes involving determination of ubiquitous environmental pollutants such as metals and pesticides. Some programmes focus on children's exposure, and policies to prevent adverse health effects. Most of these initiatives are aimed at characterizing trends. Some of these programmes are designed to investigate the role of certain exposures in disease. Fewer new biological monitoring methods were presented during the present meeting than in previous meetings. All of these new methods used mass spectrometric-based detection and quantification. There is an increasing use of biomarkers to study genetic polymorphisms of enzyme systems involved in both toxification pathways and metabolite conjugation and DNA repair. At the meeting a discussion was started that could lead to a further harmonization of the scientific fundaments of the use of biological monitoring in occupational health with possible value also for applications in the field of environmental health.  相似文献   

20.
Cumulative effect in social contagion underlies many studies on the spread of innovation, behavior, and influence. However, few large-scale empirical studies are conducted to validate the existence of cumulative effect in information diffusion on social networks. In this paper, using the population-scale dataset from the largest Chinese microblogging website, we conduct a comprehensive study on the cumulative effect in information diffusion. We base our study on the diffusion network of message, where nodes are the involved users and links characterize forwarding relationship among them. We find that multiple exposures to the same message indeed increase the possibility of forwarding it. However, additional exposures cannot further improve the chance of forwarding when the number of exposures crosses its peak at two. This finding questions the cumulative effect hypothesis in information diffusion. Furthermore, to clarify the forwarding preference among users, we investigate both structural motif in the diffusion network and temporal pattern in information diffusion process. Findings provide some insights for understanding the variation of message popularity and explain the characteristics of diffusion network.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号