首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We examine the joint distribution of levels of income per capita, life expectancy, and years of schooling across countries in 1960 and in 2000. In 1960 countries were clustered in two groups; a rich, highly educated, high longevity “developed” group and a poor, less educated, high mortality, “underdeveloped” group. By 2000 however we see the emergence of three groups; one underdeveloped group remaining near 1960 levels, a developed group with higher levels of education, income, and health than in 1960, and an intermediate group lying between these two. This finding is consistent with both the ideas of a new “middle income trap” that countries face even if they escape the “low income trap”, as well as the notion that countries which escaped the poverty trap form a temporary “transition regime” along their path to the “developed” group.  相似文献   

2.
BackgroundDuring 2017, twenty health districts (locations) implemented a dengue outbreak Early Warning and Response System (EWARS) in Mexico, which processes epidemiological, meteorological and entomological alarm indicators to predict dengue outbreaks and triggers early response activities.Out of the 20 priority districts where more than one fifth of all national disease transmission in Mexico occur, eleven districts were purposely selected and analyzed. Nine districts presented outbreak alarms by EWARS but without subsequent outbreaks (“non-outbreak districts”) and two presented alarms with subsequent dengue outbreaks (“outbreak districts”). This evaluation study assesses and compares the impact of alarm-informed response activities and the consequences of failing a timely and adequate response across the outbreak groups.MethodsFive indicators of dengue outbreak response (larval control, entomological studies with water container interventions, focal spraying and indoor residual spraying) were quantitatively analyzed across two groups (”outbreak districts” and “non-outbreak districts”). However, for quality control purposes, only qualitative concluding remarks were derived from the fifth response indicator (fogging).ResultsThe average coverage of vector control responses was significantly higher in non-outbreak districts and across all four indicators. In the “outbreak districts” the response activities started late and were of much lower intensity compared to “non-outbreak districts”. Vector control teams at districts-level demonstrated diverse levels of compliance with local guidelines for ‘initial’, ‘early’ and ‘late’ responses to outbreak alarms, which could potentially explain the different outcomes observed following the outbreak alarms.ConclusionFailing timely and adequate response of alarm signals generated by EWARS showed to negatively impact the disease outbreak control process. On the other hand, districts with adequate and timely response guided by alarm signals demonstrated successful records of outbreak prevention. This study presents important operational scenarios when failing or successding EWARS but warrants investigating the effectiveness and cost-effectiveness of EWARS using a more robust designs.  相似文献   

3.
BackgroundThe diagnosis of tuberculosis (TB) in young children can be challenging, especially in severely malnourished children. There is a critical need for improved diagnostics for children. Thus, we sought to evaluate the performance of a technique that measures antibodies in lymphocyte supernatant (ALS) for the diagnosis of TB in severely malnourished children presenting with suspected pneumonia.MethodsChildren less than 5 years with severe acute malnutrition and radiological features of pneumonia admitted to the Dhaka Hospital of International Centre for Diarrhoeal Disease Research, Bangladesh, were enrolled consecutively following informed written consent. In addition to clinical and radiological assessment, samples taken for TB diagnosis included gastric lavage fluid and induced sputum for microbiological confirmation. ALS was measured from venous blood, and results were evaluated in children classified as “confirmed”, “non-confirmed TB” or “not TB”.ResultsAmong 224 children who had ALS analysis, 12 (5.4%) children had microbiologically “confirmed TB”, a further 41 (18%) had clinically diagnosed “non-confirmed TB” and the remaining 168 (75%) were considered not to have TB. ALS was positive in 89 (40%) and negative in 85 (39%) of children, with a large number (47 or 21%) reported as “borderline”. These proportions were similar between the three diagnostic groups. The sensitivity and specificity of ALS when comparing “Confirmed TB” to “Not TB” was only 67% (95% CI: 31–91%) and 51% (95% CI: 42–60%), respectively.

Conclusions and Significance

Our data suggest that ALS is not sufficiently accurate to improve the diagnosis of TB in children with severe malnutrition.  相似文献   

4.
Androgen function was studied in twenty-five physically healthy “primarily” impotent males classified on clinical criteria into “psychogenic” or “constitutional” groups. The mean urinary testosterone level in the former was significantly higher than in the latter group (P<0·005). Important variables associated significantly with higher urinary testosterone levels (P<0·05) were (a) “late onset” impotence, (b) shorter duration than two years, (c) stronger “sex drive,” and (d) an alternative sexual outlet to orgasm and ejaculation in the three months preceding referral; the last-mentioned appeared to be the single most important discriminatory feature.It is suggested that testosterone excretion patterns—namely, high, average, and low—may be one method of classifying impotence.  相似文献   

5.
BackgroundComputer-aided detection to identify and diagnose pulmonary tuberculosis is being explored. While both cavitation on chest radiograph and smear-positivity on microscopy are independent risk factors for the infectiousness of pulmonary tuberculosis it is unknown which radiographic pattern, were it detectable, would provide the greatest public health benefit; i.e. reduced transmission. Herein we provide that evidence.Objectives1) to determine whether pulmonary tuberculosis in a high income, low incidence country is more likely to present with “typical” adult-type pulmonary tuberculosis radiographic features and 2) to determine whether those with “typical” radiographic features are more likely than those without such features to transmit the organism and/or cause secondary cases.MethodsOver a three-year period beginning January 1, 2006 consecutive adults with smear-positive pulmonary tuberculosis in the Province of Alberta, Canada, were identified and their pre-treatment radiographs scored by three independent readers as “typical” (having an upper lung zone predominant infiltrate, with or without cavitation but no discernable adenopathy) or “atypical” (all others). Each patient’s pre-treatment bacillary burden was carefully documented and, during a 30-month transmission window, each patient’s transmission events were recorded. Mycobacteriology, radiology and transmission were compared in those with “typical” versus “atypical” radiographs.FindingsA total of 97 smear-positive pulmonary tuberculosis cases were identified, 69 (71.1%) with and 28 (28.9%) without “typical” chest radiographs. “Typical” cases were more likely to have high bacillary burdens and cavitation (Odds Ratios and 95% Confidence Intervals: 2.75 [1.04–7.31] and 9.10 [2.51–32.94], respectively). Typical cases were also responsible for most transmission events—78% of tuberculin skin test conversions (p<0.002) and 95% of secondary cases in reported close contacts (p<0.01); 94% of secondary cases in “unreported” contacts (p<0.02).ConclusionAs a group, smear-positive pulmonary tuberculosis patients with typical radiographic features constitute the greatest public health risk. This may have implications for automated detection systems.  相似文献   

6.
7.
BackgroundMany U.S.-bound refugees travel from countries where intestinal parasites (hookworm, Trichuris trichuria, Ascaris lumbricoides, and Strongyloides stercoralis) are endemic. These infections are rare in the United States and may be underdiagnosed or misdiagnosed, leading to potentially serious consequences. This evaluation examined the costs and benefits of combinations of overseas presumptive treatment of parasitic diseases vs. domestic screening/treating vs. no program.MethodsAn economic decision tree model terminating in Markov processes was developed to estimate the cost and health impacts of four interventions on an annual cohort of 27,700 U.S.-bound Asian refugees: 1) “No Program,” 2) U.S. “Domestic Screening and Treatment,” 3) “Overseas Albendazole and Ivermectin” presumptive treatment, and 4) “Overseas Albendazole and Domestic Screening for Strongyloides”. Markov transition state models were used to estimate long-term effects of parasitic infections. Health outcome measures (four parasites) included outpatient cases, hospitalizations, deaths, life years, and quality-adjusted life years (QALYs).ResultsThe “No Program” option is the least expensive ($165,923 per cohort) and least effective option (145 outpatient cases, 4.0 hospitalizations, and 0.67 deaths discounted over a 60-year period for a one-year cohort). The “Overseas Albendazole and Ivermectin” option ($418,824) is less expensive than “Domestic Screening and Treatment” ($3,832,572) or “Overseas Albendazole and Domestic Screening for Strongyloides” ($2,182,483). According to the model outcomes, the most effective treatment option is “Overseas Albendazole and Ivermectin,” which reduces outpatient cases, deaths and hospitalization by around 80% at an estimated net cost of $458,718 per death averted, or $2,219/$24,036 per QALY/life year gained relative to “No Program”.DiscussionOverseas presumptive treatment for U.S.-bound refugees is a cost-effective intervention that is less expensive and at least as effective as domestic screening and treatment programs. The addition of ivermectin to albendazole reduces the prevalence of chronic strongyloidiasis and the probability of rare, but potentially fatal, disseminated strongyloidiasis.  相似文献   

8.
BackgroundLack of nationwide evidence on awareness, treatment, and control (ATC) of hypertension among older adults in India impeded targeted management of this condition. We aimed to estimate rates of hypertension ATC in the older population and to assess differences in these rates across sociodemographic groups and states in India.Methods and findingsWe used a nationally representative survey of individuals aged 45 years and over and their spouses in all Indian states (except one) in 2017 to 2018. We identified hypertension by blood pressure (BP) measurement ≥140/90 mm Hg or self-reported diagnosis if also taking medication or observing salt/diet restriction to control BP. We distinguished those who (i) reported diagnosis (“aware”); (ii) reported taking medication or being under salt/diet restriction to control BP (“treated”); and (iii) had measured systolic BP <140 and diastolic BP <90 (“controlled”). We estimated age–sex adjusted hypertension prevalence and rates of ATC by consumption quintile, education, age, sex, urban–rural, caste, religion, marital status, living arrangement, employment status, health insurance, and state. We used concentration indices to measure socioeconomic inequalities and multivariable logistic regression to estimate fully adjusted differences in these outcomes. Study limitations included reliance on BP measurement on a single occasion, missing measurements of BP for some participants, and lack of data on nonadherence to medication.The 64,427 participants in the analysis sample had a median age of 57 years: 58% were female, and 70% were rural dwellers. We estimated hypertension prevalence to be 41.9% (95% CI 41.0 to 42.9). Among those with hypertension, we estimated that 54.4% (95% CI 53.1 to 55.7), 50.8% (95% CI 49.5 to 52.0), and 28.8% (95% CI 27.4 to 30.1) were aware, treated, and controlled, respectively. Across states, adjusted rates of ATC ranged from 27.5% (95% CI 22.2 to 32.8) to 75.9% (95% CI 70.8 to 81.1), from 23.8% (95% CI 17.6 to 30.1) to 74.9% (95% CI 69.8 to 79.9), and from 4.6% (95% CI 1.1 to 8.1) to 41.9% (95% CI 36.8 to 46.9), respectively. Age–sex adjusted rates were lower (p < 0.001) in poorer, less educated, and socially disadvantaged groups, as well as for males, rural residents, and the employed. Among individuals with hypertension, the richest fifth were 8.5 percentage points (pp) (95% CI 5.3 to 11.7; p < 0.001), 8.9 pp (95% CI 5.7 to 12.0; p < 0.001), and 7.1 pp (95% CI 4.2 to 10.1; p < 0.001) more likely to be aware, treated, and controlled, respectively, than the poorest fifth.ConclusionsHypertension prevalence was high, and ATC of the condition were low among older adults in India. Inequalities in these indicators pointed to opportunities to target hypertension management more effectively and equitably on socially disadvantaged groups.

In a cross-sectional study, Sanjay K Mohanty and colleagues investigate the awareness, treatment, and control of hypertension amongst adults aged 45 years and over and their spouses in India.  相似文献   

9.
BackgroundDiet is an important factor in the prevention of chronic diseases. Analysis of secular trends of dietary patterns can be biased by energy under-reporting. Therefore, the objective of the present study was to analyse the impact of energy under-reporting on dietary patterns and secular trends in dietary patterns defined by cluster analysis.ResultsThree clusters, “healthy”, “mixed” and “western”, were identified for both surveys. The “mixed” cluster was the predominant cluster in both surveys. Excluding EUR reduced the proportion of the “mixed” cluster up to 6.40% in the 2000 survey; this caused secular trend increase in the prevalence of the “mixed” pattern. Cross-classification analysis of all participants and PER’ data showed substantial agreement in cluster assignments: 68.7% in 2000 and 84.4% in 2005. Excluding EUR did not cause meaningful (≥15%) changes in the “healthy” pattern. It provoked changes in consumption of some food groups in the “mixed” and “western” patterns: mainly decreases of unhealthy foods within the 2000 and increases of unhealthy foods within the 2005 surveys. Secular trend effects of EUR were similar to those within the 2005 survey. Excluding EUR reversed the direction of secular trends in consumption of several food groups in PER in the “mixed” and “western” patterns.ConclusionsEUR affected distribution of participants between dietary patterns within and between surveys, secular trends in food group consumption and amount of food consumed in all, but not in the “healthy” pattern. Our findings emphasize threats from energy under-reporting in dietary data analysis.  相似文献   

10.
ObjectivesTo assess survival in people who are at apparent high risk who do not develop coronary heart disease (“unwarranted survivals”) and mortality in people at low risk who die from the disease (“anomalous deaths”) and the extent to which these outcomes are explained by other, less visible, risk factors.DesignProspective general population survey.SettingRenfrew and Paisley, Scotland.Participants6068 men aged 45-64 years at screening in 1972-6, allocated to “visible” risk groups on the basis of body mass index and smoking.ResultsVisible risk was a good predictor of mortality: 13% (45) of men at low risk and 45% (86) of men at high risk had died by age 70 years. Of these deaths, 12 (4%) and 44 (23%), respectively, were from coronary heart disease. In the group at low visible risk other less visible risk factors accounted for increased risk in 83% (10/12) of men who died from coronary heart disease and 29% (84/292) of men who survived. In the high risk group 81/107 who survived (76%) and 19/44 (43%) who died from coronary heart disease had lower risk after other factors were considered. Different risk factors modified risk (beyond smoking and body mass index) in the two groups. Among men at low visible risk, poor respiratory function, diabetes, previous coronary heart disease, and socioeconomic deprivation modified risk. Among men at high visible risk, height and cholesterol concentration modified risk.ConclusionsDifferences in survival between these extreme risk groups are dramatic. Health promotion messages would be more credible if they discussed anomalies and the limits of prediction of coronary disease at an individual level.

What is already known on this topic

People pay attention to visible risk factors, such as smoking and weight, in explaining or predicting coronary events but are aware that these behavioural risk factors fail to explain some early deaths from coronary heart disease (in those with “low risk” lifestyles) and long survival (in those with “high risk” lifestyles)Such violations to notions of coronary candidacy undermine people''s belief in the worth of modifying behavioural risk factors for coronary heart disease

What this study adds

Visible risk status was a good marker for other coronary risk factors at the extremes of the risk distributionMost men at low visible risk (slim, never smoked) who died prematurely from coronary heart disease had poorer risk profiles on other less visible risk factors; similarly, men at high visible risk (obese, heavy smokers) who survived often had more favourable profiles on other risk factors  相似文献   

11.
Cecil McIver 《CMAJ》1964,91(11):578-580
Concepts of hypertension have changed and changes in terminology to reflect this state of affairs are suggested. Statistically, the best mortality experience is associated with blood pressure commonly regarded as subnormal, and increments of blood pressure above this level are associated with progressive increases in mortality. The terms “normal”, “benign” and “essential” in relation to blood pressure should be abandoned. “Optimal”, “acceptable” and “hypertensive” ranges of blood pressure are suggested. Hypertension is regarded as a symptom of disease, rather than as a disease in itself, and “hypertension”, when used as a diagnostic label, should be qualified always by the primary disease, if known, or by the modifying phrase, “of unknown cause”, if not known.  相似文献   

12.
BackgroundThe disease burden is increasing for chronic obstructive pulmonary disease (COPD) due to increasing of the growth rate of prevalence and mortality. But the empirical researches are a little for COPD that studied the association between continuity of care and death and about predictors effect on mortality.ObjectiveTo investigate the association between continuity of care (COC) and chronic obstructive pulmonary disease (COPD) mortality and to identify other mortality-related factors in COPD patients.MethodsWe conducted a longitudinal, population-based retrospective cohort study in adult patients with COPD from 2002 to 2012 using a nationwide health insurance claims database. The study sample included individuals aged 40 years and over who developed COPD in 2005 and survived until 2006. We performed a Cox proportional hazard regression analysis with COC analyzed as a time-dependent covariate.ResultsOf the 3,090 participants, 60.8% died before the end of study (N = 1,879). The median years of survival for individuals with high COC (COC index≥0.75) was 3.92, and that for patients with low COC (COC index<0.75) was 2.58 in a Kaplan Meier analysis. In a multivariate, time-dependent analysis, low COC was associated with a 22% increased risk of all-cause mortality (HR, 1.22; 95% CI, 1.09–1.36). Not receiving oxygen therapy at home was associated with a 23% increased risk of all-cause mortality (HR, 1.23; 95% CI, 1.01–1.49). Moreover, the risk of all-cause mortality for individuals who admitted one time increased 38% (HR, 1.38; 95% CI, 1.21–1.59), two times was 63% (HR, 1.63; 95% CI, 1.34–1.99) and 3+ times was 96% (HR, 1.96; 95% CI, 1.63–2.36) relative to the reference group (no admission).ConclusionsHigh COC was associated with a decreased risk of all-cause mortality. In addition, home oxygen therapy and number of hospital admissions may predict mortality in patients with COPD.  相似文献   

13.
This paper reports a quantitative genetics and genomic analysis of undesirable coat color patterns in goats. Two undesirable coat colors have routinely been recorded for the past 15 years in French Saanen goats. One fifth of Saanen females have been phenotyped “pink” (8.0%) or “pink neck” (11.5%) and consequently have not been included in the breeding program as elite animals. Heritability of the binary “pink” and “pink neck” phenotype, estimated from 103,443 females was 0.26 for “pink” and 0.21 for “pink neck”. Genome wide association studies (using haplotypes or single SNPs) were implemented using a daughter design of 810 Saanen goats sired by 9 Artificial Insemination bucks genotyped with the goatSNP50 chip. A highly significant signal (-log10pvalue = 10.2) was associated with the “pink neck” phenotype on chromosome 11, suggesting the presence of a major gene. Highly significant signals for the “pink” phenotype were found on chromosomes 5 and 13 (-log10p values of 7.2 and, 7.7 respectively). The most significant SNP on chromosome 13 was in the ASIP gene region, well known for its association with coat color phenotypes. Nine significant signals were also found for both traits. The highest signal for each trait was detected by both single SNP and haplotype approaches, whereas the smaller signals were not consistently detected by the two methods. Altogether these results demonstrated a strong genetic control of the “pink” and “pink neck” phenotypes in French Saanen goats suggesting that SNP information could be used to identify and remove undesired colored animals from the breeding program.  相似文献   

14.
BackgroundAreca (betel) nut is considered a Group 1 human carcinogen shown to be associated with other chronic diseases in addition to cancer. This paper describes the areca (betel) nut chewing trend in Guam, and health behaviors of chewers in Guam and Saipan.MethodsThe areca (betel) nut module in the Guam Behavioral Risk Factor Surveillance Survey was used to calculate the 5-year (2011–2015) chewing trend. To assess the association between areca (betel) nut chewing and health risks in the Mariana Islands, a cross-section of 300 chewers, ≥18 years old, were recruited from households in Guam and Saipan. Self-reported socio-demographics, oral health behaviors, chronic disease status, diet, and physical activity were collected. Anthropometry was measured. Only areca (betel) nut-specific and demographic information were collected from youth chewers in the household.ResultsThe 5-year areca (betel) nut chewing prevalence in Guam was 11% and increased among Non-Chamorros, primarily other Micronesians, from 2011 (7%) to 2015 (13%). In the household survey, most adult chewers (46%) preferred areca nut with betel leaf, slaked lime, and tobacco. Most youth chewers (48%) preferred areca nut only. Common adult chronic conditions included diabetes (14%), hypertension (26%), and obesity (58%).ConclusionThe 5-year areca (betel) nut chewing prevalence in Guam is comparable to the world estimate (10–20%), though rising among Non-Chamorros. Adult and youth chewers may be at an increased risk for oral cancer. Adult chewers have an increased risk of other chronic health conditions. Cancer prevention and intervention strategies should incorporate all aspects of health.  相似文献   

15.
BackgroundIntegrated care models aim to solve the problem of fragmented and poorly coordinated care in current healthcare systems. These models aim to be patient-centered by providing continuous and coordinated care and by considering the needs and preferences of patients. The objective of this study was to evaluate the opinions and experiences of community-living older adults with regard to integrated care and support, along with the extent to which it meets their health and social needs.MethodsSemi-structured interviews were conducted with 23 older adults receiving integrated care and support through “Embrace,” an integrated care model for community-living older adults that is based on the Chronic Care Model and a population health management model. Embrace is currently fully operational in the northern region of the Netherlands. Data analysis was based on the grounded theory approach.ResultsResponses of participants concerned two focus areas: 1) Experiences with aging, with the themes “Struggling with health,” “Increasing dependency,” “Decreasing social interaction,” “Loss of control,” and “Fears;” and 2) Experiences with Embrace, with the themes “Relationship with the case manager,” “Interactions,” and “Feeling in control, safe, and secure”. The prospect of becoming dependent and losing control was a key concept in the lives of the older adults interviewed. Embrace reinforced the participants’ ability to stay in control, even if they were dependent on others. Furthermore, participants felt safe and secure, in contrast to the fears of increasing dependency within the standard care system.ConclusionThe results indicate that integrated care and support provided through Embrace met the health and social needs of older adults, who were coping with the consequences of aging.  相似文献   

16.

Background

Tuberculosis (TB) is common among HIV-infected individuals in many resource-limited countries and has been associated with poor survival. We evaluated morbidity and mortality among individuals first starting antiretroviral therapy (ART) with concurrent active TB or other AIDS-defining disease using data from the “Prospective Evaluation of Antiretrovirals in Resource-Limited Settings” (PEARLS) study.

Methods

Participants were categorized retrospectively into three groups according to presence of active confirmed or presumptive disease at ART initiation: those with pulmonary and/or extrapulmonary TB (“TB” group), those with other non-TB AIDS-defining disease (“other disease”), or those without concurrent TB or other AIDS-defining disease (“no disease”). Primary outcome was time to the first of virologic failure, HIV disease progression or death. Since the groups differed in characteristics, proportional hazard models were used to compare the hazard of the primary outcome among study groups, adjusting for age, sex, country, screening CD4 count, baseline viral load and ART regimen.

Results

31 of 102 participants (30%) in the “TB” group, 11 of 56 (20%) in the “other disease” group, and 287 of 1413 (20%) in the “no disease” group experienced a primary outcome event (p = 0.042). This difference reflected higher mortality in the TB group: 15 (15%), 0 (0%) and 41 (3%) participants died, respectively (p<0.001). The adjusted hazard ratio comparing the “TB” and “no disease” groups was 1.39 (95% confidence interval: 0.93–2.10; p = 0.11) for the primary outcome and 3.41 (1.72–6.75; p<0.001) for death.

Conclusions

Active TB at ART initiation was associated with increased risk of mortality in HIV-1 infected patients.  相似文献   

17.
J. G. Fodor  C. J. Pfeiffer  V. S. Papezik 《CMAJ》1973,108(11):1369-1373
The profile of mortality in Newfoundland was analysed for all deaths occurring in 1969 of persons 35 to 69 years of age, of whom the total was 1036. An exceptionally high cardiovascular mortality (793 deaths/100,000) was noted for St. John''s, the capital city of Newfoundland, a city which has an extremely soft drinking-water supply. This high rate corresponds to that observed in the “high mortality belt” reported for the east coast of the United States, and in conjunction with data from mainland Canada, extends the belt across the entire eastern aspect of North America. The proportion of cardiovascular deaths of men occurring outside the hospital was less within hard drinking-water areas in Newfoundland than in the soft water areas of the province. Thus, the statistics reported here of cardiovascular mortality confirm evidence reported elsewhere on “macro-geographic” variations in this disease(s) as well as “micro-geographic” regional variations which may be dependent upon local environmental factors.  相似文献   

18.

Background

Betel nut (Areca nut) is the fruit of the Areca catechu tree. Approximately 700 million individuals regularly chew betel nut (or betel quid) worldwide and it is a known risk factor for oral cancer and esophageal cancer. We performed a meta-analysis to assess the influence of chewing betel quid on metabolic diseases, cardiovascular disease, and all-cause mortality.

Methodology/Principal Findings

We searched Medline, Cochrane Library, Web of Science, and Science Direct for pertinent articles (including the references) published between 1951 and 2013. The adjusted relative risk (RR) and 95% confidence interval were calculated using the random effect model. Sex was used as an independent category for comparison.

Results

Of 580 potentially relevant studies, 17 studies from Asia (5 cohort studies and 12 case-control studies) covering 388,134 subjects (range: 94 to 97,244) were selected. Seven studies (N = 121,585) showed significant dose-response relationships between betel quid consumption and the risk of events. According to pooled analysis, the adjusted RR of betel quid chewers vs. non-chewers was 1.47 (P<0.001) for obesity (N = 30,623), 1.51 (P = 0.01) for metabolic syndrome (N = 23,291), 1.47 (P<0.001) for diabetes (N = 51,412), 1.45 (P = 0.06) for hypertension (N = 89,051), 1.2 (P = 0.02) for cardiovascular disease (N = 201,488), and 1.21 (P = 0.02) for all-cause mortality (N = 179,582).

Conclusion/Significance

Betel quid chewing is associated with an increased risk of metabolic disease, cardiovascular disease, and all-cause mortality. Thus, in addition to preventing oral cancer, stopping betel quid use could be a valuable public health measure for metabolic diseases that are showing a rapid increase in South-East Asia and the Western Pacific.  相似文献   

19.

Purpose

A high percentage (50%-60%) of trauma patients die due to their injuries prior to arrival at the hospital. Studies on preclinical mortality including post-mortem examinations are rare. In this review, we summarized the literature focusing on clinical and preclinical mortality and studies included post-mortem examinations.

Methods

A literature search was conducted using PubMed/Medline database for relevant medical literature in English or German language published within the last four decades (1980–2015). The following MeSH search terms were used in different combinations: “multiple trauma”, “epidemiology”, “mortality “, “cause of death”, and “autopsy”. References from available studies were searched as well.

Results

Marked differences in demographic parameters and injury severity between studies were identified. Moreover, the incidence of penetrating injuries has shown a wide range (between 4% and 38%). Both unimodal and bimodal concepts of trauma mortality have been favored. Studies have shown a wide variation in time intervals used to analyze the distribution of death. Thus, it is difficult to say which distribution is correct.

Conclusions

We have identified variable results indicating bimodal or unimodal death distribution. Further more stundardized studies in this field are needed. We would like to encourage investigators to choose the inclusion criteria more critically and to consider factors affecting the pattern of mortality.  相似文献   

20.
Unexpected physical increases in the intensity of a frequently occurring “standard” auditory stimulus are experienced as obtrusive. This could either be because of a physical change, the increase in intensity of the “deviant” stimulus, or a psychological change, the violation of the expectancy for the occurrence of the lower intensity standard stimulus. Two experiments were run in which event-related potentials (ERPs) were recorded to determine whether “psychological” increments (violation of an expectancy for a lower intensity) would be processed differently than psychological decrements (violation of an expectancy for a higher intensity). Event-related potentials (ERPs) were recorded while subjects were presented with auditory tones that alternated between low and high intensity. The subjects ignored the auditory stimuli while watching a video. Deviants were created by repeating the same stimulus. In the first experiment, pairs of stimuli alternating in intensity, were presented in separate increment (H-L…H-L…H-H…H-L, in which H = 80 dB SPL and L = 60 dB SPL) and decrement conditions (L-H…L-H…L-L… L-H, in which H = 90 dB SPL and L = 80 dB SPL). The paradigm employed in the second experiment consisted of an alternating intensity pattern (H-L-H-L-H-H-H-L) or (H-L-H-L-L-L-H-L). Importantly, the stimulus prior to the deviant (the standard) and the actual deviants in both increment and decrement conditions in both experiments were physically identical (80 dB SPL tones). The repetition of the lower intensity tone therefore acted as a psychological rather than a physical decrement (a higher intensity tone was expected) while the repetition of the higher intensity tone acted as a psychological increment (a lower intensity tone was expected). The psychological increments in both experiments elicited a larger amplitude mismatch negativity (MMN) than the decrements. Thus, regardless of whether an acoustic change signals a physical increase in intensity or violates an expected decrease in intensity, a large MMN will be elicited.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号