首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Scientists who are members of an editorial board have been accused of preferentially publishing their scientific work in the journal where they serve as editor. Reputation and academic standing do depend on an uninterrupted flow of published scientific work and the question does arise as to whether publication mainly occurs in the self-edited journal. This investigation was designed to determine whether editorial board members of five urological journals were more likely to publish their research reports in their own rather than in other journals. A retrospective analysis was conducted for all original reports published from 2001–2010 by 65 editorial board members nominated to the boards of five impact leading urologic journals in 2006. Publications before editorial board membership, 2001–2005, and publications within the period of time as an editorial board member, 2006–2010, were identified. The impact factors of the journals were also recorded over the time period 2001–2010 to see whether a change in impact factor correlated with publication locality. In the five journals as a whole, scientific work was not preferentially published in the journal in which the scientists served as editor. However, significant heterogeneity among the journals was evident. One journal showed a significant increase in the amount of published papers in the ‘own’ journal after assumption of editorship, three journals showed no change and one journal showed a highly significant decrease in publishing in the ‘own’ journal after assumption of editorship.  相似文献   

2.

Objective

Guidelines for initiating HIV treatment are regularly revised. We explored how physicians in France have applied these evolving guidelines for ART initiation over the last decade in two different situations: chronic (CHI) and primary HIV-1 infection (PHI), since specific recommendations for PHI are also provided in France.

Methods

Data came from the ANRS PRIMO (1267 patients enrolled during PHI in 1996–2010) and COPANA (800 subjects enrolled at HIV diagnosis in 2004–2008) cohorts. We defined as guidelines-inconsistent during PHI and CHI, patients meeting criteria for ART initiation and not treated in the following month and during the next 6 months, respectively.

Results

ART initiation during PHI dramatically decreased from 91% of patients in 1996–99 to 22% in 2007 and increased to 60% in 2010, following changes in recommendations. In 2007, however, after the CD4 count threshold was raised to 350 cells/mm3 in 2006, only 55% of the patients with CD4≤350 were treated and 66% in 2008. During CHI, ART was more frequently initiated in patients who met the criteria at entry (96%) than during follow-up: 83% when recommendation to treat was 200 and 73% when it was 350 cells/mm3. Independent risk factors for not being treated during CHI despite meeting the criteria were lower viral load, lower educational level, and poorer living conditions.

Conclusion

HIV ART initiation guidelines are largely followed by practitioners in France. What can still be improved, however, is time to treat when CD4 cell counts reach the threshold to treat. Risk factors for lack of timely treatment highlight the need to understand better how patients’ living conditions and physicians’ perceptions influence the decision to initiate treatment.  相似文献   

3.

Background

Antiretroviral therapy (ART) has evolved rapidly since its beginnings. This analysis describes trends in first-line ART use in Asia and their impact on treatment outcomes.

Methods

Patients in the TREAT Asia HIV Observational Database receiving first-line ART for ≥6 months were included. Predictors of treatment failure and treatment modification were assessed.

Results

Data from 4662 eligible patients was analysed. Patients started ART in 2003–2006 (n = 1419), 2007–2010 (n = 2690) and 2011–2013 (n = 553). During the observation period, tenofovir, zidovudine and abacavir use largely replaced stavudine. Stavudine was prescribed to 5.8% of ART starters in 2012/13. Efavirenz use increased at the expense of nevirapine, although both continue to be used extensively (47.5% and 34.5% of patients in 2012/13, respectively). Protease inhibitor use dropped after 2004. The rate of treatment failure or modification declined over time (22.1 [95%CI 20.7–23.5] events per 100 patient/years in 2003–2006, 15.8 [14.9–16.8] in 2007–2010, and 11.6 [9.4–14.2] in 2011–2013). Adjustment for ART regimen had little impact on the temporal decline in treatment failure rates but substantially attenuated the temporal decline in rates of modification due to adverse event. In the final multivariate model, treatment modification due to adverse event was significantly predicted by earlier period of ART initiation (hazard ratio 0.52 [95%CI 0.33–0.81], p = 0.004 for 2011–2013 versus 2003–2006), older age (1.56 [1.19–2.04], p = 0.001 for ≥50 years versus <30years), female sex (1.29 [1.11–1.50], p = 0.001 versus male), positive hepatitis C status (1.33 [1.06–1.66], p = 0.013 versus negative), and ART regimen (11.36 [6.28–20.54], p<0.001 for stavudine-based regimens versus tenofovir-based).

Conclusions

The observed trends in first-line ART use in Asia reflect changes in drug availability, global treatment recommendations and prescriber preferences over the past decade. These changes have contributed to a declining rate of treatment modification due to adverse event, but not to reductions in treatment failure.  相似文献   

4.

Introduction

New tools for malaria control, artemisinin combination therapy (ACT) and long-lasting insecticidal nets (LLINs) were recently introduced across India. We estimated the impact of universal coverage of ACT and ACT plus LLINs in a setting of hyperendemic, forest malaria transmission.

Methods

We reviewed data collected through active and passive case detection in a vaccine trial cohort of 2,204 tribal people residing in Sundargarh district, Odisha between 2006 and 2011. We compared measures of transmission at the village and individual level in 2006–2009 versus 2010–2011 after ACT (in all villages) and LLINs (in three villages) were implemented.

Results

During 2006–2009 malaria incidence per village ranged from 156–512 per 1000 persons per year and slide prevalence ranged from 28–53%. Routine indoor residual spray did not prevent seasonal peaks of malaria. Post-intervention impact in 2010–2011 was dramatic with ranges of 14–71 per 1000 persons per year and 6–16% respectively. When adjusted for village, ACT alone decreased the incidence of malaria by 83% (IRR 0.17, 95%CI: 0.10, 0.27) and areas using ACT and LLINs decreased the incidence of malaria by 86% (IRR 0.14, 95%CI: 0.05, 0.38). After intervention, the age of malaria cases, their parasite density, and proportion with fever at the time of screening increased.

Conclusions

ACT, and LLINs along with ACT, effectively reduced malaria incidence in a closely monitored population living in a forest ecotype. It is unclear whether LLINs were impactful when prompt and quality antimalarial treatment was available. In spite of universal coverage, substantial malaria burden remained.  相似文献   

5.
ObjectiveClient adherence is vital for effective methadone maintenance treatment (MMT). This study explores the pattern and associated factors of client adherence, drop-out and re-enrolment in the Chinese MMT programme over the period of 2006–2013.MethodsThis retrospective study was conducted in 14 MMT clinics in Guangdong Province, China. We employed Kaplan-Meier survival analysis to estimate the rates of drop-out and re-enrolment of MMT clients and multivariate Cox regression to identify associated factors.ResultsAmong 1,512 study participants, 79% have experienced ‘drop-out’ during the 7-year study period. However, 82% ‘dropped-out’ clients resumed treatment at a later time. Low education level (junior high or below versus otherwise, HR = 1.21, 1.05–1.40), low methadone dosage in the first treatment episode (<50 ml versus ≥50 ml, HR = 1.84, 1.64–2.06) and higher proportion of positive urine test (≥50% versus<50%, HR = 3.72, 3.30–4.20) during the first treatment episode were strong predictors of subsequent drop-outs of the participants. Among the ‘dropped-out’ clients, being female (HR = 1.40, 1.23–1.60), being married (HR = 1.19, 1.09–1.30), and having a higher proportion of positive urine tests in the first treatment episode (≥50% versus<50%, HR = 1.35, 1.20–1.51) had greater likelihood of subsequent re-enrolment in MMT. Clients receiving lower methadone dosage (first treatment episode <50 ml versus ≥50 ml, HR = 1.12, 1.03–1.23; the last intake before drop-out <50 ml versus ≥50 ml, HR = 1.16, 1.04–1.30) were also more likely to re-enrol.ConclusionPersistent cycling in-and-out of clients in MMT programmes is common. Insufficient dosage and higher proportion of positive urine samples in the first treatment episode are the key determinants for subsequent client drop-out and re-enrolment. Interventions should target clients in their early stage of treatment to improve retention in the long term.  相似文献   

6.

Background

In July 2010 a new multiple hub-and-spoke model for acute stroke care was implemented across the whole of London, UK, with continuous specialist care during the first 72 hours provided at 8 hyper-acute stroke units (HASUs) compared to the previous model of 30 local hospitals receiving acute stroke patients. We investigated differences in clinical outcomes and costs between the new and old models.

Methods

We compared outcomes and costs ‘before’ (July 2007–July 2008) vs. ‘after’ (July 2010–June 2011) the introduction of the new model, adjusted for patient characteristics and national time trends in mortality and length of stay. We constructed 90-day and 10-year decision analytic models using data from population based stroke registers, audits and published sources. Mortality and length of stay were modelled using survival analysis.

Findings

In a pooled sample of 307 patients ‘before’ and 3156 patients ‘after’, survival improved in the ‘after’ period (age adjusted hazard ratio 0.54; 95% CI 0.41–0.72). The predicted survival rates at 90 days in the deterministic model adjusted for national trends were 87.2% ‘before’ % (95% CI 86.7%–87.7%) and 88.7% ‘after’ (95% CI 88.6%–88.8%); a relative reduction in deaths of 12% (95% CI 8%–16%). Based on a cohort of 6,438 stroke patients, the model produces a total cost saving of £5.2 million per year at 90 days (95% CI £4.9-£5.5 million; £811 per patient).

Conclusion

A centralized model for acute stroke care across an entire metropolitan city appears to have reduced mortality for a reduced cost per patient, predominately as a result of reduced hospital length of stay.  相似文献   

7.

Background

Smokefree legislation may protect children from secondhand smoke (SHS) in the home from smoking parent(s). We examined the effect of the 2007 smokefree legislation on children’s exposure to SHS in the home and maternal action to protect children from SHS exposure in Hong Kong.

Methods

Families with a smoking father and a non-smoking mother were recruited from public clinics before (2005–2006, n = 333) and after the legislation (2007–2008, n = 742) which led to a major extension of smokefree places in Hong Kong. Main outcomes included children’s SHS exposure in the home, nicotine level in mothers’ and children’s hair and home environment, mothers’ action to protect children from SHS, and their support to the fathers to quit.

Results

Fewer mothers post-legislation reported children’s SHS exposure in the home (87.2% versus 29.3%, p<0.01), which was consistent with their hair nicotine levels (0.36ng/mg versus 0.04ng/mg, p<0.01). More mothers post-legislation in the last month took their children away from cigarette smoke (6.3% versus 92.2%; p<0.01) and advised fathers to quit over 3 times (8.3% versus 33.8%; p<0.01). No significant change was found in the content of smoking cessation advice and the proportion of mothers who took specific action to support the fathers to quit.

Conclusions

SHS exposure in the home decreased and maternal action to protect children from SHS increased after the 2007 smokefree legislation. Maternal support to fathers to quit showed moderate improvement. Cessation services for smokers and specific interventions for smoking families should be expanded together with smokefree legislation.  相似文献   

8.
Sweetened beverages, coffee, and tea are the most consumed non-alcoholic beverages and may have important health consequences. We prospectively evaluated the consumption of various types of beverages assessed in 1995–1996 in relation to self-reported depression diagnosis after 2000 among 263,923 participants of the NIH-AARP Diet and Health Study. Odds ratios (OR) and 95% confidence intervals (CI) were derived from multivariate logistic regressions. The OR (95% CI) comparing ≥4 cans/cups per day with none were 1.30 (95%CI: 1.17–1.44) for soft drinks, 1.38 (1.15–1.65) for fruit drinks, and 0.91 (0.84–0.98) for coffee (all P for trend<0.0001). Null associations were observed for iced-tea and hot tea. In stratified analyses by drinkers of primarily diet versus regular beverages, the ORs were 1.31 (1.16–1.47) for diet versus 1.22 (1.03–1.45) for regular soft drinks, 1.51 (1.18–1.92) for diet versus 1.08 (0.79–1.46) for regular fruit drinks, and 1.25 (1.10–1.41) for diet versus 0.94 (0.83–1.08) for regular sweetened iced-tea. Finally, compared to nondrinkers, drinking coffee or tea without any sweetener was associated with a lower risk for depression, adding artificial sweeteners, but not sugar or honey, was associated with higher risks. Frequent consumption of sweetened beverages, especially diet drinks, may increase depression risk among older adults, whereas coffee consumption may lower the risk.  相似文献   

9.
This is a retrospective study which aims to identify major determinants of successful laparoscopic radical hysterectomy (LRH) versus abdominal radical hysterectomy (ARH) performed by inexperienced surgeons for stage IA2-IIA cervical cancer. A total of 161 consecutive patients with stage IA2–IIA cervical cancer who underwent RH were grouped into 2 groups according to the surgeons’ experience with LRH: experienced surgeon versus inexperienced surgeon. After matching for age and risk factors, surgical and survival outcomes were compared. Experienced surgeon selected patients with earlier-stage and fewer risk factors for LRH than ARH, but inexperience surgeons did not. After matching, the vaginal tumor-free margin of LRH was shorter than that of ARH in experienced surgeon group (1.3 versus 1.7 cm, p=0.007); however, the vaginal tumor-free margin was longer than that of ARH in the inexperienced surgeon group (1.8 versus 1.3 cm, p=0.035). The postoperative hospital stay of LRH was shorter than that of ARH in experienced surgeon group (5.5 versus 7.7 days, p<0.001), but not different from that of ARH in the inexperienced surgeon group. Vaginal tumor-free margin >1.8 cm (OR 7.33, 95% CI 1.22–40.42), stage >IB1 (OR 8.83, 95% CI 1.51–51.73), and estimated blood loss >575 mL (OR 33.95, 95% CI 4.87–236.79) were independent risk factors for longer postoperative hospital stay in the inexperienced surgeon group. There was no difference of 5-year-profression-free survival of LRH patients between experienced surgeon and inexperienced surgeon groups after matching (55.1 versus 33.3%, p=0.391). Selection of earlier-stage disease and moderate vaginal tumor-free margin might be important for an inexperienced surgeon to successfully perform LRH with minimal complications in stage IA2–IIA cervical cancer.  相似文献   

10.
Missense mutations in leucine-rich repeat kinase 2 (LRRK2) are the most common cause of familial Parkinson’s disease (PD); however, pathways regulating LRRK2 subcellular localization, function, and turnover are not fully defined. We performed quantitative mass spectrometry–based interactome studies to identify 48 novel LRRK2 interactors, including the microtubule-associated E3 ubiquitin ligase TRIM1 (tripartite motif family 1). TRIM1 recruits LRRK2 to the microtubule cytoskeleton for ubiquitination and proteasomal degradation by binding LRRK2911–919, a nine amino acid segment within a flexible interdomain region (LRRK2853–981), which we designate the “regulatory loop” (RL). Phosphorylation of LRRK2 Ser910/Ser935 within LRRK2 RL influences LRRK2’s association with cytoplasmic 14-3-3 versus microtubule-bound TRIM1. Association with TRIM1 modulates LRRK2’s interaction with Rab29 and prevents upregulation of LRRK2 kinase activity by Rab29 in an E3-ligase–dependent manner. Finally, TRIM1 rescues neurite outgrowth deficits caused by PD-driving mutant LRRK2 G2019S. Our data suggest that TRIM1 is a critical regulator of LRRK2, controlling its degradation, localization, binding partners, kinase activity, and cytotoxicity.  相似文献   

11.
12.

Objective

The current study aimed to examine the effects of daily change of the Shenzhen Stock Exchange Index on cardiovascular mortality in Guangzhou and Taishan, China.

Methods

Daily mortality and stock performance data during 2006–2010 were collected to construct the time series for the two cities. A distributed lag non-linear model was utilized to examine the effect of daily stock index changes on cardiovascular mortality after controlling for potential confounding factors.

Results

We observed a delayed non-linear effect of the stock index change on cardiovascular mortality: both rising and declining of the stock index were associated with increased cardiovascular deaths. In Guangzhou, the 15–25 lag days cumulative relative risk of an 800 index drop was 2.08 (95% CI: 1.38–3.14), and 2.38 (95% CI: 1.31–4.31) for an 800 stock index increase on the cardiovascular mortality, respectively. In Taishan, the cumulative relative risk over 15–25 days lag was 1.65 (95% CI: 1.13–2.42) for an 800 index drop and 2.08 (95% CI: 1.26–3.42) for an 800 index rising, respectively.

Conclusions

Large ups and downs in daily stock index might be important predictor of cardiovascular mortality.  相似文献   

13.
14.
It is generally accepted that human influenza viruses bind glycans containing sialic acid linked α2–6 to the next sugar, that avian influenza viruses bind glycans containing the α2–3 linkage, and that mutations that change the binding specificity might change the host tropism. We noted that human H3N2 viruses showed dramatic differences in their binding specificity, and so we embarked on a study of representative human H3N2 influenza viruses, isolated from 1968 to 2012, that had been isolated and minimally passaged only in mammalian cells, never in eggs. The 45 viruses were grown in MDCK cells, purified, fluorescently labeled and screened on the Consortium for Functional Glycomics Glycan Array. Viruses isolated in the same season have similar binding specificity profiles but the profiles show marked year-to-year variation. None of the 610 glycans on the array (166 sialylated glycans) bound to all viruses; the closest was Neu5Acα2–6(Galβ1–4GlcNAc)3 in either a linear or biantennary form, that bound 42 of the 45 viruses. The earliest human H3N2 viruses preferentially bound short, branched sialylated glycans while recent viruses bind better to long polylactosamine chains terminating in sialic acid. Viruses isolated in 1996, 2006, 2010 and 2012 bind glycans with α2–3 linked sialic acid; for 2006, 2010 and 2012 viruses this binding was inhibited by oseltamivir, indicating binding of α2–3 sialylated glycans by neuraminidase. More significantly, oseltamivir inhibited virus entry of 2010 and 2012 viruses into MDCK cells. All of these viruses were representative of epidemic strains that spread around the world, so all could infect and transmit between humans with high efficiency. We conclude that the year-to-year variation in receptor binding specificity is a consequence of amino acid sequence changes driven by antigenic drift, and that viruses with quite different binding specificity and avidity are equally fit to infect and transmit in the human population.  相似文献   

15.

Background

Centenarians are a rapidly growing demographic group worldwide, yet their health and social care needs are seldom considered. This study aims to examine trends in place of death and associations for centenarians in England over 10 years to consider policy implications of extreme longevity.

Methods and Findings

This is a population-based observational study using death registration data linked with area-level indices of multiple deprivations for people aged ≥100 years who died 2001 to 2010 in England, compared with those dying at ages 80-99. We used linear regression to examine the time trends in number of deaths and place of death, and Poisson regression to evaluate factors associated with centenarians’ place of death. The cohort totalled 35,867 people with a median age at death of 101 years (range: 100–115 years). Centenarian deaths increased 56% (95% CI 53.8%–57.4%) in 10 years. Most died in a care home with (26.7%, 95% CI 26.3%–27.2%) or without nursing (34.5%, 95% CI 34.0%–35.0%) or in hospital (27.2%, 95% CI 26.7%–27.6%). The proportion of deaths in nursing homes decreased over 10 years (−0.36% annually, 95% CI −0.63% to −0.09%, p = 0.014), while hospital deaths changed little (0.25% annually, 95% CI −0.06% to 0.57%, p = 0.09). Dying with frailty was common with “old age” stated in 75.6% of death certifications. Centenarians were more likely to die of pneumonia (e.g., 17.7% [95% CI 17.3%–18.1%] versus 6.0% [5.9%–6.0%] for those aged 80–84 years) and old age/frailty (28.1% [27.6%–28.5%] versus 0.9% [0.9%–0.9%] for those aged 80–84 years) and less likely to die of cancer (4.4% [4.2%–4.6%] versus 24.5% [24.6%–25.4%] for those aged 80–84 years) and ischemic heart disease (8.6% [8.3%–8.9%] versus 19.0% [18.9%–19.0%] for those aged 80–84 years) than were younger elderly patients. More care home beds available per 1,000 population were associated with fewer deaths in hospital (PR 0.98, 95% CI 0.98–0.99, p<0.001).

Conclusions

Centenarians are more likely to have causes of death certified as pneumonia and frailty and less likely to have causes of death of cancer or ischemic heart disease, compared with younger elderly patients. To reduce reliance on hospital care at the end of life requires recognition of centenarians’ increased likelihood to “acute” decline, notably from pneumonia, and wider provision of anticipatory care to enable people to remain in their usual residence, and increasing care home bed capacity. Please see later in the article for the Editors'' Summary  相似文献   

16.

Background

India, with a population of more than 1.21 billion, has the highest maternal mortality in the world (estimated to be 56000 in 2010); and adolescent (aged 15–19) mortality shares 9% of total maternal deaths. Addressing the maternity care needs of adolescents may have considerable ramifications for achieving the Millennium Development Goal (MDG)–5. This paper assesses the socioeconomic differentials in accessing full antenatal care and professional attendance at delivery by adolescent mothers (aged 15–19) in India during 1990–2006.

Methods and Findings

Data from three rounds of the National Family Health Survey of India conducted during 1992–93, 1998–99, and 2005–06 were analyzed. The Cochran-Armitage and Chi-squared test for linear and non-linear time trends were applied, respectively, to understand the trend in the proportion of adolescent mothers utilizing select maternity care services during 1990–2006. Using pooled multivariate logistic regression models, the probability of select maternal healthcare utilization among women by key socioeconomic characteristics was appraised. After adjusting for potential socio-demographic and economic characteristics, the likelihood of adolescents accessing full antenatal care increased by only 4% from 1990 to 2006. However, the probability of adolescent women availing themselves of professional attendance at delivery increased by 79% during the same period. The study also highlights the desolate disparities in maternity care services among adolescents across the most and the least favoured groups.

Conclusion

Maternal care interventions in India need focused programs for rural, uneducated, poor adolescent women so that they can avail themselves of measures to delay child bearing, and for better antenatal consultation and delivery care in case of pregnancy. This study strongly advocates the promotion of a comprehensive ‘adolescent scheme’ along the lines of ‘Continuum of Maternal, Newborn and Child health Care’ to address the unmet need of reproductive and maternal healthcare services among adolescent women in India.  相似文献   

17.

Background Aim

To gain insight into patient and doctor delay in testicular cancer (TC) and factors associated with delay.

Materials and Methods

Sixty of the 66 eligible men; median age 26 (range 17–45) years, diagnosed with TC at the University Medical Center Groningen completed a questionnaire on patients’ delay: interval from symptom onset to first consultation with a general practitioner (GP) and doctors’ delay: interval between GP and specialist visit.

Results

Median patient reported delay was 30 (range 1–365) days. Patient delay and TC tumor stage were associated (p = .01). Lower educated men and men embarrassed about their scrotal change reported longer patient delay (r = -.25, r = .79 respectively). Age, marital status, TC awareness, warning signals, nor perceived limitations were associated with patient delay. Median patient reported time from GP to specialist (doctors’ delay) was 7 (range 0–240) days. Referral time and disease stage were associated (p = .04). Six patients never reported a scrotal change. Of the 54 patients reporting a testicular change, 29 (54%) patients were initially ‘misdiagnosed’, leading to a median doctors’ delay of 14 (1–240) days, which was longer (p< .001) than in the 25 (46%) patients whose GP suspected TC (median doctors’ delay 1(0–7 days).

Conclusions

High variation in patients’ and doctors’ delay was found. Most important risk variables for longer patient delay were embarrassment and lower education. Most important risk variable in GP’s was ‘misdiagnosis’. TC awareness programs for men and physicians are required to decrease delay in the diagnosis of TC and improve disease free survival.  相似文献   

18.
Classical tree neighborhood models use size variables acting at point distances. In a new approach here, trees were spatially extended as a function of their crown sizes, represented impressionistically as points within crown areas. Extension was accompanied by plasticity in the form of crown removal or relocation under the overlap of taller trees. Root systems were supposedly extended in a similar manner. For the 38 most abundant species in the focal size class (10–<100 cm stem girth) in two 4‐ha plots at Danum (Sabah), for periods P1 (1986–1996) and P2 (1996–2007), stem growth rate and tree survival were individually regressed against stem size, and neighborhood conspecific (CON) and heterospecific (HET) basal areas within incremented steps in radius. Model parameters were critically assessed, and statistical robustness in the modeling was set by randomization testing. Classical and extended models differed importantly in their outcomes. Crown extension weakened the relationship of CON effect on growth versus plot species’ abundance, showing that models without plasticity overestimated negative density dependence. A significant negative trend of difference in CON effects on growth (P2−P1) versus CON or HET effect on survival in P1 was strongest with crown extension. Model outcomes did not then support an explanation of CON and HET effects being due to (asymmetric) competition for light alone. An alternative hypothesis is that changes in CON effects on small trees, largely incurred by a drought phase (relaxing light limitation) in P2, and following the more shaded (suppressing) conditions in P1, were likely due to species‐specific (symmetric) root competition and mycorrhizal processes. The very high variation in neighborhood composition and abundances led to a strong “neighborhood stochasticity” and hence to largely idiosyncratic species’ responses. A need to much better understand the roles of rooting structure and processes at the individual tree level was highlighted.  相似文献   

19.

Background

The burden of Congenital Rubella Syndrome (CRS) is typically underestimated in routine surveillance. Updated estimates are needed following the recent WHO position paper on rubella and recent GAVI initiatives, funding rubella vaccination in eligible countries. Previous estimates considered the year 1996 and only 78 (developing) countries.

Methods

We reviewed the literature to identify rubella seroprevalence studies conducted before countries introduced rubella-containing vaccination (RCV). These data and the estimated vaccination coverage in the routine schedule and mass campaigns were incorporated in mathematical models to estimate the CRS incidence in 1996 and 2000–2010 for each country, region and globally.

Results

The estimated CRS decreased in the three regions (Americas, Europe and Eastern Mediterranean) which had introduced widespread RCV by 2010, reaching <2 per 100,000 live births (the Americas and Europe) and 25 (95% CI 4–61) per 100,000 live births (the Eastern Mediterranean). The estimated incidence in 2010 ranged from 90 (95% CI: 46–195) in the Western Pacific, excluding China, to 116 (95% CI: 56–235) and 121 (95% CI: 31–238) per 100,000 live births in Africa and SE Asia respectively. Highest numbers of cases were predicted in Africa (39,000, 95% CI: 18,000–80,000) and SE Asia (49,000, 95% CI: 11,000–97,000). In 2010, 105,000 (95% CI: 54,000–158,000) CRS cases were estimated globally, compared to 119,000 (95% CI: 72,000–169,000) in 1996.

Conclusions

Whilst falling dramatically in the Americas, Europe and the Eastern Mediterranean after vaccination, the estimated CRS incidence remains high elsewhere. Well-conducted seroprevalence studies can help to improve the reliability of these estimates and monitor the impact of rubella vaccination.  相似文献   

20.
A coaching change is an extreme, but frequently occurring phenomenon in elite soccer with its impact on team success debatable. The aim of the current study was twofold: (i) to compare team’s performance when coached by new and old coaches; and (ii) to investigate the impact of a coaching change on team’s performance according to coach- and club-related factors. All in-season coaching changes from the 2010–11 to 2017–18 seasons within the Spanish, French, English, German and Italian professional leagues were examined. Team performance was assessed as points awarded from match outcome over 1–20 matches prior to and following the coaching change. Four independent variables (coach’s experience, team’s budget, whether the coach had been an elite former player or not, and whether the coach was a novice or not) were included into linear regression modelling. The main results showed that team’s short-term performance was improved significantly with a change to a new coach with this impact declining in the longer term (> 10 matches). Specifically, the number of points (1.15–1.32 vs. 0.37–1.03, p < 0.05) and the moving average of points (1.19–1.31 vs. 0.37–1.04, p < 0.05) awarded per match were significantly greater after the coaching change. Further, the winning effect due to the new coach was independent of coach-related factors such as coaching experience or the new coach being a former elite player. A critical organisational decision to change coaches may provide an essential stimulus for future team success in elite soccer.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号