首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

The infectivity of the HIV-1 acute phase has been directly measured only once, from a retrospectively identified cohort of serodiscordant heterosexual couples in Rakai, Uganda. Analyses of this cohort underlie the widespread view that the acute phase is highly infectious, even more so than would be predicted from its elevated viral load, and that transmission occurring shortly after infection may therefore compromise interventions that rely on diagnosis and treatment, such as antiretroviral treatment as prevention (TasP). Here, we re-estimate the duration and relative infectivity of the acute phase, while accounting for several possible sources of bias in published estimates, including the retrospective cohort exclusion criteria and unmeasured heterogeneity in risk.

Methods and Findings

We estimated acute phase infectivity using two approaches. First, we combined viral load trajectories and viral load-infectivity relationships to estimate infectivity trajectories over the course of infection, under the assumption that elevated acute phase infectivity is caused by elevated viral load alone. Second, we estimated the relative hazard of transmission during the acute phase versus the chronic phase (RHacute) and the acute phase duration (d acute) by fitting a couples transmission model to the Rakai retrospective cohort using approximate Bayesian computation. Our model fit the data well and accounted for characteristics overlooked by previous analyses, including individual heterogeneity in infectiousness and susceptibility and the retrospective cohort''s exclusion of couples that were recorded as serodiscordant only once before being censored by loss to follow-up, couple dissolution, or study termination. Finally, we replicated two highly cited analyses of the Rakai data on simulated data to identify biases underlying the discrepancies between previous estimates and our own.From the Rakai data, we estimated RHacute = 5.3 (95% credibility interval [95% CrI]: 0.79–57) and d acute = 1.7 mo (95% CrI: 0.55–6.8). The wide credibility intervals reflect an inability to distinguish a long, mildly infectious acute phase from a short, highly infectious acute phase, given the 10-mo Rakai observation intervals. The total additional risk, measured as excess hazard-months attributable to the acute phase (EHMacute) can be estimated more precisely: EHMacute = (RHacute - 1) × d acute, and should be interpreted with respect to the 120 hazard-months generated by a constant untreated chronic phase infectivity over 10 y of infection. From the Rakai data, we estimated that EHMacute = 8.4 (95% CrI: -0.27 to 64). This estimate is considerably lower than previously published estimates, and consistent with our independent estimate from viral load trajectories, 5.6 (95% confidence interval: 3.3–9.1). We found that previous overestimates likely stemmed from failure to account for risk heterogeneity and bias resulting from the retrospective cohort study design.Our results reflect the interaction between the retrospective cohort exclusion criteria and high (47%) rates of censorship amongst incident serodiscordant couples in the Rakai study due to loss to follow-up, couple dissolution, or study termination. We estimated excess physiological infectivity during the acute phase from couples data, but not the proportion of transmission attributable to the acute phase, which would require data on the broader population''s sexual network structure.

Conclusions

Previous EHMacute estimates relying on the Rakai retrospective cohort data range from 31 to 141. Our results indicate that these are substantial overestimates of HIV-1 acute phase infectivity, biased by unmodeled heterogeneity in transmission rates between couples and by inconsistent censoring. Elevated acute phase infectivity is therefore less likely to undermine TasP interventions than previously thought. Heterogeneity in infectiousness and susceptibility may still play an important role in intervention success and deserves attention in future analyses  相似文献   

2.
The inactivation domain of STIM1 (IDSTIM: amino acids 470–491) has been described as necessary for Ca2+-dependent inactivation (CDI) of Ca2+ release–activated Ca2+ (CRAC) channels, but its mechanism of action is unknown. Here we identify acidic residues within IDSTIM that control the extent of CDI and examine functional interactions of IDSTIM with Orai1 pore residues W76 and Y80. Alanine scanning revealed three IDSTIM residues (D476/D478/D479) that are critical for generating full CDI. Disabling IDSTIM by a triple alanine substitution for these three residues (“STIM1 3A”) or by truncation of the entire domain (STIM11–469) reduced CDI to the same residual level observed for the Orai1 pore mutant W76A (approximately one third of the extent seen with full-length STIM1). Results of noise analysis showed that STIM11–469 and Orai1 W76A mutants do not reduce channel open probability or unitary Ca2+ conductance, factors that determine local Ca2+ accumulation, suggesting that they diminish CDI instead by inhibiting the CDI gating mechanism. We tested for functional coupling between IDSTIM and the Orai1 pore by double-mutant cycle analysis. The effects on CDI of mutations disabling IDSTIM or W76 were not additive, demonstrating that IDSTIM and W76 are strongly coupled and act in concert to generate full-strength CDI. Interestingly, disabling IDSTIM and W76 separately gave opposite results in Orai1 Y80A channels: channels with W76 but lacking IDSTIM generated approximately two thirds of the WT extent of CDI but those with IDSTIM but lacking W76 completely failed to inactivate. Together, our results suggest that Y80 alone is sufficient to generate residual CDI, but acts as a barrier to full CDI. Although IDSTIM is not required as a Ca2+ sensor for CDI, it acts in concert with W76 to progress beyond the residual inactivated state and enable CRAC channels to reach the full extent of inactivation.  相似文献   

3.
The microsporidia have recently been recognized as a group of pathogens that have potential for waterborne transmission; however, little is known about the effects of routine disinfection on microsporidian spore viability. In this study, in vitro growth of Encephalitozoon syn. Septata intestinalis, a microsporidium found in the human gut, was used as a model to assess the effect of chlorine on the infectivity and viability of microsporidian spores. Spore inoculum concentrations were determined by using spectrophotometric measurements (percent transmittance at 625 nm) and by traditional hemacytometer counting. To determine quantitative dose-response data for spore infectivity, we optimized a rabbit kidney cell culture system in 24-well plates, which facilitated calculation of a 50% tissue culture infective dose (TCID50) and a minimal infective dose (MID) for E. intestinalis. The TCID50 is a quantitative measure of infectivity and growth and is the number of organisms that must be present to infect 50% of the cell culture wells tested. The MID is as a measure of a system's permissiveness to infection and a measure of spore infectivity. A standardized MID and a standardized TCID50 have not been reported previously for any microsporidian species. Both types of doses are reported in this paper, and the values were used to evaluate the effects of chlorine disinfection on the in vitro growth of microsporidia. Spores were treated with chlorine at concentrations of 0, 1, 2, 5, and 10 mg/liter. The exposure times ranged from 0 to 80 min at 25°C and pH 7. MID data for E. intestinalis were compared before and after chlorine disinfection. A 3-log reduction (99.9% inhibition) in the E. intestinalis MID was observed at a chlorine concentration of 2 mg/liter after a minimum exposure time of 16 min. The log10 reduction results based on percent transmittance-derived spore counts were equivalent to the results based on hemacytometer-derived spore counts. Our data suggest that chlorine treatment may be an effective water treatment for E. intestinalis and that spectrophotometric methods may be substituted for labor-intensive hemacytometer methods when spores are counted in laboratory-based chlorine disinfection studies.  相似文献   

4.
Shigellosis is a heavy disease burden in China especially in children aged under 5 years. However, the age-related factors involved in transmission of shigellosis are unclear. An age-specific Susceptible–Exposed–Infectious/Asymptomatic–Recovered (SEIAR) model was applied to shigellosis surveillance data maintained by Hubei Province Centers for Disease Control and Prevention from 2005 to 2017. The individuals were divided into four age groups (≤ 5 years, 6–24 years, 25–59 years, and ≥ 60 years). The effective reproduction number (Reff), including infectivity (RI) and susceptibility (RS) was calculated to assess the transmissibility of different age groups. From 2005 to 2017, 130,768 shigellosis cases were reported in Hubei Province. The SEIAR model fitted well with the reported data (P < 0.001). The highest transmissibility (Reff) was from ≤ 5 years to the 25–59 years (mean: 0.76, 95% confidence interval [CI]: 0.34–1.17), followed by from the 6–24 years to the 25–59 years (mean: 0.69, 95% CI: 0.35–1.02), from the ≥ 60 years to the 25–59 years (mean: 0.58, 95% CI: 0.29–0.86), and from the 25–59 years to 25–59 years (mean: 0.50, 95% CI: 0.21–0.78). The highest infectivity was in ≤ 5 years (RI = 1.71), and was most commonly transmitted to the 25–59 years (45.11%). The highest susceptibility was in the 25–59 years (RS = 2.51), and their most common source was the ≤ 5 years (30.15%). Furthermore, “knock out” simulation predicted the greatest reduction in the number of cases occurred by when cutting off transmission routes among ≤ 5 years and from 25–59 years to ≤ 5 years. Transmission in ≤ 5 years occurred mainly within the group, but infections were most commonly introduced by individuals in the 25–59 years. Infectivity was highest in the ≤ 5 years and susceptibility was highest in the 25–59 years. Interventions to stop transmission should be directed at these age groups.  相似文献   

5.
The endpoint dilution assay’s output, the 50% infectious dose (ID50), is calculated using the Reed-Muench or Spearman-Kärber mathematical approximations, which are biased and often miscalculated. We introduce a replacement for the ID50 that we call Specific INfection (SIN) along with a free and open-source web-application, midSIN (https://midsin.physics.ryerson.ca) to calculate it. midSIN computes a virus sample’s SIN concentration using Bayesian inference based on the results of a standard endpoint dilution assay, and requires no changes to current experimental protocols. We analyzed influenza and respiratory syncytial virus samples using midSIN and demonstrated that the SIN/mL reliably corresponds to the number of infections a sample will cause per mL. It can therefore be used directly to achieve a desired multiplicity of infection, similarly to how plaque or focus forming units (PFU, FFU) are used. midSIN’s estimates are shown to be more accurate and robust than the Reed-Muench and Spearman-Kärber approximations. The impact of endpoint dilution plate design choices (dilution factor, replicates per dilution) on measurement accuracy is also explored. The simplicity of SIN as a measure and the greater accuracy provided by midSIN make them an easy and superior replacement for the TCID50 and other in vitro culture ID50 measures. We hope to see their universal adoption to measure the infectivity of virus samples.  相似文献   

6.
Background and AimsBranch biomass and other attributes are important for estimating the carbon budget of forest stands and characterizing crown structure. As destructive measuring is time-consuming and labour-intensive, terrestrial laser scanning (TLS) as a solution has been used to estimate branch biomass quickly and non-destructively. However, branch information extraction from TLS data alone is challenging due to occlusion and other defects, especially for estimating individual branch attributes in coniferous trees.MethodsThis study presents a method, entitled TSMtls, to estimate individual branch biomass non-destructively and accurately by combining tree structure models and TLS data. The TSMtls method constructs the stem-taper curve from TLS data, then uses tree structure models to determine the number, basal area and biomass of individual branches at whorl level. We estimated the tree structural model parameters from 122 destructively measured Scots pine (Pinus sylvestris) trees and tested the method on six Scots pine trees that were first TLS-scanned and later destructively measured. Additionally, we estimated the branch biomass using other TLS-based approaches for comparison.Key ResultsTree-level branch biomass estimates derived from TSMtls showed the best agreement with the destructive measurements [coefficient of variation of root mean square error (CV-RMSE) = 9.66 % and concordance correlation coefficient (CCC) = 0.99], outperforming the other TLS-based approaches (CV-RMSE 12.97–57.45 % and CCC 0.43–0.98 ). Whorl-level individual branch attributes estimates produced from TSMtls showed more accurate results than those produced from TLS data directly.ConclusionsThe results showed that the TSMtls method proposed in this study holds promise for extension to more species and larger areas.  相似文献   

7.

Background

In sub-Saharan Africa, a large proportion of HIV positive patients on antiretroviral therapy (ART) are lost to follow-up, some of whom are dead. The objective of this study was to validate methods used to correct mortality estimates for loss-to-follow-up using a cohort with complete death ascertainment.

Methods

Routinely collected data from HIV patients initiating first line antiretroviral therapy (ART) at the Infectious Diseases Institute (IDI) (Routine Cohort) was used. Three methods to estimate mortality after initiation were: 1) standard Kaplan-Meier estimation (uncorrected method) that uses passively observed data; 2) double-sampling methods by Frangakis and Rubin (F&R) where deaths obtained from patient tracing studies are given a higher weight than those passively ascertained; 3) Nomogram proposed by Egger et al. Corrected mortality estimates in the Routine Cohort, were compared with the estimates from the IDI research observational cohort (Research Cohort), which was used as the “gold-standard”.

Results

We included 5,633 patients from the Routine Cohort and 559 from the Research Cohort. Uncorrected mortality estimates (95% confidence interval [1]) in the Routine Cohort at 1, 2 and 3 years were 5.5% (4.9%–6.3%), 6.6% (5.9%–7.5%) and 7.4% (6.5%–8.5%), respectively. The F&R corrected estimates at 1, 2 and 3 years were 11.2% (5.8%–21.2%), 15.8% (9.9%–24.8%) and 18.5% (12.3% –27.2%) respectively. The estimates obtained from the Research Cohort were 15.6% (12.8%–18.9%), 17.5% (14.6%–21.0%) and 19.0% (15.3%–21.9%) at 1, 2 and 3 years respectively. Using the nomogram method in the Routine Cohort, the corrected programme-level mortality estimate in year 1 was 11.9% (8.0%–15.7%).

Conclusion

Mortality adjustments provided by the F&R and nomogram methods are adequate and should be employed to correct mortality for loss-to-follow-up in large HIV care centres in Sub-Saharan Africa.  相似文献   

8.
Summary The effective diffusion coefficient of oxygen, IDe, was determined in different gel support materials (calcium alginate, -carrageenan, gellan gum, agar and agarose) which are generally used for immobilization of cells. The method used was based upon fitting Crank's model on the experimental data. The model describes the solute diffusion from a well-stirred solution into gel beads which are initially free of solute. The effect of the gel concentration on IDe of oxygen in the gel was investigated. The results showed a decreasing IDe for both agar and agarose at increasing gel concentration. In case of calcium alginate and gellan gum, a maximum in IDe at the intermediate gel concentration was observed. It is hypothesized that this phenomenon is due to a changing gelpore structure at increasing gel concentrations. The IDe of oxygen in calcium alginate, -carrageenan and gellan gum varied from 1.5*10–9 to 2.1*10–9 m2s–1 in the gel concentration range of 0.5 to 5% (w/v).  相似文献   

9.
BackgroundGreat progress has been made toward the elimination of trachoma as a public-health problem. Mathematical and statistical models have been used to forecast when the program will attain the goal of the elimination of active trachoma, defined as prevalence of trachomatous inflammation—follicular in 1–9 year olds (TF1–9) <5%. Here we use program data to create an empirical model predicting the year of attaining global elimination of TF1–9.Methodology/Principal findingsWe calculated the mean number of years (95% CI) observed for an implementation unit (IU) to move from a baseline TF1–9 prevalence ≥5% to the elimination threshold, based on the region (Ethiopia vs. non-Ethiopia) and baseline prevalence category. Ethiopia IUs had significantly different rates of reaching the TF1–9 elimination threshold after a trachoma impact survey (TIS) compared to non-Ethiopia IUs across all baseline categories. We used those estimates to predict when remaining active trachoma-endemic IUs (TF1–9 ≥5%) would have their last round of mass drug administration (MDA) based on the mean number of years required and number of MDA rounds already completed. Our model predicts that elimination of TF1–9 will be achieved in 2028 in Ethiopia (95% CI: 2026–2033) and 2029 outside of Ethiopia (95% CI: 2023–2034), with some IUs in East Africa predicted to be the last requiring MDA globally.Conclusions/SignificanceOur empirical estimate is similar to those resulting from previous susceptible-infectious-susceptible (SIS) and mathematical models, suggesting that the forecast achievement of TF1–9 elimination is realistic with the caveat that although disease elimination progress can be predicted for most IUs, there is an important minority of IUs that is not declining or has not yet started trachoma elimination activities. These IUs represent an important barrier to the timely global elimination of active trachoma.  相似文献   

10.
Climate feedbacks from soils can result from environmental change followed by response of plant and microbial communities, and/or associated changes in nutrient cycling. Explicit consideration of microbial life-history traits and functions may be necessary to predict climate feedbacks owing to changes in the physiology and community composition of microbes and their associated effect on carbon cycling. Here we developed the microbial enzyme-mediated decomposition (MEND) model by incorporating microbial dormancy and the ability to track multiple isotopes of carbon. We tested two versions of MEND, that is, MEND with dormancy (MEND) and MEND without dormancy (MEND_wod), against long-term (270 days) carbon decomposition data from laboratory incubations of four soils with isotopically labeled substrates. MEND_wod adequately fitted multiple observations (total C–CO2 and 14C–CO2 respiration, and dissolved organic carbon), but at the cost of significantly underestimating the total microbial biomass. MEND improved estimates of microbial biomass by 20–71% over MEND_wod. We also quantified uncertainties in parameters and model simulations using the Critical Objective Function Index method, which is based on a global stochastic optimization algorithm, as well as model complexity and observational data availability. Together our model extrapolations of the incubation study show that long-term soil incubations with experimental data for multiple carbon pools are conducive to estimate both decomposition and microbial parameters. These efforts should provide essential support to future field- and global-scale simulations, and enable more confident predictions of feedbacks between environmental change and carbon cycling.  相似文献   

11.

Background

Herpes simplex virus type 2 (HSV-2) infection causes significant disease globally. Adolescent and adult infection may present as painful genital ulcers. Neonatal infection has high morbidity and mortality. Additionally, HSV-2 likely contributes substantially to the spread of HIV infection. The global burden of HSV-2 infection was last estimated for 2003. Here we present new global estimates for 2012 of the burden of prevalent (existing) and incident (new) HSV-2 infection among females and males aged 15–49 years, using updated methodology to adjust for test performance and estimate by World Health Organization (WHO) region.

Methods and Findings

We conducted a literature review of HSV-2 prevalence studies world-wide since 2000. We then fitted a model with constant HSV-2 incidence by age to pooled HSV-2 prevalence values by age and sex. Prevalence values were adjusted for test sensitivity and specificity. The model estimated prevalence and incidence by sex for each WHO region to obtain global burden estimates. Uncertainty bounds were computed by refitting the model to reflect the variation in the underlying prevalence data. In 2012, we estimate that there were 417 million people aged 15–49 years (range: 274–678 million) living with HSV-2 infection world-wide (11.3% global prevalence), of whom 267 million were women. We also estimate that in 2012, 19.2 million (range: 13.0–28.6 million) individuals aged 15–49 years were newly-infected (0.5% of all individuals globally). The highest burden was in Africa. However, despite lower prevalence, South-East Asia and Western Pacific regions also contributed large numbers to the global totals because of large population sizes.

Conclusions

The global burden of HSV-2 infection is large, leaving over 400 million people at increased risk of genital ulcer disease, HIV acquisition, and transmission of HSV-2 to partners or neonates. These estimates highlight the critical need for development of vaccines, microbicides, and other new HSV prevention strategies.  相似文献   

12.
Cryptosporidium parvum, which is resistant to chlorine concentrations typically used in water treatment, is recognized as a significant waterborne pathogen. Recent studies have demonstrated that chlorine dioxide is a more efficient disinfectant than free chlorine against Cryptosporidium oocysts. It is not known, however, if oocysts from different suppliers are equally sensitive to chlorine dioxide. This study used both a most-probable-number–cell culture infectivity assay and in vitro excystation to evaluate chlorine dioxide inactivation kinetics in laboratory water at pH 8 and 21°C. The two viability methods produced significantly different results (P < 0.05). Products of disinfectant concentration and contact time (Ct values) of 1,000 mg · min/liter were needed to inactivate approximately 0.5 log10 and 2.0 log10 units (99% inactivation) of C. parvum as measured by in vitro excystation and cell infectivity, respectively, suggesting that excystation is not an adequate viability assay. Purified oocysts originating from three different suppliers were evaluated and showed marked differences with respect to their resistance to inactivation when using chlorine dioxide. Ct values of 75, 550, and 1,000 mg · min/liter were required to achieve approximately 2.0 log10 units of inactivation with oocysts from different sources. Finally, the study compared the relationship between easily measured indicators, including Bacillus subtilis (aerobic) spores and Clostridium sporogenes (anaerobic) spores, and C. parvum oocysts. The bacterial spores were found to be more sensitive to chlorine dioxide than C. parvum oocysts and therefore could not be used as direct indicators of C. parvum inactivation for this disinfectant. In conclusion, it is suggested that future studies address issues such as oocyst purification protocols and the genetic diversity of C. parvum, since these factors might affect oocyst disinfection sensitivity.  相似文献   

13.
Measuring leaf area index (LAI) is essential for evaluating crop growth and estimating yield, thereby facilitating high-throughput phenotyping of maize (Zea mays). LAI estimation models use multi-source data from unmanned aerial vehicles (UAVs), but using multimodal data to estimate maize LAI, and the effect of tassels and soil background, remain understudied. Our research aims to (1) determine how multimodal data contribute to LAI and propose a framework for estimating LAI based on remote-sensing data, (2) evaluate the robustness and adaptability of an LAI estimation model that uses multimodal data fusion and deep neural networks (DNNs) in single- and whole growth stages, and (3) explore how soil background and maize tasseling affect LAI estimation. To construct multimodal datasets, our UAV collected red–green–blue, multispectral, and thermal infrared images. We then developed partial least square regression (PLSR), support vector regression, and random forest regression models to estimate LAI. We also developed a deep learning model with three hidden layers. This multimodal data structure accurately estimated maize LAI. The DNN model provided the best estimate (coefficient of determination [R2] = 0.89, relative root mean square error [rRMSE] = 12.92%) for a single growth period, and the PLSR model provided the best estimate (R2 = 0.70, rRMSE = 12.78%) for a whole growth period. Tassels reduced the accuracy of LAI estimation, but the soil background provided additional image feature information, improving accuracy. These results indicate that multimodal data fusion using low-cost UAVs and DNNs can accurately and reliably estimate LAI for crops, which is valuable for high-throughput phenotyping and high-spatial precision farmland management.

Multimodal data fusion (red–green–blue, multispectral, and thermal infrared) using low-cost unmanned aerial vehicles in a deep neural network and machine learning framework estimates maize leaf area index  相似文献   

14.
BackgroundReliable and field-applicable diagnosis of schistosome infections in non-human animals is important for surveillance, control, and verification of interruption of human schistosomiasis transmission. This study aimed to summarize uses of available diagnostic techniques through a systematic review and meta-analysis.Methodology and principal findingsWe systematically searched the literature and reports comparing two or more diagnostic tests in non-human animals for schistosome infection. Out of 4,909 articles and reports screened, 19 met our inclusion criteria, four of which were considered in the meta-analysis. A total of 14 techniques (parasitologic, immunologic, and molecular) and nine types of non-human animals were involved in the studies. Notably, four studies compared parasitologic tests (miracidium hatching test (MHT), Kato-Katz (KK), the Danish Bilharziasis Laboratory technique (DBL), and formalin-ethyl acetate sedimentation-digestion (FEA-SD)) with quantitative polymerase chain reaction (qPCR), and sensitivity estimates (using qPCR as the reference) were extracted and included in the meta-analyses, showing significant heterogeneity across studies and animal hosts. The pooled estimate of sensitivity was 0.21 (95% confidence interval (CI): 0.03–0.48) with FEA-SD showing highest sensitivity (0.89, 95% CI: 0.65–1.00).Conclusions/significanceOur findings suggest that the parasitologic technique FEA-SD and the molecular technique qPCR are the most promising techniques for schistosome diagnosis in non-human animal hosts. Future studies are needed for validation and standardization of the techniques for real-world field applications.  相似文献   

15.
Accurate estimates of global carbon emissions are critical for understanding global warming. This paper estimates net carbon emissions from land use change in Bolivia during the periods 1990–2000 and 2000–2010 using a model that takes into account deforestation, forest degradation, forest regrowth, gradual carbon decomposition and accumulation, as well as heterogeneity in both above ground and below ground carbon contents at the 10 by 10 km grid level. The approach permits detailed maps of net emissions by region and type of land cover. We estimate that net CO2 emissions from land use change in Bolivia increased from about 65 million tons per year during 1990–2000 to about 93 million tons per year during 2000–2010, while CO2 emissions per capita and per unit of GDP have remained fairly stable over the sample period. If we allow for estimated biomass increases in mature forests, net CO2 emissions drop to close to zero. Finally, we find these results are robust to alternative methods of calculating emissions.  相似文献   

16.

Background

In the face of an influenza pandemic, accurate estimates of epidemiologic parameters are required to help guide decision-making. We sought to estimate epidemiologic parameters for pandemic H1N1 influenza using data from initial reports of laboratory-confirmed cases.

Methods

We obtained data on laboratory-confirmed cases of pandemic H1N1 influenza reported in the province of Ontario, Canada, with dates of symptom onset between Apr. 13 and June 20, 2009. Incubation periods and duration of symptoms were estimated and fit to parametric distributions. We used competing-risk models to estimate risk of hospital admission and case-fatality rates. We used a Markov Chain Monte Carlo model to simulate disease transmission.

Results

The median incubation period was 4 days and the duration of symptoms was 7 days. Recovery was faster among patients less than 18 years old than among older patients (hazard ratio 1.23, 95% confidence interval 1.06–1.44). The risk of hospital admission was 4.5% (95% CI 3.8%–5.2%) and the case-fatality rate was 0.3% (95% CI 0.1%–0.5%). The risk of hospital admission was highest among patients less than 1 year old and those 65 years or older. Adults more than 50 years old comprised 7% of cases but accounted for 7 of 10 initial deaths (odds ratio 28.6, 95% confidence interval 7.3–111.2). From the simulation models, we estimated the following values (and 95% credible intervals): a mean basic reproductive number (R0, the number of new cases created by a single primary case in a susceptible population) of 1.31 (1.25–1.38), a mean latent period of 2.62 (2.28–3.12) days and a mean duration of infectiousness of 3.38 (2.06–4.69) days. From these values we estimated a serial interval (the average time from onset of infectiousness in a case to the onset of infectiousness in a person infected by that case) of 4–5 days.

Interpretation

The low estimates for R0 indicate that effective mitigation strategies may reduce the final epidemic impact of pandemic H1N1 influenza.The emergence and global spread of pandemic H1N1 influenza led the World Health Organization to declare a pandemic on June 11, 2009. As the pandemic spreads, countries will need to make decisions about strategies to mitigate and control disease in the face of uncertainty.For novel infectious diseases, accurate estimates of epidemiologic parameters can help guide decision-making. A key parameter for any new disease is the basic reproductive number (R0), defined as the average number of new cases created by a single primary case in a susceptible population. R0 affects the growth rate of an epidemic and the final number of infected people. It also informs the optimal choice of control strategies. Other key parameters that affect use of resources, disease burden and societal costs during a pandemic are duration of illness, rate of hospital admission and case-fatality rate. Early in an epidemic, the case-fatality rate may be underestimated because of the temporal lag between onset of infection and death; the delay between initial identification of a new case and death may lead to an apparent increase in deaths several weeks into an epidemic that is an artifact of the natural history of the disease.We used data from initial reports of laboratory-confirmed pandemic H1N1 influenza to estimate epidemiologic parameters for pandemic H1N1 influenza. The parameters included R0, incubation period and duration of illness. We also estimated risk of hospital admission and case-fatality rates, which can be used to estimate the burden of illness likely to be associated with this disease.  相似文献   

17.
It is now clearly established that the transfusion of blood from variant CJD (v-CJD) infected individuals can transmit the disease. Since the number of asymptomatic infected donors remains unresolved, inter-individual v-CJD transmission through blood and blood derived products is a major public health concern. Current risk assessments for transmission of v-CJD by blood and blood derived products by transfusion rely on infectious titers measured in rodent models of Transmissible Spongiform Encephalopathies (TSE) using intra-cerebral (IC) inoculation of blood components. To address the biological relevance of this approach, we compared the efficiency of TSE transmission by blood and blood components when administrated either through transfusion in sheep or by intra-cerebral inoculation (IC) in transgenic mice (tg338) over-expressing ovine PrP. Transfusion of 200 µL of blood from asymptomatic infected donor sheep transmitted prion disease with 100% efficiency thereby displaying greater virulence than the transfusion of 200 mL of normal blood spiked with brain homogenate material containing 103ID50 as measured by intracerebral inoculation of tg338 mice (ID50 IC in tg338). This was consistent with a whole blood titer greater than 103.6 ID50 IC in tg338 per mL. However, when the same blood samples were assayed by IC inoculation into tg338 the infectious titers were less than 32 ID per mL. Whereas the transfusion of crude plasma to sheep transmitted the disease with limited efficacy, White Blood Cells (WBC) displayed a similar ability to whole blood to infect recipients. Strikingly, fixation of WBC with paraformaldehyde did not affect the infectivity titer as measured in tg338 but dramatically impaired disease transmission by transfusion in sheep. These results demonstrate that TSE transmission by blood transfusion can be highly efficient and that this efficiency is more dependent on the viability of transfused cells than the level of infectivity measured by IC inoculation.  相似文献   

18.
The effective population size (Ne) is proportional to the loss of genetic diversity and the rate of inbreeding, and its accurate estimation is crucial for the monitoring of small populations. Here, we integrate temporal studies of the gecko Oedura reticulata, to compare genetic and demographic estimators of Ne. Because geckos have overlapping generations, our goal was to demographically estimate NbI, the inbreeding effective number of breeders and to calculate the NbI/Na ratio (Na = number of adults) for four populations. Demographically estimated NbI ranged from 1 to 65 individuals. The mean reduction in the effective number of breeders relative to census size (NbI/Na) was 0.1 to 1.1. We identified the variance in reproductive success as the most important variable contributing to reduction of this ratio. We used four methods to estimate the genetic based inbreeding effective number of breeders NbI(gen) and the variance effective populations size NeV(gen) estimates from the genotype data. Two of these methods - a temporal moment-based (MBT) and a likelihood-based approach (TM3) require at least two samples in time, while the other two were single-sample estimators - the linkage disequilibrium method with bias correction LDNe and the program ONeSAMP. The genetic based estimates were fairly similar across methods and also similar to the demographic estimates excluding those estimates, in which upper confidence interval boundaries were uninformative. For example, LDNe and ONeSAMP estimates ranged from 14–55 and 24–48 individuals, respectively. However, temporal methods suffered from a large variation in confidence intervals and concerns about the prior information. We conclude that the single-sample estimators are an acceptable short-cut to estimate NbI for species such as geckos and will be of great importance for the monitoring of species in fragmented landscapes.  相似文献   

19.
Scrapie of sheep and chronic wasting disease (CWD) of cervids are transmissible prion diseases. Milk and placenta have been identified as sources of scrapie prions but do not explain horizontal transmission. In contrast, CWD prions have been reported in saliva, urine and feces, which are thought to be responsible for horizontal transmission. While the titers of CWD prions have been measured in feces, levels in saliva or urine are unknown. Because sheep produce ∼17 L/day of saliva and scrapie prions are present in tongue and salivary glands of infected sheep, we asked if scrapie prions are shed in saliva. We inoculated transgenic (Tg) mice expressing ovine prion protein, Tg(OvPrP) mice, with saliva from seven Cheviot sheep with scrapie. Six of seven samples transmitted prions to Tg(OvPrP) mice with titers of −0.5 to 1.7 log ID50 U/ml. Similarly, inoculation of saliva samples from two mule deer with CWD transmitted prions to Tg(ElkPrP) mice with titers of −1.1 to −0.4 log ID50 U/ml. Assuming similar shedding kinetics for salivary prions as those for fecal prions of deer, we estimated the secreted salivary prion dose over a 10-mo period to be as high as 8.4 log ID50 units for sheep and 7.0 log ID50 units for deer. These estimates are similar to 7.9 log ID50 units of fecal CWD prions for deer. Because saliva is mostly swallowed, salivary prions may reinfect tissues of the gastrointestinal tract and contribute to fecal prion shedding. Salivary prions shed into the environment provide an additional mechanism for horizontal prion transmission.Key words: scrapie, chronic wasting disease, saliva, horizontal transmission, titers  相似文献   

20.
Knowledge of survival rates of Neotropical landbirds remains limited, with estimates of apparent survival available from relatively few sites and species. Previously, capture-mark-recapture models were used to estimate apparent survival of 31 species (30 passerines, 1 Trochilidae) from eastern Ecuador based on data collected from 2001 to 2006. Here, estimates are updated with data from 2001-2012 to determine how additional years of data affect estimates; estimates for six additional species are provided. Models assuming constant survival had highest support for 19 of 31 species when based on 12 years of data compared to 27 when based on six; models incorporating effects of transients had the highest support for 12 of 31 species compared to four when based on 12 and six years, respectively. Average apparent survival based on the most highly-supported model (based on model averaging, when appropriate) was 0.59 (± 0.02 SE) across 30 species of passerines when based on 12 years and 0.57 (± 0.02) when based on six. Standard errors of survival estimates based on 12 years were approximately half those based on six years. Of 31 species in both data sets, estimates of apparent survival were somewhat lower for 13, somewhat higher for 17, and remained unchanged for one; confidence intervals for estimates based on six and 12 years of data overlapped for all species. Results indicate that estimates of apparent survival are comparable but more precise when based on longer-term data sets; standard error of the estimates was negatively correlated with numbers of captures (rs = −0.72) and recaptures (rs = −0.93, P<0.001 in both cases). Thus, reasonable estimates of apparent survival may be obtained with relatively few years of data if sample sizes are sufficient.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号