首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
By a suitable transformation of the pairs of observations obtained in the successive periods of the trial, bioequivalence assessment in a standard comparative bioavailability study reduces to testing for equivalence of two continuous distributions from which unrelated samples are available. Let the two distribution functions be given by F(x) = P[Xx], G(y) = P[Yy] with (X, Y) denoting an independent pair of real-valued random variables. An intuitively appealing way of putting the notion of equivalence of F and G into nonparametric terms can be based on the distance of the functional P[X > Y] from the value it takes if F and G coincide. This leads to the problem of testing the null hypothesis Ho P[X > Y] ≤ 1/2 - ε1 or P[X > Y] ≥ 1/2 + ε2 versus H1 : 1/2 ? ε1 < P[X > Y] < 1/2 + ∈2, with sufficiently small ε1, ε2 ∈ (0, 1/2). The testing procedure we derive for (0, H1) and propose to term Mann-Whitney test for equivalence, consists of carrying out in terms of the U-statistics estimator of P[X > Y] the uniformly most powerful level a test for an interval hypothesis about the mean of a Gaussian distribution with fixed variance. The test is shown to be asymptotically distribution-free with respect to the significance level. In addition, results of an extensive simulation study are presented which suggest that the new test controls the level even with sample sizes as small as 10. For normally distributed data, the loss in power as against the optimal parametric procedure is found to be almost as small as in comparisons between the Mann-Whitney and the t-statistic in the conventional one or two-sided setting, provided the power of the parametric test does not fall short of 80%.  相似文献   

2.
A novel approach to population-level assessment was applied in order to demonstrate its utility in estimating and managing the risk of zinc in a water environment. Much attention has been paid to population-level risk assessment, but there have been no attempts to determine a “safe” population-level concentration as an environmental criterion. Based on the published results of toxicity tests for various species, we first theoretically derived a threshold concentration at which a population size is unchanged due to the adverse effects of zinc exposure. To derive a zinc concentration that will protect populations in natural environments, we adopted the concept of species sensitivity distribution. Assuming the threshold concentrations of a set of species are log-normally distributed, we calculated the 95% protection level of zinc (PHC5 :population-level hazardous concentration of 5% of species), which is 107 μg/L. Meanwhile, the 95% protection criterion (HC5) based on conventional individual-level chronic toxicity, was calculated to be 14.6 μg/L. The environmentally “safe” concentration for a population-level endpoint is about 7 times greater than that for an individual-level endpoint. The proposed method provides guidance for a pragmatic approach to population-level ecological risk assessment and the management of chemicals.  相似文献   

3.
The current study focuses on identification and prioritization of the most important risks affecting a gas power plant located in southern Iran that was selected as a case study. After identifying risky activities, plant operations, and natural disasters, a Delphi questionnaire was prepared to specify crisis and accident-prone centers that could lead to the plant's destruction. After analyzing the questionnaires, the final criteria were determined. Subsequently, multi-criteria decision-making methods including technique for order preference by similarity to ideal solution (TOPSIS) and analytical hierarchy process (AHP) were applied to prioritize the identified criteria. The relative weights of the criteria were calculated using the eigenvector method in the environment of MATLAB and EXPERT CHOICE. In some cases, there was no correlation between the obtained results, resulting in a novel integrated technique consisting of three methods: average, Borda, and Copeland was used to reach a consensus for prioritizing the criteria. The risk assessment results indicate that gas and oil pipes, dust storm, and terrorism have the first to third priorities among the other risks. Some strategies are proposed to control and mitigate the identified risks.  相似文献   

4.
5.
6.
This paper proposes a statistical generalized species-area model (GSAM) to represent various patterns of species-area relationship (SAR), which is one of the fundamental patterns in ecology. The approach enables the generalization of many preliminary models, as power-curve model, which is commonly used to mathematically describe the SAR. The GSAM is applied to simulated data set of species diversity in areas of different sizes and a real-world data of insects of Hymenoptera order has been modeled. We show that the GSAM enables the identification of the best statistical model and estimates the number of species according to the area.  相似文献   

7.
The vasculature of body tissues is continuously subject to remodeling processes originating at the micro-vascular level. The formation of new blood vessels (angiogenesis) is essential for a number of physiological and pathophysiological processes such as tissue regeneration, tumor development and the integration of artificial tissues. There are currently no time-lapsed in vivo imaging techniques providing information on the vascular network at the capillary level in a non-destructive, three-dimensional and high-resolution fashion. This paper presents a novel imaging framework based on contrast enhanced micro-computed tomography (micro-CT) for hierarchical in vivo quantification of blood vessels in mice, ranging from largest to smallest structures. The framework combines for the first time a standard morphometric approach with densitometric analysis. Validation tests showed that the method is precise and robust. Furthermore, the framework is sensitive in detecting different perfusion levels after the implementation of a murine ischemia-reperfusion model. Correlation with both histological data and micro-CT analysis of vascular corrosion casts confirmed accuracy of the method. The newly developed time-lapsed imaging approach shows high potential for in vivo monitoring of a number of different physiological and pathological conditions in angiogenesis and vascular development.  相似文献   

8.

Background

Simple surgical intervention advocated by the World Health Organization can alleviate trachomatous trichiasis (TT) and prevent subsequent blindness. A large backlog of TT cases remain unidentified and untreated. To increase identification and referral of TT cases, a novel approach using standard screening questions, a card, and simple training for Community Treatment Assistants (CTAs) to use during Mass Drug Administration (MDA) was developed and evaluated in Kongwa District, a trachoma-endemic area of central Tanzania.

Methodology/Principal Findings

A community randomized trial was conducted in 36 communities during MDA. CTAs in intervention villages received an additional half-day of training and a TT screening card in addition to the training received by CTAs in villages assigned to usual care. All MDA participants 15 years and older were screened for TT, and senior TT graders confirmed case status by evaluating all screened-positive cases. A random sample of those screened negative for TT and those who did not present at MDA were also evaluated by the master graders. Intervention CTAs identified 5.6 times as many cases (n = 50) as those assigned to usual care (n = 9, p < 0.05). While specificity was above 90% for both groups, the sensitivity for the novel screening tool was 31.2% compared to 5.6% for the usual care group (p < 0.05).

Conclusions/Significance

CTAs appear to be viable resources for the identification of TT cases. Additional training and use of a TT screening card significantly increased the ability of CTAs to recognize and refer TT cases during MDA; however, further efforts are needed to improve case detection and reduce the number of false positive cases.  相似文献   

9.
Pei-Hsun Wu 《Biophysical journal》2009,96(12):5103-5111
Video-based particle tracking monitors the microscopic movement of labeled biomolecules and fluorescent probes within a complex cellular environment. Information gained from this technique enables us to extract the dynamic behavior of biomolecules and the local mechanical properties inside the cell from a tracked particle's mean-square displacement (MSD). However, MSD measurements are highly susceptible to static error introduced by noise in the image acquisition process that leads to an incorrect positioning of the particle. Static error can mask the subtle effects from the local microenvironment on the MSD and potentially generate misleading conclusions about the biophysical properties of cells. An approach that greatly increases the accuracy of MSD measurements is presented herein by combining experimental data with Monte Carlo simulations to eliminate the inherent static error. This practical method of static error correction greatly advances particle-tracking techniques.  相似文献   

10.
A method is described for estimating changes in cell cycle times during periods of rapid change in proliferation rate. This method, which depends upon the interpretation of pre- and post-velocity sedimentation fractionation continuous thymidine labelling patterns, exploits the relationship between sedimentation rate and cell cycle location. By this means, cycle times can be estimated under conditions that are difficult (if not impossible) to analyse by FLM methods.  相似文献   

11.

Background

While CD4 strongly predicts mortality on antiretroviral therapy (ART), estimates from programmatic data suffer from incomplete patient outcomes.

Methods

We conducted a pooled analysis of one-year mortality data on ART accounting for lost patients. We identified articles reporting one-year mortality by ART initiation CD4 count. We estimated the average mortality among those lost as the value that maximizes the fit of a regression of the natural log of mortality on the natural log of the imputed mean CD4 count in each band.

Results

We found 20 studies representing 64,426 subjects and 51 CD4 observations. Without correcting for losses, one-year mortality was >4.8% for all CD4 counts <200 cells/mm3. When searching over different values for mortality among those lost, the best fitting model occurs at 60% mortality. In this model, those with a CD4≤200 had a one-year mortality above 8.7, while those with a CD4>500 had a one-year mortality <6.8%. Comparing those starting ART at 500 vs. 50, one-year mortality risk was reduced by 54% (6.8 vs. 12.5%). Regardless of CD4 count, mortality was substantially higher than when assuming no mortality among those lost, ranging from a 23–94% increase.

Conclusions

Our best fitting regression estimates that every 10% increase in CD4 count at initiation is associated with a 2.8% decline in one-year mortality, including those lost. Our study supports the health benefits of higher thresholds for CD4 count initiation and suggests that reports of programmatic ART outcomes can and should adjust results for mortality among those lost.  相似文献   

12.
An algebraic and geometrical approach is used to describe the primaeval RNA code and a proposed Extended RNA code. The former consists of all codons of the type RNY, where R means purines, Y pyrimidines, and N any of them. The latter comprises the 16 codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type), and the third, (YRN type) reading frames. In each of these reading frames, there are 16 triplets that altogether complete a set of 48 triplets, which specify 17 out of the 20 amino acids, including AUG, the start codon, and the three known stop codons. The other 16 codons, do not pertain to the Extended RNA code and, constitute the union of the triplets YYY and RRR that we define as the RNA-less code. The codons in each of the three subsets of the Extended RNA code are represented by a four-dimensional hypercube and the set of codons of the RNA-less code is portrayed as a four-dimensional hyperprism. Remarkably, the union of these four symmetrical pairwise disjoint sets comprises precisely the already known six-dimensional hypercube of the Standard Genetic Code (SGC) of 64 triplets. These results suggest a plausible evolutionary path from which the primaeval RNA code could have originated the SGC, via the Extended RNA code plus the RNA-less code. We argue that the life forms that probably obeyed the Extended RNA code were intermediate between the ribo-organisms of the RNA World and the last common ancestor (LCA) of the Prokaryotes, Archaea, and Eucarya, that is, the cenancestor. A general encoding function, E, which maps each codon to its corresponding amino acid or the stop signal is also derived. In 45 out of the 64 cases, this function takes the form of a linear transformation F, which projects the whole six-dimensional hypercube onto a four-dimensional hyperface conformed by all triplets that end in cytosine. In the remaining 19 cases the function E adopts the form of an affine transformation, i.e., the composition of F with a particular translation. Graphical representations of the four local encoding functions and E, are illustrated and discussed. For every amino acid and for the stop signal, a single triplet, among those that specify it, is selected as a canonical representative. From this mapping a graphical representation of the 20 amino acids and the stop signal is also derived. We conclude that the general encoding function E represents the SGC itself.  相似文献   

13.
The reliable reconstruction of tree topology from a set of homologous sequences is one of the main goals in the study of molecular evolution. If consistent estimators of distances from a multiple sequence alignment are known, the distance method is attractive because the tree reconstruction is consistent. To obtain a distance estimate d, the observed proportion of differences p (p-distance) is usually ``corrected' for multiple and back substitutions by means of a functional relationship d=f(p). In this paper the conditions under which this correction of p-distances will not alter the selection of the tree topology are specified. When these conditions are not fulfilled the selection of the tree topology may depend on the correction function applied. A novel method which includes estimates of distances not only between sequence pairs, but between triplets, quadruplets, etc., is proposed to strengthen the proper selection of correction function and tree topology. A ``super' tree that includes all tree topologies as special cases is introduced. Received: 17 February 1998 / Accepted: 20 July 1998  相似文献   

14.
Abstract

Starting from (+)-endo-5-norbornen-2-yl acetate (1) (-)-1-[(1R, 3R, 4R)-3-hydroxy-4-hydroxymethylcyclopentyl]-1H, 3H-pyrinidine-2, 4-dione (7) was synthesized in a 6-step sequence.  相似文献   

15.

Background

Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a valuable tool for clinicians for the differential diagnosis of anaemia. This study describes a specific and non-operator demanding immunoassay for hepcidin quantification in human sera.

Methods and Findings

An ELISA assay was developed for measuring hepcidin serum concentration using a recombinant hepcidin25-His peptide and a polyclonal antibody against this peptide, which was able to identify native hepcidin. The ELISA assay had a detection range of 10–1500 µg/L and a detection limit of 5.4 µg/L. The intra- and interassay coefficients of variance ranged from 8–15% and 5–16%, respectively. Mean linearity and recovery were 101% and 107%, respectively. Mean hepcidin levels were significantly lower in 7 patients with juvenile hemochromatosis (12.8 µg/L) and 10 patients with iron deficiency anemia (15.7 µg/L) and higher in 7 patients with Hodgkin lymphoma (116.7 µg/L) compared to 32 age-matched healthy controls (42.7 µg/L).

Conclusions

We describe a new simple ELISA assay for measuring hepcidin in human serum with sufficient accuracy and reproducibility.  相似文献   

16.
Cryopreservation is an efficient way to store spermatozoa and plays a critical role in the livestock industry as well as in clinical practice. During cryopreservation, cryo-stress causes substantial damage to spermatozoa. In present study, the effects of cryo-stress at various cryopreservation steps, such as dilution / cooling, adding cryoprtectant, and freezing were studied in spermatozoa collected from 9 individual bull testes. The motility (%), motion kinematics, capacitation status, mitochondrial activity, and viability of bovine spermatozoa at each step of the cryopreservation process were assessed using computer-assisted sperm analysis, Hoechst 33258/chlortetracycline fluorescence, rhodamine 123 staining, and hypo-osmotic swelling test, respectively. The results demonstrate that the cryopreservation steps reduced motility (%), rapid speed (%), and mitochondrial activity, whereas medium/slow speed (%), and the acrosome reaction were increased (P < 0.05). Differences (Δ) of the acrosome reaction were higher in dilution/cooling step (P < 0.05), whereas differences (Δ) of motility, rapid speed, and non-progressive motility were higher in cryoprotectant and freezing as compared to dilution/cooling (P < 0.05). On the other hand, differences (Δ) of mitochondrial activity, viability, and progressive motility were higher in freezing step (P < 0.05) while the difference (Δ) of the acrosome reaction was higher in dilution/cooling (P < 0.05). Based on these results, we propose that freezing / thawing steps are the most critical in cryopreservation and may provide a logical ground of understanding on the cryo-damage. Moreover, these sperm parameters might be used as physical markers of sperm cryo-damage.  相似文献   

17.
A dearth of information obscures the true scale of the global illegal trade in wildlife. Herein, we introduce an automated web crawling surveillance system developed to monitor reports on illegally traded wildlife. A resource for enforcement officials as well as the general public, the freely available website, http://www.healthmap.org/wildlifetrade, provides a customizable visualization of worldwide reports on interceptions of illegally traded wildlife and wildlife products. From August 1, 2010 to July 31, 2011, publicly available English language illegal wildlife trade reports from official and unofficial sources were collected and categorized by location and species involved. During this interval, 858 illegal wildlife trade reports were collected from 89 countries. Countries with the highest number of reports included India (n = 146, 15.6%), the United States (n = 143, 15.3%), South Africa (n = 75, 8.0%), China (n = 41, 4.4%), and Vietnam (n = 37, 4.0%). Species reported as traded or poached included elephants (n = 107, 12.5%), rhinoceros (n = 103, 12.0%), tigers (n = 68, 7.9%), leopards (n = 54, 6.3%), and pangolins (n = 45, 5.2%). The use of unofficial data sources, such as online news sites and social networks, to collect information on international wildlife trade augments traditional approaches drawing on official reporting and presents a novel source of intelligence with which to monitor and collect news in support of enforcement against this threat to wildlife conservation worldwide.  相似文献   

18.
Bronchial thermoplasty is a non-drug procedure for severe persistent asthma that delivers thermal energy to the airway wall in a precisely controlled manner to reduce excessive airway smooth muscle. Reducing airway smooth muscle decreases the ability of the airways to constrict, thereby reducing the frequency of asthma attacks. Bronchial thermoplasty is delivered by the Alair System and is performed in three outpatient procedure visits, each scheduled approximately three weeks apart. The first procedure treats the airways of the right lower lobe, the second treats the airways of the left lower lobe and the third and final procedure treats the airways in both upper lobes. After all three procedures are performed the bronchial thermoplasty treatment is complete.Bronchial thermoplasty is performed during bronchoscopy with the patient under moderate sedation. All accessible airways distal to the mainstem bronchi between 3 and 10 mm in diameter, with the exception of the right middle lobe, are treated under bronchoscopic visualization. Contiguous and non-overlapping activations of the device are used, moving from distal to proximal along the length of the airway, and systematically from airway to airway as described previously. Although conceptually straightforward, the actual execution of bronchial thermoplasty is quite intricate and procedural duration for the treatment of a single lobe is often substantially longer than encountered during routine bronchoscopy. As such, bronchial thermoplasty should be considered a complex interventional bronchoscopy and is intended for the experienced bronchoscopist. Optimal patient management is critical in any such complex and longer duration bronchoscopic procedure. This article discusses the importance of careful patient selection, patient preparation, patient management, procedure duration, postoperative care and follow-up to ensure that bronchial thermoplasty is performed safely.Bronchial thermoplasty is expected to complement asthma maintenance medications by providing long-lasting asthma control and improving asthma-related quality of life of patients with severe asthma. In addition, bronchial thermoplasty has been demonstrated to reduce severe exacerbations (asthma attacks) emergency rooms visits for respiratory symptoms, and time lost from work, school and other daily activities due to asthma.Download video file.(90M, mov)  相似文献   

19.
High-throughput shotgun sequence data make it possible in principle to accurately estimate population genetic parameters without confounding by SNP ascertainment bias. One such statistic of interest is the proportion of heterozygous sites within an individual’s genome, which is informative about inbreeding and effective population size. However, in many cases, the available sequence data of an individual are limited to low coverage, preventing the confident calling of genotypes necessary to directly count the proportion of heterozygous sites. Here, we present a method for estimating an individual’s genome-wide rate of heterozygosity from low-coverage sequence data, without an intermediate step that calls genotypes. Our method jointly learns the shared allele distribution between the individual and a panel of other individuals, together with the sequencing error distributions and the reference bias. We show our method works well, first, by its performance on simulated sequence data and, second, on real sequence data where we obtain estimates using low-coverage data consistent with those from higher coverage. We apply our method to obtain estimates of the rate of heterozygosity for 11 humans from diverse worldwide populations and through this analysis reveal the complex dependency of local sequencing coverage on the true underlying heterozygosity, which complicates the estimation of heterozygosity from sequence data. We show how we can use filters to correct for the confounding arising from sequencing depth. We find in practice that ratios of heterozygosity are more interpretable than absolute estimates and show that we obtain excellent conformity of ratios of heterozygosity with previous estimates from higher-coverage data.  相似文献   

20.

Background

Zoonoses account for over half of all communicable diseases causing illness in humans. As there are limited resources available for the control and prevention of zoonotic diseases, a framework for their prioritization is necessary to ensure resources are directed into those of highest importance. Although zoonotic outbreaks are a significant burden of disease in North America, the systematic prioritization of zoonoses in this region has not been previously evaluated.

Methodology/Principal Findings

This study describes the novel use of a well-established quantitative method, conjoint analysis (CA), to identify the relative importance of 21 key characteristics of zoonotic diseases that can be used for their prioritization in Canada and the US. Relative importance weights from the CA were used to develop a point-scoring system to derive a recommended list of zoonoses for prioritization in Canada and the US. Over 1,500 participants from the general public were recruited to complete the online survey (761 from Canada and 778 from the US). Hierarchical Bayes models were fitted to the survey data to derive CA-weighted scores. Scores were applied to 62 zoonotic diseases of public health importance in Canada and the US to rank diseases in order of priority.

Conclusions/Significance

This was the first study to describe a systematic and quantitative approach to the prioritization of zoonoses in North America involving public participants. We found individuals with no prior knowledge or experience in prioritizing zoonoses were capable of producing meaningful results using CA as a novel quantitative approach to prioritization. More similarities than differences were observed between countries suggesting general agreement in disease prioritization between Canadians and Americans. We demonstrate CA as a potential tool for the prioritization of zoonoses; other prioritization exercises may also consider this approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号