首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
Terrestrial lichen biomass is an important indicator of forage availability for caribou in northern regions, and can indicate vegetation shifts due to climate change, air pollution or changes in vascular plant community structure. Techniques for estimating lichen biomass have traditionally required destructive harvesting that is painstaking and impractical, so we developed models to estimate biomass from relatively simple cover and height measurements. We measured cover and height of forage lichens (including single-taxon and multi-taxa “community” samples, n = 144) at 73 sites on the Seward Peninsula of northwestern Alaska, and harvested lichen biomass from the same plots. We assessed biomass-to-volume relationships using zero-intercept regressions, and compared differences among two non-destructive cover estimation methods (ocular vs. point count), among four landcover types in two ecoregions, and among single-taxon vs. multi-taxa samples. Additionally, we explored the feasibility of using lichen height (instead of volume) as a predictor of stand-level biomass. Although lichen taxa exhibited unique biomass and bulk density responses that varied significantly by growth form, we found that single-taxon sampling consistently under-estimated true biomass and was constrained by the need for taxonomic experts. We also found that the point count method provided little to no improvement over ocular methods, despite increased effort. Estimated biomass of lichen-dominated communities (mean lichen cover: 84.9±1.4%) using multi-taxa, ocular methods differed only nominally among landcover types within ecoregions (range: 822 to 1418 g m−2). Height alone was a poor predictor of lichen biomass and should always be weighted by cover abundance. We conclude that the multi-taxa (whole-community) approach, when paired with ocular estimates, is the most reasonable and practical method for estimating lichen biomass at landscape scales in northwest Alaska.  相似文献   

3.

Background

Individual participant data (IPD) meta-analyses that obtain “raw” data from studies rather than summary data typically adopt a “two-stage” approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of “one-stage” approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare “two-stage” and “one-stage” models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way.

Methods and Findings

We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97). Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model.

Conclusions

For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled trials. Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials.  相似文献   

4.
5.
Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17°C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.  相似文献   

6.
Cellulosic feedstocks for bioenergy differ in composition and processing requirements for efficient conversion to chemicals and fuels. This study discusses and compares the processing requirements for three lignocellulosic feedstocks??soybean hulls, wheat straw, and de-starched wheat bran. They were ground with a hammer mill to investigate how differences in composition and particle size affect the hydrolysis process. Enzyme hydrolysis was conducted using cellulase from Trichoderma reesei at 50°C and pH 5. Ground fractions were also subjected to dilute sulfuric acid treatment at 125°C, 15 psi for 30 min prior to cellulase treatment. Reducing particle size of biomass resulted in segregated components of feedstock. Grinding wheat straw to particle size <132 ??m resulted in measured lignin content from 20% to ??5% and reduced hemicellulose content. Reducing lignin content increased the effectiveness of enzyme hydrolysis of wheat straw. Particles sized <132 ??m exhibited the highest soluble sugar release upon hydrolysis for all three feedstocks studied. Hemicellulose digestion improved with dilute sulfuric acid treatment with residual hemicellulose content <5% in all three feedstocks after acid treatment. This enhanced the cellulase action and resulted in approximately 1.6-fold increase in sugar availability in de-starched wheat bran and ??1.5-fold for wheat straw and soybean hulls. Higher sugar availability in wheat bran after acid-mediated enzyme treatment correlated to higher ethanol yields during yeast fermentation compared with soybean hulls and wheat straw.  相似文献   

7.
RNA-Seq已成为当前转录组学研究的强有力工具,尤其在肿瘤差异表达基因的筛选方面有重要的应用价值。为进一步阐明肝细胞癌(HCC)的分子机制,本研究对GEO中1个包括12对HCC组织标本的RNA-Seq数据集(GSE63863)进行了生物信息学分析。采用edgeR、DESeq2、voom等3种不同算法的软件进行统计分析,共获得976个差异表达基因(adj. p-value<0.01或FDR<0.01,|logFC|≥2),其中上调表达422个(43.2%),下调554个(56.8%)。GO富集分析显示这些差异表达基因主要涉及离子结合、氧化还原酶活性等分子功能以及氧化还原、细胞分裂等生物学过程;KEGG通路分析显示,这些差异表达基因主要涉及细胞周期、视黄醇等代谢通路。STRING分析显示,共有654个基因编码的蛋白质存在相互作用,进一步利用MCODE分析显示,169个基因编码蛋白构成4个子网络,相应的中心节点基因分别为UBE2C、GNG4、TTR、FOS,这些基因的异常表达可能在HCC的发生发展过程中具有重要作用。上述研究结果将为进一步阐明HCC分子发病机制、寻找新型生物标志物提供初步的依据。  相似文献   

8.
Humans are potentially exposed to phthalate esters (PEs) through ingestion, inhalation, and dermal contact. Studies quantifying exposure to PEs include “biomarker studies” and “indirect studies.” Biomarker studies use measurements of PE metabolites in urine to back-calculate exposure to the parent diester, while indirect studies use the concentration of the PE in each medium of exposure and the rate of intake of that medium to quantify intake of the PE. In this review, exposure estimates from biomarker and indirect studies are compiled and compared for seven PEs to determine if there are regional differences and if there is a preferred approach. The indirect and biomarker methods generally agree with each other within an order of magnitude and discrepancies are explained by difficulties in accounting for use of consumer products, uncertainty concerning absorption, regional differences, and temporal changes. No single method is preferred for estimating intake of all PEs; it is suggested that biomarker estimates be used for low molecular weight PEs for which it is difficult to quantify all sources of exposure and either indirect or biomarker methods be used for higher molecular weight PEs. The indirect methods are useful in identifying sources of exposure while the biomarker methods quantify exposure.  相似文献   

9.
10.
A comparison of the results of the antimicrobial susceptibility tests by the agar-dilution method, the agar-diffusion method, the Autobac-1 system and the MS2 system has been made in a number of Escherichia coli and Staphylococcus aureus strains. With the agar-dilution method as a reference the agar-diffusion method and the MS2 system showed mostly satisfactory scores. With the Autobac-1 system, Esch. coli strains gave several insufficient scores with some antibiotics. A study of the distribution of minimal inhibitory concentrations against the strains has enabled the explanation of some of the discrepancies that were observed. Furthermore, the influence of two different sets of interpretive criteria on the outcome of the results has been studied.  相似文献   

11.
12.
This article is concerned with the comparison of slope estimator precision in regression analysis and the structural relationships approach (e.g., Humak, 1983, ch. 3; Kendall and Stuart, 1977), as it is relevant for their applications when testing for initial value dependence in biomedical and behavioral contexts of repeated assessments (e.g., Blomqvist, 1977; Wall, 1977). As a basis for the comparison of the two methods, the mean square error is adopted. In the general case, it is argued for an informed (data-dependent) choice between regression analysis and the structural relationships approach. For the apparent majority of biomedical and behavioral studies of the phenomenon of initial value depenence, this comparison suggests that structural relationships is the preferable approach leading to more trustworthy substantive conclusions.  相似文献   

13.
A key step in the analysis of circadian data is to make an accurate estimate of the underlying period. There are many different techniques and algorithms for determining period, all with different assumptions and with differing levels of complexity. Choosing which algorithm, which implementation and which measures of accuracy to use can offer many pitfalls, especially for the non-expert. We have developed the BioDare system, an online service allowing data-sharing (including public dissemination), data-processing and analysis. Circadian experiments are the main focus of BioDare hence performing period analysis is a major feature of the system. Six methods have been incorporated into BioDare: Enright and Lomb-Scargle periodograms, FFT-NLLS, mFourfit, MESA and Spectrum Resampling. Here we review those six techniques, explain the principles behind each algorithm and evaluate their performance. In order to quantify the methods'' accuracy, we examine the algorithms against artificial mathematical test signals and model-generated mRNA data. Our re-implementation of each method in Java allows meaningful comparisons of the computational complexity and computing time associated with each algorithm. Finally, we provide guidelines on which algorithms are most appropriate for which data types, and recommendations on experimental design to extract optimal data for analysis.  相似文献   

14.
Several natural language processing tools, both commercial and freely available, are used to extract protein interactions from publications. Methods used by these tools include pattern matching to dynamic programming with individual recall and precision rates. A methodical survey of these tools, keeping in mind the minimum interaction information a researcher would need, in comparison to manual analysis has not been carried out. We compared data generated using some of the selected NLP tools with manually curated protein interaction data (PathArt and IMaps) to comparatively determine the recall and precision rate. The rates were found to be lower than the published scores when a normalized definition for interaction is considered. Each data point captured wrongly or not picked up by the tool was analyzed. Our evaluation brings forth critical failures of NLP tools and provides pointers for the development of an ideal NLP tool.  相似文献   

15.

Background

KRAS mutation assays are important companion diagnostic tests to guide anti-EGFR antibody treatment of metastatic colorectal cancer. Direct comparison of newer diagnostic methods with existing methods is an important part of validation of any new technique. In this this study, we have compared the Therascreen (Qiagen) ARMS assay with Competitive Allele-Specific TaqMan PCR (castPCR, Life Technologies) to determine equivalence for KRAS mutation analysis.

Methods

DNA was extracted by Maxwell (Promega) from 99 colorectal cancers. The ARMS-based Therascreen and a customized castPCR assay were performed according to the manufacturer’s instructions. All assays were performed on either an Applied Biosystems 7500 Fast Dx or a ViiA7 real-time PCR machine (both from Life Technologies). The data were collected and discrepant results re-tested with newly extracted DNA from the same blocks in both assay types.

Results

Of the 99 tumors included, Therascreen showed 62 tumors to be wild-type (WT) for KRAS, while 37 had KRAS mutations on initial testing. CastPCR showed 61 tumors to be wild-type (WT) for KRAS, while 38 had KRAS mutations. Thirteen tumors showed BRAF mutation in castPCR and in one of these there was also a KRAS mutation. The custom castPCR plate included several other KRAS mutations and BRAF V600E, not included in Therascreen, explaining the higher number of mutations detected by castPCR. Re-testing of discrepant results was required in three tumors, all of which then achieved concordance for KRAS. CastPCR assay Ct values were on average 2 cycles lower than Therascreen.

Conclusion

There was excellent correlation between the two methods. Although castPCR assay shows lower Ct values than Therascreen, this is unlikely to be clinically significant.  相似文献   

16.
Primary analysis of case-control studies focuses on the relationship between disease (D) and a set of covariates of interest (Y,X). A secondary application of the case-control study, often invoked in modern genetic epidemiologic association studies, is to investigate the interrelationship between the covariates themselves. The task is complicated due to the case-control sampling, and to avoid the biased sampling that arises from the design, it is typical to use the control data only. In this paper, we develop penalized regression spline methodology that uses all the data, and improves precision of estimation compared to using only the controls. A simulation study and an empirical example are used to illustrate the methodology.  相似文献   

17.
A Comparison of Pressure-Volume Curve Data Analysis Techniques   总被引:9,自引:0,他引:9  
Schulte, P. J. and Hinckley, T. M. 1985. A comparison of pressure-volumecurve data analysis techniques.—J. exp. Bot. 36: 1590–1602. Computer assisted analysis of data derived with the pressure-volumetechnique is currently feasible. In this study, various computeralgorithms were used to analyse a variety of pressure-volumecurve data sets. Comparisons were made with respect to estimatesof osmotic potential, turgor loss point, symplastic fraction,and bulk modulus of elasticity. While osmotic potential estimationwas fairly insensitive to the model used, estimates of the bulkmodulus of elasticity appear to be highly dependent on the modelused for analysis of the data. Key words: Pressure-volume, computer analysis, elasticity  相似文献   

18.
Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989–1991), 2 (1993–1995), and 3 (1998–1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.  相似文献   

19.

Background

Traditional methods using microscopy for the detection of helminth infections have limited sensitivity. Polymerase chain reaction (PCR) assays enhance detection of helminths, particularly low burden infections. However, differences in test performance may modify the ability to detect associations between helminth infection, risk factors, and sequelae. We compared these associations using microscopy and PCR.

Methods

This cross-sectional study was nested within a randomized clinical trial conducted at 3 sites in Kenya. We performed microscopy and real-time multiplex PCR for the stool detection and quantification of Ascaris lumbricoides, Necator americanus, Ancylostoma duodenale, Strongyloides stercoralis, and Schistosoma species. We utilized regression to evaluate associations between potential risk factors or outcomes and infection as detected by either method.

Results

Of 153 HIV-positive adults surveyed, 55(36.0%) and 20(13.1%) were positive for one or more helminth species by PCR and microscopy, respectively (p<0.001). PCR-detected infections were associated with farming (Prevalence Ratio 1.57, 95% CI: 1.02, 2.40), communal water source (PR 3.80, 95% CI: 1.01, 14.27), and no primary education (PR 1.54, 95% CI: 1.14, 2.33), whereas microscopy-detected infections were not associated with any risk factors under investigation. Microscopy-detected infections were associated with significantly lower hematocrit and hemoglobin (means of -3.56% and -0.77 g/dl) and a 48% higher risk of anemia (PR 1.48, 95% CI: 1.17, 1.88) compared to uninfected. Such associations were absent for PCR-detected infections unless infection intensity was considered, Infections diagnosed with either method were associated with increased risk of eosinophilia (PCR PR 2.42, 95% CI: 1.02, 5.76; microscopy PR 2.92, 95% CI: 1.29, 6.60).

Conclusion

Newer diagnostic methods, including PCR, improve the detection of helminth infections. This heightened sensitivity may improve the identification of risk factors for infection while reducing ability to discriminate infections associated with adverse clinical outcomes. Quantitative assays can be used to differentiate infection loads and discriminate infections associated with sequelae.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号