首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A nonparametric hierarchical growth curve model is proposed. Different levels in the model hierarchy are intended to correspond to different sources of variation in an individual's growth. The nonparametric character of the model offers considerable flexibility in fitting the growth curves to empirical data. Here the emphasis is on prediction, and for this purpose the adopted Bayesian inferential approach seems particularly natural and efficient. A Markov chain Carlo method is used to perform the numerical computations. As an illustration of the techniques, we consider the growth of children, during their first two years.  相似文献   

2.
3.

Purpose

To identify non-invasive clinical parameters to predict urodynamic bladder outlet obstruction (BOO) in patients with benign prostatic hyperplasia (BPH) using causal Bayesian networks (CBN).

Subjects and Methods

From October 2004 to August 2013, 1,381 eligible BPH patients with complete data were selected for analysis. The following clinical variables were considered: age, total prostate volume (TPV), transition zone volume (TZV), prostate specific antigen (PSA), maximum flow rate (Qmax), and post-void residual volume (PVR) on uroflowmetry, and International Prostate Symptom Score (IPSS). Among these variables, the independent predictors of BOO were selected using the CBN model. The predictive performance of the CBN model using the selected variables was verified through a logistic regression (LR) model with the same dataset.

Results

Mean age, TPV, and IPSS were 6.2 (±7.3, SD) years, 48.5 (±25.9) ml, and 17.9 (±7.9), respectively. The mean BOO index was 35.1 (±25.2) and 477 patients (34.5%) had urodynamic BOO (BOO index ≥40). By using the CBN model, we identified TPV, Qmax, and PVR as independent predictors of BOO. With these three variables, the BOO prediction accuracy was 73.5%. The LR model showed a similar accuracy (77.0%). However, the area under the receiver operating characteristic curve of the CBN model was statistically smaller than that of the LR model (0.772 vs. 0.798, p = 0.020).

Conclusions

Our study demonstrated that TPV, Qmax, and PVR are independent predictors of urodynamic BOO.  相似文献   

4.
Using a case study of Lake Chaohu, the fifth largest lake in China, we constructed a cusp model for water bloom prediction that used TP (total phosphorus), T (temperature), Chla (chlorophyll-a), and DO (dissolved oxygen). These four parameters were assumed to be the most important factors in eutrophication and water bloom of the lake. The model was found to be accurate, because its relative error was around 10%. What is more convincing, according to the catastrophe discriminant of the cusp model, it could be judged that a discontinuous jump of the aquatic ecosystem occurred in July 2004, in Lake Chaohu. This conclusion is consistent with the fact that water blooms arose in August 2004. The cusp model also showed satisfactory precision when applied to forecast the eutrophication trend and prediction of water bloom in Lake Chaohu in 2005. The case study found that water bloom brought on by eutrophication can be fit and predicted by a catastrophe model. We suggest that catastrophe models would be a constructive approach to forecast and judge the outbreak of water bloom in lakes. In addition, by constructing and studying such catastrophe models, lake managers would be able to simulate the effects of different protection and mitigation projects and enrich the scientific basis for the optimization of these projects as well.  相似文献   

5.
Goal, Scope and Background Wood has many applications and it is often in competition with other materials. Chipboard is the most common item of wood-based materials and it has attained the highest economical development in recent years. Relevant up-to-date environmental data are needed to allow the environmental comparison of wood with other materials. There are several examples of Life Cycle Assessment (LCA) evaluations of some wood products and forest-technology systems, but no comprehensive Life Cycle Inventory (LCI) data for particleboard manufacture is available in the literature. The main focus of this study is to generate a comprehensive LCI database for the manufacture of resin-bonded wood particleboards. Methods In this work, International Organization for Standardization (ISO) standards and Ecoindicator 99 methodology were considered to quantify the potential environmental impact associated to the system under study. A Spanish factory considered representative of the 'state of art' was studied in detail. The system boundaries included all the activities taking place inside the factory as well as the activities associated with the production of the main chemicals used in the process, energy inputs and transport. All the data related to the inputs and outputs of the process were obtained by on-site measurements. Results and Discussion LCI methodology was used for the quantification of the impacts of the particleboard manufacture. The inventory data of the three defined subsystems are described: - Wood preparation: a comprehensive inventory of data including storage, debarking, particle production, storage and measurement of particles, drying and combustion of the bark for energy purposes. - Board shaping: data related to particle classification, resin mixing, mattress formation and the pressing stage. - Board finishing: cooling data, finishing, storage and distribution of the final product. The system was characterised with Ecoindicator 99 methodology (hierarchic version) in order to identify the 'hot spots'. Damage to Human Health was mainly produced by the subsystem of Board finishing. The subsystem of Board shaping was the most significant contributor to damage to the Ecosystem Quality and Resources. Conclusions With the final aim of creating a database to identify and characterise the manufacture of particleboard, special attention was paid to the inventory analysis stage of the particleboard industry. A multicriteria approach was applied in order to define the most adequate use of wood wastes. Environmental, economic and social considerations strengthen the hypothesis that the use of forest residues in particleboard manufacture is more sustainable than their use as fuel. Recommendations and Outlook In this work, particleboard was the product analysed, as it is one of the most common wood-based materials. Future work will focus on the study of another key wood board: Medium Density Fibreboard (MDF). Moreover, factors with strong geographical dependence, such as the electricity profile and final transport of the product, will be analysed. In addition, the definition of widespread functional unit to study the use of wood wastes at the end-of-life stage may be another issue of outstanding interest.  相似文献   

6.
The aim of this article is to develop a methodological approach allowing to assess the influence of parameters of one or more elementary processes in the foreground system, on the outcomes of a life cycle assessment (LCA) study. From this perspective, the method must be able to: (1) include foreground process modeling in order to avoid the assumption of proportionality between inventory data and reference flows; (2) quantify influences of foreground processes’ parameters (and, possibly, interactions between parameters); and (3) identify trends (either increasing or decreasing) for each parameter on each indicator in order to determine the most favorable direction for parametric variation. These objectives can be reached by combining foreground system modeling, a set of two different sensitivity analysis methods (each one providing different and complementary information), and LCA. The proposed method is applied to a case study of hemp‐based insulation materials for buildings. The present study will focus on the agricultural stage as a foreground system and as a first step encompassing the entire life cycle. A set of technological recommendations were identified for hemp farmers in order to reduce the crop's environmental impacts (from –11% to –89% according to the considered impact category). One of the main limitations of the approach is the need for a detailed model of the foreground process. Further, the method is, at present, rather time‐consuming. However, it offers long‐term advantages given that the higher level of model detail adds robustness to the LCA results.  相似文献   

7.
The objective of this study is to construct a balanced chlorine budget for a small forested catchment, focusing on the interaction between chloride (Clinorg) and organic-matter-bound chlorine (Clorg). Data from the actual catchment are combined with secondary data from other sites to elucidate more clearly which parts of the cycle are fairly well known and which are more or less unknown. The budget calculations show that the principal input and output fluxes of Cl in the catchment are inorganic but that the main pool is Clorg in the soil. In addition, the budget calculations suggest that a considerable portion of Clinorg in soil is transformed to Clorg and subsequently leached to deeper soil layers, that net mineralization of Clorg takes place in soil, preferably in deeper soil layers, and that degrading organic matter is a major source of Clinorg in runoff. The loss of Clorg through runoff is small to negligible in relation to other fluxes. It appears as if dry deposition of Clinorg is at risk of being underestimated if Clinorg is assumed to be conservative in soil. The pool of organic-matter-bound chlorine in soil is considerably larger than the annual flux of chloride through the system. The estimates suggest that the amount of Clorg in the upper 40 cm of the soil at the investigated site is approximately twice as large as the Clinorg. Furthermore, the amount of Clorg biomass is small in relation to the occurrence of Clorg in soil. Finally, the estimates indicate that the transport of volatile Clorg from the soil to the atmosphere may influence the chlorine cycle.  相似文献   

8.
To generate meaningful results, life cycle assessments (LCAs) require accurate technology data that are consistent with the goal and scope of the analysis. While literature data are available for many products and processes, finding representative data for highly site‐specific technologies, such as waste treatment processes, remains a challenge. This study investigated representative life cycle inventory (LCI) modeling of waste treatment technologies in consideration of variations in technological level and climate. The objectives were to demonstrate the importance of representative LCI modeling as a function of the specificity of the study, and to illustrate the necessity of iteratively refining the goal and scope of the study as data are developed. A landfill case study was performed where 52 discrete landfill data sets were built and grouped to represent different technology options and geographical sites, potential impacts were calculated, and minimum/maximum (min‐max) intervals were generated for each group. The results showed decreasing min‐max intervals with increasing specificity of the scope of study, which indicates that compatibility between the scope of study and LCI model is critical. Hereby, this study quantitatively demonstrates the influence of representative modeling on LCA results. The results indicate that technology variations and site‐specific conditions (e.g., the influence of precipitation and cover permeability on landfill gas generation and collection) should be carefully addressed by a systematic analysis of the key process parameters. Therefore, a thorough understanding of the targeted waste treatment technologies is necessary to ensure that appropriate data choices are made within the boundaries of the defined scope of the study.  相似文献   

9.
Effectiveness of CNS-acting drugs depends on the localization, targeting, and capacity to be transported through the blood–brain barrier (BBB) which can be achieved by designing brain-targeting delivery vectors. Hence, the objective of this study was to screen the formulation and process variables affecting the performance of sertraline (Ser-HCl)-loaded pegylated and glycosylated liposomes. The prepared vectors were characterized for Ser-HCl entrapment, size, surface charge, release behavior, and in vitro transport through the BBB. Furthermore, the compatibility among liposomal components was assessed using SEM, FTIR, and DSC analysis. Through a thorough screening study, enhancement of Ser-HCl entrapment, nanosized liposomes with low skewness, maximized stability, and controlled drug leakage were attained. The solid-state characterization revealed remarkable interaction between Ser-HCl and the charging agent to determine drug entrapment and leakage. Moreover, results of liposomal transport through mouse brain endothelialpolyoma cells demonstrated greater capacity of the proposed glycosylated liposomes to target the cerebellar due to its higher density of GLUT1 and higher glucose utilization. This transport capacity was confirmed by the inhibiting action of both cytochalasin B and phenobarbital. Using C6 glioma cells model, flow cytometry, time-lapse live cell imaging, and in vivo NIR fluorescence imaging demonstrated that optimized glycosylated liposomes can be transported through the BBB by classical endocytosis, as well as by specific transcytosis. In conclusion, the current study proposed a thorough screening of important formulation and process variabilities affecting brain-targeting liposomes for further scale-up processes.  相似文献   

10.
Nitrous oxide (N2O) is one of the greenhouse gases that can contribute to global warming. Spatial variability of N2O can lead to large uncertainties in prediction. However, previous studies have often ignored the spatial dependency to quantify the N2O – environmental factors relationships. Few researches have examined the impacts of various spatial correlation structures (e.g. independence, distance-based and neighbourhood based) on spatial prediction of N2O emissions. This study aimed to assess the impact of three spatial correlation structures on spatial predictions and calibrate the spatial prediction using Bayesian model averaging (BMA) based on replicated, irregular point-referenced data. The data were measured in 17 chambers randomly placed across a 271 m2 field between October 2007 and September 2008 in the southeast of Australia. We used a Bayesian geostatistical model and a Bayesian spatial conditional autoregressive (CAR) model to investigate and accommodate spatial dependency, and to estimate the effects of environmental variables on N2O emissions across the study site. We compared these with a Bayesian regression model with independent errors. The three approaches resulted in different derived maps of spatial prediction of N2O emissions. We found that incorporating spatial dependency in the model not only substantially improved predictions of N2O emission from soil, but also better quantified uncertainties of soil parameters in the study. The hybrid model structure obtained by BMA improved the accuracy of spatial prediction of N2O emissions across this study region.  相似文献   

11.
Demand for grapes to produce pisco in southern‐coastal Peru is expected to double by 2030. However, the appellation of this beverage confines the production and limits the space for agricultural expansion, leading to a situation in which potential competition for resources with established constraints is foreseen. Hence, the objective of this study is to understand the environmental impacts, focused on climate change and water consumption, linked to the agricultural dynamism in the valleys of Ica and Pisco due to an increase in the demand of pisco. For this, the viticulture system was analyzed regarding predicted changes in terms of expansion, displacement or intensification using a consequential life cycle assessment (CLCA) approach, identifying the environmental consequences of these shifts. A two‐step CLCA model was used based on the results of a previous attributional study, in which marginal effects were estimated following the stochastic technology‐of‐choice model (STCM) operational framework. Results identified a potential for the increase of pisco production based on crop substitution in the valleys of Ica and Pisco and suggest that greenhouse gas emissions and water consumption will be reduced locally, but the displaced agricultural production would reverse this tendency. Regardless of the policy implications of the results in the analyzed system, the proposed methodology constitutes a robust methodology that can be applied to other highly constrained agricultural systems, namely, those regulated by geographic indications.  相似文献   

12.
A key priority in infectious disease research is to understand the ecological and evolutionary drivers of viral diseases from data on disease incidence as well as viral genetic and antigenic variation. We propose using a simulation-based, Bayesian method known as Approximate Bayesian Computation (ABC) to fit and assess phylodynamic models that simulate pathogen evolution and ecology against summaries of these data. We illustrate the versatility of the method by analyzing two spatial models describing the phylodynamics of interpandemic human influenza virus subtype A(H3N2). The first model captures antigenic drift phenomenologically with continuously waning immunity, and the second epochal evolution model describes the replacement of major, relatively long-lived antigenic clusters. Combining features of long-term surveillance data from the Netherlands with features of influenza A (H3N2) hemagglutinin gene sequences sampled in northern Europe, key phylodynamic parameters can be estimated with ABC. Goodness-of-fit analyses reveal that the irregularity in interannual incidence and H3N2''s ladder-like hemagglutinin phylogeny are quantitatively only reproduced under the epochal evolution model within a spatial context. However, the concomitant incidence dynamics result in a very large reproductive number and are not consistent with empirical estimates of H3N2''s population level attack rate. These results demonstrate that the interactions between the evolutionary and ecological processes impose multiple quantitative constraints on the phylodynamic trajectories of influenza A(H3N2), so that sequence and surveillance data can be used synergistically. ABC, one of several data synthesis approaches, can easily interface a broad class of phylodynamic models with various types of data but requires careful calibration of the summaries and tolerance parameters.  相似文献   

13.
Life cycle assessment (LCA) has been applied for assessing emerging technologies, where large‐scale production data are generally lacking. This study introduces a standardized scheme for technology and manufacturing readiness levels to contextualize a technology's development stage. We applied the scheme to a carbon nanotube (CNT) LCA and found that, regardless of synthesis technique, CNT manufacturing will become less energy intensive with increased levels of readiness. We examined the influence of production volume on LCA results using primary data from a commercial CNT manufacturer with approximately 100 grams per day production volume and engineering design of a scaled‐up process with 1 tonne per day production capacity. The results show that scaling up could reduce 84% to 94% of its cradle‐to‐gate impacts, mainly as a result of the recycling of feedstock that becomes economically viable only beyond certain minimum production volume. This study shows that LCAs on emerging technologies based on immature data should be interpreted in conjunction with their technology and manufacturing readiness levels and reinforces the need of standardizing and communicating information on these readiness levels and scale of production in life cycle inventory practices.  相似文献   

14.
Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator''s input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator''s behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs.  相似文献   

15.
We present a systematic approach for prediction purposes based on panel data, involving information about different interacting subjects and different times (here: two). The corresponding bivariate regression problem can be solved analytically for the final statistical estimation error. Furthermore, this expression is simplified for the special case that the subjects do not change their properties between the last measurement and the prediction period. This statistical framework is applied to the prediction of soccer matches, based on information from the previous and the present season. It is determined how well the outcome of soccer matches can be predicted theoretically. This optimum limit is compared with the actual quality of the prediction, taking the German premier league as an example. As a key step for the actual prediction process one has to identify appropriate observables which reflect the strength of the individual teams as close as possible. A criterion to distinguish different observables is presented. Surprisingly, chances for goals turn out to be much better suited than the goals themselves to characterize the strength of a team. Routes towards further improvement of the prediction are indicated. Finally, two specific applications are discussed.  相似文献   

16.
17.

Object

To identify pre-operative prognostic parameters for survival in patients with spinal epidural neoplastic metastasis when the primary tumour is unknown.

Methods

This study was a retrospective chart review of patients who underwent surgery for spinal epidural neoplastic metastases between February 1997 and January 2011. The inclusion criteria were as follows: known post-operative survival period, a Karnofsky Performance Score equal to or greater than 30 points and a post-operative neoplastic metastasis histological type. The Kaplan-Meier method was used to estimate post-operative survival, and the Log-Rank test was used for statistical inference.

Results

A total of 52 patients who underwent 52 surgical procedures were identified. The mean age at the time of spinal surgery was 53.92 years (std. deviation, 19.09). The median survival after surgery was 70 days (95% CI 49.97–90.02), and post-operative mortality occurred within 6 months in 38 (73.07%) patients. Lung cancer, prostate cancer, myeloma and lymphoma, the 4 most common primary tumour types, affected 32 (61.53%) patients. The three identified prognostic parameters were the following: pre-operative walking incapacity (American Spinal Injury Association, A and B), present in 86.53% of the patients (p-value = 0.107); special care dependency (Karnofsky Performance Score, 10–40 points), present in 90.38% of the patients (p-value = 0.322); and vertebral epidural neoplastic metastases that were in contact with the thecal sac (Weinstein-Boriani-Biagini, sector D), present in 94.23% of the patients (p-value = 0.643). When the three secondary prognostic parameters were combined, the mean post-operative survival was 45 days; when at least one was present, the survival was 82 days (p-value = 0.175).

Conclusions

Walking incapacity, special care dependency and contact between the neoplastic metastases and the thecal sac can help determine the ultimate survival of this patient population and, potentially, which patients would benefit from surgery versus palliation alone. A 2- to 3-month post-operative survival period justified surgical treatment.  相似文献   

18.
In recent years approximate Bayesian computation (ABC) methods have become popular in population genetics as an alternative to full-likelihood methods to make inferences under complex demographic models. Most ABC methods rely on the choice of a set of summary statistics to extract information from the data. In this article we tested the use of the full allelic distribution directly in an ABC framework. Although the ABC techniques are becoming more widely used, there is still uncertainty over how they perform in comparison with full-likelihood methods. We thus conducted a simulation study and provide a detailed examination of ABC in comparison with full likelihood in the case of a model of admixture. This model assumes that two parental populations mixed at a certain time in the past, creating a hybrid population, and that the three populations then evolve under pure drift. Several aspects of ABC methodology were investigated, such as the effect of the distance metric chosen to measure the similarity between simulated and observed data sets. Results show that in general ABC provides good approximations to the posterior distributions obtained with the full-likelihood method. This suggests that it is possible to apply ABC using allele frequencies to make inferences in cases where it is difficult to select a set of suitable summary statistics and when the complexity of the model or the size of the data set makes it computationally prohibitive to use full-likelihood methods.  相似文献   

19.
Government agencies, companies, and other entities are using environmental assessments, like life cycle assessment (LCA), as an input to decision‐making processes. Communicating the esoteric results of an LCA to these decision makers can present challenges, and interpretation aids are commonly provided to increase understanding. One such method is normalizing results as a means of providing context for interpreting magnitudes of environmental impacts. Normalization is mostly carried out by relating the environmental impacts of a product (or process) under study to those of another product or a spatial reference area (e.g., the United States). This research is based on the idea that decision makers might also benefit from normalization that considers comparisons to their entity's (agency, company, organization, etc.) total impacts to provide additional meaning and aid in comprehension. Two hybrid normalization schemes have been developed, which include aspects of normalization to both spatially based and entity‐based impacts. These have been named entity‐overlaid and entity‐accentuated normalization, and the schemes allow for performance‐based planning or emphasizing environmental impact types that are most relevant to an entity's operational profile, respectively. A hypothetical case study is presented to demonstrate these schemes, which uses environmental data from a U.S. transportation agency as the basis for entity normalization factors. Results of this case study illustrate how entity‐related references may be developed, and how this additional information may enhance the presentation of LCA results using the hybrid normalization schemes.  相似文献   

20.
Farm intensification options in pasture‐based dairy systems are generally associated with increased stocking rates coupled with the increased use of off‐farm inputs to support the additional feed demand of animals. However, as well as increasing milk production per hectare, intensification can also exacerbate adverse impacts on the environment. The objective of the present study was to investigate environmental trade‐offs associated with potential intensification methods for pasture‐based dairy farming systems in the Waikato region, New Zealand. The intensification scenarios selected were (1) increased pasture utilization efficiency (PUE scenario), (2) increased use of nitrogen (N) fertilizer to boost on‐farm pasture production (N fertilizer scenario), and (3) increased use of brought‐in feed as maize silage (MS) (MS scenario). Twelve impact categories were assessed. The PUE scenario was the environmentally preferred intensification method, and the preferred choice between the N fertilizer and MS scenarios depended upon trade‐offs between different environmental impacts. Sensitivity analysis was carried out to test the effects of choice associated with: (1) the approaches used to account for indirect land‐use change (ILUC) and (2) the competing product systems (conventional beef systems) used to handle the co‐product dairy meat for the climate change (CC) indicator. Results showed that the magnitude of the CC indicator results was influenced by the ILUC accounting approaches and the choice associated with a global marginal beef mix, but the relative CC indicator results for the three intensification scenarios remained unchanged.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号