首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In physiological conditions, heart period (HP) affects systolic arterial pressure (SAP) through diastolic runoff and Starling's law, but, the reverse relation also holds as a result of the continuous action of baroreflex control. The prevailing mechanism sets the dominant temporal direction in the HP-SAP interactions (i.e., causality). We exploited cross-conditional entropy to assess HP-SAP causality. A traditional approach based on phases was applied for comparison. The ability of the approach to detect the lack of causal link from SAP to HP was assessed on 8 short-term (STHT) and 11 long-term heart transplant (LTHT) recipients (i.e., less than and more than 2 yr after transplantation, respectively). In addition, spontaneous HP and SAP variabilities were extracted from 17 healthy humans (ages 21-36 yr, median age 29 yr; 9 females) at rest and during graded head-up tilt. The tilt table inclinations ranged from 15 to 75° and were changed in steps of 15°. All subjects underwent recordings at every step in random order. The approach detected the lack of causal relation from SAP to HP in STHT recipients and the gradual restoration of the causal link from SAP to HP with time after transplantation in the LTHT recipients. The head-up tilt protocol induced the progressive shift from the prevalent causal direction from HP to SAP to the reverse causality (i.e., from SAP to HP) with tilt table inclination in healthy subjects. Transformation of phases into time shifts and comparison with baroreflex latency supported this conclusion. The proposed approach is highly efficient because it does not require the knowledge of baroreflex latency. The dependence of causality on tilt table inclination suggests that "spontaneous" baroreflex sensitivity estimated using noncausal methods (e.g., spectral and cross-spectral approaches) is more reliable at the highest tilt table inclinations.  相似文献   

2.
The dependence of the antigen-binding activity of immobilized antibodies on pH of a saturating buffer has been investigated. We analyzed 28 monoclonal antibodies (MCAs) produced by various hybridomas to three virus antigens, i.e., the nuclear p23 protein of hepatitis C virus (C core protein p23), p24 protein of HIV 1, and the surface antigen of hepatitis B virus (HBsAg). Antibodies were adsorbed on the surfaces of immune plates in acidic (pH 2.8), neutral (pH 7.5), and alkaline (pH 9.5) buffers. The binding of labeled antigens, i.e., biotinylated or conjugated with horseradish peroxidase, with immobilized antigens was tested. It was shown that 10 out of 28 analyzed MCAs (36%) considerably better preserved their antigen-binding activity if their passive adsorption was carried out on the surface of polystyrene plates in an acidic buffer (pH 2.8). This approach allowed constructing a highly sensitive sandwich method for HBsAg assay with a minimal reliably determined antigen concentration of 0.013–0.017 ng/ml. The described approach may be recommended for the optimization of sandwich methods and solid-phase competitive methods.  相似文献   

3.
Kinetics of facilitated ion transport through planar bilayer membranes are normally analyzed by electrical conductance methods. The additional use of electrical relaxation techniques, such as voltage jump, is necessary to evaluate individual rate constants. Although electrochemical impedance spectroscopy is recognized as the most powerful of the available electric relaxation techniques, it has rarely been used in connection with these kinetic studies. According to the new approach presented in this work, three steps were followed. First, a kinetic model was proposed that has the distinct quality of being general, i.e., it properly describes both carrier and channel mechanisms of ion transport. Second, the state equations for steady-state and for impedance experiments were derived, exhibiting the input–output representation pertaining to the model’s structure. With the application of a method based on the similarity transformation approach, it was possible to check that the proposed mechanism is distinguishable, i.e., no other model with a different structure exhibits the same input–output behavior for any input as the original. Additionally, the method allowed us to check whether the proposed model is globally identifiable (i.e., whether there is a single set of fit parameters for the model) when analyzed in terms of its impedance response. Thus, our model does not represent a theoretical interpretation of the experimental impedance but rather constitutes the prerequisite to select this type of experiment in order to obtain optimal kinetic identification of the system. Finally, impedance measurements were performed and the results were fitted to the proposed theoretical model in order to obtain the kinetic parameters of the system. The successful application of this approach is exemplified with results obtained for valinomycin–K+ in lipid bilayers supported onto gold substrates, i.e., an arrangement capable of emulating biological membranes.  相似文献   

4.
Satten GA  Carroll RJ 《Biometrics》2000,56(2):384-388
We consider methods for analyzing categorical regression models when some covariates (Z) are completely observed but other covariates (X) are missing for some subjects. When data on X are missing at random (i.e., when the probability that X is observed does not depend on the value of X itself), we present a likelihood approach for the observed data that allows the same nuisance parameters to be eliminated in a conditional analysis as when data are complete. An example of a matched case-control study is used to demonstrate our approach.  相似文献   

5.
Copy number variants (CNVs) play an important role in the etiology of many diseases such as cancers and psychiatric disorders. Due to a modest marginal effect size or the rarity of the CNVs, collapsing rare CNVs together and collectively evaluating their effect serves as a key approach to evaluating the collective effect of rare CNVs on disease risk. While a plethora of powerful collapsing methods are available for sequence variants (e.g., SNPs) in association analysis, these methods cannot be directly applied to rare CNVs due to the CNV-specific challenges, i.e., the multi-faceted nature of CNV polymorphisms (e.g., CNVs vary in size, type, dosage, and details of gene disruption), and etiological heterogeneity (e.g., heterogeneous effects of duplications and deletions that occur within a locus or in different loci). Existing CNV collapsing analysis methods (a.k.a. the burden test) tend to have suboptimal performance due to the fact that these methods often ignore heterogeneity and evaluate only the marginal effects of a CNV feature. We introduce CCRET, a random effects test for collapsing rare CNVs when searching for disease associations. CCRET is applicable to variants measured on a multi-categorical scale, collectively modeling the effects of multiple CNV features, and is robust to etiological heterogeneity. Multiple confounders can be simultaneously corrected. To evaluate the performance of CCRET, we conducted extensive simulations and analyzed large-scale schizophrenia datasets. We show that CCRET has powerful and robust performance under multiple types of etiological heterogeneity, and has performance comparable to or better than existing methods when there is no heterogeneity.  相似文献   

6.
Reporter-based assays underlie many high-throughput screening (HTS) platforms, but most are limited to in vitro applications. Here, we report a simple whole-organism HTS method for quantifying changes in reporter intensity in individual zebrafish over time termed, Automated Reporter Quantification in vivo (ARQiv). ARQiv differs from current "high-content" (e.g., confocal imaging-based) whole-organism screening technologies by providing a purely quantitative data acquisition approach that affords marked improvements in throughput. ARQiv uses a fluorescence microplate reader with specific detection functionalities necessary for robust quantification of reporter signals in vivo. This approach is: 1) Rapid; achieving true HTS capacities (i.e., >50,000 units per day), 2) Reproducible; attaining HTS-compatible assay quality (i.e., Z'-factors of ≥0.5), and 3) Flexible; amenable to nearly any reporter-based assay in zebrafish embryos, larvae, or juveniles. ARQiv is used here to quantify changes in: 1) Cell number; loss and regeneration of two different fluorescently tagged cell types (pancreatic beta cells and rod photoreceptors), 2) Cell signaling; relative activity of a transgenic Notch-signaling reporter, and 3) Cell metabolism; accumulation of reactive oxygen species. In summary, ARQiv is a versatile and readily accessible approach facilitating evaluation of genetic and/or chemical manipulations in living zebrafish that complements current "high-content" whole-organism screening methods by providing a first-tier in vivo HTS drug discovery platform.  相似文献   

7.
8.
Data quality     
A methodology is presented that enables incorporating expert judgment regarding the variability of input data for environmental life cycle assessment (LCA) modeling. The quality of input data in the life-cycle inventory (LCI) phase is evaluated by LCA practitioners using data quality indicators developed for this application. These indicators are incorporated into the traditional LCA inventory models that produce non-varying point estimate results (i.e., deterministic models) to develop LCA inventory models that produce results in the form of random variables that can be characterized by probability distributions (i.e., stochastic models). The outputs of these probabilistic LCA models are analyzed using classical statistical methods for better decision and policy making information. This methodology is applied to real-world beverage delivery system LCA inventory models. The inventory study results for five beverage delivery system alternatives are compared using statistical methods that account for the variance in the model output values for each alternative. Sensitivity analyses are also performed that indicate model output value variance increases as input data uncertainty increases (i.e., input data quality degrades). Concluding remarks point out the strengths of this approach as an alternative to providing the traditional qualitative assessment of LCA inventory study input data with no efficient means of examining the combined effects on the model results. Data quality assessments can now be captured quantitatively within the LCA inventory model structure. The approach produces inventory study results that are variables reflecting the uncertainty associated with the input data. These results can be analyzed using statistical methods that make efficient quantitative comparisons of inventory study alternatives possible. Recommendations for future research are also provided that include the screening of LCA inventory model inputs for significance and the application of selection and ranking techniques to the model outputs.  相似文献   

9.
Reef fish distributions are patchy in time and space with some coral reef habitats supporting higher densities (i.e., aggregations) of fish than others. Identifying and quantifying fish aggregations (particularly during spawning events) are often top priorities for coastal managers. However, the rapid mapping of these aggregations using conventional survey methods (e.g., non-technical SCUBA diving and remotely operated cameras) are limited by depth, visibility and time. Acoustic sensors (i.e., splitbeam and multibeam echosounders) are not constrained by these same limitations, and were used to concurrently map and quantify the location, density and size of reef fish along with seafloor structure in two, separate locations in the U.S. Virgin Islands. Reef fish aggregations were documented along the shelf edge, an ecologically important ecotone in the region. Fish were grouped into three classes according to body size, and relationships with the benthic seascape were modeled in one area using Boosted Regression Trees. These models were validated in a second area to test their predictive performance in locations where fish have not been mapped. Models predicting the density of large fish (≥29 cm) performed well (i.e., AUC = 0.77). Water depth and standard deviation of depth were the most influential predictors at two spatial scales (100 and 300 m). Models of small (≤11 cm) and medium (12–28 cm) fish performed poorly (i.e., AUC = 0.49 to 0.68) due to the high prevalence (45–79%) of smaller fish in both locations, and the unequal prevalence of smaller fish in the training and validation areas. Integrating acoustic sensors with spatial modeling offers a new and reliable approach to rapidly identify fish aggregations and to predict the density large fish in un-surveyed locations. This integrative approach will help coastal managers to prioritize sites, and focus their limited resources on areas that may be of higher conservation value.  相似文献   

10.
11.
Burial is one of the most fundamental processes in contexts of massbalance calculations for substances (such as nutrients, organics, metals and radionuclides) in lakes. Substances can leave a lake by two processes, outflow, i.e., the transport to a downstream system, and burial, i.e., the transport by sedimentation from the lake biosphere to the geosphere. This work gives for the first time, to there best of the author's knowledge, a review on the factors and processes regulating burial and presents a general model for burial. This approach accounts for bottom dynamic conditions (i.e., where areas of fine sediment erosion, transport and accumulation prevail), sedimentation, bioturbation, mineralisation, and the depth and age of the bioactive sediment layer. This approach has been critically tested with very good results for radiocesium, radiostrontium, many metals, calcium from liming and phosphorus, but it has not been presented before in a comprehensive way. This model for burial is meant to be used in massbalance models based on ordinary differential equations (i.e., box models) in contexts where burial is not a target y‐variable but a necessary model variable (an x‐variable). This means that there are also specific demands on this approach, e.g., it must be based on readily accessible driving variables so that it is not too difficult to use the model in practice within the context of an overall lake model. The factors influencing burial, e.g., the deposition of materials and the depth of the bioactive sediment layer, are also needed in calculations of sediment concentrations and to determine amounts of substances or pollutants in sediments. To carry out such calculations, one also needs information on sediment bulk density, water content and organic content. This paper also presents new empirical models for such calculations to be used in the new model for burial.  相似文献   

12.

Background  

Common structural biology methods (i.e., NMR and molecular dynamics) often produce ensembles of molecular structures. Consequently, averaging of 3D coordinates of molecular structures (proteins and RNA) is a frequent approach to obtain a consensus structure that is representative of the ensemble. However, when the structures are averaged, artifacts can result in unrealistic local geometries, including unphysical bond lengths and angles.  相似文献   

13.
Banerjee S  Carlin BP 《Biometrics》2004,60(1):268-275
Several recent papers (e.g., Chen, Ibrahim, and Sinha, 1999, Journal of the American Statistical Association 94, 909-919; Ibrahim, Chen, and Sinha, 2001a, Biometrics 57, 383-388) have described statistical methods for use with time-to-event data featuring a surviving fraction (i.e., a proportion of the population that never experiences the event). Such cure rate models and their multivariate generalizations are quite useful in studies of multiple diseases to which an individual may never succumb, or from which an individual may reasonably be expected to recover following treatment (e.g., various types of cancer). In this article we extend these models to allow for spatial correlation (estimable via zip code identifiers for the subjects) as well as interval censoring. Our approach is Bayesian, where posterior summaries are obtained via a hybrid Markov chain Monte Carlo algorithm. We compare across a broad collection of rather high-dimensional hierarchical models using the deviance information criterion, a tool recently developed for just this purpose. We apply our approach to the analysis of a smoking cessation study where the subjects reside in 53 southeastern Minnesota zip codes. In addition to the usual posterior estimates, our approach yields smoothed zip code level maps of model parameters related to the relapse rates over time and the ultimate proportion of quitters (the cure rates).  相似文献   

14.
In this paper, we address the multiple peak alignment problem in sequential data analysis with an approach based on the Gaussian scale-space theory. We assume that multiple sets of detected peaks are the observed samples of a set of common peaks. We also assume that the locations of the observed peaks follow unimodal distributions (e.g., normal distribution) with their means equal to the corresponding locations of the common peaks and variances reflecting the extension of their variations. Under these assumptions, we convert the problem of estimating locations of the unknown number of common peaks from multiple sets of detected peaks into a much simpler problem of searching for local maxima in the scale-space representation. The optimization of the scale parameter is achieved using an energy minimization approach. We compare our approach with a hierarchical clustering method using both simulated data and real mass spectrometry data. We also demonstrate the merit of extending the binary peak detection method (i.e., a candidate is considered either as a peak or as a nonpeak) with a quantitative scoring measure-based approach (i.e., we assign to each candidate a possibility of being a peak).  相似文献   

15.
The very early events of the intrinsic, damage-induced apoptotic pathway, i.e., upstream to Bax activation, probably consist of physico-chemical alterations (i.e., redox, pH or Ca2+ changes) rather then subtle molecular interactions, and in spite of many studies they remain unclear. One problem is that cells undergo apoptosis in an asynchronous way, leading to heterogeneity in the cell population that impairs the results of bulk analyses. In this study, we present a flow cytometric approach for studying Ca2+ alteration in apoptosis at the single cell level. By means of a multiparametric analysis, we could discriminate different sub-populations, i.e., viable and apoptotic cells and cells in secondary necrosis, and separately analyse static as well as dynamic Ca2+ parameters in each sub-population. With this approach, we have identified a set of sequential Ca2+ changes; two very early ones occur prior to any other apoptotic alterations, whereas a later change coincides with the appearance of apoptosis. Interestingly, the two pre-apoptotic changes occur simultaneously in all treated cells, i.e., at fixed times post-treatment, whereas the later one occurs at varying times, i.e., within a wide time range, concomitantly with the other apoptotic events.  相似文献   

16.
Policy discussions about the feasibility of massively scaling up antiretroviral therapy (ART) to reduce HIV transmission and incidence hinge on accurately projecting the cost of such scale-up in comparison to the benefits from reduced HIV incidence and mortality. We review the available literature on modelled estimates of the cost of providing ART to different populations around the world, and suggest alternative methods of characterising cost when modelling several decades into the future. In past economic analyses of ART provision, costs were often assumed to vary by disease stage and treatment regimen, but for treatment as prevention, in particular, most analyses assume a uniform cost per patient. This approach disregards variables that can affect unit cost, such as differences in factor prices (i.e., the prices of supplies and services) and the scale and scope of operations (i.e., the sizes and types of facilities providing ART). We discuss several of these variables, and then present a worked example of a flexible cost function used to determine the effect of scale on the cost of a proposed scale-up of treatment as prevention in South Africa. Adjusting previously estimated costs of universal testing and treatment in South Africa for diseconomies of small scale, i.e., more patients being treated in smaller facilities, adds 42% to the expected future cost of the intervention.  相似文献   

17.
In many civil engineering projects, the foundation soils do not provide the required mechanical properties and therefore, there is a need to improve the soil. Compaction, soil reinforcement, soil mixing with natural, or chemical additives are common soil stabilization methods used to improve the soil mechanical properties. The incidence of some environmental problems in traditional improvement techniques has encouraged engineers to explore new methods. Recently in this category, a new technique in geotechnical engineering called biogeotechnology is introduced to improve the mechanical properties of the soil. It is an environmentally friendly approach that uses biological methods to solve geotechnical problems. This technique uses minerals producer microorganisms. This study investigates the possibility of improving soil strength properties with microbial calcite precipitation and the effect of fine-grained percentages in this regard. In order to determine the soil strength properties, consolidated drained direct shear tests have been carried on untreated and treated soil samples. The results showed that this method is applicable to improve all soil samples (from 100% coarse-grained (i.e., sand) to 100% fine-grained (i.e., clay)). However, increasing the strength in the sand is much more enhanced than that for finer soils. It was found that a considerable increase in cohesion of treated soil can be achieved for soil samples with maximum 10% fine content.  相似文献   

18.
19.
In vivo sampling of interstitial fluid by using microdialysis fibers has become a standard and accepted procedure. This sampling method is generally coupled to offline analysis of consecutive dialysate samples by high-performance liquid chromatography or capillary electrophoresis, but this combination is not the best approach for some applications, especially those which require high temporal resolution and rapid data collection. The purpose of this review is to provide information on enzyme-based online assays, i.e., continuous analysis of the dialysate as it emerges from the outlet of the sampling device. We have focused on methods developed specifically for the analysis of solutions perfused at a very slow flow rate, i.e., a feature of microdialysis and ultrafiltration techniques. These methods include flow enzyme-fluorescence assays, flow enzyme-amperometric assays, and sequential enzyme-amperometric detection. Each type of assay is discussed in terms of principle, applications, advantages, and limitations. We also comment on implantable biosensors, an obvious next step forward for in vivo monitoring of molecules in neuroscience.  相似文献   

20.
ONTOGENY AND THE HIERARCHY OF TYPES   总被引:1,自引:0,他引:1  
Abstract— The long history of belief in a parallelism between ontogeny and a hierarchical order of natural things is reviewed. The meaning of von Baerian recapitulation is analyzed and its implications for cladistic methodology are discussed at two levels: ontogeny and homology. The basic problem inherent in the purported parallelism is that the order of natural things (i.e., the taxic approach to homology) is part of the "world of being" of Platonic ideas, whereas ontogeny and phylogeny (i.e., the transformational approach to homology) belong to Plato's "world of becoming." These two "genera of existence," as Plato put it, being and becoming, are incompatible but complementary views of nature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号