首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A scaling effort on perovskite solar cells is presented where the device manufacture is progressed onto flexible substrates using scalable techniques such as slot‐die roll coating under ambient conditions. The printing of the back electrode using both carbon and silver is essential to the scaling effort. Both normal and inverted device geometries are explored and it is found that the formation of the correct morphology for the perovskite layer depends heavily on the surface upon which it is coated and this has significant implications for manufacture. The time it takes to form the desired layer morphology falls in the range of 5–45 min depending on the perovskite precursor, where the former timescale is compatible with mass production and the latter is best suited for laboratory work. A significant loss in solar cell performance of around 50% is found when progressing to using a fully scalable fabrication process, which is comparable to what is observed for other printable solar cell technologies such as polymer solar cells. The power conversion efficiency (PCE) for devices processed using spin coating on indium tin oxide (ITO)‐glass with evaporated back electrode yields a PCE of 9.4%. The same device type and active area realized using slot‐die coating on flexible ITO‐polyethyleneterphthalate (PET) with a printed back electrode gives a PCE of 4.9%.  相似文献   

2.
The transmission dynamics of infectious diseases have been traditionally described through a time-inhomogeneous Poisson process, thus assuming exponentially distributed levels of disease tolerance following the Sellke construction. Here we focus on a generalization using Weibull individual tolerance thresholds under the susceptible-exposed-infectious-removed class of models which is widely employed in epidemics. Applications with experimental foot-and-mouth disease and historical smallpox data are discussed, and simulation results are presented. Inference is carried out using Markov chain Monte Carlo methods following a Bayesian approach. Model evaluation is performed, where the adequacy of the models is assessed using methodology based on the properties of Bayesian latent residuals, and comparison between 2 candidate models is also considered using a latent likelihood ratio-type test that avoids problems encountered with relevant methods based on Bayes factors.  相似文献   

3.
Schrödle B  Held L  Rue H 《Biometrics》2012,68(3):736-744
Summary Linking information on a movement network with space-time data on disease incidence is one of the key challenges in infectious disease epidemiology. In this article, we propose and compare two statistical frameworks for this purpose, namely, parameter-driven (PD) and observation-driven (OD) models. Bayesian inference in PD models is done using integrated nested Laplace approximations, while OD models can be easily fitted with existing software using maximum likelihood. The predictive performance of both formulations is assessed using proper scoring rules. As a case study, the impact of cattle trade on the spatiotemporal spread of Coxiellosis in Swiss cows, 2004-2009, is finally investigated.  相似文献   

4.
This review discusses the prevalence and potential for interactive effects between herbivory and competition on plant growth and biomass, and it is apparent that such effects typically arise when there is a mismatch between the spatial scale of herbivore behaviour (food or patch choice) and the spatial heterogeneity of the plant community. Historically, such interactive effects have been examined using two approaches. Studies using the first approach have excluded plant neighbors and herbivores in a factorial experiment, and scored effects on plant biomass. Studies using the second approach have observed herbivore abundance or herbivory on plants with or without plant neighbors, and have identified a large number of mechanisms underlying such interactive effects. The two types of studies have produced somewhat conflicting results, where interactive effects have been commonly observed in studies using the second approach and only rarely in studies using the first approach. This is most likely a consequence of a biased choice of study systems, where studies using the first approach have primarily studied mammalian herbivory while studies using the second approach have been more focussed on insect herbivory. Moreover, studies using the first approach have typically been very small-scale manipulations and this probably precludes most possible interactive effects in systems with mammalian herbivory. This points to the fact that studies examining interactive effects of herbivory and plant competition should more carefully consider the behaviour and life history of herbivores included in the study prior to the design of removal experiments.  相似文献   

5.
6.
Efficient protein solubilization using detergents is required for in‐depth proteome analysis, but successful LC‐MS/MS analysis greatly depends on proper detergents removal. A commonly used sample processing method is the filter‐aided sample preparation (FASP), which allows protein digestion and detergent removal on the same filtration device. Many optimizations of the FASP protocol have been published, but there is no information on the influence of the filtration unit typology on the detergents removal. The aim of this study was to compare the performance of conic and flat bottom filtration units in terms of number of proteins identified by LC‐MS/MS. We have analyzed 1, 10 and 100 μg of total cell lysate prepared using lysis buffer with different SDS concentrations. We compared the FASP protocol using conic and flat bottom filtration units to ethanol precipitation method. Subsequently, we applied our most performant protocol to single murine pancreatic islet, and identified up to 2463 protein using FASP versus 1169 proteins using ethanol precipitation. We conclude that FASP performance depends strongly on the filter shape: flat bottom devices are better suited for low‐protein samples, as they allow better SDS removal leading to the identification of greater number of proteins.  相似文献   

7.
The concept of environmental index is briefly reviewed. With a biologically plausible theoretical model, the consequences of indices being based on incomplete information about the familial environment are investigated. It is shown that using partial indices on all family members leads to both an underestimation of the familial environmental component of variance, also called cultural heritability, and, although to a possibly lesser extent, an overestimation of the genetic heritability. It is further shown that so long as complete indices are available on both parents, using identically partial indices on children will yield nearly undistorted parameter estimates. These conclusions were arrived at by using a special case of the model and may not apply in general.  相似文献   

8.
9.
Molecularly imprinted polymers (MIPs) using p-hydroxybenzoic acid (p-HB), p-hydroxyphenylacetic acid (p-HPA) and p-hydroxyphenylpropionic acid (p-HPPA) as templates were synthesized. The performance of the templates and their analogues on polymer-based high performance liquid chromatography (HPLC) columns was studied. The imprinting effect of the MIP using p-HB as template is more obvious than that of MIP using either p-HPA or p-HPPA as template, and the mixture of p-HB and p-HPA can be well separated on the MIP using p-HB as template, but not on the blank. Interestingly, the recognition of MIP (p-HB as the template) to p-HB showed a synergistic effect. The retention factor of p-HB is not the sum of those of phenol and benzoic acid. We also found that the imprinting effect decreased when increasing the concentration of acetic acid in mobile phase. The possible reason is that acetic acid molecules occupied the binding sites of the polymer, thereby decreasing the concentration of binding sites. Furthermore, polymers, which showed specificity to 3,4-dihydroxybenzoic acid, can be prepared with p-HB as template. It is thus possible to synthesize a specific polymer for a compound that is either expensive or unstable by using a structurally similar compound as template.  相似文献   

10.
To deal with imbalanced data in a classification problem, this paper proposes a data balancing technique to be used in conjunction with a committee network. The proposed data balancing technique is based on the concept of the growing ring self-organizing map (GRSOM) which is an unsupervised learning algorithm. GRSOM balances the data through growing new data on a well-defined ring structure, which is iteratively developed based on the winning node nearby the samples. Accordingly, the new balanced data still preserve the topology of the original data. The performance of our proposed method is evaluated using four real data sets from the UCI Machine Learning Repository and the classification performance is measured using the fivefold cross validation method. Classifiers with most common data balancing techniques, namely the Minority Over-Sampling Technique (SMOTE) and the Random under-sampling Technique (RT), are used as the baseline methods in this study. The results reveal that a committee of classifiers constructed using GRSOM performs at least as well as the baseline methods. The results also suggest that classifiers constructed using neural networks with the backpropagation algorithm are more robust than those using the support vector machine.  相似文献   

11.
12.
A method of isolating and identifying biotin polypeptides from crude cellular extracts is described. Protein samples are run on small avidin-Sepharose columns. After washing away nonspecifically bound protein, the biotin enzymes are eluted using an SDS-urea solution. The inactive polypeptides are then electrophoresed on SDS-polyacrylamide gels. The biotin polypeptides in the gels are identified by using fluorescent avidin or by analyzing for radioactive biotin-labeled polypeptides. A sensitive method for assaying biotin using avidin-Sepharose is also described.  相似文献   

13.
Bayesian Inference for a Random Tessellation Process   总被引:2,自引:0,他引:2  
P. G. Blackwell 《Biometrics》2001,57(2):502-507
This article describes an inhomogeneous Poisson point process in the plane with an intensity function based on a Dirichlet tessellation process and a method for using observations on the point process to make fully Bayesian inferences about the underlying tessellation. The method is implemented using a Markov chain Monte Carlo approach. An application to modeling the territories of clans of badgers, Meles meles, is described.  相似文献   

14.
Task scheduling is one of the most challenging aspects to improve the overall performance of cloud computing and optimize cloud utilization and Quality of Service (QoS). This paper focuses on Task Scheduling optimization using a novel approach based on Dynamic dispatch Queues (TSDQ) and hybrid meta-heuristic algorithms. We propose two hybrid meta-heuristic algorithms, the first one using Fuzzy Logic with Particle Swarm Optimization algorithm (TSDQ-FLPSO), the second one using Simulated Annealing with Particle Swarm Optimization algorithm (TSDQ-SAPSO). Several experiments have been carried out based on an open source simulator (CloudSim) using synthetic and real data sets from real systems. The experimental results demonstrate the effectiveness of the proposed approach and the optimal results is provided using TSDQ-FLPSO compared to TSDQ-SAPSO and other existing scheduling algorithms especially in a high dimensional problem. The TSDQ-FLPSO algorithm shows a great advantage in terms of waiting time, queue length, makespan, cost, resource utilization, degree of imbalance, and load balancing.  相似文献   

15.
Bioprocess engineering at present concentrates on the enormous problems in the environment. What is therefore needed is a sound methodology, which should be based on the interactions between the physiology of biological reaction networks and physical processes in the environment and which should be an analogy to bioreactor performance. The conventional methodology is empirically oriented, using pilot plant data for the experimental estimates of process economics, where all further details are elucidated following the mechanistic approach on the microscopic level based on assumed mechanisms (causalities). According to the new view, the new systems-based methodology uses mathematical models as approximations and includes all the interactions. Pilot plant data are needed for model falsification, using analogies on the formal macroscopic level. Bioreactor scale-up as one application is a more rapid procedure of reasonable accuracy, where both the biokinetics as well as the fluid dynamics are quantified using formal macroscopic analogies. Model consistency and plausibility are the basic criteria when using model computer simulations as a decisive aid, while experiments lose their central role and are on longer the basis for evaluating the scientific work; they are simply the basis of the researcher's intuition. Another typical feature of complex systems is that model parameters are interdependent. The final decisive fact will be the mental experiment (“thinking” with the left and right side of the brain), which can be supported using computer simulations. This evolutionary interplay between the three realities of thinking, experimenting and simulating leads to a holistic progress towards the better understanding of highly complex systems. It offers the solution to the problems, which is needed in future.  相似文献   

16.
When analysing human movement through stereophotogrammetry, skin-markers are used. Their movement relative to the underlying bone is known as a soft tissue artefact (STA). A mathematical model to estimate subject- and marker-specific STAs generated during a given motor task, is required for both skeletal kinematic estimators and comparative assessment using simulation. This study devises and assesses such a mathematical model using the paradigmatic case of thigh STAs. The model was based on two hypotheses: (1) that the artefact mostly depends on skin sliding, and thus on the angles of hip and knee; (2) that the relevant relationship is linear. These hypotheses were tested using data obtained from passive hip and knee movements in non-obese specimens and from running volunteers endowed with both skin- and pin-markers.  相似文献   

17.
地塞米松与甲硝唑在防治根管治疗期间急症中的联合应用   总被引:1,自引:0,他引:1  
目的:观察联合应用地塞米松与甲硝唑行根管治疗期间急症处理的效果。方法:对66例根管治疗期间急症发作的病例分别采用联合应用地塞米松与甲硝唑、单独使用地塞米松和单独使用甲醛甲酚的方法治疗,观察各组的差异。结果:地塞米松与甲硝唑联合应用组的治疗效果优于其他组,差异显著。结论:联合应用地塞米松与甲硝唑可有效治疗根管治疗期间急症。  相似文献   

18.
An automated microarray diagnostic system for specific IgE using photoimmobilized allergen has been developed. Photoimmobilization is useful for preparing microarrays, where various types of biological components are covalently immobilized on a plate. Because the immobilization is based on a photo-induced radical cross-linking reaction, it does not require specific functional groups on the immobilized components. Here, an aqueous solution of a photoreactive poly(ethylene glycol)-based polymer was spin-coated on a plate, and an aqueous solution of each allergen was microspotted on the coated plate and allowed to dry in air. Finally, the plate was irradiated with an ultraviolet lamp for covalent immobilization. An automated machine using these plates was developed for the assay of antigen-specific IgE. Initially, the patient serum was added to the microarray plate, and after reaction of the microspotted allergen with IgE, the adsorbed IgE was detected by a peroxidase-conjugated anti-IgE-antibody. The chemical luminescence intensity of the substrate decomposed by the peroxidase was automatically detected using a sensitive charge-coupled device camera. All the allergens were immobilized stably using this method, which was used to screen for allergen-specific IgE. The results were comparable with those using conventional specific IgE. Using this system, six different allergen-specific IgE were assayed using 10μL of serum within a period of 20min.  相似文献   

19.
Structure-based protein NMR assignments using native structural ensembles   总被引:1,自引:0,他引:1  
An important step in NMR protein structure determination is the assignment of resonances and NOEs to corresponding nuclei. Structure-based assignment (SBA) uses a model structure ("template") for the target protein to expedite this process. Nuclear vector replacement (NVR) is an SBA framework that combines multiple sources of NMR data (chemical shifts, RDCs, sparse NOEs, amide exchange rates, TOCSY) and has high accuracy when the template is close to the target protein's structure (less than 2 A backbone RMSD). However, a close template may not always be available. We extend the circle of convergence of NVR for distant templates by using an ensemble of structures. This ensemble corresponds to the low-frequency perturbations of the given template and is obtained using normal mode analysis (NMA). Our algorithm assigns resonances and sparse NOEs using each of the structures in the ensemble separately, and aggregates the results using a voting scheme based on maximum bipartite matching. Experimental results on human ubiquitin, using four distant template structures show an increase in the assignment accuracy. Our algorithm also improves the robustness of NVR with respect to structural noise. We provide a confidence measure for each assignment using the percentage of the structures that agree on that assignment. We use this measure to assign a subset of the peaks with even higher accuracy. We further validate our algorithm on data for two additional proteins with NVR. We then show the general applicability of our approach by applying our NMA ensemble-based voting scheme to another SBA tool, MARS. For three test proteins with corresponding templates, including the 370-residue maltose binding protein, we increase the number of reliable assignments made by MARS. Finally, we show that our voting scheme is sound and optimal, by proving that it is a maximum likelihood estimator of the correct assignments.  相似文献   

20.
A physical development is regarded as somatic equivalent of capacity for physical work. Two integrative indexes of size and shape of body have been revealed using the component analysis. It is possible to estimate their values for length, weight and chest circumference. The influence of subcutaneous fat on the indexes of size and shape is removed using the linear regression. A conclusion on the type of physical development may be made on the basis of their combination.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号