首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 32 毫秒
1.
In this study, an inverse algorithm based on the conjugate gradient method and the discrepancy principle is applied to solve the inverse hyperbolic heat conduction problem in estimating the unknown time-dependent surface heat flux in a skin tissue, which is stratified into epidermis, dermis, and subcutaneous layers, from the temperature measurements taken within the medium. Subsequently, the temperature distributions in the tissue can be calculated as well. The concept of finite heat propagation velocity is applied to the modeling of the bioheat transfer problem. The inverse solutions will be justified based on the numerical experiments in which two different heat flux distributions are to be determined. The temperature data obtained from the direct problem are used to simulate the temperature measurements. The influence of measurement errors on the precision of the estimated results is also investigated. Results show that an excellent estimation on the time-dependent surface heat flux can be obtained for the test cases considered in this study.  相似文献   

2.
A general predictive relation for the convection heat transfer from animal forms is developed. This relation is based on the convection equation for a sphere, and employs a simple, unique characteristic dimension to represent the animal which is the cube root of the animal volume. The accuracy of this relation is established through comparison with available convection results from animal shapes ranging in size and shape from spiders to cows. This relation allows an extrapolation to animal shapes for which data are not available. Results are also presented for the enhancement of convection heat transfer due to natural turbulence. A procedure is outlined for estimating the convecture heat loss from an animal in the natural outdoor environment.  相似文献   

3.
The existing theory of quasi-stationary plasma turbulence presumes that the growth rate of plasma waves is zero. In this paper, it is proposed to determine the spectrum of such waves by using the concept of undamped Vlasov waves. The results concerning the ion-acoustic velocity in the framework of this concept are presented for two models of ion-acoustic turbulence. It is shown that the use of the spectral properties of undamped ion-acoustic waves removes the uncertainty in estimating the time and efficiency of strong turbulent plasma heating.  相似文献   

4.
The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction.  相似文献   

5.
Material flow analysis (MFA) is widely used to investigate flows and stocks of resources or pollutants in a defined system. Data availability to quantify material flows on a national or global level is often limited owing to data scarcity or lacking data. MFA input data are therefore considered inherently uncertain. In this work, an approach to characterize the uncertainty of MFA input data is presented and applied to a case study on plastics flows in major Austrian consumption sectors in the year 2010. The developed approach consists of data quality assessment as a basis for estimating the uncertainty of input data. Four different implementations of the approach with respect to the translation of indicator scores to uncertainty ranges (linear‐ vs. exponential‐type functions) and underlying probability distributions (normal vs. log‐normal) are examined. The case study results indicate that the way of deriving uncertainty estimates for material flows has a stronger effect on the uncertainty ranges of the resulting plastics flows than the assumptions about the underlying probability distributions. Because these uncertainty estimates originate from data quality evaluation as well as uncertainty characterization, it is crucial to use a well‐defined approach, building on several steps to ensure the consistent translation of the data quality underlying material flow calculations into their associated uncertainties. Although subjectivity is inherent in uncertainty assessment in MFA, the proposed approach is consistent and provides a comprehensive documentation of the choices underlying the uncertainty analysis, which is essential to interpret the results and use MFA as a decision support tool.  相似文献   

6.
Recent suggestions for an improved model of heat transfer in living tissues emphasize the existence of a convective mode due to flowing blood in addition to, or even instead of, the perfusive mode, as proposed in Pennes' "classic" bioheat equation. In view of these suggestions, it might be beneficial to develop a technique that will enable one to distinguish between these two modes of bioheat transfer. To this end, a concept that utilizes a multiprobe array of thermistors in conjunction with a revised bioheat transfer equation has been derived to distinguish between, and to quantify the perfusive and convective contribution of blood to heat transfer in living tissues. The array consists of two or more temperature sensors one of which also serves to locally insert a short pulse of heat into the tissue prior to the temperature measurements. A theoretical analysis shows that such a concept is feasible. The construction of the system involves the selection of several important design parameters, i.e., the distance between the probes, the heating power, and the pulse duration. The choice of these parameters is based on computer simulations of the actual experiment.  相似文献   

7.
This work deals with the flow and heat transfer in upper-convected Maxwell fluid above an exponentially stretching surface. Cattaneo-Christov heat flux model is employed for the formulation of the energy equation. This model can predict the effects of thermal relaxation time on the boundary layer. Similarity approach is utilized to normalize the governing boundary layer equations. Local similarity solutions are achieved by shooting approach together with fourth-fifth-order Runge-Kutta integration technique and Newton’s method. Our computations reveal that fluid temperature has inverse relationship with the thermal relaxation time. Further the fluid velocity is a decreasing function of the fluid relaxation time. A comparison of Fourier’s law and the Cattaneo-Christov’s law is also presented. Present attempt even in the case of Newtonian fluid is not yet available in the literature.  相似文献   

8.
Abstract

A CUSUM chart method is presented as an alternative tool for continuous monitoring of an electromagnetic field-based (EMF) antifouling (AF) treatment of a heat exchanger cooled by seawater. During an initial experimental phase, biofilm growth was allowed in a heat exchanger formed of four tubes until sufficient growth had been established. In two of the tubes, continuous EMF treatment was then applied. The heat transfer resistance and heat duty (heat transfer per unit time) results showed that biofilm adhesion was reduced by the EMF treatment. EMF treatments resulted in a 35% improvement in the heat transfer resistance values. The proposed CUSUM chart method showed that the EMF treatment increased the useful life of the heat exchanger by ~20?days. Thus, CUSUM charts proved to be an efficient tool for continuous monitoring of an AF treatment using data collected online and can also be used to reduce operation and maintenance costs.  相似文献   

9.
The design and fabrication for a thermal chip with an array of temperature sensors and heaters for study of micro-jet impingement cooling heat transfer process are presented. This thermal chip can minimize the heat loss from the system to the ambient and provide a uniform heat flux along the wall, thus local heat transfer processes along the wall can be measured and obtained. The fabrication procedure presented can reach a chip yield of 100%, and every one of the sensors and heaters on the chip is in good condition. In addition, micro-jet impingement cooling experiments are performed to obtain the micro-scale local heat transfer Nusselt number along the wall. Flow visualization for the micro-impinging jet is also made. The experimental results indicate that both the micro-scale impinging jet flow structure and the heat transfer process along the wall is significantly different from the case of large-scale jet impingement cooling process.  相似文献   

10.
Knowledge of tissue thermal transport properties is imperative for any therapeutic medical tool which employs the localized application of heat to perfused biological tissue. In this study, several techniques are proposed to measure local tissue thermal diffusion by heating with a focused ultrasound field. Transient as well as near steady-state heat inputs are discussed and examined for their suitability as a measurement technique for either tissue thermal diffusivity or perfusion rate. It is shown that steady-state methods are better suited for the measurement of perfusion; however the uncertainty in the perfusion measurement is directly related to knowledge of the tissue's intrinsic thermal diffusivity. Results are presented for a transient thermal pulse technique for the measurement of the thermal diffusivity of perfused and nonperfused tissues, in vitro and in vivo. Measurements conducted in plexiglas, animal muscle, kidney and brain concur with tabulated values and show a scatter from 5-15 percent from the mean; measurements made in perfused muscle and brain compare well with the nonperfused values. An estimate of the error introduced by the effect of perfusion shows that except for highly perfused kidney tissue the effect of perfusion is less than the experimental scatter. This validation of the tissue heat transfer model will allow its eventual extension to the simultaneous measurement of local tissue thermal diffusivity and perfusion.  相似文献   

11.
A simple procedure for estimating the false discovery rate   总被引:1,自引:0,他引:1  
MOTIVATION: The most used criterion in microarray data analysis is nowadays the false discovery rate (FDR). In the framework of estimating procedures based on the marginal distribution of the P-values without any assumption on gene expression changes, estimators of the FDR are necessarily conservatively biased. Indeed, only an upper bound estimate can be obtained for the key quantity pi0, which is the probability for a gene to be unmodified. In this paper, we propose a novel family of estimators for pi0 that allows the calculation of FDR. RESULTS: The very simple method for estimating pi0 called LBE (Location Based Estimator) is presented together with results on its variability. Simulation results indicate that the proposed estimator performs well in finite sample and has the best mean square error in most of the cases as compared with the procedures QVALUE, BUM and SPLOSH. The different procedures are then applied to real datasets. AVAILABILITY: The R function LBE is available at http://ifr69.vjf.inserm.fr/lbe CONTACT: broet@vjf.inserm.fr.  相似文献   

12.
Comparisons of national standard of air kerma for conventional and mammographic diagnostic X-ray radiation qualities were conducted by the IAEA. Eleven secondary standards dosimetry laboratories provided calibration data for Exradin A3 and Radcal RC6M transfer ionization chambers circulated. Each comparison result expressed as the ratio of the participant and IAEA calibration coefficient were within the acceptance limit of ±2.5%. From the 67 results of 11 participants and 10 available beam qualities, the comparison result was within its standard uncertainty in 63 cases, and within the expanded (k = 2) uncertainty in four cases. Detailed calibration uncertainty budgets from participant laboratories are presented. The relative standard calibration uncertainty of each participant was in the range of 0.5–1.3%. These results indicate that the calibration related uncertainty component is reasonable low for a clinical measurement. In addition to the calibration coefficient, other corrections should be applied for clinical measurement to achieve the recommended accuracy.  相似文献   

13.
The comparison of the gene orders in a set of genomes can be used to infer their phylogenetic relationships and to reconstruct ancestral gene orders. For three genomes this is done by solving the "median problem for breakpoints"; this solution can then be incorporated into a routine for estimating optimal gene orders for all the ancestral genomes in a fixed phylogeny. For the difficult (and most prevalent) case where the genomes contain partially different sets of genes, we present a general heuristic for the median problem for induced breakpoints. A fixed-phylogeny optimization based on this is applied in a phylogenetic study of a set of completely sequenced protist mitochondrial genomes, confirming some of the recent sequence-based groupings which have been proposed and, conversely, confirming the usefulness of the breakpoint method as a phylogenetic tool even for small genomes.  相似文献   

14.
Electromyogram signal (EMG) measurement frequently experiences uncertainty attributed to issues caused by technical constraints such as cross talk and maximum voluntary contraction. Due to these problems, individual EMGs exhibit uncertainty in representing their corresponding muscle activations. To regulate this uncertainty, we proposed an EMG refinement, which refines EMGs with regulating the contribution redundancy of the signals from EMGs to approximating torques through EMG-driven torque estimation (EDTE) using the muscular skeletal forward dynamic model. To regulate this redundancy, we must consider the synergistic contribution redundancy of muscles, including “unmeasured” muscles, to approximating torques, which primarily causes redundancy of EDTE. To suppress this redundancy, we used the concept of muscle synergy, which is a key concept of analyzing the neurophysiological regulation of contribution redundancy of muscles to exerting torques. Based on this concept, we designed a muscle-synergy-based EDTE as a framework for EMG refinement, which regulates the abovementioned uncertainty of individual EMGs in consideration of unmeasured muscles. In achieving the proposed EMG refinement, the most considerable point is to suppress a large change such as overestimation attributed to enhancement of the contribution of particular muscles to estimating torques. Therefore it is reasonable to refine EMGs by minimizing the change in EMGs. To evaluate this model, we used a Bland-Altman plot, which quantitatively evaluates the proportional bias of refined signals to EMGs. Through this evaluation, we showed that the proposed EDTE minimizes the bias while approximating torques. Therefore this minimization optimally regulates the uncertainty of EMGs and thereby leads to optimal EMG refinement.  相似文献   

15.
Most of the laser applications in medicine and biology involve thermal effects. The laser-tissue thermal interaction has therefore received more and more attentions in recent years. However, previous works were mainly focused on the case of laser heating on normal tissues (37 degrees C or above). To date, little is known on the mechanisms of laser heating on the frozen biological tissues. Several latest experimental investigations have demonstrated that lasers have great potentials in tissue cryopreservation. But the lack of theoretical interpretation limits its further application in this area. The present paper proposes a numerical model for the thawing of biological tissues caused by laser irradiation. The Monte Carlo approach and the effective heat capacity method are, respectively, employed to simulate the light propagation and solid-liquid phase change heat transfer. The proposed model has four important features: (1) the tissue is considered as a nonideal material, in which phase transition occurs over a wide temperature range; (2) the solid phase, transition phase, and the liquid phase have different thermophysical properties; (3) the variations in optical properties due to phase-change are also taken into consideration; and (4) the light distribution is changing continually with the advancement of the thawing fronts. To this end, 15 thawing-front geometric configurations are presented for the Monte Carlo simulation. The least-squares parabola fitting technique is applied to approximate the shape of the thawing front. And then, a detailed algorithm of calculating the photon reflection/refraction behaviors at the thawing front is described. Finally, we develop a coupled light/heat transport solution procedure for the laser-induced thawing of frozen tissues. The proposed model is compared with three test problems and good agreement is obtained. The calculated results show that the light reflectance/transmittance at the tissue surface are continually changing with the progression of the thawing fronts and that lasers provide a new heating method superior to conventional heating through surface conduction because it can achieve a uniform volumetric heating. Parametric studies are performed to test the influences of the optical properties of tissue on the thawing process. The proposed model is rather general in nature and therefore can be applied to other nonbiological problems as long as the materials are absorbing and scattering media.  相似文献   

16.
Purpose

Objective uncertainty quantification (UQ) of a product life-cycle assessment (LCA) is a critical step for decision-making. Environmental impacts can be measured directly or by using models. Underlying mathematical functions describe a model that approximate the environmental impacts during various LCA stages. In this study, three possible uncertainty sources of a mathematical model, i.e., input variability, model parameter (differentiate from input in this study), and model-form uncertainties, were investigated. A simple and easy to implement method is proposed to quantify each source.

Methods

Various data analytics methods were used to conduct a thorough model uncertainty analysis; (1) Interval analysis was used for input uncertainty quantification. A direct sampling using Monte Carlo (MC) simulation was used for interval analysis, and results were compared to that of indirect nonlinear optimization as an alternative approach. A machine learning surrogate model was developed to perform direct MC sampling as well as indirect nonlinear optimization. (2) A Bayesian inference was adopted to quantify parameter uncertainty. (3) A recently introduced model correction method based on orthogonal polynomial basis functions was used to evaluate the model-form uncertainty. The methods are applied to a pavement LCA to propagate uncertainties throughout an energy and global warming potential (GWP) estimation model; a case of a pavement section in Chicago metropolitan area was used.

Results and discussion

Results indicate that each uncertainty source contributes to the overall energy and GWP output of the LCA. Input uncertainty was shown to have significant impact on overall GWP output; for the example case study, GWP interval was around 50%. Parameter uncertainty results showed that an assumption of ±?10% uniform variation in the model parameter priors resulted in 28% variation in the GWP output. Model-form uncertainty had the lowest impact (less than 10% variation in the GWP). This is because the original energy model is relatively accurate in estimating the energy. However, sensitivity of the model-form uncertainty showed that even up to 180% variation in the results can be achieved due to lower original model accuracies.

Conclusions

Investigating each uncertainty source of the model indicated the importance of the accurate characterization, propagation, and quantification of uncertainty. The outcome of this study proposed independent and relatively easy to implement methods that provide robust grounds for objective model uncertainty analysis for LCA applications. Assumptions on inputs, parameter distributions, and model form need to be justified. Input uncertainty plays a key role in overall pavement LCA output. The proposed model correction method as well as interval analysis were relatively easy to implement. Research is still needed to develop a more generic and simplified MCMC simulation procedure that is fast to implement.

  相似文献   

17.
The novel two-step serologic sensitive/less sensitive testing algorithm for detecting recent HIV seroconversion (STARHS) provides a simple and practical method to estimate HIV-1 incidence using cross-sectional HIV seroprevalence data. STARHS has been used increasingly in epidemiologic studies. However, the uncertainty of incidence estimates using this algorithm has not been well described, especially for high risk groups or when missing data is present because a fraction of sensitive enzyme immunoassay (EIA) positive specimens are not tested by the less sensitive EIA. Ad hoc methods used in practice provide incorrect confidence limits and thus may jeopardize statistical inference. In this report, we propose maximum likelihood and Bayesian methods for correctly estimating the uncertainty in incidence estimates obtained using prevalence data with a fraction missing, and extend the methods to regression settings. Using a study of injection drug users participating in a drug detoxification program in New York city as an example, we demonstrated the impact of underestimating the uncertainty in incidence estimates using ad hoc methods. Our methods can be applied to estimate the incidence of other diseases from prevalence data using similar testing algorithms when missing data is present.  相似文献   

18.
This paper shows the application of mathematical modeling to scale-up a cycle developed with lab-scale equipment on two different production units. The above method is based on a simplified model of the process parameterized with experimentally determined heat and mass transfer coefficients. In this study, the overall heat transfer coefficient between product and shelf was determined by using the gravimetric procedure, while the dried product resistance to vapor flow was determined through the pressure rise test technique. Once model parameters were determined, the freeze-drying cycle of a parenteral product was developed via dynamic design space for a lab-scale unit. Then, mathematical modeling was used to scale-up the above cycle in the production equipment. In this way, appropriate values were determined for processing conditions, which allow the replication, in the industrial unit, of the product dynamics observed in the small scale freeze-dryer. This study also showed how inter-vial variability, as well as model parameter uncertainty, can be taken into account during scale-up calculations.  相似文献   

19.
This study proposed a new quantitative technique to identify suitable but unoccupied habitats for metapopulation studies in plants. It is based on species composition at the habitat and knowledge of species co-occurrence patterns. It uses data from a large phytosociological database as a background for estimating species co-occurrence patterns. If such a database is not available, the technique can still be applied using the same data for which the prediction is done to estimate the species co-occurrence pattern. Using the technique we were able to indicate suitable unoccupied habitats and differentiate them from the unoccupied unsuitable ones. We also identified habitats with low probability of being suitable that were occupied. Compared to a direct approach of identification of suitable habitats, which involves introduction of a species to the habitat and studying its performance, the approach presented here is much easier to apply and can provide extensive information on habitat suitability for a range of species with much less effort and time needed.  相似文献   

20.
Infrared thermography (IRT) is a technique that determines surface temperature based on physical laws of radiative transfer. Thermal imaging cameras have been used since the 1960s to determine the surface temperature patterns of a wide range of birds and mammals and how species regulate their surface temperature in response to different environmental conditions. As a large proportion of metabolic energy is transferred from the body to the environment as heat, biophysical models have been formulated to determine metabolic heat loss. These models are based on heat transfer equations for radiation, convection, conduction and evaporation and therefore surface temperature recorded by IRT can be used to calculate heat loss from different body regions. This approach has successfully demonstrated that in birds and mammals heat loss is regulated from poorly insulated regions of the body which are seen to be thermal windows for the dissipation of body heat. Rather than absolute measurement of metabolic heat loss, IRT and biophysical models have been most useful in estimating the relative heat loss from different body regions. Further calibration studies will improve the accuracy of models but the strength of this approach is that it is a non-invasive method of measuring the relative energy cost of an animal in response to different environments, behaviours and physiological states. It is likely that the increasing availability and portability of thermal imaging systems will lead to many new insights into the thermal physiology of endotherms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号