首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Consumers may be exposed to formaldehyde during the use of liquid laundry detergent containing a preservative. The primary objective of this analysis was to present an approach to predict formaldehyde air emissions from a washing machine and the subsequent vapor concentrations in the laundry room air using the U.S. Environmental Protection Agency's (USEPA's) Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, referred to as the IAQX model. A second objective was to identify key model input parameters for formaldehyde. This analysis recommends use of the IAQX emission Model 52 because it provided the best estimates by correlating the formaldehyde evaporation to the Henry's law constant and to the overall gas-phase mass transfer coefficient that was based on washing machine experimental results. The mass balance estimated that 99.7% of the initial formaldehyde mass in the washing machine was discharged down the drain with the wash water and the rest of the formaldehyde was emitted to the air from the top loading washing machine and the hot air clothes dryer. The predicted formaldehyde exposures were acceptable and much lower than the USEPA proposed targets for non-cancer effects and cancer risk.  相似文献   

2.
Abstract

High-throughput methods are now routinely used to rapidly screen chemicals for potential hazard. However, hazard-based decision-making excludes important exposure considerations resulting in an incomplete estimation of chemical safety. Models to estimate exposure exist, but are generally unsuited to keep up with high-throughput demands. The High-Throughput Exposure Assessment Tool (HEAT) is designed to efficiently predict near-field exposure to consumers and workers via inhalation, oral and dermal routes. HEAT is based on well-known modeling algorithms and provides default model parameters to support reasonably conservative exposure estimates. Underlying chemical-specific data are uploaded or entered by the end user. HEAT’s main strength is the flexible tiered screening functionality, which enables exposure estimates for single or multiple chemicals simultaneously. Hypothetical case examples highlighting the application of HEAT to more complex exposure estimates for alternative and aggregate assessments are provided.  相似文献   

3.
In this study, a procedure to automatically identify the product model by evaluating an image of the label is presented. First, object character recognition (OCR) extracts text from the product label. Second, to identify the product model, the extracted text is compared with unique model references in a database, giving access to other model-specific information. For this comparison, a novel variation of the partial ratio matching algorithm was developed. The product-model identification procedure is integrated into an interactive web application, which allows for model identification to be performed during preparation for reuse, repair, and recycling: The SmartRe application. Three datasets consisting of 466, 422, and 771 images of washing machine product labels were gathered in collaboration with a (1) professional repair company for consumer devices, (2) a nonprofit repair and reuse center that resells devices in second-hand stores, and (3) a large recycling company. Results demonstrate that 96%, 91%, and 40% of the product models were correctly read from the product label with OCR, respectively. Of these recognized models, 51%, 88%, and 76% were successfully identified with the SmartRe application by comparing the extracted text with the model database. Further analysis also demonstrated that 72% of the washing machine models identified at the nonprofit repair and reuse center were also found at the recycling facility and that 12% of these models are predicted to be less than 10-years-old. This highlights the potential of the SmartRe application to assist in product triage for reuse at recycling centers.  相似文献   

4.
Animal models are considered to be the "gold standard" for determining the potential contact allergenicity of low molecular weight chemicals. However, governmental regulations and ethical considerations limit the use of animals for such purposes. There is therefore a need for in vitro alternative models. The human organotypic skin explant culture (HOSEC) model is reported to be a promising alternative method for the predictive testing of contact allergens. The accelerated migration of Langerhans cells from the epidermis upon exposure to contact allergens is used to identify chemicals that are potentially capable of inducing a delayed-type hypersensitivity. In the study described in this paper, the model was further refined, and used, in two independent laboratories, to screen 23 low molecular weight compounds of known classification for their allergenicity. Each laboratory was able to accurately detect the contact allergens, despite small variations in the protocols used. However, the classification of dermal irritants, which have often been falsely classified as allergens, varied between the two laboratories. Despite the current limitations of the HOSEC model, the accuracy of the predictions made (sensitiser or non-sensitiser) compare favourably with classifications obtained with commonly used animal models. The HOSEC model has the potential to be developed further as an in vitro alternative to animal models for screening for contact allergens.  相似文献   

5.
The development of a biopharmaceutical production process usually occurs sequentially, and tedious optimization of each individual unit operation is very time-consuming. Here, the conditions established as optimal for one-step serve as input for the following step. Yet, this strategy does not consider potential interactions between a priori distant process steps and therefore cannot guarantee for optimal overall process performance. To overcome these limitations, we established a smart approach to develop and utilize integrated process models using machine learning techniques and genetic algorithms. We evaluated the application of the data-driven models to explore potential efficiency increases and compared them to a conventional development approach for one of our development products. First, we developed a data-driven integrated process model using gradient boosting machines and Gaussian processes as machine learning techniques and a genetic algorithm as recommendation engine for two downstream unit operations, namely solubilization and refolding. Through projection of the results into our large-scale facility, we predicted a twofold increase in productivity. Second, we extended the model to a three-step model by including the capture chromatography. Here, depending on the selected baseline-process chosen for comparison, we obtained between 50% and 100% increase in productivity. These data show the successful application of machine learning techniques and optimization algorithms for downstream process development. Finally, our results highlight the importance of considering integrated process models for the whole process chain, including all unit operations.  相似文献   

6.
A system consisting of a connected mixed and tubular bioreactor was designed to study bacterial biofilm formation and the effect of its exposure to bacteriophages under different experimental conditions. The bacterial biofilm inside silicone tubular bioreactor was formed during the continuous pumping of bacterial cells at a constant physiological state for 2 h and subsequent washing with a buffer for 24 h. Monitoring bacterial and bacteriophage concentration along the tubular bioreactor was performed via a piercing method. The presence of biofilm and planktonic cells was demonstrated by combining the piercing method, measurement of planktonic cell concentration at the tubular bioreactor outlet, and optical microscopy. The planktonic cell formation rate was found to be 8.95 × 10−3 h−1 and increased approximately four-fold (4×) after biofilm exposure to an LB medium. Exposure of bacterial biofilm to bacteriophages in the LB medium resulted in a rapid decrease of biofilm and planktonic cell concentration, to below the detection limit within < 2 h. When bacteriophages were supplied in the buffer, only a moderate decrease in the concentration of both bacterial cell types was observed. After biofilm washing with buffer to remove unadsorbed bacteriophages, its exposure to the LB medium (without bacteriophages) resulted in a rapid decrease in bacterial concentration: again below the detection limit in < 2 h.  相似文献   

7.
In order to make renewable fuels and chemicals from microbes, new methods are required to engineer microbes more intelligently. Computational approaches, to engineer strains for enhanced chemical production typically rely on detailed mechanistic models (e.g., kinetic/stoichiometric models of metabolism)—requiring many experimental datasets for their parameterization—while experimental methods may require screening large mutant libraries to explore the design space for the few mutants with desired behaviors. To address these limitations, we developed an active and machine learning approach (ActiveOpt) to intelligently guide experiments to arrive at an optimal phenotype with minimal measured datasets. ActiveOpt was applied to two separate case studies to evaluate its potential to increase valine yields and neurosporene productivity in Escherichia coli. In both the cases, ActiveOpt identified the best performing strain in fewer experiments than the case studies used. This work demonstrates that machine and active learning approaches have the potential to greatly facilitate metabolic engineering efforts to rapidly achieve its objectives.  相似文献   

8.

Background

Evidence for a possible causal relationship between exposure to electromagnetic fields (EMF) emitted by high voltage transmission (HVT) lines and neurobehavioral dysfunction in children is insufficient. The present study aims to investigate the association between EMF exposure from HVT lines and neurobehavioral function in children.

Methods

Two primary schools were chosen based on monitoring data of ambient electromagnetic radiation. A cross-sectional study with 437 children (9 to 13 years old) was conducted. Exposure to EMF from HVT lines was monitored at each school. Information was collected on possible confounders and relevant exposure predictors using standardized questionnaires. Neurobehavioral function in children was evaluated using established computerized neurobehavioral tests. Data was analyzed using multivariable regression models adjusted for relevant confounders.

Results

After controlling for potential confounding factors, multivariable regression revealed that children attending a school near 500 kV HVT lines had poorer performance on the computerized neurobehavioral tests for Visual Retention and Pursuit Aiming compared to children attending a school that was not in close proximity to HVT lines.

Conclusions

The results suggest long-term low-level exposure to EMF from HVT lines might have a negative impact on neurobehavioral function in children. However, because of differences in results only for two of four tests achieved statistical significance and potential limitations, more studies are needed to explore the effects of exposure to extremely low frequency EMF on neurobehavioral function and development in children.  相似文献   

9.
10.
This study demonstrates a novel model generation methodology that addresses several limitations of conventional finite element head models (FEHM). By operating chiefly in image space, new structures can be incorporated or merged, and the mesh either decimated or refined both locally and globally. This methodology is employed in the development of a highly bio-fidelic FEHM from high-resolution scan data. The model is adaptable and presented here in a form optimised for impact and blast simulations. The accuracy and feasibility of the model are successfully demonstrated against a widely used experimental benchmark in impact loading and through the investigation of potential brain injury under blast overpressure loading.  相似文献   

11.
A screening‐level risk assessment was used to identify chemicals of potential health concern emitted during the normal operation of an hypothetical state‐of‐the‐art municipal solid waste landfill. Data on the amount of contaminants (carcinogens, non‐carcinogenic systemic toxicants, odorous compounds, and particulate‐bound metals) were obtained from existing facilities and used to estimate ground‐level air concentrations of airborne chemicals at the point of maximum impact (property line) and at year 20 (year of maximum emissions from the landfill). Concentrations of leachate components present in the corresponding underlying aquifer were also estimated. Intakes of chemicals experienced by a series of human receptors were then computed using either single‐media or multi‐media algorithms. Carcinogens of concern were selected as those contributing to a lifetime excess cancer risk (LECR) greater than 10‐6; for non‐carcinogenic systemic toxicants and odorous volatiles an Exposure Ratio (ER=intake or concentration/RfD, RfC, odor threshold) greater than 0.1 was used as cut‐off. The results obtained identified a final set of air emission components (n = 25) constituted mainly of carcinogenic and odorous substances whereas 2 leachate components were retained. Additional analysis using more refined risk‐based approaches are necessary to verify the relevance of these projections.  相似文献   

12.
Abstract

Accurate and rapid toxic gas concentration prediction model plays an important role in emergency aid of sudden gas leak. However, it is difficult for existing dispersion model to achieve accuracy and efficiency requirements at the same time. Although some researchers have considered developing new forecasting models with traditional machine learning, such as back propagation (BP) neural network, support vector machine (SVM), the prediction results obtained from such models need to be improved still in terms of accuracy. Then new prediction models based on deep learning are proposed in this paper. Deep learning has obvious advantages over traditional machine learning in prediction and classification. Deep belief networks (DBNs) as well as convolution neural networks (CNNs) are used to build new dispersion models here. Both models are compared with Gaussian plume model, computation fluid dynamics (CFD) model and models based on traditional machine learning in terms of accuracy, prediction time, and computation time. The experimental results turn out that CNNs model performs better considering all evaluation indexes.  相似文献   

13.
Valant J  Drobne D 《Protoplasma》2012,249(3):835-842
Isolated digestive gland epithelium from a model invertebrate organism was used in an ex vivo system to assess the potential of nanoparticulate TiO2 to disrupt cell membranes. Primary particle size, surface area, concentration of particles in a suspension, and duration of exposure to TiO2 particles were all found to have effects, which are observed at concentrations of nano-TiO2 as low as 1 μg mL?1. The test system employed here can be used as a fast screening tool to assess biological potential of nanoparticles with similar chemical composition but different size, concentration, or duration of exposure. We discuss the potential of ex vivo tests to avoid some of the limitations of conventional in vitro tests.  相似文献   

14.
Predicting the effects of amino acid substitutions on protein stability provides invaluable information for protein design, the assignment of biological function, and for understanding disease-associated variations. To understand the effects of substitutions, computational models are preferred to time-consuming and expensive experimental methods. Several methods have been proposed for this task including machine learning-based approaches. However, models trained using limited data have performance problems and many model parameters tend to be over-fitted. To decrease the number of model parameters and to improve the generalization potential, we calculated the amino acid contact energy change for point variations using a structure-based coarse-grained model. Based on the structural properties including contact energy (CE) and further physicochemical properties of the amino acids as input features, we developed two support vector machine classifiers. M47 predicted the stability of variant proteins with an accuracy of 87 % and a Matthews correlation coefficient of 0.68 for a large dataset of 1925 variants, whereas M8 performed better when a relatively small dataset of 388 variants was used for 20-fold cross-validation. The performance of the M47 classifier on all six tested contingency table evaluation parameters is better than that of existing machine learning-based models or energy function-based protein stability classifiers.  相似文献   

15.
Exposure of children to lead in the environment was assessed at the Murray Smelter Superfund site using both a deterministic risk assessment approach, the Integrated Exposure Uptake Biokinetic (IEUBK) model, and a probabilistic approach, the Integrated Stochastic Exposure (ISE) model. When site-specific data on lead in environmental media were input as point estimates into the IEUBK model, unacceptable risks were predicted for children living within five of eight study zones. The predicted soil cleanup goal was 550?ppm. Concentration and exposure data were then input into the ISE model as probability distribution functions and a one-dimensional Monte Carlo analysis (ID MCA) was run to predict the expected distribution of exposures and blood lead values. Uncertainty surrounding these predictions was examined in a two-dimensional Monte Carlo analysis (2-D MCA). The ISE model predicted risks that were in the same rank order as those predicted by the IEUBK model, although the probability estimates of exceeding a blood lead level of 10?µg/dl (referred to as the P10) from the ISE model were uniformly lower than those predicted by the IEUBK model. The 2-D MCA allowed evaluation of the confidence around each P10 level, and identified the main sources of both uncertainty and variability in exposure estimates. The ISE model suggested cleanup goals ranging from 1300 to 1500 ppm might be protective at this site.  相似文献   

16.
The reporter strain Pseudomonas putida TOD102 (with a tod-lux fusion) was used in chemostat experiments with binary substrate mixtures to investigate the effect of potentially occurring cosubstrates on toluene degradation activity. Although toluene was simultaneously utilized with other cosubstrates, its metabolic flux (defined as the toluene utilization rate per cell) decreased with increasing influent concentrations of ethanol, acetate, or phenol. Three inhibitory mechanisms were considered to explain these trends: (1) repression of the tod gene (coding for toluene dioxygenase) by acetate and ethanol, which was quantified by a decrease in specific bioluminescence; (2) competitive inhibition of toluene dioxygenase by phenol; and (3) metabolic flux dilution (MFD) by all three cosubstrates. Based on experimental observations, MFD was modeled without any fitting parameters by assuming that the metabolic flux of a substrate in a mixture is proportional to its relative availability (expressed as a fraction of the influent total organic carbon). Thus, increasing concentrations of alternative carbon sources "dilute" the metabolic flux of toluene without necessarily repressing tod, as observed with phenol (a known tod inducer). For all cosubstrates, the MFD model slightly overpredicted the measured toluene metabolic flux. Incorporating catabolite repression (for experiments with acetate or ethanol) or competitive inhibition (for experiments with phenol) with independently obtained parameters resulted in more accurate fits of the observed decrease in toluene metabolic flux with increasing cosubstrate concentration. These results imply that alternative carbon sources (including inducers) are likely to hinder toluene utilization per unit cell, and that these effects can be accurately predicted with simple mathematical models.  相似文献   

17.
18.
The purpose of the present study was to investigate the anti-apoptotic bcl-2 protein in rat brain and testes after whole-body exposure to radiation emitted from 900 MHz cellular phones. Two groups (sham and experimental) of Sprague-Dawley rats of eight rats each were used in the study. Exposure began approximately 10 min after transferring into the exposure cages, a period of time when rats settled down to a prone position and selected a fixed location inside the cage spontaneously. For the experimental group, the phones were in the speech condition for 20 min per day for 1 month. The same procedure was applied to the sham group rats, but the phones were turned off. Immunohistochemical staining of bcl-2 was performed according to the standardized avidin-biotin complex method. The results of this study showed that 20 min of the radiation emitted from 900 MHz cellular phones did not alter anti-apoptotic bcl-2 protein in the brain and testes of rats. We speculate that bcl-2 may not be involved in the effects of radiation on the brain and testes of rats.  相似文献   

19.
In the present study, a refined microbially-influenced degradation method was used to evaluate the stability of a solidified synthetic waste containing chromium salt, cement and fly ash in two different proportions. The experimental samples showed evidence of microbial growth by leaching of sulfate. Chromium leached by Thiobacillus thiooxidans from the experimental samples 'C1' (10.26% CrCl3 .6H2O; 89.74% cement) and 'FC1' (10.26% CrCl3 .6H2O; 10% fly ash; 79.74% cement), after 30 days of exposure was 14.53 mg/g and 9.53 mg/g, respectively. The corresponding concentration of chromium in the leachate was 0.189 mg/l and 0.124 mg/l, respectively, which was lower than the toxicity characteristic leaching procedure (TCLP), regulatory limit (5 mg/l). Replacement of cement by 10% fly ash in FC1 restricted the leaching of chromium more effectively. Model equations based on two shrinking core models namely, acid dissolution and bulk diffusion model, were used to analyze the kinetics of microbial degradation. Of the two approaches, the bulk diffusion model fit the data better than the acid dissolution model as indicated by the correlation coefficients of >0.97.  相似文献   

20.
Maleic acid-catalyzed hemicellulose hydrolysis reaction in corn stover was analyzed by kinetic modeling. Kinetic constants for Saeman and biphasic hydrolysis models were analyzed by an Arrhenius-type expansion which include activation energy and catalyst concentration factors. The activation energy for hemicellulose hydrolysis by maleic acid was determined to be 83.3 +/- 10.3 kJ/mol, which is significantly lower than the reported E(a) values for sulfuric acid catalyzed hemicellulose hydrolysis reaction. Model analysis suggest that increasing maleic acid concentrations from 0.05 to 0.2 M facilitate improvement in xylose yields from 40% to 85%, while the extent of improvement flattens to near-quantitative by increasing catalyst loading from 0.2 to 1 M. The model was confirmed for the hydrolysis of corn stover at 1 M maleic acid concentrations at 150 degrees C, resulting in a xylose yield of 96% of theoretical. The refined Saeman model was used to evaluate the optimal condition for monomeric xylose yield in the maleic acid-catalyzed reaction: low temperature reaction conditions were suggested, however, experimental results indicated that bi-phasic behavior dominated at low temperatures, which may be due to the insufficient removal of acetyl groups. A combination of experimental data and model analysis suggests that around 80-90% xylose yields can be achieved at reaction temperatures between 100 and 150 degrees C with 0.2 M maleic acid.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号