首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   928篇
  免费   117篇
  国内免费   45篇
  2023年   15篇
  2022年   19篇
  2021年   19篇
  2020年   61篇
  2019年   46篇
  2018年   45篇
  2017年   44篇
  2016年   44篇
  2015年   39篇
  2014年   51篇
  2013年   69篇
  2012年   38篇
  2011年   43篇
  2010年   28篇
  2009年   55篇
  2008年   43篇
  2007年   47篇
  2006年   51篇
  2005年   35篇
  2004年   35篇
  2003年   41篇
  2002年   49篇
  2001年   35篇
  2000年   38篇
  1999年   25篇
  1998年   29篇
  1997年   18篇
  1996年   7篇
  1995年   4篇
  1994年   6篇
  1993年   1篇
  1992年   1篇
  1991年   2篇
  1990年   1篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1984年   1篇
  1980年   1篇
  1978年   1篇
排序方式: 共有1090条查询结果,搜索用时 15 毫秒
991.
Background, aim and scope  Phospholipase is an enzyme which is able to increase the yield of cheese in, for instance, mozzarella production. Milk production is the most important source of environmental impacts in cheese production and it is obvious to assume that the milk saving that comes with the use of phospholipase reduces the overall environmental impacts of the final product. Production of industrial phospholipase is, however, also associated with environmental burdens and it is not known whether and to what extent the use of phospholipase is justified by overall environmental improvements. The aim of the present study is therefore to assess the environmental impacts that come with the use of industrial phospholipase in mozzarella production and compare with the savings that come with the avoided milk production. The study addresses mozzarella production in Denmark. Methods  LCA is used as analytical tool and environmental modelling is facilitated in SimaPro 7.1.8 LCA software. Yield improvements refer to full scale industrial application of phospholipase in cheese industry. The study is a comparative analysis and a marginal and market-oriented approach is taken. The study addresses contribution to global warming, acidification, nutrient enrichment, photochemical smog formation, energy consumption and use of agricultural land. Estimation of environmental impact potentials is based on Eco-indicator 95 v.2.1 equivalency factors. Toxicity is addressed by qualitative means. Results  The environmental impacts induced by phospholipase production are small compared with the savings obtained by reduced milk consumption for mozzarella production when all impact indicators are considered. Sensitivity analyses and data quality assessments indicate that this general outcome of the study is robust, although results at the more detailed level are the subject of much variation and uncertainty. Discussion  Transport of the enzyme from producer to mozzarella producer is insignificant and the general outcome of the study is considered applicable to other regions of the world where milk is produced in modern milk production systems. Conclusions  Use of phospholipase as a yield improvement factor is a means of reducing environmental impact of mozzarella production. Recommendations and perspectives  The total annual global warming mitigation potential of phospholipase used in production of mozzarella and other pasta filata products is in the order of 7 × 108 kg CO2 equivalents. The use of phospholipase is driven by overall cost savings and it is therefore recommended that the enzyme should be given attention as a cost-efficient means of reducing greenhouse gas emissions.  相似文献   
992.
PurposeTo analyse the impact of different optimization strategies on the compatibility between planned and delivered doses during radiotherapy of cervical cancer.Material/methodsFour treatment plans differing in optimisation strategies were prepared for ten cervical cancer cases. These were: volumetric modulated arc therapy with (_OPT) and without optimization of the doses in the bone marrow and for two sets of margins applied to the clinical target volume that arose from image guidance based on the bones (IG(B)) and soft tissues (IG(ST)). The plans were subjected to dosimetric verification by using the ArcCHECK system and 3DVH software. The planned dose distributions were compared with the corresponding measured dose distributions in the light of complexity of the plans and its deliverability.ResultsThe clinically significant impact of the plans complexity on their deliverability is visible only for the gamma passing rates analysis performed in a local mode and directly in the organs. While more general analyses show statistically significant differences, the clinical relevance of them has not been confirmed. The analysis showed that IG(ST)_OPT and IG(B)_OPT significantly differ from IG(ST) and IG(B). The clinical acceptance of IG(ST)_OPT obtained for hard combinations of gamma acceptance criteria (2%/2 mm) confirm its satisfactory deliverability. In turn, for IG(B)_OPT in the case of the rectum, the combination of 2%/2 mm did not meet the criteria of acceptance.ConclusionDespite the complexity of the IG(ST)_OPT, the results of analysis confirm the acceptance of its deliverability when 2%/2 mm gamma acceptance criteria are used during the analysis.  相似文献   
993.
Life cycle assessment (LCA) quantifies the whole-life environmental impacts of products and is essential for helping policymakers and manufacturers transition toward sustainable practices. However, typical LCA estimates future recycling benefits as if it happens today. For long-lived products such as lithium-ion batteries, this may be misleading since there is a considerable time gap between production and recycling. To explore this temporal mismatch problem, we apply future electricity scenarios from an integrated assessment model—IMAGE—using “premise” in Brightway2 to conduct a prospective LCA (pLCA) on the global warming potential of six battery chemistries and four recycling routes. We find that by 2050, electricity decarbonization under an RCP2.6 scenario mitigates production impacts by 57%, so to reach zero-carbon batteries it is important to decarbonize upstream heat, fuels, and direct emissions. For the best battery recycling case, data for 2020 gives a net recycling benefit of −22 kg CO2e kWh−1 which reduces the net impact of production and recycling from 71 to 49 kg CO2e kWh−1. However, for recycling in 2040 with decarbonized electricity, net recycling benefits would be nearly 75% lower (−6 kg CO2e kWh−1), giving a net impact of 65 kg CO2e kWh−1. This is because materials recycled in the future substitute lower-impact processes due to expected electricity decarbonization. Hence, more focus should be placed on mitigating production impacts today instead of relying on future recycling. These findings demonstrate the importance of pLCA in tackling problems such as temporal mismatch that are difficult to capture in typical LCA.  相似文献   
994.
The advent of single-cell sequencing is providing unprecedented opportunities to disentangle tissue complexity and investigate cell identities and functions. However, the analysis of single cell data is a challenging, multi-step process that requires both advanced computational skills and biological sensibility. When dealing with single cell RNA-seq (scRNA-seq) data, the presence of technical artifacts, noise, and biological biases imposes to first identify, and eventually remove, unreliable signals from low-quality cells and unwanted sources of variation that might affect the efficacy of subsequent downstream modules. Pre-processing and quality control (QC) of scRNA-seq data is a laborious process consisting in the manual combination of different computational strategies to quantify QC-metrics and define optimal sets of pre-processing parameters.Here we present popsicleR, a R package to interactively guide skilled and unskilled command line-users in the pre-processing and QC analysis of scRNA-seq data. The package integrates, into several main wrapper functions, methods derived from widely used pipelines for the estimation of quality-control metrics, filtering of low-quality cells, data normalization, removal of technical and biological biases, and for cell clustering and annotation. popsicleR starts from either the output files of the Cell Ranger pipeline from 10X Genomics or from a feature-barcode matrix of raw counts generated from any scRNA-seq technology. Open-source code, installation instructions, and a case study tutorial are freely available at https://github.com/bicciatolab/popsicleR.  相似文献   
995.
Quantitative uncertainty analysis has become a common component of risk assessments. In risk assessment models, the most robust method for propagating uncertainty is Monte Carlo simulation. Many software packages available today offer Monte Carlo capabilities while requiring minimal learning time, computational time, and/or computer memory. This paper presents an evalu ation of six software packages in the context of risk assessment: Crystal Ball, @Risk, Analytica, Stella II, PRISM, and Susa-PC. Crystal Ball and @Risk are spreadsheet based programs; Analytica and Stella II are multi-level, influence diagram based programs designed for the construction of complex models; PRISM and Susa-PC are both public-domain programs designed for incorpo rating uncertainty and sensitivity into any model written in Fortran. Each software package was evaluated on the basis of five criteria, with each criterion having several sub-criteria. A ‘User Preferences Table’ was also developed for an additional comparison of the software packages. The evaluations were based on nine weeks of experimentation with the software packages including use of the associated user manuals and test of the software through the use of example problems. The results of these evaluations indicate that Stella II has the most extensive modeling capabilities and can handle linear differential equations. Crystal Ball has the best input scheme for entering uncertain parameters and the best reference materials. @Risk offers a slightly better standard output scheme and requires a little less learning time. Susa-PC has the most options for detailed statistical analysis of the results, such as multiple options for a sensitivity analysis and sophisticated options for inputting correlations. Analytica is a versatile, menu- and graphics-driven package, while PRISM is a more specialized and less user friendly program. When choosing between software packages for uncertainty and sensitivity analysis, the choice largely depends on the specifics of the problem being modeled. However, for risk assessment problems that can be implemented on a spreadsheet, Crystal Ball is recommended because it offers the best input options, a good output scheme, adequate uncertainty and sensitivity analysis, superior reference materials, and an intuitive spreadsheet basis while requiring very little memory.  相似文献   
996.
FEFCO, Groupement Ondulé and Kraft Institute have integrated the data from their recently published updated “European Database for Corrugated Board Life Cycle Studies” into a software tool that has been developed especially for the corrugated board industry. The tool links input and output data reported in the Database to average European data for upstream and downstream processes from BUWAL 250 [3]. The tool is intended to support environmental management of companies since it provides a possibility to find opportunities for improvements and to take environment into consideration when designing corrugated board boxes. The entire system of corrugated packaging is the basis for the calculations. It is assumed that the fibres that are used for the production of the corrugated base papers are produced and recycled only within this system. This simplified so-called closedloop approach, which is described in detail in the Database report, avoids the problem of allocating impacts caused by primary fibre production and the final treatment of corrugated packaging that is not recycled between primary and recovered fibre based paper grades. This means that with the software tool it is not possible to make comparisons between the production of primary fibre and recovered fibre based materials as such. The tool enables the user to vary parameters such as transport, box design, logistics and waste management according to his personal circumstances. In this way he can use the tool to introduce parameters for possible alternatives he wants to investigate. The LCA results of these alternative cases can then be compared and analysed at inventory, characterisation, normalisation and weighing level. The user cannot change the basic data nor the methodology.  相似文献   
997.
Life-cycle assessment (LCA) is a technique for systematically analyzing a product from cradle-to-grave, that is, from resource extraction through manufacture and use to disposal. LCA is a mixed or hybrid analytical system. An inventory phase analyzes system inputs of energy and materials along with outputs of emissions and wastes throughout life cycle, usually as quantitative mass loadings. An impact assessment phase then examines these loadings in light of potential environmental issues using a mixed spectrum of qualitative and quantitative methods. The constraints imposed by inventory's loss of spatial, temporal, dose-response, and threshold information raise concerns about the accuracy of impact assessment. The degree of constraint varies widely according to the environmental issue in question and models used to extrapolate the inventory data. LCA results may have limited value in two areas: (I) local and/ortransient biophysical processes and (2) issues involving biological parameters, such as biodiversity, habitat alteration, and toxicity. The end result is that impact assessment does not measure actual effects or impacts, nor does it calculate the likelihood of an effect or risk Rather, LCA impact assessment results are largely directional environmental indicaton. The accuracy and usefulness of indicators need to be assessed individually and in a circumstance-specific manner prior to decision making. This limits LCAs usefulness as the sole basis for comprehensive assessments and the comparisons of alternatives. In conclusion, LCA may identify potential issues from a systemwide perspective, but more-focused assessments using other analytical techniques are often necessary to resolve the issues.  相似文献   
998.
It is a practical necessity for non-professional users to interpret biologically derived Raman spectral information for obtaining accurate and reliable analytical results. An integrated Raman spectral analysis software (NWUSA) was developed for spectral processing, analysis, and feature recognition. It provides a user-friendly graphical interface to perform the following preprocessing tasks: spectral range selection, cosmic ray removal, polynomial fitting based background subtraction, Savitzky–Golay smoothing, area-under-curve normalization, mean-centered procedure, as well as multivariate analysis algorithms including principal component analysis (PCA), linear discriminant analysis, partial least squares-discriminant analysis, support vector machine (SVM), and PCA-SVM. A spectral dataset obtained from two different samples was utilized to evaluate the performance of the developed software, which demonstrated that the analysis software can quickly and accurately achieve functional requirements in spectral data processing and feature recognition. Besides, the open-source software can not only be customized with more novel functional modules to suit the specific needs, but also benefit many Raman based investigations, especially for clinical usages.  相似文献   
999.
To efficiently simulate very large networks of interconnected neurons, particular consideration has to be given to the computer architecture being used. This article presents techniques for implementing simulators for large neural networks on a number of different computer architectures. The neuronal simulation task and the computer architectures of interest are first characterized, and the potential bottlenecks are highlighted. Then we describe the experience gained from adapting an existing simulator, SWIM, to two very different architectures–vector computers and multiprocessor workstations. This work lead to the implementation of a new simulation library, SPLIT, designed to allow efficient simulation of large networks on several architectures. Different computer architectures put different demands on the organization of both data structures and computations. Strict separation of such architecture considerations from the neuronal models and other simulation aspects makes it possible to construct both portable and extendible code.  相似文献   
1000.
In an LCA case study, the three most frequent industrial metal cleaning technologies were assessed: Cleaning based on aqueous cleaning agents, non-halogenated hydrocarbon solvents and halogenated hydrocarbon solvents. Beside optimisation analysis, the comparison of the cleaning processes was a main goal of the study. The function of metal cleaning processes can be described with a set of parameters called functional parameters. In order to compare different cleaning processes within LCA, it is a precondition that all relevant functional parameters be equivalent. However, metal cleaning processes from different companies normally differ in most of the functional parameters and, thus, are not functionally equivalent. Therefore, it is necessary to calculate the material and energy flows of the processes corresponding to a reference function as a basis for comparison. This can be achieved by simulating the processes according to the functional parameters with the help of a process model. For a general comparison of the technologies, it is also necessary to consider the assessed machines having the same level of optimisation and the same scale.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号