首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4919篇
  免费   518篇
  国内免费   262篇
  2024年   19篇
  2023年   176篇
  2022年   128篇
  2021年   186篇
  2020年   214篇
  2019年   269篇
  2018年   227篇
  2017年   219篇
  2016年   190篇
  2015年   182篇
  2014年   239篇
  2013年   287篇
  2012年   178篇
  2011年   174篇
  2010年   154篇
  2009年   212篇
  2008年   224篇
  2007年   237篇
  2006年   194篇
  2005年   183篇
  2004年   181篇
  2003年   172篇
  2002年   163篇
  2001年   133篇
  2000年   141篇
  1999年   112篇
  1998年   71篇
  1997年   65篇
  1996年   47篇
  1995年   47篇
  1994年   54篇
  1993年   47篇
  1992年   51篇
  1991年   53篇
  1990年   33篇
  1989年   32篇
  1988年   46篇
  1987年   45篇
  1986年   30篇
  1985年   38篇
  1984年   28篇
  1983年   30篇
  1982年   30篇
  1981年   29篇
  1980年   27篇
  1979年   24篇
  1978年   19篇
  1977年   16篇
  1976年   12篇
  1974年   7篇
排序方式: 共有5699条查询结果,搜索用时 15 毫秒
131.
The analytical scale of most mass‐spectrometry‐based targeted proteomics assays is usually limited by assay performance and instrument utilization. A recently introduced method, called triggered by offset, multiplexed, accurate mass, high resolution, and absolute quantitation (TOMAHAQ), combines both peptide and sample multiplexing to simultaneously improve analytical scale and quantitative performance. In the present work, critical technical requirements and data analysis considerations for successful implementation of the TOMAHAQ technique based on the study of a total of 185 target peptides across over 200 clinical plasma samples are discussed. Importantly, it is observed that significant interference originate from the TMTzero reporter ion used for the synthetic trigger peptides. This interference is not expected because only TMT10plex reporter ions from the target peptides should be observed under typical TOMAHAQ conditions. In order to unlock the great promise of the technique for high throughput quantification, here a post‐acquisition data correction strategy to deconvolute the reporter ion superposition and recover reliable data is proposed.  相似文献   
132.
Microbes play important roles in human health and disease. The interaction between microbes and hosts is a reciprocal relationship, which remains largely under-explored. Current computational resources lack manually and consistently curated data to connect metagenomic data to pathogenic microbes, microbial core genes, and disease phenotypes. We developed the MicroPhenoDB database by manually curating and consistently integrating microbe-disease association data. MicroPhenoDB provides 5677 non-redundant associations between 1781 microbes and 542 human disease phenotypes across more than 22 human body sites. MicroPhenoDB also provides 696,934 relationships between 27,277 unique clade-specific core genes and 685 microbes. Disease phenotypes are classified and described using the Experimental Factor Ontology (EFO). A refined score model was developed to prioritize the associations based on evidential metrics. The sequence search option in MicroPhenoDB enables rapid identification of existing pathogenic microbes in samples without running the usual metagenomic data processing and assembly. MicroPhenoDB offers data browsing, searching, and visualization through user-friendly web interfaces and web service application programming interfaces. MicroPhenoDB is the first database platform to detail the relationships between pathogenic microbes, core genes, and disease phenotypes. It will accelerate metagenomic data analysis and assist studies in decoding microbes related to human diseases. MicroPhenoDB is available through http://www.liwzlab.cn/microphenodb and http://lilab2.sysu.edu.cn/microphenodb.  相似文献   
133.
134.
目的:急性前壁心肌梗死明显影响室间隔收缩率和左心室射血分数(left ventricular ejection fraction LVEF)。本文旨在探讨心肌带降段及升段收缩率与急性前壁心肌梗死患者LVEF的相关性。方法:收集2015年4月-2017年2月在心内科住院的急性前壁心肌梗死患者36例,正常对照组患者39例。所有患者取左心室长轴M型超声心动图,测量室间隔收缩率、升段收缩率及降段收缩率。心肌梗死左心室射血分数采用双平面Simpson's法计算。结果:与正常对照组相比,心肌梗死组患者舒张末期心肌带升段厚度没有统计学差异(P=0.69),收缩末期升段厚度(P=0.014)更薄、升段收缩率(P0.01)明显降低;心肌梗死组舒张末期降段厚度(P0.01)更薄、收缩末期降段厚度(P0.01)更薄、降段收缩率(P0.01)明显降低;心肌梗死组左心室射血分数与降段收缩率(r~2=0.13,P=0.026)、室间隔增厚率(r~2=0.19,P0.01)呈正相关,与升段收缩率没有相关性(P0.05)。正常对照组左心室射血分数与室间隔增厚率、降段增厚率及升段增厚率无相关性。经过相关分析,筛选出与心肌梗死LVEF的相关因素,进一步经逐步回归分析,得多元线性回归方程为LVEF=48.206+18.914*LVDD(cm)-25.414*LVSD(cm)。结论:急性前壁心肌梗死室间隔降段收缩率明显受损,与左心室射血分数降低有关。多元线性回归方程可估算前壁心肌梗死LVEF。  相似文献   
135.
136.
137.
138.
China's high‐speed economic development and reliance on overconsumption of natural resources have led to serious environmental pollution. Environmental taxation is seen as an effective economic tool to help mitigate air pollution. In order to assess the effects of different scenarios of environmental taxation policies, we propose a frontier‐based environmentally extended input–output optimization model with explicit emission abatement sectors to reflect the inputs and benefits of abatement. Frontier analysis ensures policy scenarios are assessed under the same technical efficiency benchmark, while input–output analysis depicts the wide range of economic transactions among sectors of an economy. Four scenarios are considered in this study, which are increasing specific tax rates of SO2, NOx, and soot and dust separately and increasing all three tax rates simultaneously. Our estimation results show that: raising tax rates of SO2, NOx, and soot and dust simultaneously would have the highest emission reduction effects, with the SO2 tax rate making the greatest contribution to emission reduction. Raising the soot and dust tax rate is the most environmentally friendly strategy due to its highest abatement to welfare through avoided health costs. The combination of frontier analysis and input–output analysis provides policy makers a comprehensive and sectoral approach to assess costs and benefits of environmental taxation.  相似文献   
139.
Life cycle assessment (LCA) and environmentally extended input–output analyses (EEIOA) are two techniques commonly used to assess environmental impacts of an activity/product. Their strengths and weaknesses are complementary, and they are thus regularly combined to obtain hybrid LCAs. A number of approaches in hybrid LCA exist, which leads to different results. One of the differences is the method used to ensure that mixed LCA and EEIOA data do not overlap, which is referred to as correction for double counting. This aspect of hybrid LCA is often ignored in reports of hybrid assessments and no comprehensive study has been carried out on it. This article strives to list, compare, and analyze the different existing methods for the correction of double counting. We first harmonize the definitions of the existing correction methods and express them in a common notation, before introducing a streamlined variant. We then compare their respective assumptions and limitations. We discuss the loss of specific information regarding the studied activity/product and the loss of coherent financial representation caused by some of the correction methods. This analysis clarifies which techniques are most applicable to different tasks, from hybridizing individual LCA processes to integrating complete databases. We finally conclude by giving recommendations for future hybrid analyses.  相似文献   
140.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号