全文获取类型
收费全文 | 2974篇 |
免费 | 300篇 |
国内免费 | 155篇 |
出版年
2023年 | 42篇 |
2022年 | 55篇 |
2021年 | 139篇 |
2020年 | 140篇 |
2019年 | 131篇 |
2018年 | 133篇 |
2017年 | 147篇 |
2016年 | 129篇 |
2015年 | 114篇 |
2014年 | 187篇 |
2013年 | 198篇 |
2012年 | 143篇 |
2011年 | 183篇 |
2010年 | 128篇 |
2009年 | 143篇 |
2008年 | 155篇 |
2007年 | 126篇 |
2006年 | 111篇 |
2005年 | 107篇 |
2004年 | 82篇 |
2003年 | 80篇 |
2002年 | 48篇 |
2001年 | 46篇 |
2000年 | 44篇 |
1999年 | 52篇 |
1998年 | 51篇 |
1997年 | 33篇 |
1996年 | 36篇 |
1995年 | 42篇 |
1994年 | 38篇 |
1993年 | 24篇 |
1992年 | 26篇 |
1991年 | 23篇 |
1990年 | 23篇 |
1989年 | 15篇 |
1988年 | 23篇 |
1987年 | 20篇 |
1986年 | 12篇 |
1985年 | 26篇 |
1984年 | 36篇 |
1983年 | 23篇 |
1982年 | 31篇 |
1981年 | 21篇 |
1980年 | 21篇 |
1979年 | 11篇 |
1978年 | 8篇 |
1977年 | 8篇 |
1974年 | 3篇 |
1973年 | 4篇 |
1966年 | 2篇 |
排序方式: 共有3429条查询结果,搜索用时 15 毫秒
1.
Jessica L. Hite Alaina C. Pfenning‐Butterworth Rachel E. Vetter Clayton E. Cressler 《Ecology and evolution》2020,10(13):6239-6245
- Food ingestion is one of the most basic features of all organisms. However, obtaining precise—and high‐throughput—estimates of feeding rates remains challenging, particularly for small, aquatic herbivores such as zooplankton, snails, and tadpoles. These animals typically consume low volumes of food that are time‐consuming to accurately measure.
- We extend a standard high‐throughput fluorometry technique, which uses a microplate reader and 96‐well plates, as a practical tool for studies in ecology, evolution, and disease biology. We outline technical and methodological details to optimize quantification of individual feeding rates, improve accuracy, and minimize sampling error.
- This high‐throughput assay offers several advantages over previous methods, including i) substantially reduced time allotments per sample to facilitate larger, more efficient experiments; ii) technical replicates; and iii) conversion of in vivo measurements to units (mL‐1 hr‐1 ind‐1) which enables broad‐scale comparisons across an array of taxa and studies.
- To evaluate the accuracy and feasibility of our approach, we use the zooplankton, Daphnia dentifera, as a case study. Our results indicate that this procedure accurately quantifies feeding rates and highlights differences among seven genotypes.
- The method detailed here has broad applicability to a diverse array of aquatic taxa, their resources, environmental contaminants (e.g., plastics), and infectious agents. We discuss simple extensions to quantify epidemiologically relevant traits, such as pathogen exposure and transmission rates, for infectious agents with oral or trophic transmission.
2.
ContextModerate-grained data may not always represent landscape structure in adequate detail which could cause misleading results. Certain metrics have been shown to be predictable with changes in scale; however, no studies have verified such predictions using independent fine-grained data.ObjectivesOur objective was to use independently derived land cover datasets to assess relationships between metrics based on fine- and moderate-grained data for a range of analysis extents. We focus on metrics that previous literature has shown to have predictable relationships across scales.MethodsThe study area was located in eastern Connecticut. We compared a 1 m land cover dataset to a 30 m resampled dataset, derived from the 1 m data, as well as two Landsat-based datasets. We examined 11 metrics which included cover areas and patch metrics. Metrics were analyzed using analysis extents ranging from 100 to 1400 m in radius.ResultsThe resampled data had very strong linear relationships to the 1 m data, from which it was derived, for all metrics regardless of the analysis extent size. Landsat-based data had strong correlations for most cover area metrics but had little or no correlation for patch metrics. Increasing analysis areas improved correlations.ConclusionsRelationships between coarse- and fine-grained data tend to be much weaker when comparing independent land cover datasets. Thus, trends across scales that are found by resampling land cover are likely to be unsuitable for predicting the effects of finer-scale elements in the landscape. Nevertheless, coarser data shows promise in predicting fine-grained for cover area metrics provided the analysis area used is sufficiently large. 相似文献
3.
IntroductionThe International Atomic Energy Agency (IAEA) organized the 3rd international conference on radiation protection (RP) of patients in December 2017. This paper presents the conclusions on the interventional procedures (IP) session.Material and methodsThe IAEA conference was conducted as a series of plenary sessions followed by various thematic sessions. “Radiation protection of patients and staff in interventional procedures” session keynote speakers presented information on: 1) Risk management of skin injuries, 2) Occupational radiation risks and 3) RP for paediatric patients. Then, a summary of the session-related papers was presented by a rapporteur, followed by an open question-and-answer discussion.ResultsSixty-seven percent (67%) of papers came from Europe. Forty-four percent (44%) were patient studies, 44% were occupational and 12% were combined studies. Occupational studies were mostly on eye lens dosimetry. The rest were on scattered radiation measurements and dose tracking. The majority of patient studies related to patient exposure with only one study on paediatric patients. Automatic patient dose reporting is considered as a first step for dose optimization. Despite efforts, paediatric IP radiation dose data are still scarce. The keynote speakers outlined recent achievements but also challenges in the field. Forecasting technology, task-specific targeted education from educators familiar with the clinical situation, more accurate estimation of lens doses and improved identification of high-risk professional groups are some of the areas they focused on.ConclusionsManufacturers play an important role in making patients safer. Low dose technologies are still expensive and manufacturers should make these affordable in less resourced countries. Automatic patient dose reporting and real-time skin dose map are important for dose optimization. Clinical audit and better QA processes together with more studies on the impact of lens opacities in clinical practice and on paediatric patients are needed. 相似文献
4.
5.
PurposeAt introduction in 2014, dose calculation for the first MLC on a robotic SRS/SBRT platform was limited to a correction-based Finite-Size Pencil Beam (FSPB) algorithm. We report on the dosimetric accuracy of a novel Monte Carlo (MC) dose calculation algorithm for this MLC, included in the Precision™ treatment planning system.MethodsA phantom was built of one slab (5.0 cm) of lung-equivalent material (0.09…0.29 g/cc) enclosed by 3.5 cm (above) and 5 cm (below) slabs of solid water (1.045 g/cc). This was irradiated using rectangular (15.4 × 15.4 mm2 to 53.8 × 53.7 mm2) and two irregular MLC-fields. Radiochromic film (EBT3) was positioned perpendicular to the slabs and parallel to the beam. Calculated dose distributions were compared to film measurements using line scans and 2D gamma analysis.ResultsMeasured and MC calculated percent depth dose curves showed a characteristic dose drop within the low-density region, which was not correctly reproduced by FSPB. Superior average gamma pass rates (2%/1 mm) were found for MC (91.2 ± 1.5%) compared to FSPB (55.4 ± 2.7%). However, MC calculations exhibited localized anomalies at mass density transitions around 0.15 g/cc, which were traced to a simplification in electron transport. Absence of these anomalies was confirmed in a modified build of the MC engine, which increased gamma pass rates to 96.6 ± 1.2%.ConclusionsThe novel MC algorithm greatly improves dosimetric accuracy in heterogeneous tissue, potentially expanding the clinical use of robotic radiosurgery with MLC. In-depth, independent validation is paramount to identify and reduce the residual uncertainties in any software solution. 相似文献
6.
Bioaccessibility measurements have the potential to improve the accuracy of risk assessments and reduce the potential costs of remediation when they reveal that the solubility of chemicals in a matrix (e.g., soil) differs markedly from that in the critical toxicity study (i.e., the key study from which a toxicological or toxicity reference value is derived). We aimed to apply this approach to a brownfield site contaminated with chromium, and found that the speciation was CrIII, using a combination of alkaline digestion/diphenylcarbazide complexation and X-ray absorption near edge structure analysis. The bioaccessibility of Cr2O3, the compound on which a reference dose for CrIII is based, was substantially lower (<0.1%) than that of the CrIII in the soils, which was a maximum of 9%, giving relative bioaccessibility values of 13,000% in soil. This shows that the reference dose is based on essentially an insoluble compound, and thus we suggest that other compounds be considered for toxicity testing and derivation of reference dose. Two possibilities are CrCl3·6H2O and KCr(SO4)2·12H2O, which have been used for derivation of ecological toxicity reference values and are soluble at a range of dosing levels in our bioaccessibility tests. 相似文献
7.
8.
《Saudi Journal of Biological Sciences》2020,27(11):2936-2941
In this research, a proto-type study we have conducted, where we have synthesized tungsten based composite materials which are tungsten along with combined oxides of other elements like calcium, scandium, barium, and aluminium in the form of powder with bones powder of mice devised by high energy ball mill and later on fabricating high dense pellets by sintering by spark plasma. The particle sizes of the composite materials are found to be 1–2 µm, as evidenced by the electron microscope, suggesting synthesized materials are of micron size. The quantitative and qualitative analysis of sintered pellets are well confirmed by electron probe micro analyzer (EPMA) and energy dispersive X-ray spectrometer (EDS) which illustrate the greater percentage of tungsten presents in the profound scan areas with other elements of the composite. The absence of pores across the 3D geometry suggesting dense sample, which is quite revealed by the X-ray tomography inspection. The prepared sintered pellets from the tungsten based composites are found to be ≈ 99.5% density with the observation of tungsten to be accumulated uniformly across the scan regions along with focussed hot spots as implied by EPMA. This study paves the way, to examine how the tungsten accumulation and the distribution with the other elements for future understanding in bone tissue engineering application and the in vivo specification of tungsten. 相似文献
9.
Using data from the National Longitudinal Survey of Adolescent to Adult Health, we estimate the effect of peers’ alcohol consumption and alcohol prices on the drinking habits of high-school-age youth. We use the two-stage residual inclusion method to account for the endogeneity of peer drinking in nonlinear models. For our sample of high school students, we find that peer effects are statistically and economically significant regarding the choice to participate in drinking but are not significant for the frequency of drinking, including binge drinking. Regarding alcohol prices, even though we have good price variation in our sample, alcohol prices are not found to be significant. The results are important for policymakers who are considering policies to reduce underage drinking, as we conclude that no significant impact on underage drinking will result from low-tax states’ increasing excise taxes on alcohol so they are similar to those of high-tax states. Policymakers may choose to focus instead on the influence of peers and changing the social norm behavior. 相似文献
10.
A new set of signals for studying detectability of an X-ray imaging system is presented. The results obtained with these signals are intended to complement the NEQ results.The signals are generated from line spread profiles by progressively removing their lower frequency components and the resulting high frequency residues (HFRs) form the set of signals to be used in detectability studies. Detectability indexes for these HFRs are obtained using a non-prewhitening (NPW) observer and a series of edge images are used to obtain the HFRs, the covariance matrices required by the NPW model and the MTF and NPS used in NEQ calculations. The template used in the model is obtained by simulating the processes of blurring and sampling of the edge images. Comparison between detectability indexes for the HFRs and NEQ are carried out for different acquisition techniques using different beam qualities and doses.The relative sensitivity shown by detectability indexes using HFRs is higher than that of NEQ, especially at lower doses. Also, the different observers produce different results at high doses: while the ideal Bayesian observer used by NEQ distinguishes between beam qualities, the NPW used with the HFRs produces no differences between them.Delta functions used in HFR are the opposite of complex exponential functions in terms of their support in the spatial and frequency domains. Since NEQ can be interpreted as detectability of these complex exponential functions, detectability of HFRs is presented as a natural complement to NEQ in the performance assessment of an imaging system. 相似文献