首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   899篇
  免费   101篇
  国内免费   96篇
  2023年   18篇
  2022年   12篇
  2021年   25篇
  2020年   21篇
  2019年   34篇
  2018年   28篇
  2017年   35篇
  2016年   32篇
  2015年   48篇
  2014年   65篇
  2013年   61篇
  2012年   34篇
  2011年   45篇
  2010年   37篇
  2009年   49篇
  2008年   70篇
  2007年   63篇
  2006年   50篇
  2005年   37篇
  2004年   42篇
  2003年   45篇
  2002年   30篇
  2001年   15篇
  2000年   13篇
  1999年   14篇
  1998年   8篇
  1997年   10篇
  1996年   11篇
  1995年   11篇
  1994年   14篇
  1993年   17篇
  1992年   7篇
  1991年   6篇
  1990年   5篇
  1989年   10篇
  1988年   5篇
  1987年   6篇
  1986年   4篇
  1985年   7篇
  1984年   4篇
  1983年   4篇
  1982年   7篇
  1980年   3篇
  1979年   5篇
  1978年   3篇
  1977年   4篇
  1976年   3篇
  1975年   3篇
  1973年   3篇
  1971年   4篇
排序方式: 共有1096条查询结果,搜索用时 234 毫秒
1.
Calculating the required sample size for a desired power at a given type I error level, we often assume that we know the exact time of all subject responses whenever they occur during our study period. It is very common, however, in practice that we only monitor subjects periodically and, therefore, we know only whether responses occur or not during an interval. This paper includes a quantitative discussion of the effect resulting from data grouping or interval censoring on the required sample size when we have two treatment groups. Furthermore, with the goal of exploring the optimum in the number of subjects, the number of examinations per subject for test responses, and the total length of a study time period, this paper also provides a general guideline about how to determine these to minimize the total cost of a study for a desired power at a given α-level. A specified linear cost function that incorporates the costs of obtaining subjects, periodic examinations for test responses of subjects, and the total length of a study period, is assumed, primarily for illustrative purpose.  相似文献   
2.
3.
Aim Species distribution models are invaluable tools in biogeographical, ecological and applied biological research, but specific concerns have been raised in relation to different modelling techniques in terms of their validity. Here we compare two fundamentally different approaches to species distribution modelling, one based on simple occurrence data where the lack of an ecological framework has been criticized, and the other firmly based in socio‐ecological theory but requiring highly detailed behavioural information that is often limited in availability. Location (Sub‐Saharan) Africa. Methods We used two distinct techniques to predict the realized distribution of a model species, the vervet monkey (Cercopithecus aethiops Linnaeus, 1758). A maximum entropy model was produced taking 13 environmental variables and presence‐only data from 174 sites throughout Africa as input, with an additional 58 sites retained to test the model. A time‐budget model considering the same environmental variables was constructed from detailed behavioural data on 20 groups representing 14 populations, with presence‐only data from the remaining 218 sites reserved to test model predictions on vervet monkey occurrence. Both models were further validated against a reference species distribution map as drawn up by the African Mammals Databank. Results Both models performed well, with the time budget and maximum entropy algorithms correctly predicting vervet monkey presence at 78.4% and 91.4% of their respective test sites. Similarly, the time‐budget model correctly predicted presence and absence at 87.4% of map pixels against the reference distribution map, and the maximum entropy model achieved a success rate of 81.8%. Finally, there was a high level of agreement (81.6%) between the presence–absence maps produced by the two models, and the environmental variables identified as most strongly driving vervet monkey distribution were the same in both models. Main conclusions The time‐budget and maximum entropy models produced accurate and remarkably similar species distribution maps, despite fundamental differences in their conceptual and methodological approaches. Such strong convergence not only provides support for the credibility of current results, but also relieves concerns about the validity of the two modelling approaches.  相似文献   
4.
The problem solved in this paper is that of determining the minimum sample size for setting the ‘normal’ range for bodily fluids. The proportions of too low and too high values which are considered ‘abnormal’ are chosen based upon medical considerations. The criterion used to determine the minimum sample size is that the proportions of the too low and too high values will be not exceeded by more than a prescribed amount with a given probability. The resulting limits are β-expectation tolerance limits with the added condition just noted, and are labeled β-expectation inner tolerance limits.  相似文献   
5.
The use of the negative binomial distribution in both the numerator and denominator in prospective studies leads to an unbiased estimate of the odds ratio and an exact expression for its variance. Sample sizes that minimize the variance of odds ratio estimates are specified. The variance of the odds ratio estimate is shown to be close to the Cramér-Rao lower bound.  相似文献   
6.
The use of individual-based models in the study of the spatial patterns of disturbances has opened new horizons in forest ecosystem research. However, no studies so far have addressed (i) the uncertainty in geostatistical modelling of the spatial relationships in dendrochronological data, (ii) the number of increment cores necessary to study disturbance spatial patterns, and (iii) the choice of an appropriate geostatistical model in relation to disturbance regime. In addressing these issues, we hope to contribute to advances in research methodology as well as to improve interpretations and generalizations from case studies.We used data from the beech-dominated Žofínský Prales forest reserve (Czech Republic), where we cored 3020 trees on 74 ha. Block bootstrap and geostatistics were applied to the data, which covered five decades with highly different disturbance histories. This allowed us to assess the general behavior of various mathematical models. Uncertainty in the spatial patterns and stability of the models was measured as the length of the 95% confidence interval (CI) of model parameters.According to Akaike Information Criterion (AIC), the spherical model fitted best at the range of ca. 20 m, while the exponential model was best at the range of ca. 60 m. However, the best fitting models were not always the most stable. The stability of models grew significantly with sample size. At <500 cores the spherical model was the most stable, while the Gaussian model was very unstable at <300 cores. The pure nugget model produced the most precise nugget estimate. The choice of model should thus be based on the expected spatial relations of the forest ecosystem under study. Sill was the most stable parameter, with an error of ±6–20% for ≥1110 core series. By contrast, practical range was the most sensitive, with an error of at least ±59%. The estimation of the spatial pattern of severe disturbances was more precise than that of fine-scale disturbances.The results suggest that with a sample size of 1000–1400 cores and a properly chosen model, one reaches a certain precision in estimation that does not increase significantly with growing sample size. It appears that in temperate old-growth forests controlled by fine-scale disturbances, it is necessary to have at least 500 cores to estimate sill, nugget and relative nugget, while to estimate practical range at least 1000 cores are needed. When choosing the best model, the stability of the model should be considered together with the value of AIC. Our results indicate the general limits of disturbance spatial pattern studies using dendrochronological and geostatistical methods, which can be only partially overcome by sample size or sampling design.  相似文献   
7.
Mutual information and entropy transfer analysis employed on two inactive states of human beta-2 adrenergic receptor (β2-AR) unraveled distinct communication pathways. Previously, a so-called “highly” inactive state of the receptor was observed during 1.5 microsecond long molecular dynamics simulation where the largest intracellular loop (ICL3) was swiftly packed onto the G-protein binding cavity, becoming entirely inaccessible. Mutual information quantifying the degree of correspondence between backbone-Cα fluctuations was mostly shared between intra- and extra-cellular loop regions in the original inactive state, but shifted to entirely different regions in this latest inactive state. Interestingly, the largest amount of mutual information was always shared among the mobile regions. Irrespective of the conformational state, polar residues always contributed more to mutual information than hydrophobic residues, and also the number of polar-polar residue pairs shared the highest degree of mutual information compared to those incorporating hydrophobic residues. Entropy transfer, quantifying the correspondence between backbone-Cα fluctuations at different timesteps, revealed a distinctive pathway directed from the extracellular site toward intracellular portions in this recently exposed inactive state for which the direction of information flow was the reverse of that observed in the original inactive state where the mobile ICL3 and its intracellular surroundings drove the future fluctuations of extracellular regions.  相似文献   
8.
All living structures, from archaea to human, are open thermodynamic systems analysed through nonequilibrium thermodynamics. Nonequilibrium thermodynamics is a field with important applications to life sciences, which is very often left out of life science courses. A three-step method is suggested to make an easy introduction of nonequilibrium thermodynamics to life science students. The first step is to introduce the Prigogine equation dS = deS + diS, and explain the meaning of the entropy exchange with the surroundings deS and internal entropy generation in the system diS. The second step is to show that the Prigogine equation is connected to the equilibrium thermodynamics already known to students. This can be done by deriving the Clausius inequality dS ≥ dq/T, from the Prigogine equation applied to reversible and irreversible processes in closed systems. Reversible and irreversible processes are discussed separately and the results are then combined into the Clausius inequality. The third step is to introduce the fact that the Prigogine equation has a variety of applications in life sciences. This would give the students an opportunity to understand the entropy balance of physiological processes in cells and organisms. The import and accumulation of entropy, entropy generation, and entropy export could be made easier for students to adopt.  相似文献   
9.
Monterey Bay, CA is an Eastern boundary upwelling system that is nitrogen limited much of the year. In order to resolve population dynamics of microorganisms important for nutrient cycling in this region, we deployed the Environmental Sample Processor with quantitative PCR assays targeting both ribosomal RNA genes and functional genes for subclades of cyanobacteria (Synechococcus) and ammonia-oxidizing Archaea (Thaumarchaeota) populations. Results showed a strong correlation between Thaumarchaea abundances and nitrate during the spring upwelling but not the fall sampling period. In relatively stratified fall waters, the Thaumarchaeota community reached higher numbers than in the spring, and an unexpected positive correlation with chlorophyll concentration was observed. Further, we detected drops in Synechococcus abundance that occurred on short (that is, daily) time scales. Upwelling intensity and blooms of eukaryotic phytoplankton strongly influenced Synechococcus distributions in the spring and fall, revealing what appear to be the environmental limitations of Synechococcus populations in this region. Each of these findings has implications for Monterey Bay biogeochemistry. High-resolution sampling provides a better-resolved framework within which to observe changes in the plankton community. We conclude that controls on these ecosystems change on smaller scales than are routinely assessed, and that more predictable trends will be uncovered if they are evaluated within seasonal (monthly), rather than on annual or interannual scales.  相似文献   
10.
Statistical analysis of the low-frequency (1.0 sec-1 and lower) neuronal impulse activity (IA) meets a few fundamental difficulties. Among them, the most significant is the small number of measurements (interspike intervals) recorded within an acceptable analysis epoch. In our study, we examined the possibility of using the normalized (by its maximum value) informational entropy (Hn) for estimation of the significance of changes in the IA generated by low-frequency neurons of the rostral hypothalamus after electrical stimulation of the prefrontal cortex. We compared the efficiencies of using the U-test (Kolmogorov–Mann–Whitney) and Hn estimate for the analysis of the same samples of neuronal responses. The results allow us to conclude that Hn is a significantly more acceptable estimate for detection of stimulation-induced modifications of the IA generated by low-frequency neurons, as compared with the U-test. The direction of shifts in the Hn value makes it possible to estimate the pattern of neuronal response. This value reflects the state of the neuron and correlates with the type of neuronal responses.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号