首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5916篇
  免费   623篇
  国内免费   1篇
  2024年   11篇
  2023年   60篇
  2022年   46篇
  2021年   285篇
  2020年   133篇
  2019年   156篇
  2018年   176篇
  2017年   162篇
  2016年   241篇
  2015年   398篇
  2014年   380篇
  2013年   416篇
  2012年   568篇
  2011年   495篇
  2010年   276篇
  2009年   224篇
  2008年   337篇
  2007年   310篇
  2006年   262篇
  2005年   250篇
  2004年   214篇
  2003年   161篇
  2002年   142篇
  2001年   74篇
  2000年   78篇
  1999年   52篇
  1998年   32篇
  1997年   27篇
  1996年   23篇
  1995年   18篇
  1994年   24篇
  1993年   33篇
  1992年   30篇
  1991年   28篇
  1990年   26篇
  1989年   29篇
  1988年   23篇
  1987年   29篇
  1986年   23篇
  1985年   15篇
  1984年   21篇
  1983年   12篇
  1982年   12篇
  1980年   12篇
  1979年   15篇
  1978年   11篇
  1976年   10篇
  1974年   10篇
  1973年   10篇
  1972年   13篇
排序方式: 共有6540条查询结果,搜索用时 31 毫秒
981.
BackgroundHealthcare systems in dengue-endemic countries are often overburdened due to the high number of patients hospitalized according to dengue management guidelines. We systematically evaluated clinical outcomes in a large cohort of patients hospitalized with acute dengue to support triaging of patients to ambulatory versus inpatient management in the future.Methods/Principal findingsFrom June 2017- December 2018, we conducted surveillance among children and adults with fever within the prior 7 days who were hospitalized at the largest tertiary-care (1,800 bed) hospital in the Southern Province, Sri Lanka. Patients who developed platelet count ≤100,000/μL (threshold for hospital admission in Sri Lanka) and who met at least two clinical criteria consistent with dengue were eligible for enrollment. We confirmed acute dengue by testing sera collected at enrollment for dengue NS1 antigen or IgM antibodies. We defined primary outcomes as per the 1997 and 2009 World Health Organization (WHO) classification criteria: dengue hemorrhagic fever (DHF; WHO 1997), dengue shock syndrome (DSS; WHO 1997), and severe dengue (WHO 2009). Overall, 1064 patients were confirmed as having acute dengue: 318 (17.4%) by NS1 rapid antigen testing and 746 (40.7%) by IgM antibody testing. Of these 1064 patients, 994 (93.4%) were adults ≥18 years and 704 (66.2%) were male. The majority (56, 80%) of children and more than half of adults (544, 54.7%) developed DHF during hospitalization, while 6 (8.6%) children and 22 (2.2%) adults developed DSS. Overall, 10 (14.3%) children and 113 (11.4%) adults developed severe dengue. A total of 2 (0.2%) patients died during hospitalization.ConclusionsOne-half of patients hospitalized with acute dengue progressed to develop DHF and a very small number developed DSS or severe dengue. Developing an algorithm for triaging patients to ambulatory versus inpatient management should be the future goal to optimize utilization of healthcare resources in dengue-endemic countries.  相似文献   
982.
Stay-at-home orders and shutdowns of non-essential businesses are powerful, but socially costly, tools to control the pandemic spread of SARS-CoV-2. Mass testing strategies, which rely on widely administered frequent and rapid diagnostics to identify and isolate infected individuals, could be a potentially less disruptive management strategy, particularly where vaccine access is limited. In this paper, we assess the extent to which mass testing and isolation strategies can reduce reliance on socially costly non-pharmaceutical interventions, such as distancing and shutdowns. We develop a multi-compartmental model of SARS-CoV-2 transmission incorporating both preventative non-pharmaceutical interventions (NPIs) and testing and isolation to evaluate their combined effect on public health outcomes. Our model is designed to be a policy-guiding tool that captures important realities of the testing system, including constraints on test administration and non-random testing allocation. We show how strategic changes in the characteristics of the testing system, including test administration, test delays, and test sensitivity, can reduce reliance on preventative NPIs without compromising public health outcomes in the future. The lowest NPI levels are possible only when many tests are administered and test delays are short, given limited immunity in the population. Reducing reliance on NPIs is highly dependent on the ability of a testing program to identify and isolate unreported, asymptomatic infections. Changes in NPIs, including the intensity of lockdowns and stay at home orders, should be coordinated with increases in testing to ensure epidemic control; otherwise small additional lifting of these NPIs can lead to dramatic increases in infections, hospitalizations and deaths. Importantly, our results can be used to guide ramp-up of testing capacity in outbreak settings, allow for the flexible design of combined interventions based on social context, and inform future cost-benefit analyses to identify efficient pandemic management strategies.  相似文献   
983.
Nurses working in the hospital setting increasingly have become overburdened by managing alarms that, in many cases, provide low information value regarding patient health. The current trend, aided by disposable, wearable technologies, is to promote patient monitoring that does not require entering a patient''s room. The development of telemetry alarms and middleware escalation devices adds to the continued growth of auditory, visual, and haptic alarms to the hospital environment but can fail to provide a more complete understanding of patient health. As we begin to innovate to both address alarm overload and improve patient management, perhaps using fundamentally different integration architectures, lessons from the aviation flight deck are worth considering. Commercial jet transport systems and their alarms have evolved slowly over many decades and have developed integration methods that account for operational context, provide multiple response protocol levels, and present a more integrated view of the airplane system state. We articulate three alarm system objectives: (1) supporting hazard management, (2) establishing context, and (3) supporting alarm prioritization. More generally, we present the case that alarm design in aviation can spur directions for innovation for telemetry monitoring systems in hospitals.

Healthcare, and the hospital setting in particular, has experienced rapid growth of auditory, visual, and haptic alarms. These alarms can be notoriously unreliable or can focus on narrowly defined changes to the patient''s state.1 Further, this alarm proliferation has led nursing staff to become increasingly overburdened and distressed by managing alarms.2 Current alarm system architectures do not effectively integrate meaningful data that support increased patient status awareness and management.3 In contrast, commercial jet transports, over many decades, have developed integration methods that account for operational context, provide multiple response protocol levels, and present a more integrated view of airplane state to support operational decision making. Similar methods for advanced control rooms in nuclear power generation have been reviewed by Wu and Li.4In healthcare, The Joint Commission (TJC) and hospital quality departments have generated guidance that further elevates the need to address the industry''s “alarm problem.” In 2014, TJC issued an accreditation requirement (National Patient Safety Goal 06.01.01) titled, “Reduce patient harm associated with clinical alarm systems.”5 This requirement continues to be included in the 2020 requirements for accreditation.From the authors'' perspective, this requirement is leading to solutions that will not effectively support performance of essential tasks and is moving away from the types of innovations that are being sought in aviation and other settings. For example, healthcare administrators advocate categorizing alarms into high-priority (“run”), medium-priority (“walk”), and low-priority (“shuffle”) alarms independent of unit context, hospital context, situational context, and historical patient context.6 In addition, each alarm category is assigned a minimum response time. When nurses do not meet response time targets, administrators may add staff (“telemetry monitor watchers”), increase the volume of alarms, escalate alarms to other staff to respond, increase the “startling” nature of alarms to better direct attention, and benchmark average response times by individual nurse identifiers. Although well intentioned, these approaches can sometimes add to the alarm overload problem by creating more alarms and involving more people in alarm response.The authors, who have investigated human performance in several operational settings, believe that a need exists to reflect more broadly on the role of alarms in understanding and managing a system (be it an aircraft or a set of patients in a hospital department). Most alarms in hospitals signal when a variable is outside a prespecified range that is determined from the patient population (e.g., high heart rate), when a change in cardiac rhythm occurs (e.g., ventricular fibrillation [V-fib]), or when a problem occurs with the alarm system (e.g., change battery). These alarms support shifts in attention when the event being alarmed requires an action by a nurse and when the relative priority of the response is clear in relation to competing demands.Certain alarms are useful for other purposes, such as aiding situation awareness about planned, routine tasks (e.g., an expected event of high heart rate has occurred, which indicates that a staff member is helping a patient to the bathroom). Increasingly, secondary alarm notification systems (SANSs), otherwise known as middleware escalation systems, are incorporating communications through alarms, such as patient call systems, staff emergency broadcasts, and demands for “code blue” teams to immediately go to a patient''s bedside.Thus, alarms are used to attract attention (i.e., to orient staff to an important change). However, from a cognitive engineering perspective, we believe alarms can also be used to support awareness, prioritization, and decision making. That is, the current siloed approach to alarm presentation in healthcare, which is driven by technology, impedes the ability to properly understand and appreciate the implications of alarms. Understanding the meaning and implications of alarms can best be achieved when they are integrated via a system interface that places the alarm in the broader context of system state. We hope that sharing our insights can spur both design and alarm management innovations for bedside telemetry monitoring devices and related middleware escalation systems and dashboards.In this article, we provide insights from human factors research, and from the integrated glass cockpit in particular, to prompt innovation with clinical alarm systems. To draw lessons from aviation and other domains, we conducted a series of meetings among three human factors engineers with expertise in alarm design in healthcare, aviation, nuclear power generation, and military command and control domains. In the process, we identified differences in the design, use, and philosophies for managing alarms in different domains; defined alarm systems; clarified common elements in the “alarm problem” across these domains; articulated objectives for an alarm system that supports a human operator in controlling a complex process (i.e., supervisory control); and identified levels of alarm system maturity. Based on these activities, we assert that:
  1. Clinical alarm systems fail to reduce unnecessary complexity compared with the integrated glass cockpit.
  2. Aviation and clinical alarm systems share core objectives.
  3. The challenges with aviation and clinical alarm systems are similar, including where alarm systems fall short of their objectives.
  4. We can demarcate levels in the process of alarm system evolution, largely based on alarm reliability, system integration, and how system state is described. The higher levels point the way for innovation in clinical alarm systems.
  相似文献   
984.
Host-associated microbes influence host health and function and can be a first line of defence against infections. While research increasingly shows that terrestrial plant microbiomes contribute to bacterial, fungal, and oomycete disease resistance, no comparable experimental work has investigated marine plant microbiomes or more diverse disease agents. We test the hypothesis that the eelgrass (Zostera marina) leaf microbiome increases resistance to seagrass wasting disease. From field eelgrass with paired diseased and asymptomatic tissue, 16S rRNA gene amplicon sequencing revealed that bacterial composition and richness varied markedly between diseased and asymptomatic tissue in one of the two years. This suggests that the influence of disease on eelgrass microbial communities may vary with environmental conditions. We next experimentally reduced the eelgrass microbiome with antibiotics and bleach, then inoculated plants with Labyrinthula zosterae, the causative agent of wasting disease. We detected significantly higher disease severity in eelgrass with a native microbiome than an experimentally reduced microbiome. Our results over multiple experiments do not support a protective role of the eelgrass microbiome against L. zosterae. Further studies of these marine host–microbe–pathogen relationships may continue to show new relationships between plant microbiomes and diseases.  相似文献   
985.
Multiple groundfish stocks in New England remain depleted despite management measures that have been effective elsewhere. A growing body of research suggests that environmental change driven by increasing concentrations of carbon dioxide in the atmosphere and ocean is unfolding more rapidly in New England than elsewhere, and is an important factor in the failure of these stocks to respond to management. We reviewed research on effects of changes in temperature, salinity, dissolved oxygen, pH, and ocean currents on pelagic life stages, post-settlement life stages, and reproduction of four species in the New England groundfish fishery: Atlantic cod (Gadus morhua), haddock (Melanogrammus aeglefinus), winter flounder (Pseudopleuronectes americanus), and yellowtail flounder (Limanda ferruginea). The volume of research on cod was nearly equal to that on the other three species combined. Similarly, many more studies examined effects of temperature than other factors. The majority of studies suggest adverse outcomes, with less evidence for mixed or positive effects. However, for all of the factors other than temperature, there are more knowledge gaps than known effects. Importantly, most work to date examines impacts in isolation, but effects might combine in nonlinear ways and cause stronger reductions in stock productivity than expected. Management strategies will need to account for known effects, nonlinear interactions, and uncertainties if fisheries in New England are to adapt to environmental change.  相似文献   
986.
A series of (Z)-4-(3-carbamoylphenylamino)-4-oxobut-2-enyl amides were synthesized and tested for their ability to inhibit the mono-(ADP-ribosyl)transferase, PARP14 (a.k.a. BAL-2; ARTD-8). Two synthetic routes were established for this series and several compounds were identified as sub-micromolar inhibitors of PARP14, the most potent of which was compound 4t, IC50 = 160 nM. Furthermore, profiling other members of this series identified compounds with >20-fold selectivity over PARP5a/TNKS1, and modest selectivity over PARP10, a closely related mono-(ADP-ribosyl)transferase.  相似文献   
987.
The voltage-gated sodium channel NaV1.7 has received much attention from the scientific community due to compelling human genetic data linking gain- and loss-of-function mutations to pain phenotypes. Despite this genetic validation of NaV1.7 as a target for pain, high quality pharmacological tools facilitate further understanding of target biology, establishment of target coverage requirements and subsequent progression into the clinic. Within the sulfonamide class of inhibitors, reduced potency on rat NaV1.7 versus human NaV1.7 was observed, rendering in vivo rat pharmacology studies challenging. Herein, we report the discovery and optimization of novel benzoxazine sulfonamide inhibitors of human, rat and mouse NaV1.7 which enabled pharmacological assessment in traditional behavioral rodent models of pain and in turn, established a connection between formalin-induced pain and histamine-induced pruritus in mice. The latter represents a simple and efficient means of measuring target engagement.  相似文献   
988.
989.
An understanding of the environmental factors that determine how clam growth varies in space and time improves effective mariculture and shellfish management. We examined the importance of temperature, salinity and chlorophyll-a in controlling the spatial pattern of Mya arenaria growth, the commercially important soft-shell clam, in the Plum Island Sound estuary in northeastern Massachusetts, USA. We collected clams (>5.08 cm) monthly during the April to November growing season from which we determined growth rate, maximum size (L-infinity), and time to reach a harvestable size. We also surveyed selected sites along the estuary to estimate the relationship between clam size and weight. We collected environmental data along the estuary, and our data were complemented with data collected and maintained by the Plum Island ecosystems long-term ecological research project. Clams reached harvestable size fastest and had the greatest L-infinity at the most oceanic site (Yacht Club) in the estuary. Clams had the smallest L-infinity and were slowest to reach the harvestable size at the least oceanic site (Railroad Meander). The spatial patterns of clam growth were best explained by a positive distribution of salinity. Salinity significantly accounted for 95 % of the spatial variation of clam growth in the estuary. Snow melt in spring increases freshwater input to the estuary and results in the lowest spring salinity during a year, and this explained the upper estuary limit of clam distribution. IPCC-projected climate change will cause sea-level rise and increasing precipitation in the northeastern USA, which will modify the spatial pattern of salinity in the region’s estuaries. Our research therefore suggests that future management of M. arenaria, an important economic resource for the local economy, should be concerned with the changes of salinity distribution under climate and land-use change.  相似文献   
990.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号