共查询到20条相似文献,搜索用时 15 毫秒
1.
Global stability results for a generalized Lotka-Volterra system with distributed delays 总被引:3,自引:0,他引:3
The paper contains an extension of the general ODE system proposed in previous papers by the same authors, to include distributed time delays in the interaction terms. The new system describes a large class of Lotka-Volterra like population models and epidemic models with continuous time delays. Sufficient conditions for the boundedness of solutions and for the global asymptotic stability of nontrivial equilibrium solutions are given. A detailed analysis of the epidemic system is given with respect to the conditions for global stability. For a relevant subclass of these systems an existence criterion for steady states is also given.Work supported by the Special Program Control of Infectious Diseases, C.N.R. and by the M.P.I., Italy 相似文献
2.
Distributed Shared Arrays (DSA) is a distributed virtual machine that supports Java-compliant multithreaded programming with
mobility support for system reconfiguration in distributed environments. The DSA programming model allows programmers to explicitly
control data distribution so as to take advantage of the deep memory hierarchy, while relieving them from error-prone orchestration
of communication and synchronization at run-time. The DSA system is developed as an integral component of mobility support
middleware for Grid computing so that DSA-based virtual machines can be reconfigured to adapt to the varying resource supplies
or demand over the course of a computation. The DSA runtime system also features a directory-based cache coherence protocol
in support of replication of user-defined sharing granularity and a communication proxy mechanism for reducing network contention.
System reconfiguration is achieved by a DSA service migration mechanism, which moves the DSA service and residing computational
agents between physical servers for load balancing and fault resilience. We demonstrate the programmability of the model in
a number of parallel applications and evaluate its performance by application benchmark programs, in particular, the impact
of the coherence granularity and service migration overhead.
Song Fu received the BS degreee in computer science from Nanjing University of Aeronautics and Astronautics, China, in 1999, and
the MS degree in computer science from Nanjing University, China, in 2002. He is currently a PhD candidate in computer engineering
at Wayne State University. His research interests include the resource management, security, and mobility issues in wide-area
distributed systems.
Cheng-Zhong Xu received the BS and MS degrees in computer science from Nanjing University in 1986 and 1989, respectively, and the Ph.D.
degree in computer science from the University of Hong Kong in 1993. He is an Associate Professor in the Department of Electrical
and Computer Engineer of Wayne State University. His research interests lie in distributed are in distributed and parallel
systems, particularly in resource management for high performance cluster and grid computing and scalable and secure Internet
services. He has published more than100 peer-reviewed articles in journals and conference proceedings in these areas. He is
the author of the book Scalable and Secure Internet Services and Architecture (CRC Press, 2005) and a co-author of the book Load Balancing in Parallel Computers: Theory and Practice (Kluwer Academic, 1997). He serves on the editorial boards of J. of Parallel and Distributed Computing, J. of Parallel, Emergent,
and Distributed Systems, J. of High Performance Computing and Networking, and J. of Computers and Applications. He was the
founding program co-chair of International Workshop on Security in Systems and Networks (SSN), the general co-chair of the
IFIP 2006 International Conference on Embedded and Ubiquitous Computing (EUC06), and a member of the program committees of
numerous conferences. His research was supported in part by the US National Science Foundation, NASA, and Cray Research. He
is a recipient of the Faculty Research Award of Wayne State University in 2000, the Presidents Award for Excellence in Teaching
in 2002, and the Career Development Chair Award in 2003. He is a senior member of the IEEE.
Brian A. Wims was born in Washington, DC in 1967. He received the Bachelor of Science in Electrical Engineering from GMI-EMI (now called
Kettering University) in 1990; and Master of Science in Computer Engineering from Wayne State University in 1999. His research
interests are primarily in the fields of parallel and distributed systems with applications in Mobile Agent technologies.
From 1990–2001 he worked in various Engineering positions in General Motors, including Electrical Analysis, Software Design,
and Test and Development. In 2001, he joined the General Motors IS&S department where he is currently a Project Manager in
the Computer Aided Test group. Responsibilities include managing the development of test automation applications in the Electrical,
EMC, and Safety Labs.
Ramzi Basharahil was born in Aden, Yemen in 1972. He received the Bachelor of Science degree in Electrical Engineering from the United Arab
Emirates University. He graduated top of his engineering graduated class of 1997. He obtained Master of Science degree in
2001 from Wayne State University in the Department of Electrical and Computer Engineering. His main research interests are
primarily in the fields of parallel and distributed systems with applications to distributed processing across cluster of
servers.
From 1997 to 1998, he worked as a Teaching Assistant in the Department of Electrical Engineering at the UAE University. In
2000, he joined Internet Security Systems as a security software engineer. He later joined NetIQ Corporation in 2002 and still
working since then. He is leading the security events trending and events management software development where he is involved
in designing and the implementing event/log managements products. 相似文献
3.
A flexible multi-dimensional QoS performance measure framework for distributed heterogeneous systems
Jong-Kook Kim Debra A. Hensgen Taylor Kidd Howard Jay Siegel David St. John Cynthia Irvine Tim Levin N. Wayne Porter Viktor K. Prasanna Richard F. Freund 《Cluster computing》2006,9(3):281-296
When users’ tasks in a distributed heterogeneous computing environment (e.g., cluster of heterogeneous computers) are allocated
resources, the total demand placed on some system resources by the tasks, for a given interval of time, may exceed the availability
of those resources. In such a case, some tasks may receive degraded service or be dropped from the system. One part of a measure
to quantify the success of a resource management system (RMS) in such a distributed environment is the collective value of
the tasks completed during an interval of time, as perceived by the user, application, or policy maker. The Flexible Integrated
System Capability (FISC) measure presented here is a measure for quantifying this collective value. The FISC measure is a
flexible multi-dimensional measure such that any task attribute can be inserted and may include priorities, versions of a
task or data, deadlines, situational mode, security, application- and domain-specific QoS, and task dependencies. For an environment
where it is important to investigate how well data communication requests are satisfied, the data communication request satisfied
can be the basis of the FISC measure instead of tasks completed. The motivation behind the FISC measure is to determine the
performance of resource management schemes if tasks have multiple attributes that needs to be satisfied. The goal of this
measure is to compare the results of different resource management heuristics that are trying to achieve the same performance
objective but with different approaches.
This research was supported by the DARPA/ITO Quorum Program, by the DARPA/ISO BADD Program and the Office of Naval Research
under ONR grant number N00014-97-1-0804, by the DARPA/ITO AICE program under contract numbers DABT63-99-C-0010 and DABT63-99-C-0012,
and by the Colorado State University George T. Abell Endowment. Intel and Microsoft donated some of the equipment used in
this research.
Jong-Kook Kim is pursuing a Ph.D. degree from the School of Electrical and Computer Engineering at Purdue University (expected in August
2004). Jong-Kook received his M.S. degree in electrical and computer engineering from Purdue University in May 2000. He received
his B.S. degree in electronic engineering from Korea University, Seoul, Korea in 1998. He has presented his work at several
international conferences and has been a reviewer for numerous conferences and journals. His research interests include heterogeneous
distributed computing, computer architecture, performance measure, resource management, evolutionary heuristics, and power-aware
computing. He is a student member of the IEEE, IEEE Computer Society, and ACM.
Debra Hensgen is a member of the Research and Evaluation Team at OpenTV in Mountain View, California. OpenTV produces middleware for set-top
boxes in support of interactive television. She received her Ph.D. in the area of Distributed Operating Systems from the University
of Kentucky. Prior to moving to private industry, as an Associate Professor in the systems area, she worked with students
and colleagues to design and develop tools and systems for resource management, network re-routing algorithms and systems
that preserve quality of service guarantees, and visualization tools for performance debugging of parallel and distributed
systems. She has published numerous papers concerning her contributions to the Concurra toolkit for automatically generating
safe, efficient concurrent code, the Graze parallel processing performance debugger, the SAAM path information base, and the
SmartNet and MSHN Resource Management Systems.
Taylor Kidd is currently a Software Architect for Vidiom Systems in Portland Oregon. His current work involves the writing of multi-company
industrial specifications and the architecting of software systems for the digital cable television industry. He has been
involved in the establishment of international specifications for digital interactive television in both Europe and in the
US. Prior to his current position, Dr. Kidd has been a researcher for the US Navy as well as an Associate Professor at the
Naval Postgraduate School. Dr Kidd received his Ph.D. in Electrical Engineering in 1991 from the University of California,
San Diego.
H. J. Siegel was appointed the George T. Abell Endowed Chair Distinguished Professor of Electrical and Computer Engineering at Colorado
State University (CSU) in August 2001, where he is also a Professor of Computer Science. In December 2002, he became the first
Director of the CSU Information Science and Technology Center (ISTeC). ISTeC is a university-wide organization for promoting,
facilitating, and enhancing CSU’s research, education, and outreach activities pertaining to the design and innovative application
of computer, communication, and information systems. From 1976 to 2001, he was a professor at Purdue University. He received
two BS degrees from MIT, and the MA, MSE, and PhD degrees from Princeton University. His research interests include parallel
and distributed computing, heterogeneous computing, robust computing systems, parallel algorithms, parallel machine interconnection
networks, and reconfigurable parallel computer systems. He has co-authored over 300 published papers on parallel and distributed
computing and communication, is an IEEE Fellow, is an ACM Fellow, was a Coeditor-in-Chief of the Journal of Parallel and Distributed
Computing, and was on the Editorial Boards of both the IEEE Transactions on Parallel and Distributed Systems and the IEEE
Transactions on Computers. He was Program Chair/Co-Chair of three major international conferences, General Chair/Co-Chair
of four international conferences, and Chair/Co-Chair of five workshops. He has been an international keynote speaker and
tutorial lecturer, and has consulted for industry and government.
David St. John is Chief Information Officer for WeatherFlow, Inc., a weather services company specializing in coastal weather observations
and forecasts. He received a master’s degree in Engineering from the University of California, Irvine. He spent several years
as the head of staff on the Management System for Heterogeneous Networks project in the Computer Science Department of the
Naval Postgraduate School. His current relationship with cluster computing is as a user of the Regional Atmospheric Modeling
System (RAMS), a numerical weather model developed at Colorado State University. WeatherFlow runs RAMS operationally on a
Linux-based cluster.
Cynthia Irvine is a Professor of Computer Science at the Naval Postgraduate School in Monterey, California. She received her Ph.D. from
Case Western Reserve University and her B.A. in Physics from Rice University. She joined the faculty of the Naval Postgraduate
School in 1994. Previously she worked in industry on the development of high assurance secure systems. In 2001, Dr. Irvine
received the Naval Information Assurance Award. Dr. Irvine is the Director of the Center for Information Systems Security
Studies and Research at the Naval Postgraduate School. She has served on special panels for NSF, DARPA, and OSD. In the area
of computer security education Dr. Irvine has most recently served as the general chair of the Third World Conference on Information
Security Education and the Fifth Workshop on Education in Computer Security. She co-chaired the NSF workshop on Cyber-security
Workforce Needs Assessment and Educational Innovation and was a participant in the Computing Research Association/NSF sponsored
Grand Challenges in Information Assurance meeting. She is a member of the editorial board of the Journal of Information Warfare
and has served as a reviewer and/or program committee member of a variety of security related conferences. She has written
over 100 papers and articles and has supervised the work of over 80 students. Professor Irvine is a member of the ACM, the
AAS, a life member of the ASP, and a Senior Member of the IEEE.
Timothy E. Levin is a Research Associate Professor at the Naval Postgraduate School. He has spent over 18 years working in the design, development,
evaluation, and verification of secure computer systems, including operating systems, databases and networks. His current
research interests include high assurance system design and analysis, development of models and methods for the dynamic selection
of QoS security attributes, and the application of formal methods to the development of secure computer systems.
Viktor K. Prasanna received his BS in Electronics Engineering from the Bangalore University and his MS from the School of Automation, Indian
Institute of Science. He obtained his Ph.D. in Computer Science from the Pennsylvania State University in 1983. Currently,
he is a Professor in the Department of Electrical Engineering as well as in the Department of Computer Science at the University
of Southern California, Los Angeles. He is also an associate member of the Center for Applied Mathematical Sciences (CAMS)
at USC. He served as the Division Director for the Computer Engineering Division during 1994–98. His research interests include
parallel and distributed systems, embedded systems, configurable architectures and high performance computing. Dr. Prasanna
has published extensively and consulted for industries in the above areas. He has served on the organizing committees of several
international meetings in VLSI computations, parallel computation, and high performance computing. He is the Steering Co-chair
of the International Parallel and Distributed Processing Symposium [merged IEEE International Parallel Processing Symposium
(IPPS) and the Symposium on Parallel and Distributed Processing (SPDP)] and is the Steering Chair of the International Conference
on High Performance Computing(HiPC). He serves on the editorial boards of the Journal of Parallel and Distributed Computing
and the Proceedings of the IEEE. He is the Editor-in-Chief of the IEEE Transactions on Computers. He was the founding Chair
of the IEEE Computer Society Technical Committee on Parallel Processing. He is a Fellow of the IEEE.
Richard F. Freund is the originator of GridIQ’s network scheduling concepts that arose from mathematical and computing approaches he developed
for the Department of Defense in the early 1980’s. Dr. Freund has over twenty-five years experience in computational mathematics,
algorithm design, high performance computing, distributed computing, network planning, and heterogeneous scheduling. Since
1989, Dr. Freund has published over 45 journal articles in these fields. He has also been an editor of special editions of
IEEE Computer and the Journal of Parallel and Distributed Computing. In addition, he is a founder of the Heterogeneous Computing
Workshop, held annually in conjunction with the International Parallel Processing Symposium. Dr. Freund is the recipient of
many awards, which includes the prestigious Department of Defense Meritorious Civilian Service Award in 1984 and the Lauritsen-Bennet
Award from the Space and Naval Warfare Systems Command in San Diego, California. 相似文献
4.
Storage of sequence data is a big concern as the amount of data generated is exponential in nature at several locations. Therefore,there is a need to develop techniques to store data using compression algorithm. Here we describe optimal storage algorithm(OPTSDNA) for storing large amount of DNA sequences of varying length. This paper provides performance analysis of optimalstorage algorithm (OPTSDNA) of a distributed bioinformatics computing system for analysis of DNA sequences. OPTSDNAalgorithm is used for storing various sizes of DNA sequences into database. DNA sequences of different lengths were stored byusing this algorithm. These input DNA sequences are varied in size from very small to very large. Storage size is calculated by thisalgorithm. Response time is also calculated in this work. The efficiency and performance of the algorithm is high (in size calculationwith percentage) when compared with other known with sequential approach. 相似文献
5.
In the use of age structured population models for agricultural applications such as the modeling of crop-pest interactions it is often essential that the model take into account the distribution in maturation rates present in some or all of the populations. The traditional method for incorporating distributed maturation rates into crop and pest models has been the so-called distributed delay method. In this paper we review the application of the distributed delay formalism to the McKendrick equation of an age structured population. We discuss the mathematical properties of the system of ordinary differential equations arising out of the distributed delay formalism. We then discuss an alternative method involving modification of the Leslie matrix. 相似文献
6.
Human cognitive evolution is characterized by two special features that are truly novel in the primate line. The first is the emergence of "mindsharing" cultures that perform cooperative cognitive work, and serve as distributed cognitive networks. The second is the emergence of a brain that is specifically adapted for functioning within those distributed networks, and cannot realize its design potential without them. This paper proposes a hypothetical neural process at the core of this brain adaptation, called the "slow process". It enables the human brain to comprehend social events of much longer duration and complexity than those that characterize primate social life. It runs in the background of human cognitive life, with the faster moving sensorimotor interface running in the foreground. Most mammals can integrate events in the shorter time zone that corresponds to working memory. However, very few can comprehend complex events that extend over several hours (for example, a game or conversation) in what may be called the "intermediate" time zone. Adult humans typically live, plan, and imagine their lives in this time range, which seems to exceed the capabilities of our closest relatives, bonobos and chimpanzees. In summary, human cognition has both an individual and a collective dimension. Individual brains and minds function within cognitive-cultural networks, or CCNs, that store and transmit knowledge. The human brain relies on cultural input even to develop the basic cognitive capacities needed to gain access to that knowledge in the first place. The postulated slow process is a top-down executive capacity that evolved specifically to manage the cultural connection, and handle the cognitive demands imposed by increasingly complex distributed systems. 相似文献
7.
8.
研究了一类具有分布时滞的扩散种群模型行波解的存在性,证明了当平均时滞充分小时,方程具有连接两个平衡点的单调行波解. 相似文献
9.
Iwaya A Nakagawa S Iwakura N Taneike I Kurihara M Kuwano T Gondaira F Endo M Hatakeyama K Yamamoto T 《FEMS microbiology letters》2005,253(2):163-170
Large-scale nosocomial outbreaks of Serratia marcescens septicaemia in Japan have had a fatality rate of 20-60% within 48 h. As a countermeasure, a real-time PCR assay was constructed for the rapid diagnosis of S. marcescens septicaemia. This assay indeed detected S. marcescens in clinical blood specimens (at ca. 10(2)CFU ml(-1)), at a frequency of 0.5% in suspected cases of septicaemia. In mice, the assay provided estimates of blood S. marcescens levels at various infectious stages: namely, 10(7) to 10(8)CFU ml(-1) at a fatal stage (resulting in 100% death), 10(4)-10(5)CFU ml(-1) at a moderately fatal stage (resulting in 50% or more death), and <10(3)CFU ml(-1) at a mild stage (resulting in 100% survival), consistent with actual CFU measurements. Blood bacterial levels could be an important clinical marker that reflects the severity of septicaemia. The simultaneous detection of S. marcescens and the carbapenem resistance gene was also demonstrated. 相似文献
10.
We evaluated habitat quality for crested ibis (Nipponia nippon) using a geographic information system (GIS). First, we digitized the topography map, vegetation map, river map, road map and villages/towns map by ArcInfo, and gave each map layer a suitability index based on our perceptions of the needs of crested ibis. Second, we overlayed these maps to obtain an integrated map of habitat quality. Finally, we compared the calculated habitat quality with the actual distribution of crested ibis. We found that the birds were almost always located at the site of high quality (habitat suitability index [HSI]>0.6), which indicated that the factors we selected were important for crested ibis. We also found that crested ibis were never located at some sites of high quality, thus, we assume that other factors not considered in this study limit the distribution of crested ibis. Regression analysis indicated that crested ibis preferred lower elevation habitats and tolerated higher levels of human disturbance in recent years than previously reported. These results reflected a 20-year protection program for this species. 相似文献
11.
12.
create: a software to create input files from diploid genotypic data for 52 genetic software programs 总被引:2,自引:0,他引:2
create is a Windows program for the creation of new and conversion of existing data input files for 52 genetic data analysis software programs. Programs are grouped into areas of sibship reconstruction, parentage assignment, genetic data analysis, and specialized applications. create is able to read in data from text, Microsoft Excel and Access sources and allows the user to specify columns containing individual and population identifiers, birth and death data, sex data, relationship information, and spatial location data. create's only constraints on source data are that one individual is contained in one row, and the genotypic data is contiguous. create is available for download at http://www.lsc.usgs.gov/CAFL/Ecology/Software.html. 相似文献
13.
P.J. Dias L. Sollelis S.B. Piertney M. Snow 《Journal of experimental marine biology and ecology》2008,367(2):253-258
Shellfish aquaculture is a growing industry in Scotland, dominated by the production of the mussel Mytilus edulis, the native species. Recently the discovery of Mytilus galloprovincialis and Mytilus trossulus together with M. edulis and all 3 hybrids in cultivation in some Scottish sea lochs led to questions regarding the distribution of mussel species in Scotland. The establishment of an extensive sampling survey, involving the collection of mussels at 34 intertidal sites and 10 marinas around Scotland, motivated the development of a high-throughput method for identification of Mytilus alleles from samples. Three Taqman®-MGB probes and one set of primers were designed, based on the previously described Me 15/16 primers targeting the adhesive protein gene sequence, and samples were screened for the presence of M. edulis, M. galloprovincialis and M. trossulus alleles using real-time PCR. Mytilus edulis alleles were identified in samples from all 44 sites. Mytilus galloprovincialis alleles were found together with M. edulis alleles extensively in northern parts of the west and east coasts. Mytilus trossulus alleles were identified in samples from 6 sites in the west and south-west of Scotland. Because M. trossulus is generally undesirable in cultivation and therefore preventing the geographical spread of this species across Scotland is considered beneficial by the shellfish aquaculture industry, these 6 samples were further analysed for genotype frequencies using conventional PCR. Although distribution of the non-native species M. galloprovincialis and M. trossulus have proven to be more widespread than previously thought, there is no evidence from our study of either M. trossulus or M. galloprovincialis acting as an invasive species in Scotland. The real-time PCR method developed in this study has proven to be a rapid and effective tool for the identification of M. edulis, M. galloprovincialis and M. trossulus alleles from samples and should prove useful in future surveys, ecological or aquaculture management related studies in both unispecific and mixed species areas of these species. 相似文献
14.
Zhao YL Ruan WB Yu L Zhang JY Fu JM Shain EB Huang XT Wang JG 《Journal of nematology》2010,42(2):166-172
Diagnosing and quantifying plant-parasitic nematodes is critical for efficient nematode management. Several studies have been performed intending to demonstrate nematode quantification via real-time quantitative PCR. However, most of the studies used dilution of DNA templates to make standard curves, while few studies used samples with different nematode numbers to make the standard curve, resulting in a high standard error. The objective of the present study was to develop a high quality standard curve using samples containing different numbers of the root-knot nematode Meloidogyne incognita and evaluate the results of real time qPCR with maxRatio analysis. The results showed that a high quality standard curve was obtained with different nematode numbers using specific primers and cycle threshold (Ct)-PCR (R(2)=0.9962, P<0.001, n=9). With the maxRatio analysis, the fractional cycle number (FCN)-PCR cycle curve and adjusted FCN (FCNadj)-PCR cycle curve had similar patterns as those of the Ct-PCR cycle curve. For quantification of nematodes in field soil samples, qPCR estimations with a FCNadj-PCR cycle standard curve was very close to microscope counting of second-stage juveniles (R(2)=0.9064, P<0.001, n=10), qPCR estimations with a FCN-PCR cycle standard curve was comparably good (R(2)=0.8509, P<0.001, n=10), and the biases with a Ct-PCR cycle standard curve were large (R(2)=0.7154, P<0.001, n=10). Moreover, we found that the concentration of Triton X-100 had less of an effect on FCN as compared to Ct, with delta FCN 0.52, and delta Ct 3.94 at 0.8% Triton. The present study suggests, that combined with maxRatio methods, real time qPCR could be a practical approach for quantifying M. incognita in field samples. 相似文献
15.
Thermostable trypsin conjugates for high-throughput proteomics: synthesis and performance evaluation
Sebela M Stosová T Havlis J Wielsch N Thomas H Zdráhal Z Shevchenko A 《Proteomics》2006,6(10):2959-2963
Conjugating bovine trypsin with oligosaccharides maltotriose, raffinose and stachyose increased its thermostability and suppressed autolysis, without affecting its cleavage specificity. These conjugates accelerated the digestion of protein substrates both in solution and in gel, compared to commonly used unmodified and methylated trypsins. 相似文献
16.
17.
McLean RK Graham ID Bosompra K Choudhry Y Coen SE Macleod M Manuel C McCarthy R Mota A Peckham D Tetroe JM Tucker J 《Implementation science : IS》2012,7(1):57
ABSTRACT: BACKGROUND: The Canadian Institutes of Health Research (CIHR) has defined knowledge translation (KT) as a dynamic and iterative process that includes the synthesis, dissemination, exchange, and ethically-sound application of knowledge to improve the health of Canadians, provide more effective health services and products, and strengthen the healthcare system. CIHR, the national health research funding agency in Canada, has undertaken to advance this concept through direct research funding opportunities in KT. Because CIHR is recognized within Canada and internationally for leading and funding the advancement of KT science and practice, it is essential and timely to evaluate this intervention, and specifically, these funding opportunities. DESIGN: The study will employ a novel method of participatory, utilization-focused evaluation inspired by the principles of integrated KT. It will use a mixed methods approach, drawing on both quantitative and qualitative data, and will elicit participation from CIHR funded researchers, knowledge users, KT experts, as well as other health research funding agencies. Lines of inquiry will include an international environmental scan, document/data reviews, in-depth interviews, targeted surveys, case studies, and an expert review panel. The study will investigate how efficiently and effectively the CIHR model of KT funding programs operates, what immediate outcomes these funding mechanisms have produced, and what impact these programs have had on the broader state of health research, health research uptake, and health improvement. DISCUSSION: The protocol and results of this evaluation will be of interest to those engaged in the theory, practice, and evaluation of KT. The dissemination of the study protocol and results to both practitioners and theorists will help to fill a gap in knowledge in three areas: the role of a public research funding agency in facilitating KT, the outcomes and impacts KT funding interventions, and how KT can best be evaluated. 相似文献
18.
19.
Insect wings are deformable structures that change shape passively and dynamically owing to inertial and aerodynamic forces during flight. It is still unclear how the three-dimensional and passive change of wing kinematics owing to inherent wing flexibility contributes to unsteady aerodynamics and energetics in insect flapping flight. Here, we perform a systematic fluid-structure interaction based analysis on the aerodynamic performance of a hovering hawkmoth, Manduca, with an integrated computational model of a hovering insect with rigid and flexible wings. Aerodynamic performance of flapping wings with passive deformation or prescribed deformation is evaluated in terms of aerodynamic force, power and efficiency. Our results reveal that wing flexibility can increase downwash in wake and hence aerodynamic force: first, a dynamic wing bending is observed, which delays the breakdown of leading edge vortex near the wing tip, responsible for augmenting the aerodynamic force-production; second, a combination of the dynamic change of wing bending and twist favourably modifies the wing kinematics in the distal area, which leads to the aerodynamic force enhancement immediately before stroke reversal. Moreover, an increase in hovering efficiency of the flexible wing is achieved as a result of the wing twist. An extensive study of wing stiffness effect on aerodynamic performance is further conducted through a tuning of Young's modulus and thickness, indicating that insect wing structures may be optimized not only in terms of aerodynamic performance but also dependent on many factors, such as the wing strength, the circulation capability of wing veins and the control of wing movements. 相似文献
20.
Socher E Jarikote DV Knoll A Röglin L Burmeister J Seitz O 《Analytical biochemistry》2008,375(2):318-330
The ability to accurately quantify specific nucleic acid molecules in complex biomolecule solutions in real time is important in diagnostic and basic research. Here we describe a DNA-PNA (peptide nucleic acid) hybridization assay that allows sensitive quantification of specific nucleic acids in solution and concomitant detection of select single base mutations in resulting DNA-PNA duplexes. The technique employs so-called FIT (forced intercalation) probes in which one base is replaced by a thiazole orange (TO) dye molecule. If a DNA molecule that is complementary to the FIT-PNA molecule (except at the site of the dye) hybridizes to the probe, the TO dye exhibits intense fluorescence because stacking in the duplexes enforces a coplanar arrangement even in the excited state. However, a base mismatch at either position immediately adjacent to the TO dye dramatically decreases fluorescence, presumably because the TO dye has room to undergo torsional motions that lead to rapid depletion of the excited state. Of note, we found that the use of d-ornithine rather than aminoethylglycine as the PNA backbone increases the intensity of fluorescence emitted by matched probe-target duplexes while specificity of fluorescence signaling under nonstringent conditions is also increased. The usefulness of the ornithine-containing FIT probes was demonstrated in the real-time PCR analysis providing a linear measurement range over at least seven orders of magnitude. The analysis of two important single nucleotide polymorphisms (SNPs) in the CFTR gene confirmed the ability of FIT probes to facilitate unambiguous SNP calls for genomic DNA by quantitative PCR. 相似文献