首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
With the security requirements of networks, biometrics authenticated schemes which are applied in the multi-server environment come to be more crucial and widely deployed. In this paper, we propose a novel biometric-based multi-server authentication and key agreement scheme which is based on the cryptanalysis of Mishra et al.’s scheme. The informal and formal security analysis of our scheme are given, which demonstrate that our scheme satisfies the desirable security requirements. The presented scheme provides a variety of significant functionalities, in which some features are not considered in the most of existing authentication schemes, such as, user revocation or re-registration and biometric information protection. Compared with several related schemes, our scheme has more secure properties and lower computation cost. It is obviously more appropriate for practical applications in the remote distributed networks.  相似文献   

2.
One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems.Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols.The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.  相似文献   

3.
The possible applications for biometric technologies are many and varied, and education is one of the many industries ideally suited for their use. Some of the possible uses in schools include replacing school IDs, lunch passes, library cards, attendance rosters or timecards, as well as controlling physical access. Overall the use of biometrics in schools is a win/win situation.  相似文献   

4.
Btt runs its survey on the lesser-known and lesser-used biometric technologies once every two years. These technologies often suffer from a lack of funding and credibility making it extremely difficult for them to gather momentum. This year, however, the amount of progress made is impressive and it seems that at least two of the biometrics could warrant their own surveys in the not too distant future…  相似文献   

5.
The combined usage of smart cards and biometrics has gained in popularity over the last two years. While smart cards themselves may be highly secure, many smart card suppliers are acknowledging the advantage that biometrics can bring to their products by linking the card to the owner. Meanwhile, an advantage to the biometric industry is a privacy-friendly medium to carry its templates. A number of high profile applications have helped push the convergence of the technologies, but a remaining question is how far this convergence will actually go, with sensor-on-card products just around the corner?  相似文献   

6.
New ‘omics’ technologies are changing nutritional sciences research. They enable to tackle increasingly complex questions but also increase the need for collaboration between research groups. An important challenge for successful collaboration is the management and structured exchange of information that accompanies data-intense technologies. NuGO, the European Nutrigenomics Organization, the major collaborating network in molecular nutritional sciences, is supporting the application of modern information technologies in this area. We have developed and implemented a concept for data management and computing infrastructure that supports collaboration between nutrigenomics researchers. The system fills the gap between “private” storing with occasional file sharing by email and the use of centralized databases. It provides flexible tools to share data, also during experiments, while preserving ownership. The NuGO Information Network is a decentral, distributed system for data exchange based on standard web technology. Secure access to data, maintained by the individual researcher, is enabled by web services based on the the BioMoby framework. A central directory provides information about available web services. The flexibility of the infrastructure allows a wide variety of services for data processing and integration by combining several web services, including public services. Therefore, this integrated information system is suited for other research collaborations.  相似文献   

7.
Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people’s lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website’s information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites.  相似文献   

8.
Smart card chips are increasingly being used to store and match biometric templates. In the near future, however, the security bar will be raised even further as fingerprint sensors are placed on the smart cards themselves. At this point the combined market for smart cards and biometrics could really blossom.  相似文献   

9.
Forging ahead     
A large percentage of the world’s population have seen science fiction movies depicting biometric systems that can be circumvented by criminals using sophisticated devices. Such misleading information may prove to be a box office winner, but it can lead to a high amount of confusion about how a biometric device assesses whether a person is who he or she claims to be. While the marketing departments of many biometrics companies address the PR issues associated with inaccurate information, the technical divisions are also working hard to eliminate problems associated with ‘live and well’ checks. However, to my knowledge, in spite of the steps taken by many fingerprint companies, no fingerprint system — currently on the market — is 100% foolproof.  相似文献   

10.
In recent years, there has been a growing interest in finding stronger means of securitising identity against the various risks presented by the mobile globalised world. Biometric technology has featured quite prominently on the policy and security agenda of many countries. It is being promoted as the solution du jour for protecting and managing the uniqueness of identity in order to combat identity theft and fraud, crime and terrorism, illegal work and employment, and to efficiently govern various domains and services including asylum, immigration and social welfare. In this paper, I shall interrogate the ways in which biometrics is about the uniqueness of identity and what kind of identity biometrics is concerned with. I argue that in posing such questions at the outset, we can start delimiting the distinctive bioethical stakes of biometrics beyond the all-too-familiar concerns of privacy, data protection and the like. I take cue mostly from Cavarero’s Arendt-inspired distinction between the “what” and the “who” elements of a person, and from Ricoeur’s distinction between the “idem” and “ipse” versions of identity. By engaging with these philosophical distinctions and concepts, and with particular reference to the example of asylum policy, I seek to examine and emphasise an important ethical issue pertaining to the practice of biometric identification. This issue relates mainly to the paradigmatic shift from the biographical story (which for so long has been the means by which an asylum application is assessed) to bio-digital samples (that are now the basis for managing and controlling the identities of asylum applicants). The purging of identity from its narrative dimension lies at the core of biometric technology’s overzealous aspiration to accuracy, precision and objectivity, and raises one of the most pressing bioethical questions vis-à-vis the realm of identification.  相似文献   

11.

Background  

Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow.  相似文献   

12.
The term 'biological resources' is applied to the living biological material collected, held and catalogued in culture collections: bacterial and fungal cultures; animal, human and plant cells; viruses; and isolated genetic material. A wealth of information on these materials has been accumulated in culture collections, and most of this information is accessible. Digitalisation of data has reached a high level; however, information is still dispersed. Individual and coordinated approaches have been initiated to improve accessibility of biological resource centres, their holdings and related information through the Internet. These approaches cover subjects such as standardisation of data handling and data accessibility, and standardisation and quality control of laboratory procedures. This article reviews some of the most important initiatives implemented so far, as well as the most recent achievements. It also discusses the possible improvements that could be achieved by adopting new communication standards and technologies, such as web services, in view of a deeper and more fruitful integration of biological resources information in the bioinformatics network environment.  相似文献   

13.
The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about ‘legacy’ parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.  相似文献   

14.
MOTIVATION: Bioinformatics requires Grid technologies and protocols to build high performance applications without focusing on the low level detail of how the individual Grid components operate. RESULTS: The Discovery Net system is a middleware that allows service developers to integrate tools based on existing and emerging Grid standards such as web services. Once integrated, these tools can be used to compose reusable workflows using these services that can later be deployed as new services for others to use. Using the Discovery Net system and a range of different bioinformatics tools, we built a Grid based application for Genome Annotation. This includes workflows for automatic nucleotide annotation, annotation of predicted proteins and text analysis based on metabolic profiles and text analysis.  相似文献   

15.
The ever increasing rate at which whole genome sequences are becoming accessible to the scientific community has created an urgent need for tools enabling comparison of chromosomes of different species. We have applied biometric methods to available chromosome sequences and posted the results on our Comparative Genometrics (CG) web site. By genometrics, a term coined by Elston and Wilson [Genet. Epidemiol. (1990), 7, 17–19], we understand a biometric analysis of chromosomes. During the initial phase, our web site displays, for all completely sequenced prokaryotic genomes, three genometric analyses: the DNA walk [Lobry (1999) Microbiology Today, 26, 164–165] and two complementary representations, i.e. the cumulative GC- and TA-skew analyses, capable of identifying, at the level of whole genomes, features inherent to chromosome organization and functioning. It appears that the latter features are taxon-specific. Although primarily focused on prokaryotic chromosomes, the CG web site contains genometric information on paradigm plasmids, phages, viruses and eukaryotic organelles. Relevant data and methods can be readily used by the scientific community for further analyses as well as for tutorial purposes. Our data posted at the CG web site are freely available on the World Wide Web at http://www.unil.ch/comparativegenometrics.  相似文献   

16.
Coverage of genetic technologies under national health reform.   总被引:1,自引:1,他引:0       下载免费PDF全文
This article examines the extent to which the technologies expected to emerge from genetic research are likely to be covered under Government-mandated health insurance programs such as those being proposed by advocates of national health reform. Genetic technologies are divided into three broad categories; genetic information services, including screening, testing, and counseling; experimental technologies; and gene therapy. This article concludes that coverage of these technologies under national health reform is uncertain. The basic benefits packages provided for in the major health reform plans are likely to provide partial coverage of experimental technologies; relatively broad coverage of information services; and varying coverage of gene therapies, on the basis of an evaluation of their costs, benefits, and the degree to which they raise objections on political and religious grounds. Genetic services that are not included in the basic benefits package will be available only to those who can purchase supplemental insurance or to those who can purchase the services with personal funds. The resulting multitiered system of access to genetic services raises serious questions of fairness.  相似文献   

17.
Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology.  相似文献   

18.
Applications of information and communications technology (ICT) for the management of environmental data, if used during the design and at the end of the product life cycle, can improve the environmental performance of products. This specific application of ICT for data management is called product data technology (PDT) and is based on the use of international standards developed by ISO TC184/SC4. PDT enables the computerized representations of information about products, processes, and their properties that are independent of any proprietary computer system or software application. The standard product data models are designed to integrate the necessary information about materials used in the product, and such information can be accessed and used at any point in the life cycle, from design to disposal. In the article, we present how PDT can support life cycle assessment (LCA) by focusing on a series of standards for communicating data for design and manufacture and standards for business and commercial information. Examples of possibilities for using PDT and semantic web for LCA data are introduced. The findings presented here are based on DEPUIS (Design of Environmentally‐Friendly Products Using Information Standards), a project aimed at improving the eco‐design of new products and services through the innovative use of new information standards.  相似文献   

19.
Speaker verification is the most highly commercialized form of a group of technologies that are called speaker recognition or voice biometrics. Other voice-biometric technologies include speaker identification, voice stress analysis and lie detection. Like most other biometric-based security technologies, speaker verification has been experiencing increased market and investor interest. Despite the dot.com crash, 2001 has been a very good year for vendors, with the number of pilots and actual deployments increasing.  相似文献   

20.
As the applications of mobile and ubiquitous technologies have become more extensive, the communication security issues of those applications are emerging as the most important concern. Therefore, studies are active in relation with various techniques and system applications for individual security elements. In this paper, we proposed a new technique which uses the voice features in order to generate mobile one time passwords (OTPs) and generated safe and variable and safe passwords for one time use, using voice information of biometrics, which is used for powerful personal authentication optionally. Also, we performed the availability analysis on homomorphic variability of voice feature points using dendrogram and distribution of 15 users’ voice skip sampling of feature points for the proposed password generation method. And we have described the application cases of the proposed mobile-OTP using skip sampling of voice signal.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号