首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Fog-cloud computing is a promising distributed model for hosting ever-increasing Internet of Things (IoT) applications. IoT applications should meet different characteristics such as deadline, frequency rate, and input file size. Fog nodes are heterogeneous, resource-limited devices and cannot accommodate all the IoT applications. Due to these difficulties, designing an efficient algorithm to deploy a set of IoT applications in a fog-cloud environment is very important. In this paper, a fuzzy approach is developed to classify applications based on their characteristics then an efficient heuristic algorithm is proposed to place applications on the virtualized computing resources. The proposed policy aims to provide a high quality of service for IoT users while the profit of fog service providers is maximized by minimizing resource wastage. Extensive simulation experiments are conducted to evaluate the performance of the proposed policy. Results show that the proposed policy outperforms other approaches by improving the average response time up to 13%, the percentage of deadline satisfied requests up to 12%, and the resource wastage up to 26%.

  相似文献   

2.
Sabitha  S.  Rajasree  M. S. 《Cluster computing》2021,24(2):1455-1478

The exponential growth of data storage and sharing in cloud demands an efficient access control mechanism for flexible data sharing. Attribute-Based Encryption (ABE) is a promising cryptographic solution to share data among users in the cloud. But it suffers from user revocation, attribute revocation, forward secrecy and backward secrecy issues. Communication and computation overhead is more due to the linear variation in the size of ciphertext and the secret key with respect to the number of attributes. In this paper, we investigate an on-demand access control for flexible sharing of secure data among randomly selected users. It is a tunable access control mechanism for the flexible sharing of ciphertext classes in the cloud. It delegates the decryption rights of any set of ciphertext classes among the users only if their attributes are satisfied with the access policy associated with ciphertext and if they should possess a compact key corresponding to the intended set of ciphertext classes. It produces a constant size ciphertext and a compact secret key to efficiently utilize the storage space and reduce the communication cost. The compact key aggregates the power of secret keys used to encrypt the outsourced data. This method flexibly shares the ciphertext classes among the randomly selected users with a specific set of attributes. All other ciphertext classes outside the set remain confidential. It allows dynamic data updates by verifying the data manipulation privilege of users with the help of claim policy. The proposed scheme provides access control of varying granularity, at user-level, at file-level, and attribute-level. Granularity levels can be chosen based on applications and user demands. Hence, it is a multi-level, tunable access control over the shared data. It is very useful for secure data storage. This scheme tackles user revocation and attribute revocation problems so that, it allows the data owner to revoke a specific user or a group of users. It prevents forward and backward secrecy issues.

  相似文献   

3.

The radical shift in the technology with the advent of connected things has led to the significant proliferation in demand for IoT devices, commonly called ‘smart devices’. These devices are capable of data collection, which can help in umpteen applications, particularly in healthcare. With the tremendous growth in these resource-constrained end devices, there has been a substantial increase in the number of attack varieties. Since these end devices deal with the sensitive data that might cause severe damage if not handled properly. Hence, defending its integrity, preserving its privacy, and maintaining its confidentiality as well as availability is of utmost importance. However, there are many protocols, models, architecture tools, etc. proposed to provide security. Nevertheless, almost every solution propound so far is not fully resilient and lacks in giving full protection to the system in some way or the other. So here, we have proposed a lightweight anonymous mutual authentication scheme for end devices and fog nodes.

  相似文献   

4.
Li  Dong  Luo  Zai  Cao  Bo 《Cluster computing》2022,25(4):2585-2599

Blockchain technology is an undeniable ledger technology that stores transactions in high-security chains of blocks. Blockchain can solve security and privacy issues in a variety of domains. With the rapid development of smart environments and complicated contracts between users and intelligent devices, federated learning (FL) is a new paradigm to improve accuracy and precision factors of data mining by supporting information privacy and security. Much sensitive information such as patient health records, safety industrial information, and banking personal information in various domains of the Internet of Things (IoT) including smart city, smart healthcare, and smart industry should be collected and gathered to train and test with high potential privacy and secured manner. Using blockchain technology to the adaption of intelligent learning can influence maintaining and sustaining information security and privacy. Finally, blockchain-based FL mechanisms are very hot topics and cut of scientific edge in data science and artificial intelligence. This research proposes a systematic study on the discussion of privacy and security in the field of blockchain-based FL methodologies on the scientific databases to provide an objective road map of the status of this issue. According to the analytical results of this research, blockchain-based FL has been grown significantly during these 5 years and blockchain technology has been used more to solve problems related to patient healthcare records, image retrieval, cancer datasets, industrial equipment, and economical information in the field of IoT applications and smart environments.

  相似文献   

5.

The spread of the Internet of Things (IoT) is demanding new, powerful architectures for handling the huge amounts of data produced by the IoT devices. In many scenarios, many existing isolated solutions applied to IoT devices use a set of rules to detect, report and mitigate malware activities or threats. This paper describes a development environment that allows the programming and debugging of such rule-based multi-agent solutions. The solution consists of the integration of a rule engine into the agent, the use of a specialized, wrapping agent class with a graphical user interface for programming and testing purposes, and a mechanism for the incremental composition of behaviors. Finally, a set of examples and a comparative study were accomplished to test the suitability and validity of the approach. The JADE multi-agent middleware has been used for the practical implementation of the approach.

  相似文献   

6.
Keyword search on encrypted data allows one to issue the search token and conduct search operations on encrypted data while still preserving keyword privacy. In the present paper, we consider the keyword search problem further and introduce a novel notion called attribute-based proxy re-encryption with keyword search (), which introduces a promising feature: In addition to supporting keyword search on encrypted data, it enables data owners to delegate the keyword search capability to some other data users complying with the specific access control policy. To be specific, allows (i) the data owner to outsource his encrypted data to the cloud and then ask the cloud to conduct keyword search on outsourced encrypted data with the given search token, and (ii) the data owner to delegate other data users keyword search capability in the fine-grained access control manner through allowing the cloud to re-encrypted stored encrypted data with a re-encrypted data (embedding with some form of access control policy). We formalize the syntax and security definitions for , and propose two concrete constructions for : key-policy and ciphertext-policy . In the nutshell, our constructions can be treated as the integration of technologies in the fields of attribute-based cryptography and proxy re-encryption cryptography.  相似文献   

7.

Non-orthogonal multiple access (NOMA) along with cognitive radio (CR) have been recently configured as potential solutions to fulfill the extraordinary demands of the fifth generation (5G) and beyond (B5G) networks and support the Internet of Thing (IoT) applications. Multiple users can be served within the same orthogonal domains in NOMA via power-domain multiplexing, whilst CR allows secondary users (SUs) to access the licensed spectrum frequency. This work investigates the possibility of combining orthogonal frequency division multiple access (OFDMA), NOMA, and CR, referred to as hybrid OFDMA-NOMA CR network. With this hybrid technology, the licensed frequency is divided into several channels, such as a group SUs is served in each channel based on NOMA technology. In particular, a rate-maximization framework is developed, at which user pairing at each channel, power allocations for each user, and secondary users activities are jointly considered to maximize the sum-rate of the hybrid OFDMA-NOMA CR network, while maintaining a set of relevant NOMA and CR constraints. The developed sum-rate maximization framework is NP-hard problem, and cannot be solved through classical approaches. Accordingly, we propose a two-stage approach; in the first stage, we propose a novel user pairing algorithm. With this, an iterative algorithm based on the sequential convex approximation is proposed to evaluate the solution of the non-convex rate-maximization problem, in the second stage. Results show that our proposed algorithm outperforms the existing schemes, and CR network features play a major role in deciding the overall network’s performance.

  相似文献   

8.
Hayyolalam  Vahideh  Otoum  Safa  Özkasap  Öznur 《Cluster computing》2022,25(3):1695-1713

Edge intelligence has become popular recently since it brings smartness and copes with some shortcomings of conventional technologies such as cloud computing, Internet of Things (IoT), and centralized AI adoptions. However, although utilizing edge intelligence contributes to providing smart systems such as automated driving systems, smart cities, and connected healthcare systems, it is not free from limitations. There exist various challenges in integrating AI and edge computing, one of which is addressed in this paper. Our main focus is to handle the adoption of AI methods on resource-constrained edge devices. In this regard, we introduce the concept of Edge devices as a Service (EdaaS) and propose a quality of service (QoS) and quality of experience (QoE)-aware dynamic and reliable framework for AI subtasks composition. The proposed framework is evaluated utilizing three well-known meta-heuristics in terms of various metrics for a connected healthcare application scenario. The experimental results confirm the applicability of the proposed framework. Moreover, the results reveal that black widow optimization (BWO) can handle the issue more efficiently compared to particle swarm optimization (PSO) and simulated annealing (SA). The overall efficiency of BWO over PSO is 95%, and BWO outperforms SA with 100% efficiency. It means that BWO prevails SA and PSO in all and 95% of the experiments, respectively.

  相似文献   

9.
Biological and medical diagnoses depend on high-quality measurements. A wearable device based on Internet of Things (IoT) must be unobtrusive to the human body to encourage users to accept continuous monitoring. However, unobtrusive IoT devices are usually of low quality and unreliable because of the limitation of technology progress that has slowed down at high peak. Therefore, advanced inference techniques must be developed to address the limitations of IoT devices. This review proposes that IoT technology in biological and medical applications should be based on a new data assimilation process that fuses multiple data scales from several sources to provide diagnoses. Moreover, the required technologies are ready to support the desired disease diagnosis levels, such as hypothesis test, multiple evidence fusion, machine learning, data assimilation, and systems biology. Furthermore, cross-disciplinary integration has emerged with advancements in IoT. For example, the multiscale modeling of systems biology from proteins and cells to organs integrates current developments in biology, medicine, mathematics, engineering, artificial intelligence, and semiconductor technologies. Based on the monitoring objectives of IoT devices, researchers have gradually developed ambulant, wearable, noninvasive, unobtrusive, low-cost, and pervasive monitoring devices with data assimilation methods that can overcome the limitations of devices in terms of quality measurement. In the future, the novel features of data assimilation in systems biology and ubiquitous sensory development can describe patients’ physical conditions based on few but long-term measurements.  相似文献   

10.
Cluster Computing - Attribute-based encryption (ABE) has evolved as an efficient and secure method for storage of data with fine-grained access control in cloud platforms. In recent years,...  相似文献   

11.
Cloud storage is an important application service in cloud computing, it allows data users to store and access their files anytime, from anywhere and with any device. To ensure the security of the outsourced data, data user needs to periodically check data integrity. In some cases, the identity privacy of data user must be protected. However, in the existing preserving identity privacy protocols, data tag generation is mainly based on complex ring signature or group signature. It brings a heavy burden to data user. To ensure identity privacy of data user, in this paper we propose a novel identity privacy-preserving public auditing protocol by utilizing chameleon hash function. It can achieve the following properties: (1) the identity privacy of data user is preserved for cloud server; (2) the validity of the outsourced data is verified; (3) data privacy can be preserved for the auditor in auditing process; (4) computation cost to produce data tag is very low. Finally, we also show that our scheme is provably secure in the random oracle model, the security of the proposed scheme is related to the computational Diffie–Hellman problem and hash function problem.  相似文献   

12.
Cloud computing, an on-demand computation model that consists of large data-centers (Clouds) managed by cloud providers, offers storage and computation needs for cloud users based on service level agreements (SLAs). Services in cloud computing are offered at relatively low cost. The model, therefore, forms a great target for many applications, such as startup businesses and e-commerce applications. The area of cloud computing has grown rapidly in the last few years; yet, it still faces some obstacles. For example, there is a lack of mechanisms that guarantee for cloud users the quality that they are actually getting, compared to the quality of service that is specified in SLAs. Another example is the concern of security, privacy and trust, since users lose control over their data and programs once they are sent to cloud providers. In this paper, we introduce a new architecture that aids the design and implementation of attestation services. The services monitor cloud-based applications to ensure software quality, such as security, privacy, trust and usability of cloud-based applications. Our approach is a user-centric approach through which users have more control on their own data/applications. Further, the proposed approach is a cloud-based approach where the powers of the clouds are utilized. Simulation results show that many services can be designed based on our architecture, with limited performance overhead.  相似文献   

13.
The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.  相似文献   

14.
Shao  Bilin  Ji  Yanyan 《Cluster computing》2021,24(3):1989-2000

In recent years, how to design efficient auditing protocol to verify the integrity of users’ data, which is stored in cloud services provider (CSP), becomes a research focus. Homomorphic message authentication code (MAC) and homomorphic signature are two popular techniques to respectively design private and public auditing protocols. On the one hand, it is not suitable for the homomorphic-MAC-based auditing protocols to be outsourced to third-party auditor (TPA), who has more professional knowledge and computational abilities, although they have high efficiencies. On the other hand, the homomorphic-signature-based ones are very suitable for employing TPA without compromising user’s signing key but have very low efficiency (compared to the former case). In this paper, we propose a new auditing protocol, which perfectly combines the advantages of above two cases. In particular, it is almost as efficient as a homomorphic-MAC-based protocol proposed by Zhang et al. recently. Moreover, it is also suitable for outsourcing to TPA because it does not compromise the privacy of users’ signing key, which can be seen from our security analysis. Finally, numerical analysis and experimental results demonstrate the high-efficiency of our protocol.

  相似文献   

15.

In recent years, cloud computing can be considered an emerging technology that can share resources with users. Because cloud computing is on-demand, efficient use of resources such as memory, processors, bandwidth, etc., is a big challenge. Despite the advantages of cloud computing, sometimes it is not a proper choice due to its delay in responding appropriately to existing requests, which led to the need for another technology called fog computing. Fog computing reduces traffic and time lags by expanding cloud services to the network and closer to users. It can schedule resources with higher efficiency and utilize them to impact the user's experience dramatically. This paper aims to survey some studies that have been done in the field of scheduling in fog/cloud computing environments. The focus of this survey is on published studies between 2015 and 2021 in journals or conferences. We selected 71 studies in a systematic literature review (SLR) from four major scientific databases based on their relation to our paper. We classified these studies into five categories based on their traced parameters and their focus area. This classification comprises 1—performance 2—energy efficiency, 3—resource utilization, 4—performance and energy efficiency, and 5—performance and resource utilization simultaneously. 42.3% of the studies focused on performance, 9.9% on energy efficiency, 7.0% on resource utilization, 21.1% on both performance and energy efficiency, and 19.7% on both performance and resource utilization. Finally, we present challenges and open issues in the resource scheduling methods in fog/cloud computing environments.

  相似文献   

16.

Many consumers participate in the smart city via smart portable gadgets such as wearables, personal gadgets, mobile devices, or sensor systems. In the edge computation systems of IoT in the smart city, the fundamental difficulty of the sensor is to pick reliable participants. Since not all smart IoT gadgets are dedicated, certain intelligent IoT gadgets might destroy the networks or services deliberately and degrade the customer experience. A trust-based internet of things (TM-IoT) cloud computing method is proposed in this research. The problem is solved by choosing trustworthy partners to enhance the quality services of the IoT edging network in the Smart architectures. A smart device choice recommendation method based on the changing networks was developed. It applied the evolutionary concept of games to examine the reliability and durability of the technique of trust management presented in this article. The reliability and durability of the trustworthiness-managing system, the Lyapunov concept was applied.A real scenario for personal-health-control systems and air-qualitymonitoring and assessment in a smart city setting confirmed the efficiency of the confidence-management mechanism. Experiments have demonstrated that the methodology for trust administration suggested in this research plays a major part in promoting multi-intelligent gadget collaboration in the IoT edge computer system with an efficiency of 97%. It resists harmful threads against service suppliers more consistently and is ideal for the smart world's massive IoT edge computer system.

  相似文献   

17.

Real-time accurate traffic congestion prediction can enable Intelligent traffic management systems (ITMSs) that replace traditional systems to improve the efficiency of traffic and reduce traffic congestion. The ITMS consists of three main layers, which are: Internet of Things (IoT), edge, and cloud layers. Edge can collect real-time data from different routes through IoT devices such as wireless sensors, and then it can compute and store this collected data before transmitting them to the cloud for further processing. Thus, an edge is an intermediate layer between IoT and cloud layers that can receive the transmitted data through IoT to overcome cloud challenges such as high latency. In this paper, a novel real-time traffic congestion prediction strategy (TCPS) is proposed based on the collected data in the edge’s cache server at the edge layer. The proposed TCPS contains three stages, which are: (i) real-time congestion prediction (RCP) stage, (ii) congestion direction detection (CD2) stage, and (iii) width change decision (WCD) stage. The RCP aims to predict traffic congestion based on the causes of congestion in the hotspot using a fuzzy inference system. If there is congestion, the CD2 stage is used to detect the congestion direction based on the predictions from the RCP by using the Optimal Weighted Naïve Bayes (OWNB) method. The WCD stage aims to prevent the congestion occurrence in which it is used to change the width of changeable routes (CR) after detecting the direction of congestion in CD2. The experimental results have shown that the proposed TCPS outperforms other recent methodologies. TCPS provides the highest accuracy, precision, and recall. Besides, it provides the lowest error, with values equal to 95%, 74%, 75%, and 5% respectively.

  相似文献   

18.

Background

Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems.

Purpose

It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem.

Method

We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.

Results

The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.  相似文献   

19.
In this study, we use an improved, more accurate model to analyze the energy footprint of content downloaded from a major online newspaper by means of various combinations of user devices and access networks. Our results indicate that previous analyses based on average figures for laptops or desktop personal computers predict national and global energy consumption values that are unrealistically high. Additionally, we identify the components that contribute most of the total energy consumption during the use stage of the life cycle of digital services. We find that, depending on the type of user device and access network employed, the data center where the news content originates consumes between 4% and 48% of the total energy consumption when news articles are read and between 2% and 11% when video content is viewed. Similarly, we find that user devices consume between 7% and 90% and 0.7% and 78% for articles and video content, respectively, depending on the type of user device and access network that is employed. Though increasing awareness of the energy consumption by data centers is justified, an analysis of our results shows that for individual users of the online newspaper we studied, energy use by user devices and the third‐generation (3G) mobile network are usually bigger contributors to the service footprint than the datacenters. Analysis of our results also shows that data transfer of video content has a significant energy use on the 3G mobile network, but less so elsewhere. Hence, a strategy of reducing the resolution of video would reduce the energy footprint for individual users who are using mobile devices to access content by the 3G network.  相似文献   

20.
Han  KyungHyun  Lee  Wai-Kong  Hwang  Seong Oun 《Cluster computing》2022,25(1):433-450

Recently, National Institute of Standards and Technology (NIST) in the U.S. had initiated a global-scale competition to standardize the lightweight authenticated encryption with associated data (AEAD) and hash function. Gimli is one of the Round 2 candidates that is designed to be efficiently implemented across various platforms, including hardware (VLSI and FPGA), microprocessors, and microcontrollers. However, the performance of Gimli in massively parallel architectures like Graphics Processing Units (GPU) is still unknown. A high performance Gimli implementation on GPU can be especially useful to Internet of Things (IoT) applications, wherein the gateway devices and cloud servers need to handle a massive number of communications protected by AEAD. In this paper, we show that with careful optimization, Gimli can be efficiently implemented in desktop and embedded GPU to achieve extremely high throughput. Our experiments show that the proposed Gimli implementation can achieve 661.44 KB/s (encryption), 892.24 KB/s (decryption), and 4344.46 KB/s (hashing) in state-of-the-art GPUs.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号