期刊文献+
共找到16,655篇文章
< 1 2 250 >
每页显示 20 50 100
Empowering Edge Computing:Public Edge as a Service for Performance and Cost Optimization
1
作者 Ateeqa Jalal Umar Farooq +4 位作者 Ihsan Rabbi Afzal Badshah Aurangzeb Khan Muhammad Mansoor Alam Mazliham Mohd Su’ud 《Computers, Materials & Continua》 2026年第2期1784-1802,共19页
The exponential growth of Internet of Things(IoT)devices,autonomous systems,and digital services is generating massive volumes of big data,projected to exceed 291 zettabytes by 2027.Conventional cloud computing,despit... The exponential growth of Internet of Things(IoT)devices,autonomous systems,and digital services is generating massive volumes of big data,projected to exceed 291 zettabytes by 2027.Conventional cloud computing,despite its high processing and storage capacity,suffers from increased network latency,network congestion,and high operational costs,making it unsuitable for latency-sensitive applications.Edge computing addresses these issues by processing data near the source but faces scalability challenges and elevated Total Cost of Ownership(TCO).Hybrid solutions,such as fog computing,cloudlets,and Mobile Edge Computing(MEC),attempt to balance cost and performance;however,they still struggle with limited resource sharing and high deployment expenses.This paper proposes Public Edge as a Service(PEaaS),a novel paradigm that utilizes idle resources contributed by universities,enterprises,cellular operators,and individuals under a collaborative service model.By decentralizing computation and enabling multi-tenant resource sharing,PEaaS reduces reliance on centralized cloud infrastructure,minimizes communication costs,and enhances scalability.The proposed framework is evaluated using EdgeCloudSim under varying workloads,for keymetrics such as latency,communication cost,server utilization,and task failure rate.Results reveal that while cloud has a task failure rate rising sharply to 12.3%at 2000 devices,PEaaS maintains a low rate of 2.5%,closely matching edge computing.Furthermore,communication costs remain 25% lower than cloud and latency remains below 0.3,even under peak load.These findings demonstrate that PEaaS achieves near-edge performance with reduced costs and enhanced scalability,offering a sustainable and economically viable solution for next-generation computing environments. 展开更多
关键词 Big data edge as a service edge computing
在线阅读 下载PDF
Data Elements and Trustworthy Circulation:A Clearing and Settlement Architecture for Element Market Transactions Integrating Privacy Computing and Smart Contracts
2
作者 Huanjing Huang 《Journal of Electronic Research and Application》 2025年第5期86-92,共7页
This article explores the characteristics of data resources from the perspective of production factors,analyzes the demand for trustworthy circulation technology,designs a fusion architecture and related solutions,inc... This article explores the characteristics of data resources from the perspective of production factors,analyzes the demand for trustworthy circulation technology,designs a fusion architecture and related solutions,including multi-party data intersection calculation,distributed machine learning,etc.It also compares performance differences,conducts formal verification,points out the value and limitations of architecture innovation,and looks forward to future opportunities. 展开更多
关键词 data elements Privacy computing Smart contracts
在线阅读 下载PDF
Hadoop-based secure storage solution for big data in cloud computing environment 被引量:2
3
作者 Shaopeng Guan Conghui Zhang +1 位作者 Yilin Wang Wenqing Liu 《Digital Communications and Networks》 SCIE CSCD 2024年第1期227-236,共10页
In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose... In order to address the problems of the single encryption algorithm,such as low encryption efficiency and unreliable metadata for static data storage of big data platforms in the cloud computing environment,we propose a Hadoop based big data secure storage scheme.Firstly,in order to disperse the NameNode service from a single server to multiple servers,we combine HDFS federation and HDFS high-availability mechanisms,and use the Zookeeper distributed coordination mechanism to coordinate each node to achieve dual-channel storage.Then,we improve the ECC encryption algorithm for the encryption of ordinary data,and adopt a homomorphic encryption algorithm to encrypt data that needs to be calculated.To accelerate the encryption,we adopt the dualthread encryption mode.Finally,the HDFS control module is designed to combine the encryption algorithm with the storage model.Experimental results show that the proposed solution solves the problem of a single point of failure of metadata,performs well in terms of metadata reliability,and can realize the fault tolerance of the server.The improved encryption algorithm integrates the dual-channel storage mode,and the encryption storage efficiency improves by 27.6% on average. 展开更多
关键词 Big data security data encryption hadoop Parallel encrypted storage Zookeeper
在线阅读 下载PDF
Big Data Storage Architecture Design in Cloud Computing
4
作者 Xuebin Chen Shi Wang +1 位作者 Yanyan Dong Xu Wang 《国际计算机前沿大会会议论文集》 2015年第B12期3-4,共2页
To solve the lag problem of the traditional storage technology in mass data storage and management,the application platform is designed and built for big data on Hadoop and data warehouse integration platform,which en... To solve the lag problem of the traditional storage technology in mass data storage and management,the application platform is designed and built for big data on Hadoop and data warehouse integration platform,which ensured the convenience for the management and usage of data.In order to break through the master node system bottlenecks,a storage system with better performance is designed through introduction of cloud computing technology,which adopts the design of master-slave distribution patterns by the network access according to the recent principle.Thus the burden of single access the master node is reduced.Also file block update strategy and fault recovery mechanism are provided to solve the management bottleneck problem of traditional storage system on the data update and fault recovery and offer feasible technical solutions to storage management for big data. 展开更多
关键词 BIG data·Cloud computing·hadoop·data warehouse·Storage architecture
在线阅读 下载PDF
A Method for Trust Management in Cloud Computing: Data Coloring by Cloud Watermarking 被引量:7
5
作者 Yu-Chao Liu Yu-Tao Ma +2 位作者 Hai-Su Zhang De-Yi Li Gui-Sheng Chen 《International Journal of Automation and computing》 EI 2011年第3期280-285,共6页
With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dyn... With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtuMized resources are provided as services. With virtualization technology, cloud computing offers diverse services (such as virtual computing, virtual storage, virtual bandwidth, etc.) for the public by means of multi-tenancy mode. Although users are enjoying the capabilities of super-computing and mass storage supplied by cloud computing, cloud security still remains as a hot spot problem, which is in essence the trust management between data owners and storage service providers. In this paper, we propose a data coloring method based on cloud watermarking to recognize and ensure mutual reputations. The experimental results show that the robustness of reverse cloud generator can guarantee users' embedded social reputation identifications. Hence, our work provides a reference solution to the critical problem of cloud security. 展开更多
关键词 Cloud computing cloud security trust management cloud watermarking data coloring.
在线阅读 下载PDF
Fog-IBDIS:Industrial Big Data Integration and Sharing with Fog Computing for Manufacturing Systems 被引量:4
6
作者 Junliang Wang Peng Zheng +2 位作者 Youlong Lv Jingsong Bao Jie Zhang 《Engineering》 SCIE EI 2019年第4期662-670,共9页
Industrial big data integration and sharing(IBDIS)is of great significance in managing and providing data for big data analysis in manufacturing systems.A novel fog-computing-based IBDIS approach called Fog-IBDIS is p... Industrial big data integration and sharing(IBDIS)is of great significance in managing and providing data for big data analysis in manufacturing systems.A novel fog-computing-based IBDIS approach called Fog-IBDIS is proposed in order to integrate and share industrial big data with high raw data security and low network traffic loads by moving the integration task from the cloud to the edge of networks.First,a task flow graph(TFG)is designed to model the data analysis process.The TFG is composed of several tasks,which are executed by the data owners through the Fog-IBDIS platform in order to protect raw data privacy.Second,the function of Fog-IBDIS to enable data integration and sharing is presented in five modules:TFG management,compilation and running control,the data integration model,the basic algorithm library,and the management component.Finally,a case study is presented to illustrate the implementation of Fog-IBDIS,which ensures raw data security by deploying the analysis tasks executed by the data generators,and eases the network traffic load by greatly reducing the volume of transmitted data. 展开更多
关键词 FOG computing INDUSTRIAL BIG data Integration Manufacturing system
在线阅读 下载PDF
On the Privacy-Preserving Outsourcing Scheme of Reversible Data Hiding over Encrypted Image Data in Cloud Computing 被引量:11
7
作者 Lizhi Xiong Yunqing Shi 《Computers, Materials & Continua》 SCIE EI 2018年第6期523-539,共17页
Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the clou... Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels. 展开更多
关键词 Cloud data security re-encryption reversible data hiding cloud computing privacy-preserving.
在线阅读 下载PDF
Parallel Computing of a Variational Data Assimilation Model for GPS/MET Observation Using the Ray-Tracing Method 被引量:5
8
作者 张昕 刘月巍 +1 位作者 王斌 季仲贞 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2004年第2期220-226,共7页
The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. V... The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. Verified by radiosonde, including GPS/MET observations into the analysis makes an overall improvement to the analysis variables of temperature, winds, and water vapor. However, the variational model with the ray-tracing method is quite expensive for numerical weather prediction and climate research. For example, about 4 000 GPS/MET refraction angles need to be assimilated to produce an ideal global analysis. Just one iteration of minimization will take more than 24 hours CPU time on the NCEP's Cray C90 computer. Although efforts have been taken to reduce the computational cost, it is still prohibitive for operational data assimilation. In this paper, a parallel version of the three-dimensional variational data assimilation model of GPS/MET occultation measurement suitable for massive parallel processors architectures is developed. The divide-and-conquer strategy is used to achieve parallelism and is implemented by message passing. The authors present the principles for the code's design and examine the performance on the state-of-the-art parallel computers in China. The results show that this parallel model scales favorably as the number of processors is increased. With the Memory-IO technique implemented by the author, the wall clock time per iteration used for assimilating 1420 refraction angles is reduced from 45 s to 12 s using 1420 processors. This suggests that the new parallelized code has the potential to be useful in numerical weather prediction (NWP) and climate studies. 展开更多
关键词 parallel computing variational data assimilation GPS/MET
在线阅读 下载PDF
Computing over Space:Status,Challenges,and Opportunities 被引量:1
9
作者 Yaoqi Liu Yinhe Han +3 位作者 Hongxin Li Shuhao Gu Jibing Qiu Ting Li 《Engineering》 2025年第11期20-25,共6页
1.Introduction The rapid expansion of satellite constellations in recent years has resulted in the generation of massive amounts of data.This surge in data,coupled with diverse application scenarios,underscores the es... 1.Introduction The rapid expansion of satellite constellations in recent years has resulted in the generation of massive amounts of data.This surge in data,coupled with diverse application scenarios,underscores the escalating demand for high-performance computing over space.Computing over space entails the deployment of computational resources on platforms such as satellites to process large-scale data under constraints such as high radiation exposure,restricted power consumption,and minimized weight. 展开更多
关键词 satellite constellations deployment computational resources data processing space computing radiation exposure SPACE high performance computing power consumption
在线阅读 下载PDF
Fog Computing Architecture-Based Data Acquisition for WSN Applications 被引量:2
10
作者 Guangwei Zhang Ruifan Li 《China Communications》 SCIE CSCD 2017年第11期69-81,共13页
Efficient and effective data acquisition is of theoretical and practical importance in WSN applications because data measured and collected by WSN is often unreliable, such as those often accompanied by noise and erro... Efficient and effective data acquisition is of theoretical and practical importance in WSN applications because data measured and collected by WSN is often unreliable, such as those often accompanied by noise and error, missing values or inconsistent data. Motivated by fog computing, which focuses on how to effectively offload computation-intensive tasks from resource-constrained devices, this paper proposes a simple but yet effective data acquisition approach with the ability of filtering abnormal data and meeting the real-time requirement. Our method uses a cooperation mechanism by leveraging on both an architectural and algorithmic approach. Firstly, the sensor node with the limited computing resource only accomplishes detecting and marking the suspicious data using a light weight algorithm. Secondly, the cluster head evaluates suspicious data by referring to the data from the other sensor nodes in the same cluster and discard the abnormal data directly. Thirdly, the sink node fills up the discarded data with an approximate value using nearest neighbor data supplement method. Through the architecture, each node only consumes a few computational resources and distributes the heavily computing load to several nodes. Simulation results show that our data acquisition method is effective considering the real-time outlier filtering and the computing overhead. 展开更多
关键词 WSN fog computing abnormal data data filtering intrusion tolerance
在线阅读 下载PDF
Fast wireless sensor for anomaly detection based on data stream in an edge-computing-enabled smart greenhouse 被引量:5
11
作者 Yihong Yang Sheng Ding +4 位作者 Yuwen Liu Shunmei Meng Xiaoxiao Chi Rui Ma Chao Yan 《Digital Communications and Networks》 SCIE CSCD 2022年第4期498-507,共10页
Edge-computing-enabled smart greenhouses are a representative application of the Internet of Things(IoT)technology,which can monitor the environmental information in real-time and employ the information to contribute ... Edge-computing-enabled smart greenhouses are a representative application of the Internet of Things(IoT)technology,which can monitor the environmental information in real-time and employ the information to contribute to intelligent decision-making.In the process,anomaly detection for wireless sensor data plays an important role.However,the traditional anomaly detection algorithms originally designed for anomaly detection in static data do not properly consider the inherent characteristics of the data stream produced by wireless sensors such as infiniteness,correlations,and concept drift,which may pose a considerable challenge to anomaly detection based on data stream and lead to low detection accuracy and efficiency.First,the data stream is usually generated quickly,which means that the data stream is infinite and enormous.Hence,any traditional off-line anomaly detection algorithm that attempts to store the whole dataset or to scan the dataset multiple times for anomaly detection will run out of memory space.Second,there exist correlations among different data streams,and traditional algorithms hardly consider these correlations.Third,the underlying data generation process or distribution may change over time.Thus,traditional anomaly detection algorithms with no model update will lose their effects.Considering these issues,a novel method(called DLSHiForest)based on Locality-Sensitive Hashing and the time window technique is proposed to solve these problems while achieving accurate and efficient detection.Comprehensive experiments are executed using a real-world agricultural greenhouse dataset to demonstrate the feasibility of our approach.Experimental results show that our proposal is practical for addressing the challenges of traditional anomaly detection while ensuring accuracy and efficiency. 展开更多
关键词 Anomaly detection data stream DLSHiForest Smart greenhouse Edge computing
在线阅读 下载PDF
A LoRa-based protocol for connecting IoT edge computing nodes to provide small-data-based services 被引量:3
12
作者 Kiyoshy Nakamura Pietro Manzoni +4 位作者 Alessandro Redondi Edoardo Longo Marco Zennaro Juan-Carlos Cano Carlos T.Calafate 《Digital Communications and Networks》 SCIE CSCD 2022年第3期257-266,共10页
Data is becoming increasingly personal.Individuals regularly interact with a variety of structured data,ranging from SQLite databases on the phone to personal sensors and open government data.The“digital traces left ... Data is becoming increasingly personal.Individuals regularly interact with a variety of structured data,ranging from SQLite databases on the phone to personal sensors and open government data.The“digital traces left by individuals through these interactions”are sometimes referred to as“small data”.Examples of“small data”include driving records,biometric measurements,search histories,weather forecasts and usage alerts.In this paper,we present a flexible protocol called LoRaCTP,which is based on LoRa technology that allows data“chunks”to be transferred over large distances with very low energy expenditure.LoRaCTP provides all the mechanisms necessary to make LoRa transfer reliable by introducing a lightweight connection setup and allowing the ideal sending of an as-long-as necessary data message.We designed this protocol as communication support for small-data edge-based IoT solutions,given its stability,low power usage,and the possibility to cover long distances.We evaluated our protocol using various data content sizes and communication distances to demonstrate its performance and reliability. 展开更多
关键词 Small data Edge computing LoRa IOT
在线阅读 下载PDF
A Data Security Framework for Cloud Computing Services 被引量:3
13
作者 Luis-Eduardo Bautista-Villalpando Alain Abran 《Computer Systems Science & Engineering》 SCIE EI 2021年第5期203-218,共16页
Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industr... Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS. 展开更多
关键词 Cloud computing SERVICES computer security data security data security requirements data risk data security measurement
在线阅读 下载PDF
Data Security Issues and Challenges in Cloud Computing: A Conceptual Analysis and Review 被引量:3
14
作者 Osama Harfoushi Bader Alfawwaz +3 位作者 Nazeeh A. Ghatasheh Ruba Obiedat Mua’ad M. Abu-Faraj Hossam Faris 《Communications and Network》 2014年第1期15-21,共7页
Cloud computing is a set of Information Technology services offered to users over the web on a rented base. Such services enable the organizations to scale-up or scale-down their in-house foundations. Generally, cloud... Cloud computing is a set of Information Technology services offered to users over the web on a rented base. Such services enable the organizations to scale-up or scale-down their in-house foundations. Generally, cloud services are provided by a third-party supplier who possesses the arrangement. Cloud computing has many advantages such as flexibility, efficiency, scalability, integration, and capital reduction. Moreover, it provides an advanced virtual space for organizations to deploy their applications or run their operations. With disregard to the possible benefits of cloud computing services, the organizations are reluctant to invest in cloud computing mainly due to security concerns. Security is one of the main challenges that hinder the growth of cloud computing. At the same time, service providers strive to reduce the risks over the clouds and increase their reliability in order to build mutual trust between them and the cloud customers. Various security issues and challenges are discussed in this research, and possible opportunities are stated. 展开更多
关键词 CLOUD computing data Security INFRASTRUCTURE SCALABILITY REVIEW
在线阅读 下载PDF
Towards Comprehensive Provable Data Possession in Cloud Computing 被引量:1
15
作者 LI Chaoling CHEN Yue +1 位作者 TAN Pengxu YANG Gang 《Wuhan University Journal of Natural Sciences》 CAS 2013年第3期265-271,共7页
To check the remote data integrity in cloud computing,we have proposed an efficient and full data dynamic provable data possession(PDP) scheme that uses a SN(serial number)-BN(block number) table to support data... To check the remote data integrity in cloud computing,we have proposed an efficient and full data dynamic provable data possession(PDP) scheme that uses a SN(serial number)-BN(block number) table to support data block update.In this article,we first analyze and test its performance in detail.The result shows that our scheme is efficient with low computation,storage,and communication costs.Then,we discuss how to extend the dynamic scheme to support other features,including public auditability,privacy preservation,fairness,and multiple-replica checking.After being extended,a comprehensive PDP scheme that has high efficiency and satisfies all main requirements is provided. 展开更多
关键词 cloud computing provable data possession data dynamics SN-BN table
原文传递
Managing Computing Infrastructure for IoT Data 被引量:1
16
作者 Sapna Tyagi Ashraf Darwish Mohammad Yahiya Khan 《Advances in Internet of Things》 2014年第3期29-35,共7页
Digital data have become a torrent engulfing every area of business, science and engineering disciplines, gushing into every economy, every organization and every user of digital technology. In the age of big data, de... Digital data have become a torrent engulfing every area of business, science and engineering disciplines, gushing into every economy, every organization and every user of digital technology. In the age of big data, deriving values and insights from big data using rich analytics becomes important for achieving competitiveness, success and leadership in every field. The Internet of Things (IoT) is causing the number and types of products to emit data at an unprecedented rate. Heterogeneity, scale, timeliness, complexity, and privacy problems with large data impede progress at all phases of the pipeline that can create value from data issues. With the push of such massive data, we are entering a new era of computing driven by novel and ground breaking research innovation on elastic parallelism, partitioning and scalability. Designing a scalable system for analysing, processing and mining huge real world datasets has become one of the challenging problems facing both systems researchers and data management researchers. In this paper, we will give an overview of computing infrastructure for IoT data processing, focusing on architectural and major challenges of massive data. We will briefly discuss about emerging computing infrastructure and technologies that are promising for improving massive data management. 展开更多
关键词 BIG data Cloud computing data ANALYTICS Elastic SCALABILITY Heterogeneous computing GPU PCM Massive data Processing
在线阅读 下载PDF
Building a Trust Model for Secure Data Sharing(TM-SDS)in Edge Computing Using HMAC Techniques 被引量:1
17
作者 K.Karthikeyan P.Madhavan 《Computers, Materials & Continua》 SCIE EI 2022年第6期4183-4197,共15页
With the rapid growth of Internet of Things(IoT)based models,and the lack amount of data makes cloud computing resources insufficient.Hence,edge computing-based techniques are becoming more popular in present research... With the rapid growth of Internet of Things(IoT)based models,and the lack amount of data makes cloud computing resources insufficient.Hence,edge computing-based techniques are becoming more popular in present research domains that makes data storage,and processing effective at the network edges.There are several advanced features like parallel processing and data perception are available in edge computing.Still,there are some challenges in providing privacy and data security over networks.To solve the security issues in Edge Computing,Hash-based Message Authentication Code(HMAC)algorithm is used to provide solutions for preserving data from various attacks that happens with the distributed network nature.This paper proposed a Trust Model for Secure Data Sharing(TM-SDS)with HMAC algorithm.Here,data security is ensured with local and global trust levels with the centralized processing of cloud and by conserving resources effectively.Further,the proposed model achieved 84.25%of packet delivery ratio which is better compared to existing models in the resulting phase.The data packets are securely transmitted between entities in the proposed model and results showed that proposed TM-SDS model outperforms the existing models in an efficient manner. 展开更多
关键词 Secure data sharing edge computing global trust levels parallel processing
在线阅读 下载PDF
Cloud Computing and Big Data: A Review of Current Service Models and Hardware Perspectives 被引量:1
18
作者 Richard Branch Heather Tjeerdsma +2 位作者 Cody Wilson Richard Hurley Sabine McConnell 《Journal of Software Engineering and Applications》 2014年第8期686-693,共8页
Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include s... Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data. 展开更多
关键词 BIG data CLOUD computing CLOUD Storage Software as a Service NOSQL ARCHITECTURES
在线阅读 下载PDF
Blockchain-Based Cognitive Computing Model for Data Security on a Cloud Platform 被引量:1
19
作者 Xiangmin Guo Guangjun Liang +1 位作者 Jiayin Liu Xianyi Chen 《Computers, Materials & Continua》 SCIE EI 2023年第12期3305-3323,共19页
Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading... Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading to poor performance and privacy breaches.Blockchain-based cognitive computing can help protect and maintain information security and privacy in cloud platforms,ensuring businesses can focus on business development.To ensure data security in cloud platforms,this research proposed a blockchain-based Hybridized Data Driven Cognitive Computing(HD2C)model.However,the proposed HD2C framework addresses breaches of the privacy information of mixed participants of the Internet of Things(IoT)in the cloud.HD2C is developed by combining Federated Learning(FL)with a Blockchain consensus algorithm to connect smart contracts with Proof of Authority.The“Data Island”problem can be solved by FL’s emphasis on privacy and lightning-fast processing,while Blockchain provides a decentralized incentive structure that is impervious to poisoning.FL with Blockchain allows quick consensus through smart member selection and verification.The HD2C paradigm significantly improves the computational processing efficiency of intelligent manufacturing.Extensive analysis results derived from IIoT datasets confirm HD2C superiority.When compared to other consensus algorithms,the Blockchain PoA’s foundational cost is significant.The accuracy and memory utilization evaluation results predict the total benefits of the system.In comparison to the values 0.004 and 0.04,the value of 0.4 achieves good accuracy.According to the experiment results,the number of transactions per second has minimal impact on memory requirements.The findings of this study resulted in the development of a brand-new IIoT framework based on blockchain technology. 展开更多
关键词 Blockchain Internet of Things(IoT) blockchain based cognitive computing Hybridized data Driven Cognitive computing(HD2C) Federated Learning(FL) Proof of Authority(PoA)
在线阅读 下载PDF
Improving Performance of Cloud Computing and Big Data Technologies and Applications 被引量:1
20
作者 Zhenjiang Dong 《ZTE Communications》 2014年第4期1-2,共2页
Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c... Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios. 展开更多
关键词 Improving Performance of Cloud computing and Big data Technologies and Applications HBASE
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部