Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.P...Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.展开更多
The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initiall...The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.展开更多
As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitati...As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitating data-driven decision making,fostering innovation ecosystems,and maintaining operational stability.In this study,we propose an advanced deployment algorithm for Service Function Chaining(SFC)that leverages an enhanced Practical Byzantine Fault Tolerance(PBFT)mechanism.The main goal is to tackle the issues of security and resource efficiency in SFC implementation across diverse network settings.By integrating blockchain technology and Deep Reinforcement Learning(DRL),our algorithm not only optimizes resource utilization and quality of service but also ensures robust security during SFC deployment.Specifically,the enhanced PBFT consensus mechanism(VRPBFT)significantly reduces consensus latency and improves Byzantine node detection through the introduction of a Verifiable Random Function(VRF)and a node reputation grading model.Experimental results demonstrate that compared to traditional PBFT,the proposed VRPBFT algorithm reduces consensus latency by approximately 30%and decreases the proportion of Byzantine nodes by 40%after 100 rounds of consensus.Furthermore,the DRL-based SFC deployment algorithm(SDRL)exhibits rapid convergence during training,with improvements in long-term average revenue,request acceptance rate,and revenue/cost ratio of 17%,14.49%,and 20.35%,respectively,over existing algorithms.Additionally,the CPU resource utilization of the SDRL algorithmreaches up to 42%,which is 27.96%higher than other algorithms.These findings indicate that the proposed algorithm substantially enhances resource utilization efficiency,service quality,and security in SFC deployment.展开更多
In the first-tier cities,subway has become an important carrier and life focus of people’s daily travel activities.By studying the distribution of POIs of public service facilities around Metro Line 10,using GIS to q...In the first-tier cities,subway has become an important carrier and life focus of people’s daily travel activities.By studying the distribution of POIs of public service facilities around Metro Line 10,using GIS to quantitatively analyze the surrounding formats of subway stations,discussing the functional attributes of subway stations,and discussing the distribution of urban functions from a new perspective,this paper provided guidance and advice for the construction of service facilities.展开更多
Urban functional area(UFA)is a core scientific issue affecting urban sustainability.The current knowledge gap is mainly reflected in the lack of multi-scale quantitative interpretation methods from the perspective of ...Urban functional area(UFA)is a core scientific issue affecting urban sustainability.The current knowledge gap is mainly reflected in the lack of multi-scale quantitative interpretation methods from the perspective of human-land interaction.In this paper,based on multi-source big data include 250 m×250 m resolution cell phone data,1.81×105 Points of Interest(POI)data and administrative boundary data,we built a UFA identification method and demonstrated empirically in Shenyang City,China.We argue that the method we built can effectively identify multi-scale multi-type UFAs based on human activity and further reveal the spatial correlation between urban facilities and human activity.The empirical study suggests that the employment functional zones in Shenyang City are more concentrated in central cities than other single functional zones.There are more mix functional areas in the central city areas,while the planned industrial new cities need to develop comprehensive functions in Shenyang.UFAs have scale effects and human-land interaction patterns.We suggest that city decision makers should apply multi-sources big data to measure urban functional service in a more refined manner from a supply-demand perspective.展开更多
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter...With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.展开更多
The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap ...The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap by developing the Housing Contradiction Evaluation Weighted Index(HCEWI)model,making three key contributions to high-resolution housing monitoring.First,we establish a tripartite theoretical framework integrating dynamic population pressure(PPI),housing supply potential(HSI),and functional diversity(HHI).The PPI innovatively combines mobile signaling data with principal component analysis to capture real-time commuting patterns,while the HSI introduces a novel dual-criteria system based on Local Climate Zones(LCZ),weighted by building density and residential function ratio.Second,we develop a spatiotemporal coupling architecture featuring an entropy-weighted dynamic integration mechanism with self-correcting modules,demonstrating robust performance against data noise.Third,our 25-month longitudinal analysis in Shenzhen reveals significant findings,including persistent bipolar clustering patterns,contrasting volatility between peripheral and core areas,and seasonal policy responsiveness.Methodologically,we advance urban diagnostics through 500-meter grid monthly monitoring and process-oriented temporal operators that reveal“tentacle-like”spatial restructuring along transit corridors.Our findings provide a replicable framework for precision housing governance and demonstrate the transformative potential of mobile signaling data in implementing China’s“city-specific policy”approach.We further propose targeted intervention strategies,including balance regulation for high-contradiction zones,Transit-Oriented Development(TOD)activation for low-contradiction clusters,and dynamic land conversion mechanisms for transitional areas.展开更多
The objective,connotations and research issues of big geodata mining were discussed to address its significance to geographical research in this paper.Big geodata may be categorized into two domains:big earth observat...The objective,connotations and research issues of big geodata mining were discussed to address its significance to geographical research in this paper.Big geodata may be categorized into two domains:big earth observation data and big human behavior data.A description of big geodata includes,in addition to the“5Vs”(volume,velocity,value,variety and veracity),a further five features,that is,granularity,scope,density,skewness and precision.Based on this approach,the essence of mining big geodata includes four aspects.First,flow space,where flow replaces points in traditional space,will become the new presentation form for big human behavior data.Second,the objectives for mining big geodata are the spatial patterns and the spatial relationships.Third,the spatiotemporal distributions of big geodata can be viewed as overlays of multiple geographic patterns and the characteristics of the data,namely heterogeneity and homogeneity,may change with scale.Fourth,data mining can be seen as a tool for discovery of geographic patterns and the patterns revealed may be attributed to human-land relationships.The big geodata mining methods may be categorized into two types in view of the mining objective,i.e.,classification mining and relationship mining.Future research will be faced by a number of issues,including the aggregation and connection of big geodata,the effective evaluation of the mining results and the challenge for mining to reveal“non-trivial”knowledge.展开更多
Multimedia big data brings tremendous challenges as well as opportunities for multimedia applications/services. In this paper, we present a survey and tutorial for multimedia big data. After discussing the characteris...Multimedia big data brings tremendous challenges as well as opportunities for multimedia applications/services. In this paper, we present a survey and tutorial for multimedia big data. After discussing the characteristics of multimedia big data such as human-centricity, multimodality, heterogeneity, unprecedented volume, and so on, this paper provides an overview of the state-of-the-art of multimedia big data, reviews the latest related technologies, and discusses the technical challenges. We conclude this paper with a discussion of open problems and future directions.展开更多
When the Wireless Sensor Network(WSN)is combined with the Internet of Things(IoT),it can be employed in a wide range of applications,such as agriculture,industry 4.0,health care,smart homes,among others.Accessing the ...When the Wireless Sensor Network(WSN)is combined with the Internet of Things(IoT),it can be employed in a wide range of applications,such as agriculture,industry 4.0,health care,smart homes,among others.Accessing the big data generated by these applications in Cloud Servers(CSs),requires higher levels of authenticity and confidentiality during communication conducted through the Internet.Signcryption is one of the most promising approaches nowadays for overcoming such obstacles,due to its combined nature,i.e.,signature and encryption.A number of researchers have developed schemes to address issues related to access control in the IoT literature,however,the majority of these schemes are based on homogeneous nature.This will be neither adequate nor practical for heterogeneous IoT environments.In addition,these schemes are based on bilinear pairing and elliptic curve cryptography,which further requires additional processing time and more communication overheads that is inappropriate for real-time communication.Consequently,this paper aims to solve the above-discussed issues,we proposed an access control scheme for IoT environments using heterogeneous signcryption scheme with the efficiency and security hardiness of hyperelliptic curve.Besides the security services such as replay attack prevention,confidentiality,integrity,unforgeability,non-repudiations,and forward secrecy,the proposed scheme has very low computational and communication costs,when it is compared to existing schemes.This is primarily because of hyperelliptic curve lighter nature of key and other parameters.The AVISPA tool is used to simulate the security requirements of our proposed scheme and the results were under two backbends(Constraint Logic-based Attack Searcher(CL-b-AtSER)and On-the-Fly Model Checker(ON-t-FL-MCR))proved to be SAFE when the presented scheme is coded in HLPSL language.This scheme was proven to be capable of preventing a variety of attacks,including confidentiality,integrity,unforgeability,non-repudiation,forward secrecy,and replay attacks.展开更多
As the big data era is coming, it brings new challenges to the massive data processing. A combination of GPU and CPU on chip is the trend to release the pressure of large scale computing. We found that there are diffe...As the big data era is coming, it brings new challenges to the massive data processing. A combination of GPU and CPU on chip is the trend to release the pressure of large scale computing. We found that there are different memory access characteristics between GPU and CPU. The most important one is that the programs of GPU include a large number of threads, which lead to higher access frequency in cache than the CPU programs. Although the LRU policy favors the programs with high memory access frequency, the programs of GPU can’t get the corresponding performance boost even more cache resources are provided. So LRU policy is not suitable for heterogeneous multi-core processor. Based on the different characteristics of GPU and CPU programs on memory access, this paper proposes an LLC dynamic replacement policy--DIPP (Dynamic Insertion / Promotion Policy) for heterogeneous multi-core processors.The core idea of the replacement policy is to reduce the miss rate of the program and enhance the overall system performance by limiting the cache resources that GPU can acquire and reducing the thread interferences between programs. Experiments compare the DIPP replacement policy with LRU and we conduct a classified discussion according to the program results of GPU. Friendly programs enhance 23.29% on the average performance (using arithmetic mean).Large working sets programs can improve 13.95%, compute-intensive programs enhance 9.66% and stream class programs improve 3.8%.展开更多
基金supported by Natural Science Foundation of China(Nos.62303126,62362008,author Z.Z,https://www.nsfc.gov.cn/,accessed on 20 December 2024)Major Scientific and Technological Special Project of Guizhou Province([2024]014)+2 种基金Guizhou Provincial Science and Technology Projects(No.ZK[2022]General149) ,author Z.Z,https://kjt.guizhou.gov.cn/,accessed on 20 December 2024)The Open Project of the Key Laboratory of Computing Power Network and Information Security,Ministry of Education under Grant 2023ZD037,author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024)Open Research Project of the State Key Laboratory of Industrial Control Technology,Zhejiang University,China(No.ICT2024B25),author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024).
文摘Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.
基金supported by the National Key Research and Development Program of China(grant number 2019YFE0123600)。
文摘The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.
基金supported by the National Natural Science Foundation of China under Grant 62471493 and 62402257partially supported by the Natural Science Foundation of Shandong Province under Grant ZR2023LZH017,ZR2024MF066 and 2023QF025+2 种基金partially supported by the Open Research Subject of State Key Laboratory of Intelligent Game(No.ZBKF-24-12)partially supported by the Foundation of Key Laboratory of Education Informatization for Nationalities(Yunnan Normal University),the Ministry of Education(No.EIN2024C006)partially supported by the Key Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE(No.202306).
文摘As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitating data-driven decision making,fostering innovation ecosystems,and maintaining operational stability.In this study,we propose an advanced deployment algorithm for Service Function Chaining(SFC)that leverages an enhanced Practical Byzantine Fault Tolerance(PBFT)mechanism.The main goal is to tackle the issues of security and resource efficiency in SFC implementation across diverse network settings.By integrating blockchain technology and Deep Reinforcement Learning(DRL),our algorithm not only optimizes resource utilization and quality of service but also ensures robust security during SFC deployment.Specifically,the enhanced PBFT consensus mechanism(VRPBFT)significantly reduces consensus latency and improves Byzantine node detection through the introduction of a Verifiable Random Function(VRF)and a node reputation grading model.Experimental results demonstrate that compared to traditional PBFT,the proposed VRPBFT algorithm reduces consensus latency by approximately 30%and decreases the proportion of Byzantine nodes by 40%after 100 rounds of consensus.Furthermore,the DRL-based SFC deployment algorithm(SDRL)exhibits rapid convergence during training,with improvements in long-term average revenue,request acceptance rate,and revenue/cost ratio of 17%,14.49%,and 20.35%,respectively,over existing algorithms.Additionally,the CPU resource utilization of the SDRL algorithmreaches up to 42%,which is 27.96%higher than other algorithms.These findings indicate that the proposed algorithm substantially enhances resource utilization efficiency,service quality,and security in SFC deployment.
基金Beijing Municipal Social Science Foundation(22GLC062)Research on service function renewal of Beijing subway station living circle driven by multiple big data.Beijing Municipal Education Commission Social Science Project(KM202010009002)Young YuYou Talents Training Plan of North China University of Technology.
文摘In the first-tier cities,subway has become an important carrier and life focus of people’s daily travel activities.By studying the distribution of POIs of public service facilities around Metro Line 10,using GIS to quantitatively analyze the surrounding formats of subway stations,discussing the functional attributes of subway stations,and discussing the distribution of urban functions from a new perspective,this paper provided guidance and advice for the construction of service facilities.
基金Under the auspices of Natural Science Foundation of China(No.41971166)。
文摘Urban functional area(UFA)is a core scientific issue affecting urban sustainability.The current knowledge gap is mainly reflected in the lack of multi-scale quantitative interpretation methods from the perspective of human-land interaction.In this paper,based on multi-source big data include 250 m×250 m resolution cell phone data,1.81×105 Points of Interest(POI)data and administrative boundary data,we built a UFA identification method and demonstrated empirically in Shenyang City,China.We argue that the method we built can effectively identify multi-scale multi-type UFAs based on human activity and further reveal the spatial correlation between urban facilities and human activity.The empirical study suggests that the employment functional zones in Shenyang City are more concentrated in central cities than other single functional zones.There are more mix functional areas in the central city areas,while the planned industrial new cities need to develop comprehensive functions in Shenyang.UFAs have scale effects and human-land interaction patterns.We suggest that city decision makers should apply multi-sources big data to measure urban functional service in a more refined manner from a supply-demand perspective.
文摘With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.
基金National Natural Science Foundation of China(No.42101346)Undergraduate Training Programs for Innovation and Entrepreneurship of Wuhan University(GeoAI Special Project)(No.202510486196).
文摘The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap by developing the Housing Contradiction Evaluation Weighted Index(HCEWI)model,making three key contributions to high-resolution housing monitoring.First,we establish a tripartite theoretical framework integrating dynamic population pressure(PPI),housing supply potential(HSI),and functional diversity(HHI).The PPI innovatively combines mobile signaling data with principal component analysis to capture real-time commuting patterns,while the HSI introduces a novel dual-criteria system based on Local Climate Zones(LCZ),weighted by building density and residential function ratio.Second,we develop a spatiotemporal coupling architecture featuring an entropy-weighted dynamic integration mechanism with self-correcting modules,demonstrating robust performance against data noise.Third,our 25-month longitudinal analysis in Shenzhen reveals significant findings,including persistent bipolar clustering patterns,contrasting volatility between peripheral and core areas,and seasonal policy responsiveness.Methodologically,we advance urban diagnostics through 500-meter grid monthly monitoring and process-oriented temporal operators that reveal“tentacle-like”spatial restructuring along transit corridors.Our findings provide a replicable framework for precision housing governance and demonstrate the transformative potential of mobile signaling data in implementing China’s“city-specific policy”approach.We further propose targeted intervention strategies,including balance regulation for high-contradiction zones,Transit-Oriented Development(TOD)activation for low-contradiction clusters,and dynamic land conversion mechanisms for transitional areas.
基金National Natural Science Foundation of China,No.41525004,No.41421001。
文摘The objective,connotations and research issues of big geodata mining were discussed to address its significance to geographical research in this paper.Big geodata may be categorized into two domains:big earth observation data and big human behavior data.A description of big geodata includes,in addition to the“5Vs”(volume,velocity,value,variety and veracity),a further five features,that is,granularity,scope,density,skewness and precision.Based on this approach,the essence of mining big geodata includes four aspects.First,flow space,where flow replaces points in traditional space,will become the new presentation form for big human behavior data.Second,the objectives for mining big geodata are the spatial patterns and the spatial relationships.Third,the spatiotemporal distributions of big geodata can be viewed as overlays of multiple geographic patterns and the characteristics of the data,namely heterogeneity and homogeneity,may change with scale.Fourth,data mining can be seen as a tool for discovery of geographic patterns and the patterns revealed may be attributed to human-land relationships.The big geodata mining methods may be categorized into two types in view of the mining objective,i.e.,classification mining and relationship mining.Future research will be faced by a number of issues,including the aggregation and connection of big geodata,the effective evaluation of the mining results and the challenge for mining to reveal“non-trivial”knowledge.
基金supported in part by the Na tional Natural Science Foundation of China (NO. 61401004, 61271233, 60972038)Plan of introduction and cultivation of university leading talents in Anhui (No.gxfxZ D2016013)+3 种基金the Natural Science Foundation of the Higher Education Institutions of Anhui Province, China (No. KJ2010B357)Startup Project of Anhui Normal University Doctor ScientificResearch (No.2016XJJ129)the US Nation al Science Foundation under grants CNS1702957 and ACI-1642133the Wireless Engineering Research and Education Center at Auburn University
文摘Multimedia big data brings tremendous challenges as well as opportunities for multimedia applications/services. In this paper, we present a survey and tutorial for multimedia big data. After discussing the characteristics of multimedia big data such as human-centricity, multimodality, heterogeneity, unprecedented volume, and so on, this paper provides an overview of the state-of-the-art of multimedia big data, reviews the latest related technologies, and discusses the technical challenges. We conclude this paper with a discussion of open problems and future directions.
文摘When the Wireless Sensor Network(WSN)is combined with the Internet of Things(IoT),it can be employed in a wide range of applications,such as agriculture,industry 4.0,health care,smart homes,among others.Accessing the big data generated by these applications in Cloud Servers(CSs),requires higher levels of authenticity and confidentiality during communication conducted through the Internet.Signcryption is one of the most promising approaches nowadays for overcoming such obstacles,due to its combined nature,i.e.,signature and encryption.A number of researchers have developed schemes to address issues related to access control in the IoT literature,however,the majority of these schemes are based on homogeneous nature.This will be neither adequate nor practical for heterogeneous IoT environments.In addition,these schemes are based on bilinear pairing and elliptic curve cryptography,which further requires additional processing time and more communication overheads that is inappropriate for real-time communication.Consequently,this paper aims to solve the above-discussed issues,we proposed an access control scheme for IoT environments using heterogeneous signcryption scheme with the efficiency and security hardiness of hyperelliptic curve.Besides the security services such as replay attack prevention,confidentiality,integrity,unforgeability,non-repudiations,and forward secrecy,the proposed scheme has very low computational and communication costs,when it is compared to existing schemes.This is primarily because of hyperelliptic curve lighter nature of key and other parameters.The AVISPA tool is used to simulate the security requirements of our proposed scheme and the results were under two backbends(Constraint Logic-based Attack Searcher(CL-b-AtSER)and On-the-Fly Model Checker(ON-t-FL-MCR))proved to be SAFE when the presented scheme is coded in HLPSL language.This scheme was proven to be capable of preventing a variety of attacks,including confidentiality,integrity,unforgeability,non-repudiation,forward secrecy,and replay attacks.
文摘As the big data era is coming, it brings new challenges to the massive data processing. A combination of GPU and CPU on chip is the trend to release the pressure of large scale computing. We found that there are different memory access characteristics between GPU and CPU. The most important one is that the programs of GPU include a large number of threads, which lead to higher access frequency in cache than the CPU programs. Although the LRU policy favors the programs with high memory access frequency, the programs of GPU can’t get the corresponding performance boost even more cache resources are provided. So LRU policy is not suitable for heterogeneous multi-core processor. Based on the different characteristics of GPU and CPU programs on memory access, this paper proposes an LLC dynamic replacement policy--DIPP (Dynamic Insertion / Promotion Policy) for heterogeneous multi-core processors.The core idea of the replacement policy is to reduce the miss rate of the program and enhance the overall system performance by limiting the cache resources that GPU can acquire and reducing the thread interferences between programs. Experiments compare the DIPP replacement policy with LRU and we conduct a classified discussion according to the program results of GPU. Friendly programs enhance 23.29% on the average performance (using arithmetic mean).Large working sets programs can improve 13.95%, compute-intensive programs enhance 9.66% and stream class programs improve 3.8%.