Cell-free Wireless Heterogeneous Networks(HetNets)have emerged as a technological alternative for conventional cellular networks.In this paper,we study the spatially correlative caching strategy,the energy analysis,an...Cell-free Wireless Heterogeneous Networks(HetNets)have emerged as a technological alternative for conventional cellular networks.In this paper,we study the spatially correlative caching strategy,the energy analysis,and the impact of parameter β on the total energy cost of the cell-free wireless HetNets with Access Points distributed by Beta Ginibre Point Process(β-GPP).We derive the approximate expression of Successful Delivery Probability(SDP)based on the Signal-to-Interference-plus-Noise Ratio coverage model.From both analytical and simulation results,it is shown that the proposed caching model based on β-GPP placement,which jointly takes into account path loss,fading,and interference,can closely simulate the caching performance of the cell-free HetNets in terms of SDP.By guaranteeing the outage probability constraints,the analytical expression of the uplink energy cost is also derived.Another conclusion is that with AP locations modeled by β-GPP,the power consumption is not sensitive to β,but is sensitive to the dimension of the kernel function;hence β is less restrictive,and only the truncation of the Ginibre kernel has to be appropriately modified.These findings are new compared with the existing literature where the nodes are commonly assumed to be of Poisson Point Process,Matern Hard-Core Process,or Poisson Cluster Process deployment in cell-free systems.展开更多
At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems ba...At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems based on deep machine learning.The caching model includes program caching,Structured Query Language(SQL)preprocessing,and core caching modules.Among them,the method to improve the efficiency of the statement is to adjust operations such as multi-table joins and replacement keywords in the SQL optimizer.Build predictive models using boosted regression trees in the core caching module.Generate a series of regression tree models using machine learning algorithms.Analyze the resource occupancy rate in the power information system to dynamically adjust the voting selection of the regression tree.At the same time,the voting threshold of the prediction model is dynamically adjusted.By analogy,the cache model is re-initialized.The experimental results show that the model has a good cache hit rate and cache efficiency,and can improve the data cache performance of the power information system.It has a high hit rate and short delay time,and always maintains a good hit rate even under different computer memory;at the same time,it only occupies less space and less CPU during actual operation,which is beneficial to power The information system operates efficiently and quickly.展开更多
In Information Centric Networking(ICN)where content is the object of exchange,in-network caching is a unique functional feature with the ability to handle data storage and distribution in remote sensing satellite netw...In Information Centric Networking(ICN)where content is the object of exchange,in-network caching is a unique functional feature with the ability to handle data storage and distribution in remote sensing satellite networks.Setting up cache space at any node enables users to access data nearby,thus relieving the processing pressure on the servers.However,the existing caching strategies still suffer from the lack of global planning of cache contents and low utilization of cache resources due to the lack of fine-grained division of cache contents.To address the issues mentioned,a cooperative caching strategy(CSTL)for remote sensing satellite networks based on a two-layer caching model is proposed.The two-layer caching model is constructed by setting up separate cache spaces in the satellite network and the ground station.Probabilistic caching of popular contents in the region at the ground station to reduce the access delay of users.A content classification method based on hierarchical division is proposed in the satellite network,and differential probabilistic caching is employed for different levels of content.The cached content is also dynamically adjusted by analyzing the subsequent changes in the popularity of the cached content.In the two-layer caching model,ground stations and satellite networks collaboratively cache to achieve global planning of cache contents,rationalize the utilization of cache resources,and reduce the propagation delay of remote sensing data.Simulation results show that the CSTL strategy not only has a high cache hit ratio compared with other caching strategies but also effectively reduces user request delay and server load,which satisfies the timeliness requirement of remote sensing data transmission.展开更多
With the rapid development of generative artificial intelligence technology,the traditional cloud-based centralized model training and inference face significant limitations due to high transmission latency and costs,...With the rapid development of generative artificial intelligence technology,the traditional cloud-based centralized model training and inference face significant limitations due to high transmission latency and costs,which restrict user-side in-situ Artificial Intelligence Generated Content(AIGC)service requests.To this end,we propose the Edge Artificial Intelligence Generated Content(Edge AIGC)framework,which can effectively address the challenges of cloud computing by implementing in-situ processing of services close to the data source through edge computing.However,AIGC models usually have a large parameter scale and complex computing requirements,which poses a huge challenge to the storage and computing resources of edge devices.This paper focuses on the edge intelligence model caching and resource allocation problems in the Edge AIGC framework,aiming to improve the cache hit rate and resource utilization of edge devices for models by optimizing the model caching strategy and resource allocation scheme,and realize in-situ AIGC service processing.With the optimization objectives of minimizing service request response time and execution cost in resource-constrained environments,we employ the Twin Delayed Deep Deterministic Policy Gradient algorithm for optimization.Experimental results show that,compared with other methods,our model caching and resource allocation strategies can effectively improve the cache hit rate by at least 41.06%and reduce the response cost as well.展开更多
基金supported in part by the National Natural Science Foundation of China(NSFC)under the grant number 61901075the Natural Science Foundation of Chongqing,China,under the grant number cstc2019jcyj-msxmX0602+1 种基金Chongqing Basic and Cutting edge Project under the grant number cstc2018jcyjAX0507Chongqing University of Posts and Telecommunications Doctoral Candidates High-end Talent Training Project(No.BYJS2017001).
文摘Cell-free Wireless Heterogeneous Networks(HetNets)have emerged as a technological alternative for conventional cellular networks.In this paper,we study the spatially correlative caching strategy,the energy analysis,and the impact of parameter β on the total energy cost of the cell-free wireless HetNets with Access Points distributed by Beta Ginibre Point Process(β-GPP).We derive the approximate expression of Successful Delivery Probability(SDP)based on the Signal-to-Interference-plus-Noise Ratio coverage model.From both analytical and simulation results,it is shown that the proposed caching model based on β-GPP placement,which jointly takes into account path loss,fading,and interference,can closely simulate the caching performance of the cell-free HetNets in terms of SDP.By guaranteeing the outage probability constraints,the analytical expression of the uplink energy cost is also derived.Another conclusion is that with AP locations modeled by β-GPP,the power consumption is not sensitive to β,but is sensitive to the dimension of the kernel function;hence β is less restrictive,and only the truncation of the Ginibre kernel has to be appropriately modified.These findings are new compared with the existing literature where the nodes are commonly assumed to be of Poisson Point Process,Matern Hard-Core Process,or Poisson Cluster Process deployment in cell-free systems.
文摘At present,the database cache model of power information system has problems such as slow running speed and low database hit rate.To this end,this paper proposes a database cache model for power information systems based on deep machine learning.The caching model includes program caching,Structured Query Language(SQL)preprocessing,and core caching modules.Among them,the method to improve the efficiency of the statement is to adjust operations such as multi-table joins and replacement keywords in the SQL optimizer.Build predictive models using boosted regression trees in the core caching module.Generate a series of regression tree models using machine learning algorithms.Analyze the resource occupancy rate in the power information system to dynamically adjust the voting selection of the regression tree.At the same time,the voting threshold of the prediction model is dynamically adjusted.By analogy,the cache model is re-initialized.The experimental results show that the model has a good cache hit rate and cache efficiency,and can improve the data cache performance of the power information system.It has a high hit rate and short delay time,and always maintains a good hit rate even under different computer memory;at the same time,it only occupies less space and less CPU during actual operation,which is beneficial to power The information system operates efficiently and quickly.
基金This research was funded by the National Natural Science Foundation of China(No.U21A20451)the Science and Technology Planning Project of Jilin Province(No.20200401105GX)the China University Industry University Research Innovation Fund(No.2021FNA01003).
文摘In Information Centric Networking(ICN)where content is the object of exchange,in-network caching is a unique functional feature with the ability to handle data storage and distribution in remote sensing satellite networks.Setting up cache space at any node enables users to access data nearby,thus relieving the processing pressure on the servers.However,the existing caching strategies still suffer from the lack of global planning of cache contents and low utilization of cache resources due to the lack of fine-grained division of cache contents.To address the issues mentioned,a cooperative caching strategy(CSTL)for remote sensing satellite networks based on a two-layer caching model is proposed.The two-layer caching model is constructed by setting up separate cache spaces in the satellite network and the ground station.Probabilistic caching of popular contents in the region at the ground station to reduce the access delay of users.A content classification method based on hierarchical division is proposed in the satellite network,and differential probabilistic caching is employed for different levels of content.The cached content is also dynamically adjusted by analyzing the subsequent changes in the popularity of the cached content.In the two-layer caching model,ground stations and satellite networks collaboratively cache to achieve global planning of cache contents,rationalize the utilization of cache resources,and reduce the propagation delay of remote sensing data.Simulation results show that the CSTL strategy not only has a high cache hit ratio compared with other caching strategies but also effectively reduces user request delay and server load,which satisfies the timeliness requirement of remote sensing data transmission.
基金supported in part by the Shandong Provincial Natural Science Foundation under Grants ZR2023LZH017,ZR2022LZH015 and ZR2024MF066the National Natural Science Foundation of China under Grant 62471493+1 种基金the Tertiary Education Scientific research project of Guangzhou Municipal Education Bureau under Grant 2024312246the Guangzhou Higher Education Teaching Quality and Teaching Reform Project under Grant 2023KCJJD002。
文摘With the rapid development of generative artificial intelligence technology,the traditional cloud-based centralized model training and inference face significant limitations due to high transmission latency and costs,which restrict user-side in-situ Artificial Intelligence Generated Content(AIGC)service requests.To this end,we propose the Edge Artificial Intelligence Generated Content(Edge AIGC)framework,which can effectively address the challenges of cloud computing by implementing in-situ processing of services close to the data source through edge computing.However,AIGC models usually have a large parameter scale and complex computing requirements,which poses a huge challenge to the storage and computing resources of edge devices.This paper focuses on the edge intelligence model caching and resource allocation problems in the Edge AIGC framework,aiming to improve the cache hit rate and resource utilization of edge devices for models by optimizing the model caching strategy and resource allocation scheme,and realize in-situ AIGC service processing.With the optimization objectives of minimizing service request response time and execution cost in resource-constrained environments,we employ the Twin Delayed Deep Deterministic Policy Gradient algorithm for optimization.Experimental results show that,compared with other methods,our model caching and resource allocation strategies can effectively improve the cache hit rate by at least 41.06%and reduce the response cost as well.