In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get...In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get the electromagnetic wave propagation model of typical scenarios firstly and then do the network design by simulation offline,which obviously leads to a 6G network lacking of adaptation to dynamic environments.Recently,with the aid of sensing enhancement,more environment information can be obtained.Based on this,from radio wave propagation perspective,we propose a predictive 6G network with environment sensing enhancement,the electromagnetic wave propagation characteristics prediction enabled network(EWave Net),to further release the potential of 6G.To this end,a prediction plane is created to sense,predict and utilize the physical environment information in EWave Net to realize the electromagnetic wave propagation characteristics prediction timely.A two-level closed feedback workflow is also designed to enhance the sensing and prediction ability for EWave Net.Several promising application cases of EWave Net are analyzed and the open issues to achieve this goal are addressed finally.展开更多
With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integ...With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integrated terrestrial-satellite networks are integrated into ultra dense satellite-enabled 6G networks architecture.Then the subchannel and power allocation schemes for the downlink of the ultra dense satellite-enabled 6G heterogeneous networks are introduced.Satellite mobile edge computing(SMEC)with edge caching in three-layer heterogeneous networks serves to reduce the link traffic of networks.Furthermore,a scheme for interference management is presented,involving quality-of-service(QoS)and co-tier/cross-tier interference constraints.The simulation results show that the proposed schemes can significantly increase the total capacity of ultra dense satellite-enabled 6G heterogeneous networks.展开更多
Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control l...Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.展开更多
In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperativ...In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperative communication scenarios,the recognition of digital signal modulations assists people in identifying the communication targets and ensures an effective management over them.The recent advancements in both Machine Learning(ML)and Deep Learning(DL)models demand the development of effective modulation recognition models with self-learning capability.In this background,the current research article designs aDeep Learning enabled Intelligent Modulation Recognition of Communication Signal(DLIMR-CS)technique for next-generation networks.The aim of the proposed DLIMR-CS technique is to classify different kinds of digitally-modulated signals.In addition,the fractal feature extraction process is appliedwith the help of the Sevcik Fractal Dimension(SFD)approach.Then,the extracted features are fed into the Deep Variational Autoencoder(DVAE)model for the classification of the modulated signals.In order to improve the classification performance of the DVAE model,the Tunicate Swarm Algorithm(TSA)is used to finetune the hyperparameters involved in DVAE model.A wide range of simulations was conducted to establish the enhanced performance of the proposed DLIMR-CS model.The experimental outcomes confirmed the superior recognition rate of the DLIMR-CS model over recent state-of-the-art methods under different evaluation parameters.展开更多
Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks...Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks. In this paper, we aim to minimize the transmission delay in the MIMO-MEC in order to improve the spectral efficiency, energy efficiency, and data rate of MEC offloading. Dinkelbach transform and generalized singular value decomposition (GSVD) method are used to solve the delay minimization problem. Analytical results are provided to evaluate the performance of the proposed Hybrid-NOMA-MIMO-MEC system. Simulation results reveal that the H-NOMA-MIMO-MEC system can achieve better delay performance and lower energy consumption compared to OMA.展开更多
Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the compu...Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the computational concept of Decision-Making of cognition intelligence,its implementation framework adapting to foreseen innovations on networks and services,and its empirical evaluations are key techniques to guarantee the generationagnostic intelligence evolution of wireless communication networks.In this paper,we propose an Intelligent Decision Making(IDM)framework,acting as the role of network brain,based on Reinforcement Learning modelling philosophy to empower autonomous intelligence evolution capability to 6G network.Besides,usage scenarios and simulation demonstrate the generality and efficiency of IDM.We hope that some of the ideas of IDM will assist the research of 6G network in a new or different light.展开更多
The forthcoming sixth generation(6G)of mobile communication networks is envisioned to be AInative,supporting intelligent services and pervasive computing at unprecedented scale.Among the key paradigms enabling this vi...The forthcoming sixth generation(6G)of mobile communication networks is envisioned to be AInative,supporting intelligent services and pervasive computing at unprecedented scale.Among the key paradigms enabling this vision,Federated Learning(FL)has gained prominence as a distributed machine learning framework that allows multiple devices to collaboratively train models without sharing raw data,thereby preserving privacy and reducing the need for centralized storage.This capability is particularly attractive for vision-based applications,where image and video data are both sensitive and bandwidth-intensive.However,the integration of FL with 6G networks presents unique challenges,including communication bottlenecks,device heterogeneity,and trade-offs between model accuracy,latency,and energy consumption.In this paper,we developed a simulation-based framework to investigate the performance of FL in representative vision tasks under 6G-like environments.We formalize the system model,incorporating both the federated averaging(FedAvg)training process and a simplified communication costmodel that captures bandwidth constraints,packet loss,and variable latency across edge devices.Using standard image datasets(e.g.,MNIST,CIFAR-10)as benchmarks,we analyze how factors such as the number of participating clients,degree of data heterogeneity,and communication frequency influence convergence speed and model accuracy.Additionally,we evaluate the effectiveness of lightweight communication-efficient strategies,including local update tuning and gradient compression,in mitigating network overhead.The experimental results reveal several key insights:(i)communication limitations can significantly degrade FL convergence in vision tasks if not properly addressed;(ii)judicious tuning of local training epochs and client participation levels enables notable improvements in both efficiency and accuracy;and(iii)communication-efficient FL strategies provide a promising pathway to balance performance with the stringent latency and reliability requirements expected in 6G.These findings highlight the synergistic role of AI and nextgeneration networks in enabling privacy-preserving,real-time vision applications,and they provide concrete design guidelines for researchers and practitioners working at the intersection of FL and 6G.展开更多
Network architectures assisted by Generative Artificial Intelligence(GAI)are envisioned as foundational elements of sixth-generation(6G)communication system.To deliver ubiquitous intelligent services and meet diverse ...Network architectures assisted by Generative Artificial Intelligence(GAI)are envisioned as foundational elements of sixth-generation(6G)communication system.To deliver ubiquitous intelligent services and meet diverse service requirements,6G network architecture should offer personalized services to various mobile devices.Federated learning(FL)with personalized local training,as a privacypreserving machine learning(ML)approach,can be applied to address these challenges.In this paper,we propose a meta-learning-based personalized FL(PFL)method that improves both communication and computation efficiency by utilizing over-the-air computations.Its“pretraining-and-fine-tuning”principle makes it particularly suitable for enabling edge nodes to access personalized GAI services while preserving local privacy.Experiment results demonstrate the outperformance and efficacy of the proposed algorithm,and notably indicate enhanced communication efficiency without compromising accuracy.展开更多
随着第六代移动通信系统(6th generation mobile communication system, 6G)通信技术的发展,空天地一体化网络(Spaceair-ground integrated network, SAGIN)作为6G的重要组成部分,旨在实现卫星、空中平台与地面系统的无缝互联,在应急通...随着第六代移动通信系统(6th generation mobile communication system, 6G)通信技术的发展,空天地一体化网络(Spaceair-ground integrated network, SAGIN)作为6G的重要组成部分,旨在实现卫星、空中平台与地面系统的无缝互联,在应急通信、环境监测、智能交通等领域展现出巨大的潜力.然而,SAGIN具有异构结构、链路动态性高、资源分布广泛等特征,给网络的高效管理与优化带来巨大的挑战.近年来,人工智能(Artificial intelligence, AI)技术凭借强大的感知、学习与自主决策能力应用于通信网络,为SAGIN的智能演进提供了新契机.本文首先系统介绍SAGIN网络架构的基本组成与关键特征,并梳理当前主流AI技术在网络优化中的主要技术体系与适配优势,包括机器学习、图神经网络以及强化学习.其次,本文深入探讨了AI技术在SAGIN中智能资源管理、移动性管理与路由优化、空中平台路径规划、任务卸载与计算协同等典型场景中的应用与最新进展.最后,本文总结了AI技术应用在SAGIN网络中面临的挑战并展望了AI与SAGIN融合发展的未来方向.本文概述了AI技术在SAGIN网络中应用的优势与进展,旨在为AI赋能的SAGIN研究与应用发展提供技术参考.展开更多
The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enable...The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enables to achieve maximum throughput and delay for several applications.Besides,the evolution of 6G leads to the design of unmanned aerial vehicles(UAVs)in providing inexpensive and effective solutions in various application areas such as healthcare,environment monitoring,and so on.In the UAV network,effective data collection with restricted energy capacity poses a major issue to achieving high quality network communication.It can be addressed by the use of clustering techniques forUAVs in 6G networks.In this aspect,this study develops a novel metaheuristic based energy efficient data gathering scheme for clustered unmanned aerial vehicles(MEEDG-CUAV).The proposed MEEDG-CUAV technique intends in partitioning the UAV networks into various clusters and assign a cluster head(CH)to reduce the overall energy utilization.Besides,the quantum chaotic butterfly optimization algorithm(QCBOA)with a fitness function is derived to choose CHs and construct clusters.The experimental validation of the MEEDG-CUAV technique occurs utilizing benchmark dataset and the experimental results highlighted the better performance over the other state of art techniques interms of different measures.展开更多
The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,in...The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.展开更多
5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channe...5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.展开更多
The rise of 6G networks and the exponential growth of smart city infrastructures demand intelligent,real-time traffic forecasting solutions that preserve data privacy.This paper introduces NeuroCivitas,a federated dee...The rise of 6G networks and the exponential growth of smart city infrastructures demand intelligent,real-time traffic forecasting solutions that preserve data privacy.This paper introduces NeuroCivitas,a federated deep learning framework that integrates Convolutional Neural Networks(CNNs)for spatial pattern recognition and Long Short-Term Memory(LSTM)networks for temporal sequence modeling.Designed to meet the adaptive intelligence requirements of cognitive cities,NeuroCivitas leverages Federated Averaging(FedAvg)to ensure privacypreserving training while significantly reducing communication overhead—by 98.7%compared to centralized models.The model is evaluated using the Kaggle Traffic Prediction Dataset comprising 48,120 hourly records from four urban junctions.It achieves an RMSE of 2.76,MAE of 2.11,and an R^(2) score of 0.91,outperforming baseline models such as ARIMA,Linear Regression,Random Forest,and non-federated CNN-LSTM in both accuracy and scalability.Junctionwise and time-based performance analyses further validate its robustness,particularly during off-peak hours,while highlighting challenges in peak traffic forecasting.Ablation studies confirm the importance of both CNN and LSTM layers and temporal feature engineering in achieving optimal performance.Moreover,NeuroCivitas demonstrates stable convergence within 25 communication rounds and shows strong adaptability to non-IID data distributions.The framework is built with real-world deployment in mind,offering support for edge environments through lightweight architecture and the potential for enhancement with differential privacy and adversarial defense mechanisms.SHAPbased explainability is integrated to improve interpretability for stakeholders.In sum,NeuroCivitas delivers an accurate,scalable,and privacy-preserving traffic forecasting solution,tailored for 6G cognitive cities.Future extensions will incorporate fairness-aware optimization,real-time anomaly adaptation,multi-city validation,and advanced federated GNN comparisons to further enhance deployment readiness and societal impact.展开更多
The rapid evolution of wireless technologies and the advent of 6G networks present new challenges and opportunities for Internet ofThings(IoT)applications,particularly in terms of ultra-reliable,secure,and energyeffic...The rapid evolution of wireless technologies and the advent of 6G networks present new challenges and opportunities for Internet ofThings(IoT)applications,particularly in terms of ultra-reliable,secure,and energyefficient communication.This study explores the integration of Reconfigurable Intelligent Surfaces(RIS)into IoT networks to enhance communication performance.Unlike traditional passive reflector-based approaches,RIS is leveraged as an active optimization tool to improve both backscatter and direct communication modes,addressing critical IoT challenges such as energy efficiency,limited communication range,and double-fading effects in backscatter communication.We propose a novel computational framework that combines RIS functionality with Physical Layer Security(PLS)mechanisms,optimized through the algorithm known as Deep Deterministic Policy Gradient(DDPG).This framework adaptively adapts RIS configurations and transmitter beamforming to reduce key challenges,including imperfect channel state information(CSI)and hardware limitations like quantized RIS phase shifts.By optimizing both RIS settings and beamforming in real-time,our approach outperforms traditional methods by significantly increasing secrecy rates,improving spectral efficiency,and enhancing energy efficiency.Notably,this framework adapts more effectively to the dynamic nature of wireless channels compared to conventional optimization techniques,providing scalable solutions for large-scale RIS deployments.Our results demonstrate substantial improvements in communication performance setting a new benchmark for secure,efficient and scalable 6G communication.This work offers valuable insights for the future of IoT networks,with a focus on computational optimization,high spectral efficiency and energy-aware operations.展开更多
6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT netw...6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT networks face two critical challenges,i.e.,energy limitation and severe signal attenuation.Simultaneous wireless information and power transfer(SWIPT)and cooperative relaying provide effective ways to address these two challenges.In this paper,we investigate the energy self-sustainability(ESS)of 6G IoT network and propose an OFDM based bidirectional multi-relay SWIPT strategy for 6G IoT networks.In the proposed strategy,the transmission process is equally divided into two phases.Specifically,in phase1 two source nodes transmit their signals to relay nodes which will then use different subcarrier sets to decode information and harvest energy,respectively.In phase2 relay nodes forward signals to corresponding destination nodes with the harvested energy.We maximize the weighted sum transmission rate by optimizing subcarriers and power allocation.Our proposed strategy achieves larger weighted sum transmission rate comparing with the benchmark scheme.展开更多
Intelligence and perception are two operative technologies in 6G scenarios.The intelligent wireless network and information perception require a deep fusion of artificial intelligence(AI)and wireless communications in...Intelligence and perception are two operative technologies in 6G scenarios.The intelligent wireless network and information perception require a deep fusion of artificial intelligence(AI)and wireless communications in 6G systems.Therefore,fusion is becoming a typical feature and key challenge of 6G wireless communication systems.In this paper,we focus on the critical issues and propose three application scenarios in 6G wireless systems.Specifically,we first discuss the fusion of AI and 6G networks for the enhancement of 5G-advanced technology and future wireless communication systems.Then,we introduce the wireless AI technology architecture with 6G multidimensional information perception,which includes the physical layer technology of multi-dimensional feature information perception,full spectrum fusion technology,and intelligent wireless resource management.The discussion of key technologies for intelligent 6G wireless network networks is expected to provide a guideline for future research.展开更多
基金supported by the National Natural Science Foundation of China(No.92167202,61925102,U21B2014,62101069)the National Key R&D Program of China(No.2020YFB1805002)。
文摘In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get the electromagnetic wave propagation model of typical scenarios firstly and then do the network design by simulation offline,which obviously leads to a 6G network lacking of adaptation to dynamic environments.Recently,with the aid of sensing enhancement,more environment information can be obtained.Based on this,from radio wave propagation perspective,we propose a predictive 6G network with environment sensing enhancement,the electromagnetic wave propagation characteristics prediction enabled network(EWave Net),to further release the potential of 6G.To this end,a prediction plane is created to sense,predict and utilize the physical environment information in EWave Net to realize the electromagnetic wave propagation characteristics prediction timely.A two-level closed feedback workflow is also designed to enhance the sensing and prediction ability for EWave Net.Several promising application cases of EWave Net are analyzed and the open issues to achieve this goal are addressed finally.
基金supported in part by the National Key R&D Program of China(2020YFB1806103)the National Natural Science Foundation of China under Grant 62225103 and U22B2003+1 种基金Beijing Natural Science Foundation(L212004)China University Industry-University-Research Collaborative Innovation Fund(2021FNA05001).
文摘With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integrated terrestrial-satellite networks are integrated into ultra dense satellite-enabled 6G networks architecture.Then the subchannel and power allocation schemes for the downlink of the ultra dense satellite-enabled 6G heterogeneous networks are introduced.Satellite mobile edge computing(SMEC)with edge caching in three-layer heterogeneous networks serves to reduce the link traffic of networks.Furthermore,a scheme for interference management is presented,involving quality-of-service(QoS)and co-tier/cross-tier interference constraints.The simulation results show that the proposed schemes can significantly increase the total capacity of ultra dense satellite-enabled 6G heterogeneous networks.
基金funding by Comunidad de Madrid within the framework of the Multiannual Agreement with Universidad Politécnica de Madrid to encourage research by young doctors(PRINCE project).
文摘Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.
文摘In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperative communication scenarios,the recognition of digital signal modulations assists people in identifying the communication targets and ensures an effective management over them.The recent advancements in both Machine Learning(ML)and Deep Learning(DL)models demand the development of effective modulation recognition models with self-learning capability.In this background,the current research article designs aDeep Learning enabled Intelligent Modulation Recognition of Communication Signal(DLIMR-CS)technique for next-generation networks.The aim of the proposed DLIMR-CS technique is to classify different kinds of digitally-modulated signals.In addition,the fractal feature extraction process is appliedwith the help of the Sevcik Fractal Dimension(SFD)approach.Then,the extracted features are fed into the Deep Variational Autoencoder(DVAE)model for the classification of the modulated signals.In order to improve the classification performance of the DVAE model,the Tunicate Swarm Algorithm(TSA)is used to finetune the hyperparameters involved in DVAE model.A wide range of simulations was conducted to establish the enhanced performance of the proposed DLIMR-CS model.The experimental outcomes confirmed the superior recognition rate of the DLIMR-CS model over recent state-of-the-art methods under different evaluation parameters.
基金supported by Republic of Turkey Ministry of National Education
文摘Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks. In this paper, we aim to minimize the transmission delay in the MIMO-MEC in order to improve the spectral efficiency, energy efficiency, and data rate of MEC offloading. Dinkelbach transform and generalized singular value decomposition (GSVD) method are used to solve the delay minimization problem. Analytical results are provided to evaluate the performance of the proposed Hybrid-NOMA-MIMO-MEC system. Simulation results reveal that the H-NOMA-MIMO-MEC system can achieve better delay performance and lower energy consumption compared to OMA.
基金supported by National Key Research and Development Project 2018YFE0205503Beijing University of Posts and Telecommunications-China Mobile Research Institute Joint Innovation Center。
文摘Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the computational concept of Decision-Making of cognition intelligence,its implementation framework adapting to foreseen innovations on networks and services,and its empirical evaluations are key techniques to guarantee the generationagnostic intelligence evolution of wireless communication networks.In this paper,we propose an Intelligent Decision Making(IDM)framework,acting as the role of network brain,based on Reinforcement Learning modelling philosophy to empower autonomous intelligence evolution capability to 6G network.Besides,usage scenarios and simulation demonstrate the generality and efficiency of IDM.We hope that some of the ideas of IDM will assist the research of 6G network in a new or different light.
文摘The forthcoming sixth generation(6G)of mobile communication networks is envisioned to be AInative,supporting intelligent services and pervasive computing at unprecedented scale.Among the key paradigms enabling this vision,Federated Learning(FL)has gained prominence as a distributed machine learning framework that allows multiple devices to collaboratively train models without sharing raw data,thereby preserving privacy and reducing the need for centralized storage.This capability is particularly attractive for vision-based applications,where image and video data are both sensitive and bandwidth-intensive.However,the integration of FL with 6G networks presents unique challenges,including communication bottlenecks,device heterogeneity,and trade-offs between model accuracy,latency,and energy consumption.In this paper,we developed a simulation-based framework to investigate the performance of FL in representative vision tasks under 6G-like environments.We formalize the system model,incorporating both the federated averaging(FedAvg)training process and a simplified communication costmodel that captures bandwidth constraints,packet loss,and variable latency across edge devices.Using standard image datasets(e.g.,MNIST,CIFAR-10)as benchmarks,we analyze how factors such as the number of participating clients,degree of data heterogeneity,and communication frequency influence convergence speed and model accuracy.Additionally,we evaluate the effectiveness of lightweight communication-efficient strategies,including local update tuning and gradient compression,in mitigating network overhead.The experimental results reveal several key insights:(i)communication limitations can significantly degrade FL convergence in vision tasks if not properly addressed;(ii)judicious tuning of local training epochs and client participation levels enables notable improvements in both efficiency and accuracy;and(iii)communication-efficient FL strategies provide a promising pathway to balance performance with the stringent latency and reliability requirements expected in 6G.These findings highlight the synergistic role of AI and nextgeneration networks in enabling privacy-preserving,real-time vision applications,and they provide concrete design guidelines for researchers and practitioners working at the intersection of FL and 6G.
基金supported in part by the National Key R&D Program of China under Grant 2024YFE0200700in part by the National Natural Science Foundation of China under Grant 62201504.
文摘Network architectures assisted by Generative Artificial Intelligence(GAI)are envisioned as foundational elements of sixth-generation(6G)communication system.To deliver ubiquitous intelligent services and meet diverse service requirements,6G network architecture should offer personalized services to various mobile devices.Federated learning(FL)with personalized local training,as a privacypreserving machine learning(ML)approach,can be applied to address these challenges.In this paper,we propose a meta-learning-based personalized FL(PFL)method that improves both communication and computation efficiency by utilizing over-the-air computations.Its“pretraining-and-fine-tuning”principle makes it particularly suitable for enabling edge nodes to access personalized GAI services while preserving local privacy.Experiment results demonstrate the outperformance and efficacy of the proposed algorithm,and notably indicate enhanced communication efficiency without compromising accuracy.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under Grant Number(RGP 1/279/42).www.kku.edu.sa.
文摘The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enables to achieve maximum throughput and delay for several applications.Besides,the evolution of 6G leads to the design of unmanned aerial vehicles(UAVs)in providing inexpensive and effective solutions in various application areas such as healthcare,environment monitoring,and so on.In the UAV network,effective data collection with restricted energy capacity poses a major issue to achieving high quality network communication.It can be addressed by the use of clustering techniques forUAVs in 6G networks.In this aspect,this study develops a novel metaheuristic based energy efficient data gathering scheme for clustered unmanned aerial vehicles(MEEDG-CUAV).The proposed MEEDG-CUAV technique intends in partitioning the UAV networks into various clusters and assign a cluster head(CH)to reduce the overall energy utilization.Besides,the quantum chaotic butterfly optimization algorithm(QCBOA)with a fitness function is derived to choose CHs and construct clusters.The experimental validation of the MEEDG-CUAV technique occurs utilizing benchmark dataset and the experimental results highlighted the better performance over the other state of art techniques interms of different measures.
基金This study is supported by Fujitsu-Waseda Digital Annealer FWDA Research Project and Fujitsu Co-Creation Research Laboratory at Waseda University(Joint Research between Waseda University and Fujitsu Lab).The study was also partly supported by the School of Fundamental Science and Engineering,Faculty of Science and Engineering,Waseda University,Japan.This study was supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)Grant funded by the Korean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security)and Research Cluster Project,R20143,by the Zayed University Research Office.
文摘The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.
基金supported by the Motorola Mobility,the National Council for Scientific and Technological Development(No.433142/2018-9)Research Productivity Fellowship(No.312831/2020-0)the Pernambuco Research Foundation(FACEPE)。
文摘5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.
文摘The rise of 6G networks and the exponential growth of smart city infrastructures demand intelligent,real-time traffic forecasting solutions that preserve data privacy.This paper introduces NeuroCivitas,a federated deep learning framework that integrates Convolutional Neural Networks(CNNs)for spatial pattern recognition and Long Short-Term Memory(LSTM)networks for temporal sequence modeling.Designed to meet the adaptive intelligence requirements of cognitive cities,NeuroCivitas leverages Federated Averaging(FedAvg)to ensure privacypreserving training while significantly reducing communication overhead—by 98.7%compared to centralized models.The model is evaluated using the Kaggle Traffic Prediction Dataset comprising 48,120 hourly records from four urban junctions.It achieves an RMSE of 2.76,MAE of 2.11,and an R^(2) score of 0.91,outperforming baseline models such as ARIMA,Linear Regression,Random Forest,and non-federated CNN-LSTM in both accuracy and scalability.Junctionwise and time-based performance analyses further validate its robustness,particularly during off-peak hours,while highlighting challenges in peak traffic forecasting.Ablation studies confirm the importance of both CNN and LSTM layers and temporal feature engineering in achieving optimal performance.Moreover,NeuroCivitas demonstrates stable convergence within 25 communication rounds and shows strong adaptability to non-IID data distributions.The framework is built with real-world deployment in mind,offering support for edge environments through lightweight architecture and the potential for enhancement with differential privacy and adversarial defense mechanisms.SHAPbased explainability is integrated to improve interpretability for stakeholders.In sum,NeuroCivitas delivers an accurate,scalable,and privacy-preserving traffic forecasting solution,tailored for 6G cognitive cities.Future extensions will incorporate fairness-aware optimization,real-time anomaly adaptation,multi-city validation,and advanced federated GNN comparisons to further enhance deployment readiness and societal impact.
基金funded by the deanship of scientific research(DSR),King Abdukaziz University,Jeddah,under grant No.(G-1436-611-225)。
文摘The rapid evolution of wireless technologies and the advent of 6G networks present new challenges and opportunities for Internet ofThings(IoT)applications,particularly in terms of ultra-reliable,secure,and energyefficient communication.This study explores the integration of Reconfigurable Intelligent Surfaces(RIS)into IoT networks to enhance communication performance.Unlike traditional passive reflector-based approaches,RIS is leveraged as an active optimization tool to improve both backscatter and direct communication modes,addressing critical IoT challenges such as energy efficiency,limited communication range,and double-fading effects in backscatter communication.We propose a novel computational framework that combines RIS functionality with Physical Layer Security(PLS)mechanisms,optimized through the algorithm known as Deep Deterministic Policy Gradient(DDPG).This framework adaptively adapts RIS configurations and transmitter beamforming to reduce key challenges,including imperfect channel state information(CSI)and hardware limitations like quantized RIS phase shifts.By optimizing both RIS settings and beamforming in real-time,our approach outperforms traditional methods by significantly increasing secrecy rates,improving spectral efficiency,and enhancing energy efficiency.Notably,this framework adapts more effectively to the dynamic nature of wireless channels compared to conventional optimization techniques,providing scalable solutions for large-scale RIS deployments.Our results demonstrate substantial improvements in communication performance setting a new benchmark for secure,efficient and scalable 6G communication.This work offers valuable insights for the future of IoT networks,with a focus on computational optimization,high spectral efficiency and energy-aware operations.
基金This work was supported by China National Science Foundation under Grant No.61871348by University Key Laboratory of Advanced Wireless Communications of Guangdong Province,by the Project funded by China Postdoctoral Science Foundation under Grant 2019T120531+1 种基金by the Science and Technology Development Fund,Macao,China under Grant 0162/2019/A3by the Fundamental Research Funds for the Provincial Universities of Zhejiang under Grant RFA2019001.
文摘6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT networks face two critical challenges,i.e.,energy limitation and severe signal attenuation.Simultaneous wireless information and power transfer(SWIPT)and cooperative relaying provide effective ways to address these two challenges.In this paper,we investigate the energy self-sustainability(ESS)of 6G IoT network and propose an OFDM based bidirectional multi-relay SWIPT strategy for 6G IoT networks.In the proposed strategy,the transmission process is equally divided into two phases.Specifically,in phase1 two source nodes transmit their signals to relay nodes which will then use different subcarrier sets to decode information and harvest energy,respectively.In phase2 relay nodes forward signals to corresponding destination nodes with the harvested energy.We maximize the weighted sum transmission rate by optimizing subcarriers and power allocation.Our proposed strategy achieves larger weighted sum transmission rate comparing with the benchmark scheme.
文摘Intelligence and perception are two operative technologies in 6G scenarios.The intelligent wireless network and information perception require a deep fusion of artificial intelligence(AI)and wireless communications in 6G systems.Therefore,fusion is becoming a typical feature and key challenge of 6G wireless communication systems.In this paper,we focus on the critical issues and propose three application scenarios in 6G wireless systems.Specifically,we first discuss the fusion of AI and 6G networks for the enhancement of 5G-advanced technology and future wireless communication systems.Then,we introduce the wireless AI technology architecture with 6G multidimensional information perception,which includes the physical layer technology of multi-dimensional feature information perception,full spectrum fusion technology,and intelligent wireless resource management.The discussion of key technologies for intelligent 6G wireless network networks is expected to provide a guideline for future research.