Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant ...Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.展开更多
The adoption of 5G for Railways(5G-R)is expanding,particularly in high-speed trains,due to the benefits offered by 5G technology.High-speed trains must provide seamless connectivity and Quality of Service(QoS)to ensur...The adoption of 5G for Railways(5G-R)is expanding,particularly in high-speed trains,due to the benefits offered by 5G technology.High-speed trains must provide seamless connectivity and Quality of Service(QoS)to ensure passengers have a satisfactory experience throughout their journey.Installing base stations along urban environments can improve coverage but can dramatically reduce the experience of users due to interference.In particular,when a user with a mobile phone is a passenger in a high speed train traversing between urban centres,the coverage and the 5G resources in general need to be adequate not to diminish her experience of the service.The utilization of macro,pico,and femto cells may optimize the utilization of 5G resources.In this paper,a Genetic Algorithm(GA)-based approach to address the challenges of 5G network planning for 5G-R services is presented.The network is divided into three cell types,macro,pico,and femto cells—and the optimization process is designed to achieve a balance between key objectives:providing comprehensive coverage,minimizing interference,and maximizing energy efficiency.The study focuses on environments with high user density,such as high-speed trains,where reliable and high-quality connectivity is critical.Through simulations,the effectiveness of the GA-driven framework in optimizing coverage and performance in such scenarios is demonstrated.The algorithm is compared with the Particle Swarm Optimisation(PSO)and the Simulated Annealing(SA)methods and interesting insights emerged.The GA offers a strong balance between coverage and efficiency,achieving significantly higher coverage than PSO while maintaining competitive energy efficiency and interference levels.Its steady fitness improvement and adaptability make it well-suited for scenarios where wide coverage is a priority alongside acceptable performance trade-offs.展开更多
The rapid advancement of 6G communication technologies and generative artificial intelligence(AI)is catalyzing a new wave of innovation at the intersection of networking and intelligent computing.On the one hand,6G en...The rapid advancement of 6G communication technologies and generative artificial intelligence(AI)is catalyzing a new wave of innovation at the intersection of networking and intelligent computing.On the one hand,6G envisions a hyper-connected environment that supports ubiquitous intelligence through ultra-low latency,high throughput,massive device connectivity,and integrated sensing and communication.On the other hand,generative AI,powered by large foundation models,has emerged as a powerful paradigm capable of creating.展开更多
In the current 4th generation(4G)communication network,the base station with the same frequency transmission makes a serious interference among adjacent cells,and information transmission is susceptible to interferenc...In the current 4th generation(4G)communication network,the base station with the same frequency transmission makes a serious interference among adjacent cells,and information transmission is susceptible to interference such as channel multipath fading and occlusion effect.Detecting effectively spectrum signal under low signal-to-noise ratio(SNR),directly affects the whole performance of the wireless communication network system.This paper designs an energy signal detection algorithm based on stochastic resonance technology which transforms noise's signal energy into useful signal energy,and improves output SNR.The energy signal detection algorithm realizes the function of providing effective detection of signal under low SNR,and promotes the performance of the whole communication system.展开更多
This study presents a comparative analysis of optimisation strategies for designing hull shapes of Autonomous Underwater Vehicles(AUVs),paying special attention to drag,lift-to-drag ratio,and delivered power.A fully i...This study presents a comparative analysis of optimisation strategies for designing hull shapes of Autonomous Underwater Vehicles(AUVs),paying special attention to drag,lift-to-drag ratio,and delivered power.A fully integrated optimisation framework is developed accordingly,combining a single-objective Genetic Algorithm(GA)for design parameter generation,Computer-Aided Geometric Design(CAGD)for the creation of hull geometries and associated fluid domains,and a Reynolds-Averaged Navier-Stokes(RANS)solver for evaluating hydrodynamic performance metrics.This unified approach eliminates manual intervention,enabling automated determination of optimal hull configurations.Three distinct optimisation problems are addressed using the proposed methodology.First,the drag minimisation of a reference afterbody geometry(A1)at zero angle of attack is performed under constraints of fixed length and internal volume for various flow velocities spanning the range from 0.5 to 15 m/s.Second,the lift-to-drag ratio of A1 is maximised at a 6°angle of attack,maintaining constant total length and internal volume.Third,delivered power is minimised for A1 at a 0°angle of attack.The comparative analysis of results from all three optimisation cases reveals hull shapes with practical design significance.Notably,the shape optimised for minimum delivered power outperforms the other two across a range of velocities.Specifically,it achieves reductions in required power by 7.6%,7.8%,10.2%,and 13.04%at velocities of 0.5,1.0,1.5,and 2.152 m/s,respectively.展开更多
One of the most effective technology for the 5G mobile communications is Device-to-device(D2D)communication which is also called terminal pass-through technology.It can directly communicate between devices under the c...One of the most effective technology for the 5G mobile communications is Device-to-device(D2D)communication which is also called terminal pass-through technology.It can directly communicate between devices under the control of a base station and does not require a base station to forward it.The advantages of applying D2D communication technology to cellular networks are:It can increase the communication system capacity,improve the system spectrum efficiency,increase the data transmission rate,and reduce the base station load.Aiming at the problem of co-channel interference between the D2D and cellular users,this paper proposes an efficient algorithm for resource allocation based on the idea of Q-learning,which creates multi-agent learners from multiple D2D users,and the system throughput is determined from the corresponding state-learning of the Q value list and the maximum Q action is obtained through dynamic power for control for D2D users.The mutual interference between the D2D users and base stations and exact channel state information is not required during the Q-learning process and symmetric data transmission mechanism is adopted.The proposed algorithm maximizes the system throughput by controlling the power of D2D users while guaranteeing the quality-of-service of the cellular users.Simulation results show that the proposed algorithm effectively improves system performance as compared with existing algorithms.展开更多
Long Term Evolution (LTE) is designed to revolutionize mobile broadband technology with key considerations of higher data rate, improved power efficiency, low latency and better quality of service. This work analyzes ...Long Term Evolution (LTE) is designed to revolutionize mobile broadband technology with key considerations of higher data rate, improved power efficiency, low latency and better quality of service. This work analyzes the impact of resource scheduling algorithms on the performance of LTE (4G) and WCDMA (3G) networks. In this paper, a full illustration of LTE system is given together with different scheduling algorithms. Thereafter, 3G WCDMA and 4G LTE networks were simulated using Simulink simulator embedded in MATLAB and performance evaluations were carried out. The performance metrics used for the evaluations are average system throughput, packet delay, latency and allocation of fairness using Round Robin, Best CQI and Proportional fair Packet Scheduling Algorithms. The results of the evaluations on both networks were analysed. The results showed that 4G LTE network performs better than 3G WCDMA network in all the three scheduling algorithms used.展开更多
This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts bea...This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts beamforming technology,an anti-jamming model under Space Division Multiple Access(SDMA)conditions is proposed.Secondly,the confrontational relationship between users and the jammer is formulated as a Stackelberg game.Besides,to achieve global optimization,we design a local cooperation mechanism for users and formulate the cooperation and competition among users as a local altruistic game.By proving that the local altruistic game is an Exact Potential Game(EPG),we further prove the existence of pure strategy Nash Equilibrium(NE)among users and Stackelberg Equilibrium(SE)between users and jammer.Thirdly,to obtain the equilibrium solutions of the proposed games,we propose an anti-jamming channel selection algorithm and improve its convergence speed through heterogeneous learning parameters.The simulation results validate the convergence and effectiveness of the proposed algorithm.Compared with the throughput optimization scheme,our proposed scheme obtain a greater network satisfaction rate.Finally,we also analyze user fairness changes during the algorithm convergence process and get some interesting conclusions.展开更多
The principles of G.729 algorithm are analyzed. It proposes an optimal approach of adaptive codebook search. Realized on fixed point DSP TMS320VC5410,the searching time of the optimal algorithm is thus significantly d...The principles of G.729 algorithm are analyzed. It proposes an optimal approach of adaptive codebook search. Realized on fixed point DSP TMS320VC5410,the searching time of the optimal algorithm is thus significantly decreased,and the result shows that the speech quality is not decreased.展开更多
Leader election algorithms play an important role in orchestrating different processes on distributed systems, including next-generation transportation systems. This leader election phase is usually triggered after th...Leader election algorithms play an important role in orchestrating different processes on distributed systems, including next-generation transportation systems. This leader election phase is usually triggered after the leader has failed and has a high overhead in performance and state recovery. Further, these algorithms are not generally applicable to cloud-based native microservices-based applications where the resources available to the group and resources participating in a group continuously change and the current leader <span style="font-family:Verdana;">may exit the system with prior knowledge of the exit. Our proposed algo</span><span style="font-family:Verdana;">rithm, t</span><span style="font-family:Verdana;">he dynamic leader selection algorithm, provides several benefits through</span><span style="font-family:Verdana;"> selection (not, election) of a set of future leaders which are then alerted prior to </span><span style="font-family:Verdana;">the failure of the current leadership and handed over the leadership. A </span><span style="font-family:Verdana;">specific </span><span style="font-family:Verdana;">illustration of this algorithm is provided with reference to a peer-to-peer</span><span style="font-family:Verdana;"> distribution of autonomous cars in a 5G architecture for transportation networks. The proposed algorithm increases the efficiencies of applications that use the leader election algorithm and finds broad applicability in microservices-based applications.</span>展开更多
With the rapid development of the mobile internet and the internet of things(IoT),the fifth generation(5G)mobile communication system is seeing explosive growth in data traffic.In addition,low-frequency spectrum resou...With the rapid development of the mobile internet and the internet of things(IoT),the fifth generation(5G)mobile communication system is seeing explosive growth in data traffic.In addition,low-frequency spectrum resources are becoming increasingly scarce and there is now an urgent need to switch to higher frequency bands.Millimeter wave(mmWave)technology has several outstanding features—it is one of the most well-known 5G technologies and has the capacity to fulfil many of the requirements of future wireless networks.Importantly,it has an abundant resource spectrum,which can significantly increase the communication rate of a mobile communication system.As such,it is now considered a key technology for future mobile communications.MmWave communication technology also has a more open network architecture;it can deliver varied services and be applied in many scenarios.By contrast,traditional,all-digital precoding systems have the drawbacks of high computational complexity and higher power consumption.This paper examines the implementation of a new hybrid precoding system that significantly reduces both calculational complexity and energy consumption.The primary idea is to generate several sub-channels with equal gain by dividing the channel by the geometric mean decomposition(GMD).In this process,the objective function of the spectral efficiency is derived,then the basic tracking principle and least square(LS)techniques are deployed to design the proposed hybrid precoding.Simulation results show that the proposed algorithm significantly improves system performance and reduces computational complexity by more than 45%compared to traditional algorithms.展开更多
The linear consecutive-k-out-of-n:failure(good)(Lin/Con/k/n:F(G))system consists of n interchangeable components that have different reliabilities.These components are arranged in a line path and different component a...The linear consecutive-k-out-of-n:failure(good)(Lin/Con/k/n:F(G))system consists of n interchangeable components that have different reliabilities.These components are arranged in a line path and different component assignments change the system reliability.The optimization of Lin/Con/k/n:F(G)system is to find an optimal component assignment to maximize the system reliability.As the number of components increases,the computation time for this problem increases considerably.In this paper,we propose a Birnbaum importance-based ant colony optimization(BIACO)algorithm to obtain quasi optimal assignments for such problems.We compare its performance using the Birnbaum importance based two-stage approach(BITA)and Birnbaum importancebased genetic local search(BIGLS)algorithm from previous researches.The experimental results show that the BIACO algorithm has a good performance in the optimization of Lin/Con/k/n:F(G)system.展开更多
In the open network environment, malicious attacks to the trust model have become increasingly serious. Compared with single node attacks, collusion attacks do more harm to the trust model. To solve this problem, a co...In the open network environment, malicious attacks to the trust model have become increasingly serious. Compared with single node attacks, collusion attacks do more harm to the trust model. To solve this problem, a collusion detector based on the GN algorithm for the trust evaluation model is proposed in the open Internet environment. By analyzing the behavioral characteristics of collusion groups, the concept of flatting is defined and the G-N community mining algorithm is used to divide suspicious communities. On this basis, a collusion community detector method is proposed based on the breaking strength of suspicious communities. Simulation results show that the model has high recognition accuracy in identifying collusion nodes, so as to effectively defend against malicious attacks of collusion nodes.展开更多
In the mobile radio industry, planning is a fundamental step for the deployment and commissioning of a Telecom network. The proposed models are based on the technology and the focussed architecture. In this context, w...In the mobile radio industry, planning is a fundamental step for the deployment and commissioning of a Telecom network. The proposed models are based on the technology and the focussed architecture. In this context, we introduce a comprehensive single-lens model for a fourth generation mobile network, Long Term Evolution Advanced Network (4G/LTE-A) technology which includes three sub assignments: cells in the core network. In the resolution, we propose an adaptation of the Genetic Evolutionary Algorithm for a global resolution. This is a combinatorial optimization problem that is considered as difficult. The use of this adaptive method does not necessarily lead to optimal solutions with the aim of reducing the convergence time towards a feasible solution.展开更多
6G is envisioned as the next generation of wireless communication technology,promising unprecedented data speeds,ultra-low Latency,and ubiquitous Connectivity.In tandem with these advancements,blockchain technology is...6G is envisioned as the next generation of wireless communication technology,promising unprecedented data speeds,ultra-low Latency,and ubiquitous Connectivity.In tandem with these advancements,blockchain technology is leveraged to enhance computer vision applications’security,trustworthiness,and transparency.With the widespread use of mobile devices equipped with cameras,the ability to capture and recognize Chinese characters in natural scenes has become increasingly important.Blockchain can facilitate privacy-preserving mechanisms in applications where privacy is paramount,such as facial recognition or personal healthcare monitoring.Users can control their visual data and grant or revoke access as needed.Recognizing Chinese characters from images can provide convenience in various aspects of people’s lives.However,traditional Chinese character text recognition methods often need higher accuracy,leading to recognition failures or incorrect character identification.In contrast,computer vision technologies have significantly improved image recognition accuracy.This paper proposed a Secure end-to-end recognition system(SE2ERS)for Chinese characters in natural scenes based on convolutional neural networks(CNN)using 6G technology.The proposed SE2ERS model uses the Weighted Hyperbolic Curve Cryptograph(WHCC)of the secure data transmission in the 6G network with the blockchain model.The data transmission within the computer vision system,with a 6G gradient directional histogram(GDH),is employed for character estimation.With the deployment of WHCC and GDH in the constructed SE2ERS model,secure communication is achieved for the data transmission with the 6G network.The proposed SE2ERS compares the performance of traditional Chinese text recognition methods and data transmission environment with 6G communication.Experimental results demonstrate that SE2ERS achieves an average recognition accuracy of 88%for simple Chinese characters,compared to 81.2%with traditional methods.For complex Chinese characters,the average recognition accuracy improves to 84.4%with our system,compared to 72.8%with traditional methods.Additionally,deploying the WHCC model improves data security with the increased data encryption rate complexity of∼12&higher than the traditional techniques.展开更多
基金supported by the National Key R&D Program of China(Grant No.2022YFA1005000)the National Natural Science Foundation of China(Grant No.62025110 and 62101308).
文摘Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.
文摘The adoption of 5G for Railways(5G-R)is expanding,particularly in high-speed trains,due to the benefits offered by 5G technology.High-speed trains must provide seamless connectivity and Quality of Service(QoS)to ensure passengers have a satisfactory experience throughout their journey.Installing base stations along urban environments can improve coverage but can dramatically reduce the experience of users due to interference.In particular,when a user with a mobile phone is a passenger in a high speed train traversing between urban centres,the coverage and the 5G resources in general need to be adequate not to diminish her experience of the service.The utilization of macro,pico,and femto cells may optimize the utilization of 5G resources.In this paper,a Genetic Algorithm(GA)-based approach to address the challenges of 5G network planning for 5G-R services is presented.The network is divided into three cell types,macro,pico,and femto cells—and the optimization process is designed to achieve a balance between key objectives:providing comprehensive coverage,minimizing interference,and maximizing energy efficiency.The study focuses on environments with high user density,such as high-speed trains,where reliable and high-quality connectivity is critical.Through simulations,the effectiveness of the GA-driven framework in optimizing coverage and performance in such scenarios is demonstrated.The algorithm is compared with the Particle Swarm Optimisation(PSO)and the Simulated Annealing(SA)methods and interesting insights emerged.The GA offers a strong balance between coverage and efficiency,achieving significantly higher coverage than PSO while maintaining competitive energy efficiency and interference levels.Its steady fitness improvement and adaptability make it well-suited for scenarios where wide coverage is a priority alongside acceptable performance trade-offs.
文摘The rapid advancement of 6G communication technologies and generative artificial intelligence(AI)is catalyzing a new wave of innovation at the intersection of networking and intelligent computing.On the one hand,6G envisions a hyper-connected environment that supports ubiquitous intelligence through ultra-low latency,high throughput,massive device connectivity,and integrated sensing and communication.On the other hand,generative AI,powered by large foundation models,has emerged as a powerful paradigm capable of creating.
基金the Natural Science Foundation of Heilongjiang Province(No.F2015019)the Postdoctoral Foundation of Heilongjiang Province(No.LBHZ16054)the Undergraduate Basic Scientific Research Service Fee Project of Heilongjiang Province(No.Hkdqg201806)
文摘In the current 4th generation(4G)communication network,the base station with the same frequency transmission makes a serious interference among adjacent cells,and information transmission is susceptible to interference such as channel multipath fading and occlusion effect.Detecting effectively spectrum signal under low signal-to-noise ratio(SNR),directly affects the whole performance of the wireless communication network system.This paper designs an energy signal detection algorithm based on stochastic resonance technology which transforms noise's signal energy into useful signal energy,and improves output SNR.The energy signal detection algorithm realizes the function of providing effective detection of signal under low SNR,and promotes the performance of the whole communication system.
文摘This study presents a comparative analysis of optimisation strategies for designing hull shapes of Autonomous Underwater Vehicles(AUVs),paying special attention to drag,lift-to-drag ratio,and delivered power.A fully integrated optimisation framework is developed accordingly,combining a single-objective Genetic Algorithm(GA)for design parameter generation,Computer-Aided Geometric Design(CAGD)for the creation of hull geometries and associated fluid domains,and a Reynolds-Averaged Navier-Stokes(RANS)solver for evaluating hydrodynamic performance metrics.This unified approach eliminates manual intervention,enabling automated determination of optimal hull configurations.Three distinct optimisation problems are addressed using the proposed methodology.First,the drag minimisation of a reference afterbody geometry(A1)at zero angle of attack is performed under constraints of fixed length and internal volume for various flow velocities spanning the range from 0.5 to 15 m/s.Second,the lift-to-drag ratio of A1 is maximised at a 6°angle of attack,maintaining constant total length and internal volume.Third,delivered power is minimised for A1 at a 0°angle of attack.The comparative analysis of results from all three optimisation cases reveals hull shapes with practical design significance.Notably,the shape optimised for minimum delivered power outperforms the other two across a range of velocities.Specifically,it achieves reductions in required power by 7.6%,7.8%,10.2%,and 13.04%at velocities of 0.5,1.0,1.5,and 2.152 m/s,respectively.
文摘One of the most effective technology for the 5G mobile communications is Device-to-device(D2D)communication which is also called terminal pass-through technology.It can directly communicate between devices under the control of a base station and does not require a base station to forward it.The advantages of applying D2D communication technology to cellular networks are:It can increase the communication system capacity,improve the system spectrum efficiency,increase the data transmission rate,and reduce the base station load.Aiming at the problem of co-channel interference between the D2D and cellular users,this paper proposes an efficient algorithm for resource allocation based on the idea of Q-learning,which creates multi-agent learners from multiple D2D users,and the system throughput is determined from the corresponding state-learning of the Q value list and the maximum Q action is obtained through dynamic power for control for D2D users.The mutual interference between the D2D users and base stations and exact channel state information is not required during the Q-learning process and symmetric data transmission mechanism is adopted.The proposed algorithm maximizes the system throughput by controlling the power of D2D users while guaranteeing the quality-of-service of the cellular users.Simulation results show that the proposed algorithm effectively improves system performance as compared with existing algorithms.
文摘Long Term Evolution (LTE) is designed to revolutionize mobile broadband technology with key considerations of higher data rate, improved power efficiency, low latency and better quality of service. This work analyzes the impact of resource scheduling algorithms on the performance of LTE (4G) and WCDMA (3G) networks. In this paper, a full illustration of LTE system is given together with different scheduling algorithms. Thereafter, 3G WCDMA and 4G LTE networks were simulated using Simulink simulator embedded in MATLAB and performance evaluations were carried out. The performance metrics used for the evaluations are average system throughput, packet delay, latency and allocation of fairness using Round Robin, Best CQI and Proportional fair Packet Scheduling Algorithms. The results of the evaluations on both networks were analysed. The results showed that 4G LTE network performs better than 3G WCDMA network in all the three scheduling algorithms used.
基金supported by the National Natural Science Foundation of China under Grant No.61901523 and No.62071488.
文摘This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts beamforming technology,an anti-jamming model under Space Division Multiple Access(SDMA)conditions is proposed.Secondly,the confrontational relationship between users and the jammer is formulated as a Stackelberg game.Besides,to achieve global optimization,we design a local cooperation mechanism for users and formulate the cooperation and competition among users as a local altruistic game.By proving that the local altruistic game is an Exact Potential Game(EPG),we further prove the existence of pure strategy Nash Equilibrium(NE)among users and Stackelberg Equilibrium(SE)between users and jammer.Thirdly,to obtain the equilibrium solutions of the proposed games,we propose an anti-jamming channel selection algorithm and improve its convergence speed through heterogeneous learning parameters.The simulation results validate the convergence and effectiveness of the proposed algorithm.Compared with the throughput optimization scheme,our proposed scheme obtain a greater network satisfaction rate.Finally,we also analyze user fairness changes during the algorithm convergence process and get some interesting conclusions.
文摘The principles of G.729 algorithm are analyzed. It proposes an optimal approach of adaptive codebook search. Realized on fixed point DSP TMS320VC5410,the searching time of the optimal algorithm is thus significantly decreased,and the result shows that the speech quality is not decreased.
文摘Leader election algorithms play an important role in orchestrating different processes on distributed systems, including next-generation transportation systems. This leader election phase is usually triggered after the leader has failed and has a high overhead in performance and state recovery. Further, these algorithms are not generally applicable to cloud-based native microservices-based applications where the resources available to the group and resources participating in a group continuously change and the current leader <span style="font-family:Verdana;">may exit the system with prior knowledge of the exit. Our proposed algo</span><span style="font-family:Verdana;">rithm, t</span><span style="font-family:Verdana;">he dynamic leader selection algorithm, provides several benefits through</span><span style="font-family:Verdana;"> selection (not, election) of a set of future leaders which are then alerted prior to </span><span style="font-family:Verdana;">the failure of the current leadership and handed over the leadership. A </span><span style="font-family:Verdana;">specific </span><span style="font-family:Verdana;">illustration of this algorithm is provided with reference to a peer-to-peer</span><span style="font-family:Verdana;"> distribution of autonomous cars in a 5G architecture for transportation networks. The proposed algorithm increases the efficiencies of applications that use the leader election algorithm and finds broad applicability in microservices-based applications.</span>
文摘With the rapid development of the mobile internet and the internet of things(IoT),the fifth generation(5G)mobile communication system is seeing explosive growth in data traffic.In addition,low-frequency spectrum resources are becoming increasingly scarce and there is now an urgent need to switch to higher frequency bands.Millimeter wave(mmWave)technology has several outstanding features—it is one of the most well-known 5G technologies and has the capacity to fulfil many of the requirements of future wireless networks.Importantly,it has an abundant resource spectrum,which can significantly increase the communication rate of a mobile communication system.As such,it is now considered a key technology for future mobile communications.MmWave communication technology also has a more open network architecture;it can deliver varied services and be applied in many scenarios.By contrast,traditional,all-digital precoding systems have the drawbacks of high computational complexity and higher power consumption.This paper examines the implementation of a new hybrid precoding system that significantly reduces both calculational complexity and energy consumption.The primary idea is to generate several sub-channels with equal gain by dividing the channel by the geometric mean decomposition(GMD).In this process,the objective function of the spectral efficiency is derived,then the basic tracking principle and least square(LS)techniques are deployed to design the proposed hybrid precoding.Simulation results show that the proposed algorithm significantly improves system performance and reduces computational complexity by more than 45%compared to traditional algorithms.
基金the National Natural Science Foundation of China(Nos.71871181 and 71471147)the Overseas Expertise Introduction Project for Discipline Innovation(No.B13044)the Top International University Visiting Program for Outstanding Young Scholars of Northwestern Polytechnical University(No.201806295008)。
文摘The linear consecutive-k-out-of-n:failure(good)(Lin/Con/k/n:F(G))system consists of n interchangeable components that have different reliabilities.These components are arranged in a line path and different component assignments change the system reliability.The optimization of Lin/Con/k/n:F(G)system is to find an optimal component assignment to maximize the system reliability.As the number of components increases,the computation time for this problem increases considerably.In this paper,we propose a Birnbaum importance-based ant colony optimization(BIACO)algorithm to obtain quasi optimal assignments for such problems.We compare its performance using the Birnbaum importance based two-stage approach(BITA)and Birnbaum importancebased genetic local search(BIGLS)algorithm from previous researches.The experimental results show that the BIACO algorithm has a good performance in the optimization of Lin/Con/k/n:F(G)system.
基金supported by the National Natural Science Foundation of China(6140224161572260+3 种基金613730176157226161472192)the Scientific&Technological Support Project of Jiangsu Province(BE2015702)
文摘In the open network environment, malicious attacks to the trust model have become increasingly serious. Compared with single node attacks, collusion attacks do more harm to the trust model. To solve this problem, a collusion detector based on the GN algorithm for the trust evaluation model is proposed in the open Internet environment. By analyzing the behavioral characteristics of collusion groups, the concept of flatting is defined and the G-N community mining algorithm is used to divide suspicious communities. On this basis, a collusion community detector method is proposed based on the breaking strength of suspicious communities. Simulation results show that the model has high recognition accuracy in identifying collusion nodes, so as to effectively defend against malicious attacks of collusion nodes.
文摘In the mobile radio industry, planning is a fundamental step for the deployment and commissioning of a Telecom network. The proposed models are based on the technology and the focussed architecture. In this context, we introduce a comprehensive single-lens model for a fourth generation mobile network, Long Term Evolution Advanced Network (4G/LTE-A) technology which includes three sub assignments: cells in the core network. In the resolution, we propose an adaptation of the Genetic Evolutionary Algorithm for a global resolution. This is a combinatorial optimization problem that is considered as difficult. The use of this adaptive method does not necessarily lead to optimal solutions with the aim of reducing the convergence time towards a feasible solution.
基金supported by the Inner Mongolia Natural Science Fund Project(2019MS06013)Ordos Science and Technology Plan Project(2022YY041)Hunan Enterprise Science and Technology Commissioner Program(2021GK5042).
文摘6G is envisioned as the next generation of wireless communication technology,promising unprecedented data speeds,ultra-low Latency,and ubiquitous Connectivity.In tandem with these advancements,blockchain technology is leveraged to enhance computer vision applications’security,trustworthiness,and transparency.With the widespread use of mobile devices equipped with cameras,the ability to capture and recognize Chinese characters in natural scenes has become increasingly important.Blockchain can facilitate privacy-preserving mechanisms in applications where privacy is paramount,such as facial recognition or personal healthcare monitoring.Users can control their visual data and grant or revoke access as needed.Recognizing Chinese characters from images can provide convenience in various aspects of people’s lives.However,traditional Chinese character text recognition methods often need higher accuracy,leading to recognition failures or incorrect character identification.In contrast,computer vision technologies have significantly improved image recognition accuracy.This paper proposed a Secure end-to-end recognition system(SE2ERS)for Chinese characters in natural scenes based on convolutional neural networks(CNN)using 6G technology.The proposed SE2ERS model uses the Weighted Hyperbolic Curve Cryptograph(WHCC)of the secure data transmission in the 6G network with the blockchain model.The data transmission within the computer vision system,with a 6G gradient directional histogram(GDH),is employed for character estimation.With the deployment of WHCC and GDH in the constructed SE2ERS model,secure communication is achieved for the data transmission with the 6G network.The proposed SE2ERS compares the performance of traditional Chinese text recognition methods and data transmission environment with 6G communication.Experimental results demonstrate that SE2ERS achieves an average recognition accuracy of 88%for simple Chinese characters,compared to 81.2%with traditional methods.For complex Chinese characters,the average recognition accuracy improves to 84.4%with our system,compared to 72.8%with traditional methods.Additionally,deploying the WHCC model improves data security with the increased data encryption rate complexity of∼12&higher than the traditional techniques.