This paper introduces a novel numerical method based on an energy-minimizing normalized residual network(EMNorm Res Net)to compute the ground-state solution of Bose-Einstein condensates at zero or low temperatures.Sta...This paper introduces a novel numerical method based on an energy-minimizing normalized residual network(EMNorm Res Net)to compute the ground-state solution of Bose-Einstein condensates at zero or low temperatures.Starting from the three-dimensional Gross-Pitaevskii equation(GPE),we reduce it to the 1D and 2D GPEs because of the radial symmetry and cylindrical symmetry.The ground-state solution is formulated by minimizing the energy functional under constraints,which is directly solved using the EM-Norm Res Net approach.The paper provides detailed solutions for the ground states in 1D,2D(with radial symmetry),and 3D(with cylindrical symmetry).We use the Thomas-Fermi approximation as the target function to pre-train the neural network.Then,the formal network is trained using the energy minimization method.In contrast to traditional numerical methods,our neural network approach introduces two key innovations:(i)a novel normalization technique designed for high-dimensional systems within an energy-based loss function;(ii)improved training efficiency and model robustness by incorporating gradient stabilization techniques into residual networks.Extensive numerical experiments validate the method's accuracy across different spatial dimensions.展开更多
The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machin...The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.展开更多
Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physi...Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.展开更多
The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. V...The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. Verified by radiosonde, including GPS/MET observations into the analysis makes an overall improvement to the analysis variables of temperature, winds, and water vapor. However, the variational model with the ray-tracing method is quite expensive for numerical weather prediction and climate research. For example, about 4 000 GPS/MET refraction angles need to be assimilated to produce an ideal global analysis. Just one iteration of minimization will take more than 24 hours CPU time on the NCEP's Cray C90 computer. Although efforts have been taken to reduce the computational cost, it is still prohibitive for operational data assimilation. In this paper, a parallel version of the three-dimensional variational data assimilation model of GPS/MET occultation measurement suitable for massive parallel processors architectures is developed. The divide-and-conquer strategy is used to achieve parallelism and is implemented by message passing. The authors present the principles for the code's design and examine the performance on the state-of-the-art parallel computers in China. The results show that this parallel model scales favorably as the number of processors is increased. With the Memory-IO technique implemented by the author, the wall clock time per iteration used for assimilating 1420 refraction angles is reduced from 45 s to 12 s using 1420 processors. This suggests that the new parallelized code has the potential to be useful in numerical weather prediction (NWP) and climate studies.展开更多
Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing i...Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing is not suitable for processing the data due to limited computing power and transmission load.Mobile Edge Computing(MEC)has been proposed to solve these problems.Because of limited computation ability and battery capacity,tasks can be executed in the MEC server.However,how to schedule those tasks becomes a challenge,and is the main topic of this piece.In this paper,we design an efficient intelligent algorithm to jointly optimize energy cost and computing resource allocation in MEC.In view of the advantages of deep learning,we propose a Deep Learning-Based Traffic Scheduling Approach(DLTSA).We translate the scheduling problem into a classification problem.Evaluation demonstrates that our DLTSA approach can reduce energy cost and have better performance compared to traditional scheduling algorithms.展开更多
Determining how to structure vehicular network environments can be done in various ways.Here,we highlight vehicle networks’evolution from vehicular ad-hoc networks(VANET)to the internet of vehicles(Io Vs),listing the...Determining how to structure vehicular network environments can be done in various ways.Here,we highlight vehicle networks’evolution from vehicular ad-hoc networks(VANET)to the internet of vehicles(Io Vs),listing their benefits and limitations.We also highlight the reasons in adopting wireless technologies,in particular,IEEE 802.11 p and 5 G vehicle-toeverything,as well as the use of paradigms able to store and analyze a vast amount of data to produce intelligence and their applications in vehicular environments.We also correlate the use of each of these paradigms with the desire to meet existing intelligent transportation systems’requirements.The presentation of each paradigm is given from a historical and logical standpoint.In particular,vehicular fog computing improves on the deficiences of vehicular cloud computing,so both are not exclusive from the application point of view.We also emphasize some security issues that are linked to the characteristics of these paradigms and vehicular networks,showing that they complement each other and share problems and limitations.As these networks still have many opportunities to grow in both concept and application,we finally discuss concepts and technologies that we believe are beneficial.Throughout this work,we emphasize the crucial role of these concepts for the well-being of humanity.展开更多
Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A...Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.展开更多
Dominance-based rough set approach(DRSA) permits representation and analysis of all phenomena involving monotonicity relationship between some measures or perceptions.DRSA has also some merits within granular computin...Dominance-based rough set approach(DRSA) permits representation and analysis of all phenomena involving monotonicity relationship between some measures or perceptions.DRSA has also some merits within granular computing,as it extends the paradigm of granular computing to ordered data,specifies a syntax and modality of information granules which are appropriate for dealing with ordered data,and enables computing with words and reasoning about ordered data.Granular computing with ordered data is a very general paradigm,because other modalities of information constraints,such as veristic,possibilistic and probabilistic modalities,have also to deal with ordered value sets(with qualifiers relative to grades of truth,possibility and probability),which gives DRSA a large area of applications.展开更多
In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several ...In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.展开更多
Cloud computing is a novel computing paradigm that utilizes remote cloud resources to achieve a high-performance computation.Cloud provides infrastructure,platform and software as different on-demand services.China ha...Cloud computing is a novel computing paradigm that utilizes remote cloud resources to achieve a high-performance computation.Cloud provides infrastructure,platform and software as different on-demand services.China has made remarkable progress in cloudbased products and operating system technology.The government,enterprises and research institutions are all active in the development of cloud computing-related projects.Despite the progress,many important展开更多
With the development of sensor technology and wireless communication technology,edge computing has a wider range of applications.The privacy protection of edge computing is of great significance.In the edge computing ...With the development of sensor technology and wireless communication technology,edge computing has a wider range of applications.The privacy protection of edge computing is of great significance.In the edge computing system,in order to ensure the credibility of the source of terminal data,mobile edge computing(MEC)needs to verify the signature of the terminal node on the data.During the signature process,the computing power of edge devices such as wireless terminals can easily become the bottleneck of system performance.Therefore,it is very necessary to improve efficiency through computational offloading.Therefore,this paper proposes an identitybased edge computing anonymous authentication protocol.The protocol realizes mutual authentication and obtains a shared key by encrypting the mutual information.The encryption algorithm is implemented through a thresholded identity-based proxy ring signature.When a large number of terminals offload computing,MEC can set the priority of offloading tasks according to the user’s identity and permissions,thereby improving offloading efficiency.Security analysis shows that the scheme can guarantee the anonymity and unforgeability of signatures.The probability of a malicious node forging a signature is equivalent to cracking the discrete logarithm puzzle.According to the efficiency analysis,in the case of MEC offloading,the computational complexity is significantly reduced,the computing power of edge devices is liberated,and the signature efficiency is improved.展开更多
People with neurological disorders like Cerebral Palsy (CP) and Multiple Sclerosis (MS) suffer associated functional gait problems. The symptoms and sign of these gait deficits are different between subjects and even ...People with neurological disorders like Cerebral Palsy (CP) and Multiple Sclerosis (MS) suffer associated functional gait problems. The symptoms and sign of these gait deficits are different between subjects and even within a subject at different stage of the disease. Identifying these gait related abnormalities helps in the treatment planning and rehabilitation process. The current gait assessment process does not provide very specific information within the seven gait phases. The objective of this study is to investigate the possible application of granular computing to quantify gait parameters within the seven gait phases. In this process we applied fuzzy-granular computing on the vertical ground reaction force (VGRF) and surface electromyography (sEMG) data to obtain respective characteristic values for each gait phase. A fuzzy similarity (FS) measure is used to compare patient values with age and sex matched control able-bodied group. We specifically applied and tested this approach on 10 patients (4 Cerebral Palsy and 6 Multiple Sclerosis) to identify possible gait abnormalities. Different FS values for VGRF for right and left leg is observed. The VGRF analysis shows smaller FS values during the swing phase in CP and MS subjects that are evidence of associated stability problem. Similarly, FS values for muscle activates of the four-selected muscle display a broad range of values due to difference between subjects. Degraded FS values for different muscles at different stage of the gait cycle are reported. Smaller FS values are sign of abnormal activity of the respective muscles. This approach provides individual centered and very specific information within the gait phases that can be employed for diagnosis, treatment and rehabilitation process.展开更多
The task of determining the greatest common divisors (GCD) for several polynomials which arises in image compression, computer algebra and speech encoding can be formulated as a low rank approximation problem with Syl...The task of determining the greatest common divisors (GCD) for several polynomials which arises in image compression, computer algebra and speech encoding can be formulated as a low rank approximation problem with Sylvester matrix. This paper demonstrates a method based on structured total least norm (STLN) algorithm for matrices with Sylvester structure. We demonstrate the algorithm to compute an approximate GCD. Both the theoretical analysis and the computational results show that the method is feasible.展开更多
With the global trend of pursuing clean energy and decarbonization,power systems have been evolving in a fast pace that we have never seen in the history of electrification.This evolution makes the power system more d...With the global trend of pursuing clean energy and decarbonization,power systems have been evolving in a fast pace that we have never seen in the history of electrification.This evolution makes the power system more dynamic and more distributed,with higher uncertainty.These new power system behaviors bring significant challenges in power system modeling and simulation as more data need to be analyzed for larger systems and more complex models to be solved in a shorter time period.The conventional computing approaches will not be sufficient for future power systems.This paper provides a historical review of computing for power system operation and planning,discusses technology advancements in high performance computing(HPC),and describes the drivers for employing HPC techniques.Some high performance computing application examples with different HPC techniques,including the latest quantum computing,are also presented to show how HPC techniques can help us be well prepared to meet the requirements of power system computing in a clean energy future.展开更多
In this paper,we present local functional law of the iterated logarithm for Cs?rg?-Révész type increments of fractional Brownian motion.The results obtained extend works of Gantert[Ann.Probab.,1993,21(2):104...In this paper,we present local functional law of the iterated logarithm for Cs?rg?-Révész type increments of fractional Brownian motion.The results obtained extend works of Gantert[Ann.Probab.,1993,21(2):1045-1049]and Monrad and Rootzén[Probab.Theory Related Fields,1995,101(2):173-192].展开更多
In this paper,we propose a three-term conjugate gradient method for solving unconstrained optimization problems based on the Hestenes-Stiefel(HS)conjugate gradient method and Polak-Ribiere-Polyak(PRP)conjugate gradien...In this paper,we propose a three-term conjugate gradient method for solving unconstrained optimization problems based on the Hestenes-Stiefel(HS)conjugate gradient method and Polak-Ribiere-Polyak(PRP)conjugate gradient method.Under the condition of standard Wolfe line search,the proposed search direction is the descent direction.For general nonlinear functions,the method is globally convergent.Finally,numerical results show that the proposed method is efficient.展开更多
A hydrogen energy storage system(HESS)is one of the many risingmodern green innovations,using excess energy to generate hydrogen and storing it for various purposes.With that,there have been many discussions about com...A hydrogen energy storage system(HESS)is one of the many risingmodern green innovations,using excess energy to generate hydrogen and storing it for various purposes.With that,there have been many discussions about commercializing HESS and improving it further.However,the design and sizing process can be overwhelming to comprehend with various sources to examine,and understanding optimal design methodologies is crucial to optimize a HESS design.With that,this review aims to collect and analyse a wide range of HESS studies to summarise recent studies.Two different collections of studies are studied,one was sourced by the main author for preliminary readings,and another was obtained via VOSViewer.The findings from the Web of Science platform were also examined for amore comprehensive understanding.Major findings include the People’sRepublic of China has been active in HESS research,as most works and active organizations originate from this country.HESS has been mainly researched to support power generation and balance load demands,with financial analysis being the common scope of analysis.MATLAB is a common tool used for HESS design,modelling,and optimization as it can handle complex calculations.Artificial neural network(ANN)has the potential to be used to model the HESS,but additional review is required as a formof future work.From a commercialization perspective,pressurized hydrogen tanks are ideal for hydrogen storage in a HESS,but other methods can be considered after additional research and development.From this review,it can be implied that modelling works will be the way forward for HESS research,but extensive collaborations and additional review are needed.Overall,this review summarized various takeaways that future research works on HESS can use.展开更多
Scientists have devoted considerable effort overs several decades to reduce automobile exhaust emissions,and one practical and important strategy is the catalytic conversion of nitric oxide(NO)[1].Previous studies hav...Scientists have devoted considerable effort overs several decades to reduce automobile exhaust emissions,and one practical and important strategy is the catalytic conversion of nitric oxide(NO)[1].Previous studies have shown that lanthanide(Ln)metals can catalytically reduce NO.Thus,the reactions of NO with Ln to form lanthanide-nitric oxide(LnNO)complexes have been designed and served as the simplest prototype molecules for studying NO chemisorption on metal surfaces[2].展开更多
Exploring the quantum advantages of various non-classical quantum states in noisy environments is a central subject in quantum sensing.Here we provide a complete picture for the frequency estimation precision of three...Exploring the quantum advantages of various non-classical quantum states in noisy environments is a central subject in quantum sensing.Here we provide a complete picture for the frequency estimation precision of three important states(the Greenberger-Horne-Zeilinger(GHZ)state,the maximal spin squeezed state,and the spin coherent state)of a spin-S under both individual dephasing and collective dephasing by general Gaussian noise,ranging from the Markovian limit to the extreme non-Markovian limit.Whether or not the noise is Markovian,the spin coherent state is always worse than the classical scheme under collective dephasing although it is equivalent to the classical scheme under individual dephasing.Moreover,the maximal spin squeezed state always give the best sensing precision(and outperforms the widely studied GHZ state)in all cases.This establishes the general advantage of the spin squeezed state for noisy frequency estimation in many quantum sensing platforms.展开更多
In this paper,we introduce the normalized L,mixed intersection body and demonstrate how the normalized L_(p) mixed intersection body operator can be used to obtain the polar body operator as a limit.Moreover,we study ...In this paper,we introduce the normalized L,mixed intersection body and demonstrate how the normalized L_(p) mixed intersection body operator can be used to obtain the polar body operator as a limit.Moreover,we study the L_(p)-Busemann-Petty type problem for the normalized L_(p) mixed intersection bodies.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.11971411)。
文摘This paper introduces a novel numerical method based on an energy-minimizing normalized residual network(EMNorm Res Net)to compute the ground-state solution of Bose-Einstein condensates at zero or low temperatures.Starting from the three-dimensional Gross-Pitaevskii equation(GPE),we reduce it to the 1D and 2D GPEs because of the radial symmetry and cylindrical symmetry.The ground-state solution is formulated by minimizing the energy functional under constraints,which is directly solved using the EM-Norm Res Net approach.The paper provides detailed solutions for the ground states in 1D,2D(with radial symmetry),and 3D(with cylindrical symmetry).We use the Thomas-Fermi approximation as the target function to pre-train the neural network.Then,the formal network is trained using the energy minimization method.In contrast to traditional numerical methods,our neural network approach introduces two key innovations:(i)a novel normalization technique designed for high-dimensional systems within an energy-based loss function;(ii)improved training efficiency and model robustness by incorporating gradient stabilization techniques into residual networks.Extensive numerical experiments validate the method's accuracy across different spatial dimensions.
文摘The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.
基金This research has been supported by the National Science Foundation(under grant#1723596)the National Security Agency(under grant#H98230-17-1-0355).
文摘Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.
基金supported by the National Natural Science Eoundation of China under Grant No.40221503the China National Key Programme for Development Basic Sciences (Abbreviation:973 Project,Grant No.G1999032801)
文摘The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. Verified by radiosonde, including GPS/MET observations into the analysis makes an overall improvement to the analysis variables of temperature, winds, and water vapor. However, the variational model with the ray-tracing method is quite expensive for numerical weather prediction and climate research. For example, about 4 000 GPS/MET refraction angles need to be assimilated to produce an ideal global analysis. Just one iteration of minimization will take more than 24 hours CPU time on the NCEP's Cray C90 computer. Although efforts have been taken to reduce the computational cost, it is still prohibitive for operational data assimilation. In this paper, a parallel version of the three-dimensional variational data assimilation model of GPS/MET occultation measurement suitable for massive parallel processors architectures is developed. The divide-and-conquer strategy is used to achieve parallelism and is implemented by message passing. The authors present the principles for the code's design and examine the performance on the state-of-the-art parallel computers in China. The results show that this parallel model scales favorably as the number of processors is increased. With the Memory-IO technique implemented by the author, the wall clock time per iteration used for assimilating 1420 refraction angles is reduced from 45 s to 12 s using 1420 processors. This suggests that the new parallelized code has the potential to be useful in numerical weather prediction (NWP) and climate studies.
基金supported in part by the National Natural Science Foun-dation of China(61902029)R&D Program of Beijing Municipal Education Commission(No.KM202011232015)Project for Acceleration of University Classi cation Development(Nos.5112211036,5112211037,5112211038).
文摘Nowadays,with the widespread application of the Internet of Things(IoT),mobile devices are renovating our lives.The data generated by mobile devices has reached a massive level.The traditional centralized processing is not suitable for processing the data due to limited computing power and transmission load.Mobile Edge Computing(MEC)has been proposed to solve these problems.Because of limited computation ability and battery capacity,tasks can be executed in the MEC server.However,how to schedule those tasks becomes a challenge,and is the main topic of this piece.In this paper,we design an efficient intelligent algorithm to jointly optimize energy cost and computing resource allocation in MEC.In view of the advantages of deep learning,we propose a Deep Learning-Based Traffic Scheduling Approach(DLTSA).We translate the scheduling problem into a classification problem.Evaluation demonstrates that our DLTSA approach can reduce energy cost and have better performance compared to traditional scheduling algorithms.
基金supported by FCT through the LASIGE Research Unit(UIDB/00408/2020UIDP/00408/2020)+1 种基金the Brazilian National Council for Research and Development(CNPq)(#304315/2017-6#430274/2018-1)。
文摘Determining how to structure vehicular network environments can be done in various ways.Here,we highlight vehicle networks’evolution from vehicular ad-hoc networks(VANET)to the internet of vehicles(Io Vs),listing their benefits and limitations.We also highlight the reasons in adopting wireless technologies,in particular,IEEE 802.11 p and 5 G vehicle-toeverything,as well as the use of paradigms able to store and analyze a vast amount of data to produce intelligence and their applications in vehicular environments.We also correlate the use of each of these paradigms with the desire to meet existing intelligent transportation systems’requirements.The presentation of each paradigm is given from a historical and logical standpoint.In particular,vehicular fog computing improves on the deficiences of vehicular cloud computing,so both are not exclusive from the application point of view.We also emphasize some security issues that are linked to the characteristics of these paradigms and vehicular networks,showing that they complement each other and share problems and limitations.As these networks still have many opportunities to grow in both concept and application,we finally discuss concepts and technologies that we believe are beneficial.Throughout this work,we emphasize the crucial role of these concepts for the well-being of humanity.
基金supported by the Center for Mining,Electro-Mechanical research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnamfinancially supported by the Hunan Provincial Department of Education General Project(19C1744)+1 种基金Hunan Province Science Foundation for Youth Scholars of China fund(2018JJ3510)the Innovation-Driven Project of Central South University(2020CX040)。
文摘Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.
文摘Dominance-based rough set approach(DRSA) permits representation and analysis of all phenomena involving monotonicity relationship between some measures or perceptions.DRSA has also some merits within granular computing,as it extends the paradigm of granular computing to ordered data,specifies a syntax and modality of information granules which are appropriate for dealing with ordered data,and enables computing with words and reasoning about ordered data.Granular computing with ordered data is a very general paradigm,because other modalities of information constraints,such as veristic,possibilistic and probabilistic modalities,have also to deal with ordered value sets(with qualifiers relative to grades of truth,possibility and probability),which gives DRSA a large area of applications.
文摘In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.
文摘Cloud computing is a novel computing paradigm that utilizes remote cloud resources to achieve a high-performance computation.Cloud provides infrastructure,platform and software as different on-demand services.China has made remarkable progress in cloudbased products and operating system technology.The government,enterprises and research institutions are all active in the development of cloud computing-related projects.Despite the progress,many important
基金Beijing Postdoctoral Research Foundation(No.2021-ZZ-077,No.2020-YJ-006)Chongqing Industrial Control System Security Situational Awareness Platform,2019 Industrial Internet Innovation and Development Project-Provincial Industrial Control System Security Situational Awareness Platform,Center for Research and Innovation in Software Engineering,School of Computer and Information Science(Southwest University,Chongqing 400175,China)Chongqing Graduate Education Teaching Reform Research Project(yjg203032).
文摘With the development of sensor technology and wireless communication technology,edge computing has a wider range of applications.The privacy protection of edge computing is of great significance.In the edge computing system,in order to ensure the credibility of the source of terminal data,mobile edge computing(MEC)needs to verify the signature of the terminal node on the data.During the signature process,the computing power of edge devices such as wireless terminals can easily become the bottleneck of system performance.Therefore,it is very necessary to improve efficiency through computational offloading.Therefore,this paper proposes an identitybased edge computing anonymous authentication protocol.The protocol realizes mutual authentication and obtains a shared key by encrypting the mutual information.The encryption algorithm is implemented through a thresholded identity-based proxy ring signature.When a large number of terminals offload computing,MEC can set the priority of offloading tasks according to the user’s identity and permissions,thereby improving offloading efficiency.Security analysis shows that the scheme can guarantee the anonymity and unforgeability of signatures.The probability of a malicious node forging a signature is equivalent to cracking the discrete logarithm puzzle.According to the efficiency analysis,in the case of MEC offloading,the computational complexity is significantly reduced,the computing power of edge devices is liberated,and the signature efficiency is improved.
文摘People with neurological disorders like Cerebral Palsy (CP) and Multiple Sclerosis (MS) suffer associated functional gait problems. The symptoms and sign of these gait deficits are different between subjects and even within a subject at different stage of the disease. Identifying these gait related abnormalities helps in the treatment planning and rehabilitation process. The current gait assessment process does not provide very specific information within the seven gait phases. The objective of this study is to investigate the possible application of granular computing to quantify gait parameters within the seven gait phases. In this process we applied fuzzy-granular computing on the vertical ground reaction force (VGRF) and surface electromyography (sEMG) data to obtain respective characteristic values for each gait phase. A fuzzy similarity (FS) measure is used to compare patient values with age and sex matched control able-bodied group. We specifically applied and tested this approach on 10 patients (4 Cerebral Palsy and 6 Multiple Sclerosis) to identify possible gait abnormalities. Different FS values for VGRF for right and left leg is observed. The VGRF analysis shows smaller FS values during the swing phase in CP and MS subjects that are evidence of associated stability problem. Similarly, FS values for muscle activates of the four-selected muscle display a broad range of values due to difference between subjects. Degraded FS values for different muscles at different stage of the gait cycle are reported. Smaller FS values are sign of abnormal activity of the respective muscles. This approach provides individual centered and very specific information within the gait phases that can be employed for diagnosis, treatment and rehabilitation process.
文摘The task of determining the greatest common divisors (GCD) for several polynomials which arises in image compression, computer algebra and speech encoding can be formulated as a low rank approximation problem with Sylvester matrix. This paper demonstrates a method based on structured total least norm (STLN) algorithm for matrices with Sylvester structure. We demonstrate the algorithm to compute an approximate GCD. Both the theoretical analysis and the computational results show that the method is feasible.
基金the support from U.S.Department of Energy through its Advanced Grid Modeling program,Exascale Computing Program(ECP)The Grid Modernization Laboratory Consortium(GMLC)+1 种基金Advanced Research Projects Agency-Energy(ARPA-E),The National Quantum Information Science Research Centers,Co-design Center for Quantum Advantage(C2QA)the Office of Advanced Scientific Computing Research(ASCR).
文摘With the global trend of pursuing clean energy and decarbonization,power systems have been evolving in a fast pace that we have never seen in the history of electrification.This evolution makes the power system more dynamic and more distributed,with higher uncertainty.These new power system behaviors bring significant challenges in power system modeling and simulation as more data need to be analyzed for larger systems and more complex models to be solved in a shorter time period.The conventional computing approaches will not be sufficient for future power systems.This paper provides a historical review of computing for power system operation and planning,discusses technology advancements in high performance computing(HPC),and describes the drivers for employing HPC techniques.Some high performance computing application examples with different HPC techniques,including the latest quantum computing,are also presented to show how HPC techniques can help us be well prepared to meet the requirements of power system computing in a clean energy future.
基金Supported by NSFC(Nos.11661025,12161024)Natural Science Foundation of Guangxi(Nos.2020GXNSFAA159118,2021GXNSFAA196045)+2 种基金Guangxi Science and Technology Project(No.Guike AD20297006)Training Program for 1000 Young and Middle-aged Cadre Teachers in Universities of GuangxiNational College Student's Innovation and Entrepreneurship Training Program(No.202110595049)。
文摘In this paper,we present local functional law of the iterated logarithm for Cs?rg?-Révész type increments of fractional Brownian motion.The results obtained extend works of Gantert[Ann.Probab.,1993,21(2):1045-1049]and Monrad and Rootzén[Probab.Theory Related Fields,1995,101(2):173-192].
基金Supported by the Science and Technology Project of Guangxi(Guike AD23023002)。
文摘In this paper,we propose a three-term conjugate gradient method for solving unconstrained optimization problems based on the Hestenes-Stiefel(HS)conjugate gradient method and Polak-Ribiere-Polyak(PRP)conjugate gradient method.Under the condition of standard Wolfe line search,the proposed search direction is the descent direction.For general nonlinear functions,the method is globally convergent.Finally,numerical results show that the proposed method is efficient.
文摘A hydrogen energy storage system(HESS)is one of the many risingmodern green innovations,using excess energy to generate hydrogen and storing it for various purposes.With that,there have been many discussions about commercializing HESS and improving it further.However,the design and sizing process can be overwhelming to comprehend with various sources to examine,and understanding optimal design methodologies is crucial to optimize a HESS design.With that,this review aims to collect and analyse a wide range of HESS studies to summarise recent studies.Two different collections of studies are studied,one was sourced by the main author for preliminary readings,and another was obtained via VOSViewer.The findings from the Web of Science platform were also examined for amore comprehensive understanding.Major findings include the People’sRepublic of China has been active in HESS research,as most works and active organizations originate from this country.HESS has been mainly researched to support power generation and balance load demands,with financial analysis being the common scope of analysis.MATLAB is a common tool used for HESS design,modelling,and optimization as it can handle complex calculations.Artificial neural network(ANN)has the potential to be used to model the HESS,but additional review is required as a formof future work.From a commercialization perspective,pressurized hydrogen tanks are ideal for hydrogen storage in a HESS,but other methods can be considered after additional research and development.From this review,it can be implied that modelling works will be the way forward for HESS research,but extensive collaborations and additional review are needed.Overall,this review summarized various takeaways that future research works on HESS can use.
基金the National Key Research and Development Program of China(No.2021YFB3501501)the National Natural Science Foundation of China(No.22276013)the Beijing Natural Science Foundation(No.2242009)for financial support,and thank Tianhe2-JK HPC for generous computer time.
文摘Scientists have devoted considerable effort overs several decades to reduce automobile exhaust emissions,and one practical and important strategy is the catalytic conversion of nitric oxide(NO)[1].Previous studies have shown that lanthanide(Ln)metals can catalytically reduce NO.Thus,the reactions of NO with Ln to form lanthanide-nitric oxide(LnNO)complexes have been designed and served as the simplest prototype molecules for studying NO chemisorption on metal surfaces[2].
基金supported by the National Natural Science Foundation of China(NSFC)Grant No.12274019the NSAF grant in NSFC with Grant No.U2230402。
文摘Exploring the quantum advantages of various non-classical quantum states in noisy environments is a central subject in quantum sensing.Here we provide a complete picture for the frequency estimation precision of three important states(the Greenberger-Horne-Zeilinger(GHZ)state,the maximal spin squeezed state,and the spin coherent state)of a spin-S under both individual dephasing and collective dephasing by general Gaussian noise,ranging from the Markovian limit to the extreme non-Markovian limit.Whether or not the noise is Markovian,the spin coherent state is always worse than the classical scheme under collective dephasing although it is equivalent to the classical scheme under individual dephasing.Moreover,the maximal spin squeezed state always give the best sensing precision(and outperforms the widely studied GHZ state)in all cases.This establishes the general advantage of the spin squeezed state for noisy frequency estimation in many quantum sensing platforms.
基金Supported by the Postgraduate Scientific Research Innovation Project of Hunan Province (CX20231033)。
文摘In this paper,we introduce the normalized L,mixed intersection body and demonstrate how the normalized L_(p) mixed intersection body operator can be used to obtain the polar body operator as a limit.Moreover,we study the L_(p)-Busemann-Petty type problem for the normalized L_(p) mixed intersection bodies.