The decoherence of high-dimensional orbital angular momentum(OAM)entanglement in the weak scintillation regime has been investigated.In this study,we simulate atmospheric turbulence by utilizing a multiple-phase scree...The decoherence of high-dimensional orbital angular momentum(OAM)entanglement in the weak scintillation regime has been investigated.In this study,we simulate atmospheric turbulence by utilizing a multiple-phase screen imprinted with anisotropic non-Kolmogorov turbulence.The entanglement negativity and fidelity are introduced to quantify the entanglement of a high-dimensional OAM state.The numerical evaluation results indicate that entanglement negativity and fidelity last longer for a high-dimensional OAM state when the azimuthal mode has a lower value.Additionally,the evolution of higher-dimensional OAM entanglement is significantly influenced by OAM beam parameters and turbulence parameters.Compared to isotropic atmospheric turbulence,anisotropic turbulence has a lesser influence on highdimensional OAM entanglement.展开更多
It is known that monotone recurrence relations can induce a class of twist homeomorphisms on the high-dimensional cylinder,which is an extension of the class of monotone twist maps on the annulus or two-dimensional cy...It is known that monotone recurrence relations can induce a class of twist homeomorphisms on the high-dimensional cylinder,which is an extension of the class of monotone twist maps on the annulus or two-dimensional cylinder.By constructing a bounded solution of the monotone recurrence relation,the main conclusion in this paper is acquired:The induced homeomorphism has Birkhoff orbits provided there is a compact forward-invariant set.Therefore,it generalizes Angenent's results in low-dimensional cases.展开更多
Owing to their global search capabilities and gradient-free operation,metaheuristic algorithms are widely applied to a wide range of optimization problems.However,their computational demands become prohibitive when ta...Owing to their global search capabilities and gradient-free operation,metaheuristic algorithms are widely applied to a wide range of optimization problems.However,their computational demands become prohibitive when tackling high-dimensional optimization challenges.To effectively address these challenges,this study introduces cooperative metaheuristics integrating dynamic dimension reduction(DR).Building upon particle swarm optimization(PSO)and differential evolution(DE),the proposed cooperative methods C-PSO and C-DE are developed.In the proposed methods,the modified principal components analysis(PCA)is utilized to reduce the dimension of design variables,thereby decreasing computational costs.The dynamic DR strategy implements periodic execution of modified PCA after a fixed number of iterations,resulting in the important dimensions being dynamically identified.Compared with the static one,the dynamic DR strategy can achieve precise identification of important dimensions,thereby enabling accelerated convergence toward optimal solutions.Furthermore,the influence of cumulative contribution rate thresholds on optimization problems with different dimensions is investigated.Metaheuristic algorithms(PSO,DE)and cooperative metaheuristics(C-PSO,C-DE)are examined by 15 benchmark functions and two engineering design problems(speed reducer and composite pressure vessel).Comparative results demonstrate that the cooperative methods achieve significantly superior performance compared to standard methods in both solution accuracy and computational efficiency.Compared to standard metaheuristic algorithms,cooperative metaheuristics achieve a reduction in computational cost of at least 40%.The cooperative metaheuristics can be effectively used to tackle both high-dimensional unconstrained and constrained optimization problems.展开更多
Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemio...Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.展开更多
The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the effi...The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.展开更多
Seismic data is commonly acquired sparsely and irregularly, which necessitates the regularization of seismic data with anti-aliasing and anti-leakage methods during seismic data processing. We propose a novel method o...Seismic data is commonly acquired sparsely and irregularly, which necessitates the regularization of seismic data with anti-aliasing and anti-leakage methods during seismic data processing. We propose a novel method of 4D anti-aliasing and anti-leakage Fourier transform using a cube-removal strategy to address the combination of irregular sampling and aliasing in high-dimensional seismic data. We compute a weighting function by stacking the spectrum along the radial lines, apply this function to suppress the aliasing energy, and then iteratively pick the dominant amplitude cube to construct the Fourier spectrum. The proposed method is very efficient due to a cube removal strategy for accelerating the convergence of Fourier reconstruction and a well-designed parallel architecture using CPU/GPU collaborative computing. To better fill the acquisition holes from 5D seismic data and meanwhile considering the GPU memory limitation, we developed the anti-aliasing and anti-leakage Fourier transform method in 4D with the remaining spatial dimension looped. The entire workflow is composed of three steps: data splitting, 4D regularization, and data merging. Numerical tests on both synthetic and field data examples demonstrate the high efficiency and effectiveness of our approach.展开更多
We present a formulation of the single-trajectory entropy using the trajectories ensemble. The single-trajectory entropy is affected by its surrounding trajectories via the distribution function. The single-trajectory...We present a formulation of the single-trajectory entropy using the trajectories ensemble. The single-trajectory entropy is affected by its surrounding trajectories via the distribution function. The single-trajectory entropies are studied in two typical potentials, i.e., harmonic potential and double-well potential, and in viscous environment by interacting trajectory method. The results of the trajectory methods are in agreement well with the numerical methods(Monte Carlo simulation and difference equation). The single-trajectory entropies increasing(decreasing) could be caused by absorption(emission) heat from(to) the thermal environment. Also, some interesting trajectories, which correspond to the rare evens in the processes, are demonstrated.展开更多
This paper studies the target controllability of multilayer complex networked systems,in which the nodes are highdimensional linear time invariant(LTI)dynamical systems,and the network topology is directed and weighte...This paper studies the target controllability of multilayer complex networked systems,in which the nodes are highdimensional linear time invariant(LTI)dynamical systems,and the network topology is directed and weighted.The influence of inter-layer couplings on the target controllability of multi-layer networks is discussed.It is found that even if there exists a layer which is not target controllable,the entire multi-layer network can still be target controllable due to the inter-layer couplings.For the multi-layer networks with general structure,a necessary and sufficient condition for target controllability is given by establishing the relationship between uncontrollable subspace and output matrix.By the derived condition,it can be found that the system may be target controllable even if it is not state controllable.On this basis,two corollaries are derived,which clarify the relationship between target controllability,state controllability and output controllability.For the multi-layer networks where the inter-layer couplings are directed chains and directed stars,sufficient conditions for target controllability of networked systems are given,respectively.These conditions are easier to verify than the classic criterion.展开更多
Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is ext...Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.展开更多
In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all usef...In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.展开更多
With the increasing dimensionality of the data,High-dimensional Feature Selection(HFS)becomes an increasingly dif-ficult task.It is not simple to find the best subset of features due to the breadth of the search space...With the increasing dimensionality of the data,High-dimensional Feature Selection(HFS)becomes an increasingly dif-ficult task.It is not simple to find the best subset of features due to the breadth of the search space and the intricacy of the interactions between features.Many of the Feature Selection(FS)approaches now in use for these problems perform sig-nificantly less well when faced with such intricate situations involving high-dimensional search spaces.It is demonstrated that meta-heuristic algorithms can provide sub-optimal results in an acceptable amount of time.This paper presents a new binary Boosted version of the Spider Wasp Optimizer(BSWO)called Binary Boosted SWO(BBSWO),which combines a number of successful and promising strategies,in order to deal with HFS.The shortcomings of the original BSWO,including early convergence,settling into local optimums,limited exploration and exploitation,and lack of population diversity,were addressed by the proposal of this new variant of SWO.The concept of chaos optimization is introduced in BSWO,where initialization is consistently produced by utilizing the properties of sine chaos mapping.A new convergence parameter was then incorporated into BSWO to achieve a promising balance between exploration and exploitation.Multiple exploration mechanisms were then applied in conjunction with several exploitation strategies to effectively enrich the search process of BSWO within the search space.Finally,quantum-based optimization was added to enhance the diversity of the search agents in BSWO.The proposed BBSWO not only offers the most suitable subset of features located,but it also lessens the data's redundancy structure.BBSWO was evaluated using the k-Nearest Neighbor(k-NN)classifier on 23 HFS problems from the biomedical domain taken from the UCI repository.The results were compared with those of traditional BSWO and other well-known meta-heuristics-based FS.The findings indicate that,in comparison to other competing techniques,the proposed BBSWO can,on average,identify the least significant subsets of features with efficient classification accuracy of the k-NN classifier.展开更多
High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation lear...High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation learning to an HDI matrix,whose hyper-parameter adaptation can be implemented through a particle swarm optimizer(PSO) to meet scalable requirements.However, conventional PSO is limited by its premature issues,which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle's state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer(SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational efficiency.Hence, SPSO's use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.展开更多
The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based o...The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based on complete data. This paper studies the optimal estimation of high-dimensional covariance matrices based on missing and noisy sample under the norm. First, the model with sub-Gaussian additive noise is presented. The generalized sample covariance is then modified to define a hard thresholding estimator , and the minimax upper bound is derived. After that, the minimax lower bound is derived, and it is concluded that the estimator presented in this article is rate-optimal. Finally, numerical simulation analysis is performed. The result shows that for missing samples with sub-Gaussian noise, if the true covariance matrix is sparse, the hard thresholding estimator outperforms the traditional estimate method.展开更多
This paper discusses consensus problems for high-dimensional networked multi-agent systems with fixed topology. The communication topology of multi-agent systems is represented by a digraph. A new consensus protocol i...This paper discusses consensus problems for high-dimensional networked multi-agent systems with fixed topology. The communication topology of multi-agent systems is represented by a digraph. A new consensus protocol is proposed, and consensus convergence of multigent systems is analyzed based on the Lyapunov stability theory. The consensus problem can be formulated into solving a feasible problem with bilinear matrix inequality (BMI) constrains. Furthermore, the consensus protocol is extended to achieving tracking and formation control. By introducing the formation structure set, each agent can gain its individual desired trajectory. Finally, numerical simulations are provided to show the effectiveness of our strategies. The results show that agents from arbitrary initial states can asymptotically reach a consensus. In addition, agents with high-dimensional can track any target trajectory, and maintain desired formation during movement by selecting appropriate structure set.展开更多
The growth trajectory of hailstones within clouds has remained elusive due to the inability to trace them directly,impeding the comprehension of their underlying growth mechanisms.This study investigated hailstone ver...The growth trajectory of hailstones within clouds has remained elusive due to the inability to trace them directly,impeding the comprehension of their underlying growth mechanisms.This study investigated hailstone vertical growth trajectories by detecting the stable isotope signatures(2H and 18O compositions)of different shells in 27 hailstones from 9hailstorms,which allowed us to capture the ambient temperature during hailstone growth.The vertical growth trajectories were obtained by comparing the isotopic compositions of water condensate in clouds,derived from the Adiabatic Model,with those measured in hailstones.Although hailstone growth was primarily observed in the–10°C to–30°C temperature layer,the embryo formation height and subsequent growth trajectories significantly varied among hailstones.Embryos formed over a wide range of temperatures(–8.7°C to–33.4°C);four originated at temperatures above–15°C and 16originated at temperatures below–20°C,suggesting ice nuclei composed of bioproteins and mineral dust,respectively.Among the 27 measured hailstones,3 exhibited minimal vertical movement,16 exhibited a monotonic rise or fall,and the remaining 8 exhibited alternating up-down trajectories;only one experienced“recycling”during up-down drifting.Trajectory analysis revealed that similar-sized hailstones from a single storm tended to form at similar heights,whereas those larger than 25 mm in diameter exhibited at least one period of upward growth.Vertical trajectories derived from isotopic analysis were corroborated by radar hydrometeor observations.展开更多
The environment of low-altitude urban airspace is complex and variable due to numerous obstacles,non-cooperative aircraft,and birds.Unmanned Aerial Vehicles(UAVs)leveraging environmental information to achieve three-d...The environment of low-altitude urban airspace is complex and variable due to numerous obstacles,non-cooperative aircraft,and birds.Unmanned Aerial Vehicles(UAVs)leveraging environmental information to achieve three-dimension collision-free trajectory planning is the prerequisite to ensure airspace security.However,the timely information of surrounding situation is difficult to acquire by UAVs,which further brings security risks.As a mature technology leveraged in traditional civil aviation,the Automatic Dependent Surveillance-Broadcast(ADS-B)realizes continuous surveillance of the information of aircraft.Consequently,we leverage ADS-B for surveillance and information broadcasting,and divide the aerial airspace into multiple sub-airspaces to improve flight safety in UAV trajectory planning.In detail,we propose the secure Sub-airSpaces Planning(SSP)algorithm and Particle Swarm Optimization Rapidly-exploring Random Trees(PSO-RRT)algorithm for the UAV trajectory planning in law-altitude airspace.The performance of the proposed algorithm is verified by simulations and the results show that SSP reduces both the maximum number of UAVs in the sub-airspace and the length of the trajectory,and PSO-RRT reduces the cost of UAV trajectory in the sub-airspace.展开更多
In this paper,we investigate a multi-UAV aided NOMA communication system,where multiple UAV-mounted aerial base stations are employed to serve ground users in the downlink NOMA communication,and each UAV serves its as...In this paper,we investigate a multi-UAV aided NOMA communication system,where multiple UAV-mounted aerial base stations are employed to serve ground users in the downlink NOMA communication,and each UAV serves its associated users on its own bandwidth.We aim at maximizing the overall common throughput in a finite time period.Such a problem is a typical mixed integer nonlinear problem,which involves both continuous-variable and combinatorial optimizations.To efficiently solve this problem,we propose a two-layer algorithm,which separately tackles continuous-variable and combinatorial optimization.Specifically,in the inner layer given one user association scheme,subproblems of bandwidth allocation,power allocation and trajectory design are solved based on alternating optimization.In the outer layer,a small number of candidate user association schemes are generated from an initial scheme and the best solution can be determined by comparing all the candidate schemes.In particular,a clustering algorithm based on K-means is applied to produce all candidate user association schemes,the successive convex optimization technique is adopted in the power allocation subproblem and a logistic function approximation approach is employed in the trajectory design subproblem.Simulation results show that the proposed NOMA scheme outperforms three baseline schemes in downlink common throughput,including one solution proposed in an existing literature.展开更多
Generating dynamically feasible trajectory for fixed-wing Unmanned Aerial Vehicles(UAVs)in dense obstacle environments remains computationally intractable.This paper proposes a Safe Flight Corridor constrained Sequent...Generating dynamically feasible trajectory for fixed-wing Unmanned Aerial Vehicles(UAVs)in dense obstacle environments remains computationally intractable.This paper proposes a Safe Flight Corridor constrained Sequential Convex Programming(SFC-SCP)to improve the computation efficiency and reliability of trajectory generation.SFC-SCP combines the front-end convex polyhedron SFC construction and back-end SCP-based trajectory optimization.A Sparse A^(*)Search(SAS)driven SFC construction method is designed to efficiently generate polyhedron SFC according to the geometric relation among obstacles and collision-free waypoints.Via transforming the nonconvex obstacle-avoidance constraints to linear inequality constraints,SFC can mitigate infeasibility of trajectory planning and reduce computation complexity.Then,SCP casts the nonlinear trajectory optimization subject to SFC into convex programming subproblems to decrease the problem complexity.In addition,a convex optimizer based on interior point method is customized,where the search direction is calculated via successive elimination to further improve efficiency.Simulation experiments on dense obstacle scenarios show that SFC-SCP can generate dynamically feasible safe trajectory rapidly.Comparative studies with state-of-the-art SCP-based methods demonstrate the efficiency and reliability merits of SFC-SCP.Besides,the customized convex optimizer outperforms off-the-shelf optimizers in terms of computation time.展开更多
Objective We aimed to investigate the patterns of fasting blood glucose(FBG)trajectories and analyze the relationship between various occupational hazard factors and FBG trajectories in male steelworkers.Methods The s...Objective We aimed to investigate the patterns of fasting blood glucose(FBG)trajectories and analyze the relationship between various occupational hazard factors and FBG trajectories in male steelworkers.Methods The study cohort included 3,728 workers who met the selection criteria for the Tanggang Occupational Cohort(TGOC)between 2017 and 2022.A group-based trajectory model was used to identify the FBG trajectories.Environmental risk scores(ERS)were constructed using regression coefficients from the occupational hazard model as weights.Univariate and multivariate logistic regression analyses were performed to explore the effects of occupational hazard factors using the ERS on FBG trajectories.Results FBG trajectories were categorized into three groups.An association was observed between high temperature,noise exposure,and FBG trajectory(P<0.05).Using the first quartile group of ERS1 as a reference,the fourth quartile group of ERS1 had an increased risk of medium and high FBG by 1.90and 2.21 times,respectively(odds ratio[OR]=1.90,95%confidence interval[CI]:1.17–3.10;OR=2.21,95%CI:1.09–4.45).Conclusion An association was observed between occupational hazards based on ERS and FBG trajectories.The risk of FBG trajectory levels increase with an increase in ERS.展开更多
Foreign-funded overseas industrial parks(OIPs)are crucial for attracting foreign investment and promoting globalization in developing countries.However,large-scale land acquisition for these parks generates conflicts ...Foreign-funded overseas industrial parks(OIPs)are crucial for attracting foreign investment and promoting globalization in developing countries.However,large-scale land acquisition for these parks generates conflicts between developers and local stakeholders,increasing development costs.A qualitative multicase study was conducted in this study to analyze the land transaction trajectories of China's OIPs.Four OIPs were selected to reveal the underlying mechanisms from the perspectives of institutional arrangements,governance mechanisms,and enterprise heterogeneity.The findings indicate that in host countries with insufficient institutional development,local governments are more inclined to directly engage in OIP land acquisition.High-level intergovernmental mechanisms facilitate land acquisition processes,although their efficacy depends largely on administrative power allocation across parks in host countries.The results also indicate that enterprise characteristics significantly influence land acquisition,where microscale private enterprises lacking political connections often employ low-cost,bottom-up strategies by leveraging international experience.In summary,policy-makers in developing countries should prioritize enhancing OIP governance to mitigate transaction costs,promote diversified land supply,and optimize land allocation.By depicting China's OIP land acquisition processes,this study deepens the academic understanding of OIP governance in developing countries and related international land transactions,offering practical OIP management insights for governments in both host and parent countries.展开更多
基金supported by the Project of the Hubei Provincial Department of Science and Technology(Grant Nos.2022CFB957,2022CFB475)the National Natural Science Foundation of China(Grant No.11847118)。
文摘The decoherence of high-dimensional orbital angular momentum(OAM)entanglement in the weak scintillation regime has been investigated.In this study,we simulate atmospheric turbulence by utilizing a multiple-phase screen imprinted with anisotropic non-Kolmogorov turbulence.The entanglement negativity and fidelity are introduced to quantify the entanglement of a high-dimensional OAM state.The numerical evaluation results indicate that entanglement negativity and fidelity last longer for a high-dimensional OAM state when the azimuthal mode has a lower value.Additionally,the evolution of higher-dimensional OAM entanglement is significantly influenced by OAM beam parameters and turbulence parameters.Compared to isotropic atmospheric turbulence,anisotropic turbulence has a lesser influence on highdimensional OAM entanglement.
基金Supported by the National Natural Science Foundation of China(12201446)the Natural Science Foundation of the Jiangsu Higher Education Institutions of China(22KJB110005)the Shuangchuang Program of Jiangsu Province(JSSCBS20220898)。
文摘It is known that monotone recurrence relations can induce a class of twist homeomorphisms on the high-dimensional cylinder,which is an extension of the class of monotone twist maps on the annulus or two-dimensional cylinder.By constructing a bounded solution of the monotone recurrence relation,the main conclusion in this paper is acquired:The induced homeomorphism has Birkhoff orbits provided there is a compact forward-invariant set.Therefore,it generalizes Angenent's results in low-dimensional cases.
基金funded by National Natural Science Foundation of China(Nos.12402142,11832013 and 11572134)Natural Science Foundation of Hubei Province(No.2024AFB235)+1 种基金Hubei Provincial Department of Education Science and Technology Research Project(No.Q20221714)the Opening Foundation of Hubei Key Laboratory of Digital Textile Equipment(Nos.DTL2023019 and DTL2022012).
文摘Owing to their global search capabilities and gradient-free operation,metaheuristic algorithms are widely applied to a wide range of optimization problems.However,their computational demands become prohibitive when tackling high-dimensional optimization challenges.To effectively address these challenges,this study introduces cooperative metaheuristics integrating dynamic dimension reduction(DR).Building upon particle swarm optimization(PSO)and differential evolution(DE),the proposed cooperative methods C-PSO and C-DE are developed.In the proposed methods,the modified principal components analysis(PCA)is utilized to reduce the dimension of design variables,thereby decreasing computational costs.The dynamic DR strategy implements periodic execution of modified PCA after a fixed number of iterations,resulting in the important dimensions being dynamically identified.Compared with the static one,the dynamic DR strategy can achieve precise identification of important dimensions,thereby enabling accelerated convergence toward optimal solutions.Furthermore,the influence of cumulative contribution rate thresholds on optimization problems with different dimensions is investigated.Metaheuristic algorithms(PSO,DE)and cooperative metaheuristics(C-PSO,C-DE)are examined by 15 benchmark functions and two engineering design problems(speed reducer and composite pressure vessel).Comparative results demonstrate that the cooperative methods achieve significantly superior performance compared to standard methods in both solution accuracy and computational efficiency.Compared to standard metaheuristic algorithms,cooperative metaheuristics achieve a reduction in computational cost of at least 40%.The cooperative metaheuristics can be effectively used to tackle both high-dimensional unconstrained and constrained optimization problems.
基金supported in part by the Young Scientists Fund of the National Natural Science Foundation of China(Grant Nos.82304253)(and 82273709)the Foundation for Young Talents in Higher Education of Guangdong Province(Grant No.2022KQNCX021)the PhD Starting Project of Guangdong Medical University(Grant No.GDMUB2022054).
文摘Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.
基金supported by the Innovation Fund Project of the Gansu Education Department(Grant No.2021B-099).
文摘The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.
文摘Seismic data is commonly acquired sparsely and irregularly, which necessitates the regularization of seismic data with anti-aliasing and anti-leakage methods during seismic data processing. We propose a novel method of 4D anti-aliasing and anti-leakage Fourier transform using a cube-removal strategy to address the combination of irregular sampling and aliasing in high-dimensional seismic data. We compute a weighting function by stacking the spectrum along the radial lines, apply this function to suppress the aliasing energy, and then iteratively pick the dominant amplitude cube to construct the Fourier spectrum. The proposed method is very efficient due to a cube removal strategy for accelerating the convergence of Fourier reconstruction and a well-designed parallel architecture using CPU/GPU collaborative computing. To better fill the acquisition holes from 5D seismic data and meanwhile considering the GPU memory limitation, we developed the anti-aliasing and anti-leakage Fourier transform method in 4D with the remaining spatial dimension looped. The entire workflow is composed of three steps: data splitting, 4D regularization, and data merging. Numerical tests on both synthetic and field data examples demonstrate the high efficiency and effectiveness of our approach.
基金supported by the National Natural Science Foundation of China (Grant No. 12234013)the Natural Science Foundation of Shandong Province (Grant No. ZR2021LLZ009)。
文摘We present a formulation of the single-trajectory entropy using the trajectories ensemble. The single-trajectory entropy is affected by its surrounding trajectories via the distribution function. The single-trajectory entropies are studied in two typical potentials, i.e., harmonic potential and double-well potential, and in viscous environment by interacting trajectory method. The results of the trajectory methods are in agreement well with the numerical methods(Monte Carlo simulation and difference equation). The single-trajectory entropies increasing(decreasing) could be caused by absorption(emission) heat from(to) the thermal environment. Also, some interesting trajectories, which correspond to the rare evens in the processes, are demonstrated.
基金supported by the National Natural Science Foundation of China (U1808205)Hebei Natural Science Foundation (F2000501005)。
文摘This paper studies the target controllability of multilayer complex networked systems,in which the nodes are highdimensional linear time invariant(LTI)dynamical systems,and the network topology is directed and weighted.The influence of inter-layer couplings on the target controllability of multi-layer networks is discussed.It is found that even if there exists a layer which is not target controllable,the entire multi-layer network can still be target controllable due to the inter-layer couplings.For the multi-layer networks with general structure,a necessary and sufficient condition for target controllability is given by establishing the relationship between uncontrollable subspace and output matrix.By the derived condition,it can be found that the system may be target controllable even if it is not state controllable.On this basis,two corollaries are derived,which clarify the relationship between target controllability,state controllability and output controllability.For the multi-layer networks where the inter-layer couplings are directed chains and directed stars,sufficient conditions for target controllability of networked systems are given,respectively.These conditions are easier to verify than the classic criterion.
文摘Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.
基金Outstanding Youth Foundation of Hunan Provincial Department of Education(Grant No.22B0911)。
文摘In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.
基金supported from the Deanship of Research and Graduate Studies(DRG)at Ajman University,Ajman,UAE(Grant No.2023-IRG-ENIT-34).
文摘With the increasing dimensionality of the data,High-dimensional Feature Selection(HFS)becomes an increasingly dif-ficult task.It is not simple to find the best subset of features due to the breadth of the search space and the intricacy of the interactions between features.Many of the Feature Selection(FS)approaches now in use for these problems perform sig-nificantly less well when faced with such intricate situations involving high-dimensional search spaces.It is demonstrated that meta-heuristic algorithms can provide sub-optimal results in an acceptable amount of time.This paper presents a new binary Boosted version of the Spider Wasp Optimizer(BSWO)called Binary Boosted SWO(BBSWO),which combines a number of successful and promising strategies,in order to deal with HFS.The shortcomings of the original BSWO,including early convergence,settling into local optimums,limited exploration and exploitation,and lack of population diversity,were addressed by the proposal of this new variant of SWO.The concept of chaos optimization is introduced in BSWO,where initialization is consistently produced by utilizing the properties of sine chaos mapping.A new convergence parameter was then incorporated into BSWO to achieve a promising balance between exploration and exploitation.Multiple exploration mechanisms were then applied in conjunction with several exploitation strategies to effectively enrich the search process of BSWO within the search space.Finally,quantum-based optimization was added to enhance the diversity of the search agents in BSWO.The proposed BBSWO not only offers the most suitable subset of features located,but it also lessens the data's redundancy structure.BBSWO was evaluated using the k-Nearest Neighbor(k-NN)classifier on 23 HFS problems from the biomedical domain taken from the UCI repository.The results were compared with those of traditional BSWO and other well-known meta-heuristics-based FS.The findings indicate that,in comparison to other competing techniques,the proposed BBSWO can,on average,identify the least significant subsets of features with efficient classification accuracy of the k-NN classifier.
基金supported in part by the National Natural Science Foundation of China (62372385, 62272078, 62002337)the Chongqing Natural Science Foundation (CSTB2022NSCQ-MSX1486, CSTB2023NSCQ-LZX0069)the Deanship of Scientific Research at King Abdulaziz University, Jeddah, Saudi Arabia (RG-12-135-43)。
文摘High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation learning to an HDI matrix,whose hyper-parameter adaptation can be implemented through a particle swarm optimizer(PSO) to meet scalable requirements.However, conventional PSO is limited by its premature issues,which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle's state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer(SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational efficiency.Hence, SPSO's use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.
文摘The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based on complete data. This paper studies the optimal estimation of high-dimensional covariance matrices based on missing and noisy sample under the norm. First, the model with sub-Gaussian additive noise is presented. The generalized sample covariance is then modified to define a hard thresholding estimator , and the minimax upper bound is derived. After that, the minimax lower bound is derived, and it is concluded that the estimator presented in this article is rate-optimal. Finally, numerical simulation analysis is performed. The result shows that for missing samples with sub-Gaussian noise, if the true covariance matrix is sparse, the hard thresholding estimator outperforms the traditional estimate method.
基金Supported by the National Natural Science Foundation of China (No. 61075065,60774045, U1134108) and the Ph. D Programs Foundation of Ministry of Education of China ( No. 20110162110041 ).
文摘This paper discusses consensus problems for high-dimensional networked multi-agent systems with fixed topology. The communication topology of multi-agent systems is represented by a digraph. A new consensus protocol is proposed, and consensus convergence of multigent systems is analyzed based on the Lyapunov stability theory. The consensus problem can be formulated into solving a feasible problem with bilinear matrix inequality (BMI) constrains. Furthermore, the consensus protocol is extended to achieving tracking and formation control. By introducing the formation structure set, each agent can gain its individual desired trajectory. Finally, numerical simulations are provided to show the effectiveness of our strategies. The results show that agents from arbitrary initial states can asymptotically reach a consensus. In addition, agents with high-dimensional can track any target trajectory, and maintain desired formation during movement by selecting appropriate structure set.
基金supported by the National Natural Science Foundation of China(Grant No.42030607)the Beijing Municipal Science and Technology Commission(Grant No.Z251100004525005)the National Science Foundation/National Center for Atmospheric Research,and NASA(Grant No.80NSSC22M0129)。
文摘The growth trajectory of hailstones within clouds has remained elusive due to the inability to trace them directly,impeding the comprehension of their underlying growth mechanisms.This study investigated hailstone vertical growth trajectories by detecting the stable isotope signatures(2H and 18O compositions)of different shells in 27 hailstones from 9hailstorms,which allowed us to capture the ambient temperature during hailstone growth.The vertical growth trajectories were obtained by comparing the isotopic compositions of water condensate in clouds,derived from the Adiabatic Model,with those measured in hailstones.Although hailstone growth was primarily observed in the–10°C to–30°C temperature layer,the embryo formation height and subsequent growth trajectories significantly varied among hailstones.Embryos formed over a wide range of temperatures(–8.7°C to–33.4°C);four originated at temperatures above–15°C and 16originated at temperatures below–20°C,suggesting ice nuclei composed of bioproteins and mineral dust,respectively.Among the 27 measured hailstones,3 exhibited minimal vertical movement,16 exhibited a monotonic rise or fall,and the remaining 8 exhibited alternating up-down trajectories;only one experienced“recycling”during up-down drifting.Trajectory analysis revealed that similar-sized hailstones from a single storm tended to form at similar heights,whereas those larger than 25 mm in diameter exhibited at least one period of upward growth.Vertical trajectories derived from isotopic analysis were corroborated by radar hydrometeor observations.
基金supported by the National Key R&D Program of China(No.2022YFB3104502)the National Natural Science Foundation of China(No.62301251)+2 种基金the Natural Science Foundation of Jiangsu Province of China under Project(No.BK20220883)the open research fund of National Mobile Communications Research Laboratory,Southeast University,China(No.2024D04)the Young Elite Scientists Sponsorship Program by CAST(No.2023QNRC001).
文摘The environment of low-altitude urban airspace is complex and variable due to numerous obstacles,non-cooperative aircraft,and birds.Unmanned Aerial Vehicles(UAVs)leveraging environmental information to achieve three-dimension collision-free trajectory planning is the prerequisite to ensure airspace security.However,the timely information of surrounding situation is difficult to acquire by UAVs,which further brings security risks.As a mature technology leveraged in traditional civil aviation,the Automatic Dependent Surveillance-Broadcast(ADS-B)realizes continuous surveillance of the information of aircraft.Consequently,we leverage ADS-B for surveillance and information broadcasting,and divide the aerial airspace into multiple sub-airspaces to improve flight safety in UAV trajectory planning.In detail,we propose the secure Sub-airSpaces Planning(SSP)algorithm and Particle Swarm Optimization Rapidly-exploring Random Trees(PSO-RRT)algorithm for the UAV trajectory planning in law-altitude airspace.The performance of the proposed algorithm is verified by simulations and the results show that SSP reduces both the maximum number of UAVs in the sub-airspace and the length of the trajectory,and PSO-RRT reduces the cost of UAV trajectory in the sub-airspace.
基金supported by Beijing Natural Science Fund–Haidian Original Innovation Joint Fund(L232040 and L232045).
文摘In this paper,we investigate a multi-UAV aided NOMA communication system,where multiple UAV-mounted aerial base stations are employed to serve ground users in the downlink NOMA communication,and each UAV serves its associated users on its own bandwidth.We aim at maximizing the overall common throughput in a finite time period.Such a problem is a typical mixed integer nonlinear problem,which involves both continuous-variable and combinatorial optimizations.To efficiently solve this problem,we propose a two-layer algorithm,which separately tackles continuous-variable and combinatorial optimization.Specifically,in the inner layer given one user association scheme,subproblems of bandwidth allocation,power allocation and trajectory design are solved based on alternating optimization.In the outer layer,a small number of candidate user association schemes are generated from an initial scheme and the best solution can be determined by comparing all the candidate schemes.In particular,a clustering algorithm based on K-means is applied to produce all candidate user association schemes,the successive convex optimization technique is adopted in the power allocation subproblem and a logistic function approximation approach is employed in the trajectory design subproblem.Simulation results show that the proposed NOMA scheme outperforms three baseline schemes in downlink common throughput,including one solution proposed in an existing literature.
基金supported by the National Natural Science Foundation of China(No.62203256)。
文摘Generating dynamically feasible trajectory for fixed-wing Unmanned Aerial Vehicles(UAVs)in dense obstacle environments remains computationally intractable.This paper proposes a Safe Flight Corridor constrained Sequential Convex Programming(SFC-SCP)to improve the computation efficiency and reliability of trajectory generation.SFC-SCP combines the front-end convex polyhedron SFC construction and back-end SCP-based trajectory optimization.A Sparse A^(*)Search(SAS)driven SFC construction method is designed to efficiently generate polyhedron SFC according to the geometric relation among obstacles and collision-free waypoints.Via transforming the nonconvex obstacle-avoidance constraints to linear inequality constraints,SFC can mitigate infeasibility of trajectory planning and reduce computation complexity.Then,SCP casts the nonlinear trajectory optimization subject to SFC into convex programming subproblems to decrease the problem complexity.In addition,a convex optimizer based on interior point method is customized,where the search direction is calculated via successive elimination to further improve efficiency.Simulation experiments on dense obstacle scenarios show that SFC-SCP can generate dynamically feasible safe trajectory rapidly.Comparative studies with state-of-the-art SCP-based methods demonstrate the efficiency and reliability merits of SFC-SCP.Besides,the customized convex optimizer outperforms off-the-shelf optimizers in terms of computation time.
基金supported by the Key Research and Development Program of the Ministry of Science and Technology of China(grant number:2016YF0900605)the Key Research and Development Program of Hebei Province(grant number:192777129D)+1 种基金the Joint Fund for Iron and Steel of the Natural Science Foundation of Hebei Province(grant number:H2016209058)the National Natural Science Foundation for Regional Joint Fund of China(grant number:U22A20364)。
文摘Objective We aimed to investigate the patterns of fasting blood glucose(FBG)trajectories and analyze the relationship between various occupational hazard factors and FBG trajectories in male steelworkers.Methods The study cohort included 3,728 workers who met the selection criteria for the Tanggang Occupational Cohort(TGOC)between 2017 and 2022.A group-based trajectory model was used to identify the FBG trajectories.Environmental risk scores(ERS)were constructed using regression coefficients from the occupational hazard model as weights.Univariate and multivariate logistic regression analyses were performed to explore the effects of occupational hazard factors using the ERS on FBG trajectories.Results FBG trajectories were categorized into three groups.An association was observed between high temperature,noise exposure,and FBG trajectory(P<0.05).Using the first quartile group of ERS1 as a reference,the fourth quartile group of ERS1 had an increased risk of medium and high FBG by 1.90and 2.21 times,respectively(odds ratio[OR]=1.90,95%confidence interval[CI]:1.17–3.10;OR=2.21,95%CI:1.09–4.45).Conclusion An association was observed between occupational hazards based on ERS and FBG trajectories.The risk of FBG trajectory levels increase with an increase in ERS.
基金Philosophy and Social Science Planning Projects in Yunnan Province,No.QN202428China Postdoctoral Science Foundation,No.2024M752918。
文摘Foreign-funded overseas industrial parks(OIPs)are crucial for attracting foreign investment and promoting globalization in developing countries.However,large-scale land acquisition for these parks generates conflicts between developers and local stakeholders,increasing development costs.A qualitative multicase study was conducted in this study to analyze the land transaction trajectories of China's OIPs.Four OIPs were selected to reveal the underlying mechanisms from the perspectives of institutional arrangements,governance mechanisms,and enterprise heterogeneity.The findings indicate that in host countries with insufficient institutional development,local governments are more inclined to directly engage in OIP land acquisition.High-level intergovernmental mechanisms facilitate land acquisition processes,although their efficacy depends largely on administrative power allocation across parks in host countries.The results also indicate that enterprise characteristics significantly influence land acquisition,where microscale private enterprises lacking political connections often employ low-cost,bottom-up strategies by leveraging international experience.In summary,policy-makers in developing countries should prioritize enhancing OIP governance to mitigate transaction costs,promote diversified land supply,and optimize land allocation.By depicting China's OIP land acquisition processes,this study deepens the academic understanding of OIP governance in developing countries and related international land transactions,offering practical OIP management insights for governments in both host and parent countries.