The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation...The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation, a crucial factor is the formation of highly oxidising species such as OH radicals. These radicals are often the primary driving force for the removal and breakdown of organic and inorganic contaminants. The quantification of such compounds is challenging due to the nature of the radical,however indirect methods which deploy a chemical probe to essentially capture the radical have been shown to be effective. As discussed in the work presented here, optimisation of such a method is fundamental to the efficiency of the method. A starting concentration range of coumarin from 50 mmol/L to 1000 mmol/L was used along with a catalyst loading of 0.01 g/L to 1 g/L Ti TiO2 to identify that 250 mmol/L and 0.5 g/L Ti TiO2 were the optimum conditions for production. Under these parameters a maximum production rate of 35.91 mmol/L(Rmax= 0.4 mmol/L OH* min-1) was achieved which yielded at photonic efficiency of 4.88 OH*moles photon-1 under UV irradiation. The data set presented also highlighted the limitations which are associated with the method which included; rapid exhaustion of the probe molecule and process inhibition through UV light saturation. Identifying both the optimum conditions and the potential limitations of the process were concluded to be key for the efficient deployment of the photocatalytic screening method.展开更多
Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,dr...Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,drilling a multilateral well does not always increase the flow rate when compared to two single-horizontal wells due to competition in production inside the mother-bore.Here,a holistic approach is proposed to find the optimum balance between single and multilateral wells in an offshore oil development.In so doing,the integrated approach finds the highest Net Present Value(NPV)configuration of the field considering drilling,subsurface,production and financial analysis.The model employs stochastic perturbation and Markov Chain Monte-Carlo methods to solve the global maximising-NPV problem.In addition,a combination of Mixed-Integer Linear Programming(MILP),an improved Dijkstra algorithm and a Levenberg-Marquardt optimiser is proposed to solve the rate allocation problem.With the outcome from this analysis,the model suggests the optimum development including number of multilateral and single horizontal wells that would result in the highest NPV.The results demonstrate the potential for modelling to find the optimal use of petroleum facilities and to assist with planning and decision making.展开更多
Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based ...Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based on a predictive software and capable of generating optimised rolling schedules to obtain the desired mechanical properties in the final product is described.OptiLam includes some well-known metallurgical models which predict microstructural evolution during hot rolling and the transformation austenite/ferrite during the cooling.Furthermore,an optimisation algorithm,which is based on the gradient method,has been added,in order to design thermomechanical sequences when a specific final grain size is desired.OptiLam has been used to optimise rolling parameters,such as strain and temperature.Here,some of the results of the software validation performed by means of hot torsion tests are presented,showing also the functionality of the tool.Finally,the application of classical optimisation models,based on the gradient method,to hot rolling operations,is also discussed.展开更多
Zero-day attacks use unknown vulnerabilities that prevent being identified by cybersecurity detection tools.This study indicates that zero-day attacks have a significant impact on computer security.A conventional sign...Zero-day attacks use unknown vulnerabilities that prevent being identified by cybersecurity detection tools.This study indicates that zero-day attacks have a significant impact on computer security.A conventional signature-based detection algorithm is not efficient at recognizing zero-day attacks,as the signatures of zero-day attacks are usually not previously accessible.A machine learning(ML)-based detection algorithm is proficient in capturing statistical features of attacks and,therefore,optimistic for zero-day attack detection.ML and deep learning(DL)are employed for designing intrusion detection systems.The improvement of absolute varieties of novel cyberattacks poses significant challenges for IDS solutions that are dependent on datasets of prior signatures of the attacks.This manuscript presents the Zero-day attack detection employing an equilibrium optimizer with a deep learning(ZDAD-EODL)method to ensure cybersecurity.The ZDAD-EODL technique employs meta-heuristic feature subset selection using an optimum DL-based classification technique for zero-day attacks.Initially,the min-max scalar is utilized for normalizing the input data.For feature selection(FS),the ZDAD-EODL method utilizes the equilibrium optimizer(EO)model to choose feature sub-sets.In addition,the ZDAD-EODL technique employs the bi-directional gated recurrent unit(BiGRU)technique for the classification and identification of zero-day attacks.Finally,the detection performance of the BiGRU technique is further enhanced through the implementation of the subtraction average-based optimizer(SABO)-based tuning process.The performance of the ZDAD-EODL approach is investigated on the benchmark dataset.The comparison study of the ZDAD-EODL approach portrayed a superior accuracy value of 98.47%over existing techniques.展开更多
Generally,the Battery Energy Storage Systems(BESS)may meet entire load requirements while the wind turbine and PV array are still not providing energy,which enhances the reliability of the power system's distribut...Generally,the Battery Energy Storage Systems(BESS)may meet entire load requirements while the wind turbine and PV array are still not providing energy,which enhances the reliability of the power system's distribution.The primary aim of the presented research is to produce a framework for enhancing power quality in hybrid devices attached to renewable sources via an optimised fractional order proportional integral(FOPI)Controller in Unified Power Quality Conditioner(UPQC).The source input is made up of RES.To address the PQ difficulties,the UPQC employs series and shunt filter management techniques.Utilizing suggested Adaptive Bald Eagle optimisation algorithm(ABE-OA),the variables are ideally optimised for greater control.To confirm the effectiveness of the proposed method in PQ enhancement,the effectiveness of the developed system is evaluated to that of previous techniques.The proposed method achieved better rise time of 2042.5 and settling min of 19,999.展开更多
This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segment...This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segmentation’is done.Following that,‘Discrete Wavelet Transform(DWT),Scale-Invariant Feature Transform(SIFT)and low-level features(colour and shape),Proposed local Binary Pattern(LBP)based features’are extracted that are classified via‘MultiLayer Perceptron(MLP)and Long Short Term Memory(LSTM)’and predicted outcomes are obtained.For exact prediction,this work intends to optimise the weights of LSTM using Inertia Weighted Salp Swarm Optimisation(IW-SSO)model.Eventually,the development of IW-SSO method is established on varied metrics.展开更多
Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biologi...Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biological evidence other than touch DNA in the form of archived latent fingerprint lift cards.In this study,a previously developed optimised workflow for this sample type was tested on aged fingerprints to determine if improved short tandem repeat(STR)profiles could be obtained.Two-year-old samples processed with the optimised workflow produced an average of approximately five more STR alleles per profile over the traditional method.The optimised workflow also produced detectable alleles in samples aged out to 28 years.Of the methods tested,the optimised workflow resulted in the most informative profiles from evidence samples more representative of the forensic need.This workflow is recommended for use with archived latent fingerprint samples,regardless of the archival time.展开更多
Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-...Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-of-the-art weather analogue search algorithms: a parallelization and a heuristic search. The heuristic search (NDRank) limits of the final number of results and does initial searches on a lower resolution dataset to find candidates that, in the second phase, are locally validated. These optimisations were deployed in the Cloud and evaluated with ERA5 data from ECMWF. The proposed parallelization attained speedups close to optimal, and NDRank attains speedups higher than 4. NDRank can be applied to any parallel search, adding similar speedups. A substantial number of executions returned a set of analogues similar to the existing exhaustive search and most of the remaining results presented a numerical value difference lower than 0.1%. The results demonstrate that it is now possible to search for weather analogues in a faster way (even compared with parallel searches) with results with little to no error. Furthermore, NDRank can be applied to existing exhaustive searches, providing faster results with small reduction of the precision of the results.展开更多
This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting slidi...This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting sliding mode algorithm severely alleviates the chattering present in the classical sliding mode control.It provides robustness against model uncertainties and external disturbances with the knowledge of the upper bounds of the uncertainties and disturbances.The gains of the super-twisting sliding mode algorithm are selected through adaptive law.Parameters of the adaption law are tuned using a modified grey wolf optimisation algorithm,a meta-heuristic optimisation technique.Lyapunov stability analysis is carried out to analyse the overall control system stability.The performance of the proposed control algorithm is compared with two other sliding mode control strategies present in the literature,therein showing better performance of the proposed control scheme.展开更多
This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references ...This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references quickly. At first, nonsingular fast terminal slidingmode control is developed, which can guarantee not only the stability but also finite-timeconvergence of the closed-loop system. As the parameters of the designed controllers playa vital role for control performance, an improved beetle antennae search algorithm is proposedto optimise them. By employing the historical information of the beetle’s antennaeand dynamically updating the step size as well as the range of its searching, the optimisingis accelerated considerably to ensure the efficiency of the quadrotor control. The superiorityof the proposed control scheme is demonstrated by simulation experiments, from whichone can see that both the error and the overshooting of the trajectory tracking are reducedeffectively.展开更多
With an optimised hall layout,progressive design collaborations,inspiring trends and AIdriven innovations,Heimtextil 2026 reacts to the current market situation–and offers the industry a reliable constant in challeng...With an optimised hall layout,progressive design collaborations,inspiring trends and AIdriven innovations,Heimtextil 2026 reacts to the current market situation–and offers the industry a reliable constant in challenging times.Under the motto‘Lead the Change’,the leading trade fair for home and contract textiles and textile design shows how challenges can be turned into opportunities.From 13 to 16 January,more than 3,100 exhibitors from 65 countries will provide a comprehensive market overview with new collections and textile solutions.As a knowledge hub,Heimtextil delivers new strategies and concrete solutions for future business success.展开更多
This study proposes a new component of the composite loss function minimised during training of the Super-Resolution(SR)algorithms—the normalised structural similarity index loss LSSIMN,which has the potential to imp...This study proposes a new component of the composite loss function minimised during training of the Super-Resolution(SR)algorithms—the normalised structural similarity index loss LSSIMN,which has the potential to improve the natural appearance of reconstructed images.Deep learning-based super-resolution(SR)algorithms reconstruct high-resolution images from low-resolution inputs,offering a practical means to enhance image quality without requiring superior imaging hardware,which is particularly important in medical applications where diagnostic accuracy is critical.Although recent SR methods employing convolutional and generative adversarial networks achieve high pixel fidelity,visual artefacts may persist,making the design of the loss function during training essential for ensuring reliable and naturalistic image reconstruction.Our research shows on two models—SR and Invertible Rescaling Neural Network(IRN)—trained on multiple benchmark datasets that the function LSSIMN significantly contributes to the visual quality,preserving the structural fidelity on the reference datasets.The quantitative analysis of results while incorporating LSSIMN shows that including this loss function component has a mean 2.88%impact on the improvement of the final structural similarity of the reconstructed images in the validation set,in comparison to leaving it out and 0.218%in comparison when this component is non-normalised.展开更多
The integration of physics-based modelling and data-driven artificial intelligence(AI)has emerged as a transformative paradigm in computational mechanics.This perspective reviews the development and current status of ...The integration of physics-based modelling and data-driven artificial intelligence(AI)has emerged as a transformative paradigm in computational mechanics.This perspective reviews the development and current status of AI-empowered frameworks,including data-driven methods,physics-informed neural networks,and neural operators.While these approaches have demonstrated significant promise,challenges remain in terms of robustness,generalisation,and computational efficiency.We delineate four promising research directions:(1)Modular neural architectures inspired by traditional computational mechanics,(2)physics informed neural operators for resolution-invariant operator learning,(3)intelligent frameworks for multiphysics and multiscale biomechanics problems,and(4)structural optimisation strategies based on physics constraints and reinforcement learning.These directions represent a shift toward foundational frameworks that combine the strengths of physics and data,opening new avenues for the modelling,simulation,and optimisation of complex physical systems.展开更多
Support structure,a critical component in the design for additive manufacturing(DfAM),has been largely overlooked by additive manufacturing(AM)communities.The support structure stabilises overhanging sections,aids in ...Support structure,a critical component in the design for additive manufacturing(DfAM),has been largely overlooked by additive manufacturing(AM)communities.The support structure stabilises overhanging sections,aids in heat dissipation,and reduces the risk of thermal warping,residual stress,and distortion,particularly in the fabrication of complex geometries that challenge traditional manufacturing methods.Despite the importance of support structures in AM,a systematic review covering all aspects of the design,optimisation,and removal of support structures remains lacking.This review provides an overview of various support structure types—contact and non-contact,as well as identical and dissimilar material configurations—and outlines optimisation methods,including geometric,topology,simulation-driven,data-driven,and multi-objective approaches.Additionally,the mechanisms of support removal,such as mechanical milling and chemical dissolution,and innovations like dissolvable supports and sensitised interfaces,are discussed.Future research directions are outlined,emphasising artificial intelligence(AI)-driven intelligent design,multi-material supports,sustainable support materials,support-free AM techniques,and innovative support removal methods,all of which are essential for advancing AM technology.Overall,this review aims to serve as a foundational reference for the design and optimisation of the support structure in AM.展开更多
The challenge of optimising multimodal functions within high-dimensional domains constitutes a notable difficulty in evolutionary computation research.Addressing this issue,this study introduces the Deep Backtracking ...The challenge of optimising multimodal functions within high-dimensional domains constitutes a notable difficulty in evolutionary computation research.Addressing this issue,this study introduces the Deep Backtracking Bare-Bones Particle Swarm Optimisation(DBPSO)algorithm,an innovative approach built upon the integration of the Deep Memory Storage Mechanism(DMSM)and the Dynamic Memory Activation Strategy(DMAS).The DMSM enhances the memory retention for the globally optimal particle,promoting interaction between standard particles and their historically optimal counterparts.In parallel,DMAS assures the updated position of the globally optimal particle is appropriately aligned with the deep memory repository.The efficacy of DBPSO was rigorously assessed through a series of simulations employing the CEC2017 benchmark suite.A comparative analysis juxtaposed DBPSO's performance against five contemporary evolutionary algorithms across two experimental conditions:Dimension-50 and Dimension-100.In the 50D trials,DBPSO attained an average ranking of 2.03,whereas in the 100D scenarios,it improved to an average ranking of 1.9.Further examination utilising the CEC2019 benchmark functions revealed DBPSO's robustness,securing four first-place finishes,three second-place standings,and three third-place positions,culminating in an unmatched average ranking of 1.9 across all algorithms.These empirical results corroborate DBPSO's proficiency in delivering precise solutions for complex,high-dimensional optimisation challenges.展开更多
Impact ground pressure events occur frequently in coal mining processes,significantly affecting the personal safety of construction workers.Real-time microseismic monitoring of coal rock body rupture information can p...Impact ground pressure events occur frequently in coal mining processes,significantly affecting the personal safety of construction workers.Real-time microseismic monitoring of coal rock body rupture information can provide early warnings,and the seismic source location method is an essential indicator for evaluating a microseismic monitoring system.This paper proposes a nonlinear hybrid optimal particle swarm optimisation(PSO)microseismic positioning method based on this technique.The method first improves the PSO algorithm by using the global search performance of this method to quickly find a feasible solution and provide a better initial solution for the subsequent solution of the nonlinear optimal microseismic positioning method.This approach effectively prevents the problem of the microseismic positioning method falling into a local optimum because of an over-reliance on the initial value.In addition,the nonlinear optimal microseismic positioning method further narrows the localisation error based on the PSO algorithm.A simulation test demonstrates that the new method has a good positioning effect,and engineering application examples also show that the proposed method has high accuracy and strong positioning stability.The new method is better than the separate positioning method,both overall and in three directions,making it more suitable for solving the microseismic positioning problem.展开更多
The highly efficient electrochemical treatment technology for dye-polluted wastewater is one of hot research topics in industrial wastewater treatment.This study reported a three-dimensional electrochemical treatment ...The highly efficient electrochemical treatment technology for dye-polluted wastewater is one of hot research topics in industrial wastewater treatment.This study reported a three-dimensional electrochemical treatment process integrating graphite intercalation compound(GIC)adsorption,direct anodic oxidation,and·OH oxidation for decolourising Reactive Black 5(RB5)from aqueous solutions.The electrochemical process was optimised using the novel progressive central composite design-response surface methodology(CCD-NPRSM),hybrid artificial neural network-extreme gradient boosting(hybrid ANN-XGBoost),and classification and regression trees(CART).CCD-NPRSM and hybrid ANN-XGBoost were employed to minimise errors in evaluating the electrochemical process involving three manipulated operational parameters:current density,electrolysis(treatment)time,and initial dye concentration.The optimised decolourisation efficiencies were 99.30%,96.63%,and 99.14%for CCD-NPRSM,hybrid ANN-XGBoost,and CART,respectively,compared to the 98.46%RB5 removal rate observed experimentally under optimum conditions:approximately 20 mA/cm^(2) of current density,20 min of electrolysis time,and 65 mg/L of RB5.The optimised mineralisation efficiencies ranged between 89%and 92%for different models based on total organic carbon(TOC).Experimental studies confirmed that the predictive efficiency of optimised models ranked in the descending order of hybrid ANN-XGBoost,CCD-NPRSM,and CART.Model validation using analysis of variance(ANOVA)revealed that hybrid ANN-XGBoost had a mean squared error(MSE)and a coefficient of determination(R^(2))of approximately 0.014 and 0.998,respectively,for the RB5 removal efficiency,outperforming CCD-NPRSM with MSE and R^(2) of 0.518 and 0.998,respectively.Overall,the hybrid ANN-XGBoost approach is the most feasible technique for assessing the electrochemical treatment efficiency in RB5 dye wastewater decolourisation.展开更多
This article presents the design of a microfabricated bio-inspired flapping-wing Nnano Aaerial Vvehicle(NAV),driven by an electromagnetic system.Our approach is based on artificial wings composed of rigid bodies conne...This article presents the design of a microfabricated bio-inspired flapping-wing Nnano Aaerial Vvehicle(NAV),driven by an electromagnetic system.Our approach is based on artificial wings composed of rigid bodies connected by compliant links,which optimise aerodynamic forces though replicating the complex wing kinematics of insects.The originality of this article lies in a new design methodology based on a triple equivalence between a 3D model,a multibody model,and a mass/spring model(0D)which reduces the number of parameters in the problem.This approach facilitates NAV optimisation by using only the mass/spring model,thereby simplifying the design process while maintaining high accuracy.Two wing geometries are studied and optimised in this article to produce large-amplitude wing motions(approximately 40^\circ),and enabling flapping and twisting motion in quadrature.The results are validated thanks to experimental measurements for the large amplitude and through finite element simulations for the combined motion,confirming the effectiveness of this strategy for a NAV weighing less than 40 mg with a wingspan of under 3 cm.展开更多
This paper presents an investigation of the tribological performance of AA2024–B_(4)C composites,with a specific focus on the influence of reinforcement and processing parameters.In this study three input parameters ...This paper presents an investigation of the tribological performance of AA2024–B_(4)C composites,with a specific focus on the influence of reinforcement and processing parameters.In this study three input parameters were varied:B_(4)C weight percentage,milling time,and normal load,to evaluate their effects on two output parameters:wear loss and the coefficient of friction.AA2024 alloy was used as the matrix alloy,while B_(4)C particles were used as reinforcement.Due to the high hardness and wear resistance of B_(4)C,the optimized composite shows strong potential for use in aerospace structural elements and automotive brake components.The optimisation of tribological behaviour was conducted using a Taguchi-Grey Relational Analysis(Taguchi-GRA)and the Technique for Order of Preference by Similarity to Ideal Solution(TOPSIS).A total of 27 combinations of input parameters were analysed,varying the B_(4)C content(0,10,and 15 wt.%),milling time(0,15,and 25 h),and normal load(1,5,and 10 N).Wear loss and the coefficient of friction were numerically evaluated and selected as criteria for optimisation.Artificial Neural Networks(ANNs)were also applied for two outputs simultaneously.TOPSIS identified Alternative 1 as the optimal solution,confirming the results obtained using the Taguchi Grey method.The optimal condition obtained(10 wt.%B_(4)C,25 h milling time,10 N load)resulted in a minimum wear loss of 1.7 mg and a coefficient of friction of 0.176,confirming significant enhancement in tribological behaviour.Based on the results,both the B_(4)C content and the applied processing conditions have a significant impact on wear loss and frictional properties.This approach demonstrates high reliability and confidence,enabling the design of future composite materials with optimal properties for specific applications.展开更多
An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are in...An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are inaccurate and the query efficiency cannot be guaranteed as well.In particular,they are difficult to accurately obtain the complex relationships between multiple tables in complex database systems.When dealing with complex queries,the existing cardinality estimators cannot achieve good results.In this study,a novel cardinality estimator is proposed.It uses the core techniques with the BiLSTM network structure and adds the attention mechanism.First,the columns involved in the query statements in the training set are sampled and compressed into bitmaps.Then,the Word2vec model is used to embed the word vectors about the query statements.Finally,the BiLSTM network and attention mechanism are employed to deal with word vectors.The proposed model takes into consideration not only the correlation between tables but also the processing of complex predicates.Extensive experiments and the evaluation of BiLSTM-Attention Cardinality Estimator(BACE)on the IMDB datasets are conducted.The results show that the deep learning model can significantly improve the quality of cardinality estimation,which is a vital role in query optimisation for complex databases.展开更多
基金the financial support of Northern Irelands Department of Education and Learning for funding Caitlin Buck’s Ph DQueen’s University Belfast Pioneering Research Programme (PRP) for funding the research of Dr Nathan Skillen
文摘The development of photocatalytic technology has grown significantly since its initial report and as such,a number of screening methods have been developed to assess activity. In the field of environmental remediation, a crucial factor is the formation of highly oxidising species such as OH radicals. These radicals are often the primary driving force for the removal and breakdown of organic and inorganic contaminants. The quantification of such compounds is challenging due to the nature of the radical,however indirect methods which deploy a chemical probe to essentially capture the radical have been shown to be effective. As discussed in the work presented here, optimisation of such a method is fundamental to the efficiency of the method. A starting concentration range of coumarin from 50 mmol/L to 1000 mmol/L was used along with a catalyst loading of 0.01 g/L to 1 g/L Ti TiO2 to identify that 250 mmol/L and 0.5 g/L Ti TiO2 were the optimum conditions for production. Under these parameters a maximum production rate of 35.91 mmol/L(Rmax= 0.4 mmol/L OH* min-1) was achieved which yielded at photonic efficiency of 4.88 OH*moles photon-1 under UV irradiation. The data set presented also highlighted the limitations which are associated with the method which included; rapid exhaustion of the probe molecule and process inhibition through UV light saturation. Identifying both the optimum conditions and the potential limitations of the process were concluded to be key for the efficient deployment of the photocatalytic screening method.
文摘Multilateral wells promise cost savings to oil and fields as they have the potential to reduce overall drilling distances and minimize the number of slots required for the surface facility managing the well.However,drilling a multilateral well does not always increase the flow rate when compared to two single-horizontal wells due to competition in production inside the mother-bore.Here,a holistic approach is proposed to find the optimum balance between single and multilateral wells in an offshore oil development.In so doing,the integrated approach finds the highest Net Present Value(NPV)configuration of the field considering drilling,subsurface,production and financial analysis.The model employs stochastic perturbation and Markov Chain Monte-Carlo methods to solve the global maximising-NPV problem.In addition,a combination of Mixed-Integer Linear Programming(MILP),an improved Dijkstra algorithm and a Levenberg-Marquardt optimiser is proposed to solve the rate allocation problem.With the outcome from this analysis,the model suggests the optimum development including number of multilateral and single horizontal wells that would result in the highest NPV.The results demonstrate the potential for modelling to find the optimal use of petroleum facilities and to assist with planning and decision making.
基金supported by the project "Quality improvement by metallurgical optimised stock temperature evolution in the reheating furnace including microstructure feedback from the rolling mill" (OPTHEAT RFSR-CT-2006-00007) of the Research Fund for Coal and Steel (RFCS) from the European Union
文摘Nowadays it is known that the thermomechanical schedules applied during hot rolling of flat products provide the steel with improved mechanical properties.In this work an optimisation tool,OptiLam (OptiLam v.1),based on a predictive software and capable of generating optimised rolling schedules to obtain the desired mechanical properties in the final product is described.OptiLam includes some well-known metallurgical models which predict microstructural evolution during hot rolling and the transformation austenite/ferrite during the cooling.Furthermore,an optimisation algorithm,which is based on the gradient method,has been added,in order to design thermomechanical sequences when a specific final grain size is desired.OptiLam has been used to optimise rolling parameters,such as strain and temperature.Here,some of the results of the software validation performed by means of hot torsion tests are presented,showing also the functionality of the tool.Finally,the application of classical optimisation models,based on the gradient method,to hot rolling operations,is also discussed.
基金Deanship of Research and Graduate Studies at King Khalid University for funding this work through Large Research Project under grant number RGP2/286/46Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R732),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia+2 种基金Ongoing Research Funding program(ORFFT-2025-100-7),King Saud University,Riyadh,Saudi Arabia for financial supportthe Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabia,for funding this research work through the project number“NBU-FFR-2025-2913-07”the Deanship of Graduate Studies and Scientific Research at the University of Bisha for supporting this work through the Fast-Track Research Support Program。
文摘Zero-day attacks use unknown vulnerabilities that prevent being identified by cybersecurity detection tools.This study indicates that zero-day attacks have a significant impact on computer security.A conventional signature-based detection algorithm is not efficient at recognizing zero-day attacks,as the signatures of zero-day attacks are usually not previously accessible.A machine learning(ML)-based detection algorithm is proficient in capturing statistical features of attacks and,therefore,optimistic for zero-day attack detection.ML and deep learning(DL)are employed for designing intrusion detection systems.The improvement of absolute varieties of novel cyberattacks poses significant challenges for IDS solutions that are dependent on datasets of prior signatures of the attacks.This manuscript presents the Zero-day attack detection employing an equilibrium optimizer with a deep learning(ZDAD-EODL)method to ensure cybersecurity.The ZDAD-EODL technique employs meta-heuristic feature subset selection using an optimum DL-based classification technique for zero-day attacks.Initially,the min-max scalar is utilized for normalizing the input data.For feature selection(FS),the ZDAD-EODL method utilizes the equilibrium optimizer(EO)model to choose feature sub-sets.In addition,the ZDAD-EODL technique employs the bi-directional gated recurrent unit(BiGRU)technique for the classification and identification of zero-day attacks.Finally,the detection performance of the BiGRU technique is further enhanced through the implementation of the subtraction average-based optimizer(SABO)-based tuning process.The performance of the ZDAD-EODL approach is investigated on the benchmark dataset.The comparison study of the ZDAD-EODL approach portrayed a superior accuracy value of 98.47%over existing techniques.
文摘Generally,the Battery Energy Storage Systems(BESS)may meet entire load requirements while the wind turbine and PV array are still not providing energy,which enhances the reliability of the power system's distribution.The primary aim of the presented research is to produce a framework for enhancing power quality in hybrid devices attached to renewable sources via an optimised fractional order proportional integral(FOPI)Controller in Unified Power Quality Conditioner(UPQC).The source input is made up of RES.To address the PQ difficulties,the UPQC employs series and shunt filter management techniques.Utilizing suggested Adaptive Bald Eagle optimisation algorithm(ABE-OA),the variables are ideally optimised for greater control.To confirm the effectiveness of the proposed method in PQ enhancement,the effectiveness of the developed system is evaluated to that of previous techniques.The proposed method achieved better rise time of 2042.5 and settling min of 19,999.
文摘This paper aims to frame a new rice disease prediction model that included three major phases.Initially,median filtering(MF)is deployed during pre-processing and then‘proposed Fuzzy Means Clustering(FCM)based segmentation’is done.Following that,‘Discrete Wavelet Transform(DWT),Scale-Invariant Feature Transform(SIFT)and low-level features(colour and shape),Proposed local Binary Pattern(LBP)based features’are extracted that are classified via‘MultiLayer Perceptron(MLP)and Long Short Term Memory(LSTM)’and predicted outcomes are obtained.For exact prediction,this work intends to optimise the weights of LSTM using Inertia Weighted Salp Swarm Optimisation(IW-SSO)model.Eventually,the development of IW-SSO method is established on varied metrics.
基金This work was supported by the Department of Forensic Science of Virgina Commonwealth University and National Institute of Justice(NIJ)Award 2014-DNBX-K013.
文摘Due to recent improvements in forensic DNA testing kit sensitivity,there has been an increased demand in the criminal justice community to revisit past convictions or cold cases.Some of these cases have little biological evidence other than touch DNA in the form of archived latent fingerprint lift cards.In this study,a previously developed optimised workflow for this sample type was tested on aged fingerprints to determine if improved short tandem repeat(STR)profiles could be obtained.Two-year-old samples processed with the optimised workflow produced an average of approximately five more STR alleles per profile over the traditional method.The optimised workflow also produced detectable alleles in samples aged out to 28 years.Of the methods tested,the optimised workflow resulted in the most informative profiles from evidence samples more representative of the forensic need.This workflow is recommended for use with archived latent fingerprint samples,regardless of the archival time.
基金the Fundação para a Ciência e a Tecnologia[UIDB/50021/2020].
文摘Global meteorology data are now widely used in various areas, but one of its applications, weather analogues, still require exhaustive searches on the whole historical data. We present two optimisations for the state-of-the-art weather analogue search algorithms: a parallelization and a heuristic search. The heuristic search (NDRank) limits of the final number of results and does initial searches on a lower resolution dataset to find candidates that, in the second phase, are locally validated. These optimisations were deployed in the Cloud and evaluated with ERA5 data from ECMWF. The proposed parallelization attained speedups close to optimal, and NDRank attains speedups higher than 4. NDRank can be applied to any parallel search, adding similar speedups. A substantial number of executions returned a set of analogues similar to the existing exhaustive search and most of the remaining results presented a numerical value difference lower than 0.1%. The results demonstrate that it is now possible to search for weather analogues in a faster way (even compared with parallel searches) with results with little to no error. Furthermore, NDRank can be applied to existing exhaustive searches, providing faster results with small reduction of the precision of the results.
文摘This paper proposes a modified grey wolf optimiser-based adaptive super-twisting sliding mode control algorithm for the trajectory tracking and balancing of the rotary inverted pendulum system.The super-twisting sliding mode algorithm severely alleviates the chattering present in the classical sliding mode control.It provides robustness against model uncertainties and external disturbances with the knowledge of the upper bounds of the uncertainties and disturbances.The gains of the super-twisting sliding mode algorithm are selected through adaptive law.Parameters of the adaption law are tuned using a modified grey wolf optimisation algorithm,a meta-heuristic optimisation technique.Lyapunov stability analysis is carried out to analyse the overall control system stability.The performance of the proposed control algorithm is compared with two other sliding mode control strategies present in the literature,therein showing better performance of the proposed control scheme.
基金Fujian Provincial Science and Technology Major Project(No.2020HZ02014)Education and Teaching Reform Research Project for Colleges and Universities in Fujian Province(No.FBJG20210239)Huaqiao University Graduate Education Teaching Reform Research Funding Project(No.20YJG009).
文摘This paper focuses on the trajectory tracking of quadrotors under bounded external disturbances.An optimised robust controller is proposed to drive the position and attitude ofa quadrotor converge to their references quickly. At first, nonsingular fast terminal slidingmode control is developed, which can guarantee not only the stability but also finite-timeconvergence of the closed-loop system. As the parameters of the designed controllers playa vital role for control performance, an improved beetle antennae search algorithm is proposedto optimise them. By employing the historical information of the beetle’s antennaeand dynamically updating the step size as well as the range of its searching, the optimisingis accelerated considerably to ensure the efficiency of the quadrotor control. The superiorityof the proposed control scheme is demonstrated by simulation experiments, from whichone can see that both the error and the overshooting of the trajectory tracking are reducedeffectively.
文摘With an optimised hall layout,progressive design collaborations,inspiring trends and AIdriven innovations,Heimtextil 2026 reacts to the current market situation–and offers the industry a reliable constant in challenging times.Under the motto‘Lead the Change’,the leading trade fair for home and contract textiles and textile design shows how challenges can be turned into opportunities.From 13 to 16 January,more than 3,100 exhibitors from 65 countries will provide a comprehensive market overview with new collections and textile solutions.As a knowledge hub,Heimtextil delivers new strategies and concrete solutions for future business success.
基金support from the following institutional grant.Internal Grant Agency of the Faculty of Economics and Management,Czech University of Life Sciences Prague,grant no.2023A0004(https://iga.pef.czu.cz/,accessed on 6 June 2025).
文摘This study proposes a new component of the composite loss function minimised during training of the Super-Resolution(SR)algorithms—the normalised structural similarity index loss LSSIMN,which has the potential to improve the natural appearance of reconstructed images.Deep learning-based super-resolution(SR)algorithms reconstruct high-resolution images from low-resolution inputs,offering a practical means to enhance image quality without requiring superior imaging hardware,which is particularly important in medical applications where diagnostic accuracy is critical.Although recent SR methods employing convolutional and generative adversarial networks achieve high pixel fidelity,visual artefacts may persist,making the design of the loss function during training essential for ensuring reliable and naturalistic image reconstruction.Our research shows on two models—SR and Invertible Rescaling Neural Network(IRN)—trained on multiple benchmark datasets that the function LSSIMN significantly contributes to the visual quality,preserving the structural fidelity on the reference datasets.The quantitative analysis of results while incorporating LSSIMN shows that including this loss function component has a mean 2.88%impact on the improvement of the final structural similarity of the reconstructed images in the validation set,in comparison to leaving it out and 0.218%in comparison when this component is non-normalised.
基金supported by the Australian Research Council(Grant No.IC190100020)the Australian Research Council Indus〓〓try Fellowship(Grant No.IE230100435)the National Natural Science Foundation of China(Grant Nos.12032014 and T2488101)。
文摘The integration of physics-based modelling and data-driven artificial intelligence(AI)has emerged as a transformative paradigm in computational mechanics.This perspective reviews the development and current status of AI-empowered frameworks,including data-driven methods,physics-informed neural networks,and neural operators.While these approaches have demonstrated significant promise,challenges remain in terms of robustness,generalisation,and computational efficiency.We delineate four promising research directions:(1)Modular neural architectures inspired by traditional computational mechanics,(2)physics informed neural operators for resolution-invariant operator learning,(3)intelligent frameworks for multiphysics and multiscale biomechanics problems,and(4)structural optimisation strategies based on physics constraints and reinforcement learning.These directions represent a shift toward foundational frameworks that combine the strengths of physics and data,opening new avenues for the modelling,simulation,and optimisation of complex physical systems.
基金supported by the Advanced Research and Technology Innovation Centre (ARTIC)the National University of Singapore under Grant (Project Number:ADTRP1)the sponsorship of the China Scholarship Council (No. 202306130143).
文摘Support structure,a critical component in the design for additive manufacturing(DfAM),has been largely overlooked by additive manufacturing(AM)communities.The support structure stabilises overhanging sections,aids in heat dissipation,and reduces the risk of thermal warping,residual stress,and distortion,particularly in the fabrication of complex geometries that challenge traditional manufacturing methods.Despite the importance of support structures in AM,a systematic review covering all aspects of the design,optimisation,and removal of support structures remains lacking.This review provides an overview of various support structure types—contact and non-contact,as well as identical and dissimilar material configurations—and outlines optimisation methods,including geometric,topology,simulation-driven,data-driven,and multi-objective approaches.Additionally,the mechanisms of support removal,such as mechanical milling and chemical dissolution,and innovations like dissolvable supports and sensitised interfaces,are discussed.Future research directions are outlined,emphasising artificial intelligence(AI)-driven intelligent design,multi-material supports,sustainable support materials,support-free AM techniques,and innovative support removal methods,all of which are essential for advancing AM technology.Overall,this review aims to serve as a foundational reference for the design and optimisation of the support structure in AM.
基金supported by the Artificial Intelligence Innovation Project of Wuhan Science and Technology Bureau,2023010402040016the Natural Science Foundation of Hubei Province of China,2022CFB076,JSPS KAKENHI,JP25K15279,Natural Science Foundation of Hubei Province,2023AFB003+1 种基金the National Natural Science Foundation of China,52201363the Education Department Scientific Research Programme Project of Hubei Province of China,Q20222208.
文摘The challenge of optimising multimodal functions within high-dimensional domains constitutes a notable difficulty in evolutionary computation research.Addressing this issue,this study introduces the Deep Backtracking Bare-Bones Particle Swarm Optimisation(DBPSO)algorithm,an innovative approach built upon the integration of the Deep Memory Storage Mechanism(DMSM)and the Dynamic Memory Activation Strategy(DMAS).The DMSM enhances the memory retention for the globally optimal particle,promoting interaction between standard particles and their historically optimal counterparts.In parallel,DMAS assures the updated position of the globally optimal particle is appropriately aligned with the deep memory repository.The efficacy of DBPSO was rigorously assessed through a series of simulations employing the CEC2017 benchmark suite.A comparative analysis juxtaposed DBPSO's performance against five contemporary evolutionary algorithms across two experimental conditions:Dimension-50 and Dimension-100.In the 50D trials,DBPSO attained an average ranking of 2.03,whereas in the 100D scenarios,it improved to an average ranking of 1.9.Further examination utilising the CEC2019 benchmark functions revealed DBPSO's robustness,securing four first-place finishes,three second-place standings,and three third-place positions,culminating in an unmatched average ranking of 1.9 across all algorithms.These empirical results corroborate DBPSO's proficiency in delivering precise solutions for complex,high-dimensional optimisation challenges.
基金supported by the Natural Science Foundation of Henan Province,China.(No,222300420596).
文摘Impact ground pressure events occur frequently in coal mining processes,significantly affecting the personal safety of construction workers.Real-time microseismic monitoring of coal rock body rupture information can provide early warnings,and the seismic source location method is an essential indicator for evaluating a microseismic monitoring system.This paper proposes a nonlinear hybrid optimal particle swarm optimisation(PSO)microseismic positioning method based on this technique.The method first improves the PSO algorithm by using the global search performance of this method to quickly find a feasible solution and provide a better initial solution for the subsequent solution of the nonlinear optimal microseismic positioning method.This approach effectively prevents the problem of the microseismic positioning method falling into a local optimum because of an over-reliance on the initial value.In addition,the nonlinear optimal microseismic positioning method further narrows the localisation error based on the PSO algorithm.A simulation test demonstrates that the new method has a good positioning effect,and engineering application examples also show that the proposed method has high accuracy and strong positioning stability.The new method is better than the separate positioning method,both overall and in three directions,making it more suitable for solving the microseismic positioning problem.
文摘The highly efficient electrochemical treatment technology for dye-polluted wastewater is one of hot research topics in industrial wastewater treatment.This study reported a three-dimensional electrochemical treatment process integrating graphite intercalation compound(GIC)adsorption,direct anodic oxidation,and·OH oxidation for decolourising Reactive Black 5(RB5)from aqueous solutions.The electrochemical process was optimised using the novel progressive central composite design-response surface methodology(CCD-NPRSM),hybrid artificial neural network-extreme gradient boosting(hybrid ANN-XGBoost),and classification and regression trees(CART).CCD-NPRSM and hybrid ANN-XGBoost were employed to minimise errors in evaluating the electrochemical process involving three manipulated operational parameters:current density,electrolysis(treatment)time,and initial dye concentration.The optimised decolourisation efficiencies were 99.30%,96.63%,and 99.14%for CCD-NPRSM,hybrid ANN-XGBoost,and CART,respectively,compared to the 98.46%RB5 removal rate observed experimentally under optimum conditions:approximately 20 mA/cm^(2) of current density,20 min of electrolysis time,and 65 mg/L of RB5.The optimised mineralisation efficiencies ranged between 89%and 92%for different models based on total organic carbon(TOC).Experimental studies confirmed that the predictive efficiency of optimised models ranked in the descending order of hybrid ANN-XGBoost,CCD-NPRSM,and CART.Model validation using analysis of variance(ANOVA)revealed that hybrid ANN-XGBoost had a mean squared error(MSE)and a coefficient of determination(R^(2))of approximately 0.014 and 0.998,respectively,for the RB5 removal efficiency,outperforming CCD-NPRSM with MSE and R^(2) of 0.518 and 0.998,respectively.Overall,the hybrid ANN-XGBoost approach is the most feasible technique for assessing the electrochemical treatment efficiency in RB5 dye wastewater decolourisation.
基金supported by ANR-ASTRID NANOFLY(ANR-19-ASTR-0023)and French AID(Defense Innovation Agency).
文摘This article presents the design of a microfabricated bio-inspired flapping-wing Nnano Aaerial Vvehicle(NAV),driven by an electromagnetic system.Our approach is based on artificial wings composed of rigid bodies connected by compliant links,which optimise aerodynamic forces though replicating the complex wing kinematics of insects.The originality of this article lies in a new design methodology based on a triple equivalence between a 3D model,a multibody model,and a mass/spring model(0D)which reduces the number of parameters in the problem.This approach facilitates NAV optimisation by using only the mass/spring model,thereby simplifying the design process while maintaining high accuracy.Two wing geometries are studied and optimised in this article to produce large-amplitude wing motions(approximately 40^\circ),and enabling flapping and twisting motion in quadrature.The results are validated thanks to experimental measurements for the large amplitude and through finite element simulations for the combined motion,confirming the effectiveness of this strategy for a NAV weighing less than 40 mg with a wingspan of under 3 cm.
文摘This paper presents an investigation of the tribological performance of AA2024–B_(4)C composites,with a specific focus on the influence of reinforcement and processing parameters.In this study three input parameters were varied:B_(4)C weight percentage,milling time,and normal load,to evaluate their effects on two output parameters:wear loss and the coefficient of friction.AA2024 alloy was used as the matrix alloy,while B_(4)C particles were used as reinforcement.Due to the high hardness and wear resistance of B_(4)C,the optimized composite shows strong potential for use in aerospace structural elements and automotive brake components.The optimisation of tribological behaviour was conducted using a Taguchi-Grey Relational Analysis(Taguchi-GRA)and the Technique for Order of Preference by Similarity to Ideal Solution(TOPSIS).A total of 27 combinations of input parameters were analysed,varying the B_(4)C content(0,10,and 15 wt.%),milling time(0,15,and 25 h),and normal load(1,5,and 10 N).Wear loss and the coefficient of friction were numerically evaluated and selected as criteria for optimisation.Artificial Neural Networks(ANNs)were also applied for two outputs simultaneously.TOPSIS identified Alternative 1 as the optimal solution,confirming the results obtained using the Taguchi Grey method.The optimal condition obtained(10 wt.%B_(4)C,25 h milling time,10 N load)resulted in a minimum wear loss of 1.7 mg and a coefficient of friction of 0.176,confirming significant enhancement in tribological behaviour.Based on the results,both the B_(4)C content and the applied processing conditions have a significant impact on wear loss and frictional properties.This approach demonstrates high reliability and confidence,enabling the design of future composite materials with optimal properties for specific applications.
基金supported by the National Natural Science Foundation of China under grant nos.61772091,61802035,61962006,61962038,U1802271,U2001212,and 62072311the Sichuan Science and Technology Program under grant nos.2021JDJQ0021 and 22ZDYF2680+7 种基金the CCF‐Huawei Database System Innovation Research Plan under grant no.CCF‐HuaweiDBIR2020004ADigital Media Art,Key Laboratory of Sichuan Province,Sichuan Conservatory of Music,Chengdu,China under grant no.21DMAKL02the Chengdu Major Science and Technology Innovation Project under grant no.2021‐YF08‐00156‐GXthe Chengdu Technology Innovation and Research and Development Project under grant no.2021‐YF05‐00491‐SNthe Natural Science Foundation of Guangxi under grant no.2018GXNSFDA138005the Guangdong Basic and Applied Basic Research Foundation under grant no.2020B1515120028the Science and Technology Innovation Seedling Project of Sichuan Province under grant no 2021006the College Student Innovation and Entrepreneurship Training Program of Chengdu University of Information Technology under grant nos.202110621179 and 202110621186.
文摘An excellent cardinality estimation can make the query optimiser produce a good execution plan.Although there are some studies on cardinality estimation,the prediction results of existing cardinality estimators are inaccurate and the query efficiency cannot be guaranteed as well.In particular,they are difficult to accurately obtain the complex relationships between multiple tables in complex database systems.When dealing with complex queries,the existing cardinality estimators cannot achieve good results.In this study,a novel cardinality estimator is proposed.It uses the core techniques with the BiLSTM network structure and adds the attention mechanism.First,the columns involved in the query statements in the training set are sampled and compressed into bitmaps.Then,the Word2vec model is used to embed the word vectors about the query statements.Finally,the BiLSTM network and attention mechanism are employed to deal with word vectors.The proposed model takes into consideration not only the correlation between tables but also the processing of complex predicates.Extensive experiments and the evaluation of BiLSTM-Attention Cardinality Estimator(BACE)on the IMDB datasets are conducted.The results show that the deep learning model can significantly improve the quality of cardinality estimation,which is a vital role in query optimisation for complex databases.