Ground water is a crucial ecological resource and source of drinking water to a great percentage of theworld population.The quality of groundwater in an area with industrial emission and air pollution is an especially...Ground water is a crucial ecological resource and source of drinking water to a great percentage of theworld population.The quality of groundwater in an area with industrial emission and air pollution is an especiallyimportant issue that requires proper evaluation.This paper introduces a spatiotemporal deep learning model thatincorporates the use of metaheuristic optimization in predicting groundwater quality in various pollution contexts.Thegiven method is a combination of the Spatial-Temporal-Assisted Deep Belief Network(StaDBN)and a hybrid WhaleOptimization Algorithm and Tiki-Taka Algorithms(WOA-TTA)that would model intricate patterns of contamination.Historical ground water data sets with the hydrochemical data and time are preprocessed and pertinent and nonredundant features are determined with the Addax Optimization Algorithm(AOA).Spatial and temporal dependenciesare explicitly integrated in StaDBN architecture to facilitate representation learning,and network hyperparametersare optimized by the WOA-TTA module to increase the training efficiency and predictive performance.The modelwas coded in Python and tested based on common statistical measures,such as root mean square error(RMSE),Nash Sutcliffe efficiency(NSE),mean absolute error(MAE),and the correlation coefficient(R).The proposedGWQP-StaDBN-WOA-TTA framework demonstrates superior predictive performance and interpretability comparedto conventional machine learning and deep learning models,achieving higher correlation(R=0.963),improvedNash-Sutcliffe efficiency(NSE=0.84),and substantially lower prediction errors(MAE=0.29,RMSE=0.48),therebyvalidating its effectiveness for groundwater quality assessment under industrial and atmospheric pollution scenarios.展开更多
Variable stiffness composites present a promising solution for mitigating impact loads via varying the fiber volume fraction layer-wise,thereby adjusting the panel's stiffness.Since each layer of the composite may...Variable stiffness composites present a promising solution for mitigating impact loads via varying the fiber volume fraction layer-wise,thereby adjusting the panel's stiffness.Since each layer of the composite may be affected by a different failure mode,the optimal fiber volume fraction to suppress damage initiation and evolution is different across the layers.This research examines how re-allocating the fibers layer-wise enhances the composites'impact resistance.In this study,constant stiffness panels with the same fiber volume fraction throughout the layers are compared to variable stiffness ones by varying volume fraction layer-wise.A method is established that utilizes numerical analysis coupled with optimization techniques to determine the optimal fiber volume fraction in both scenarios.Three different reinforcement fibers(Kevlar,carbon,and glass)embedded in epoxy resin were studied.Panels were manufactured and tested under various loading conditions to validate results.Kevlar reinforcement revealed the highest tensile toughness,followed by carbon and then glass fibers.Varying reinforcement volume fraction significantly influences failure modes.Higher fractions lead to matrix cracking and debonding,while lower fractions result in more fiber breakage.The optimal volume fraction for maximizing fiber breakage energy is around 45%,whereas it is about 90%for matrix cracking and debonding.A drop tower test was used to examine the composite structure's behavior under lowvelocity impact,confirming the superiority of Kevlar-reinforced composites with variable stiffness.Conversely,glass-reinforced composites with constant stiffness revealed the lowest performance with the highest deflection.Across all reinforcement materials,the variable stiffness structure consistently outperformed its constant stiffness counterpart.展开更多
The increasing integration of cyber-physical components in Industry 4.0 water infrastructures has heightened the risk of false data injection(FDI)attacks,posing critical threats to operational integrity,resource manag...The increasing integration of cyber-physical components in Industry 4.0 water infrastructures has heightened the risk of false data injection(FDI)attacks,posing critical threats to operational integrity,resource management,and public safety.Traditional detection mechanisms often struggle to generalize across heterogeneous environments or adapt to sophisticated,stealthy threats.To address these challenges,we propose a novel evolutionary optimized transformer-based deep reinforcement learning framework(Evo-Transformer-DRL)designed for robust and adaptive FDI detection in smart water infrastructures.The proposed architecture integrates three powerful paradigms:a transformer encoder for modeling complex temporal dependencies in multivariate time series,a DRL agent for learning optimal decision policies in dynamic environments,and an evolutionary optimizer to fine-tune model hyper-parameters.This synergy enhances detection performance while maintaining adaptability across varying data distributions.Specifically,hyper-parameters of both the transformer and DRL modules are optimized using an improved grey wolf optimizer(IGWO),ensuring a balanced trade-off between detection accuracy and computational efficiency.The model is trained and evaluated on three realistic Industry 4.0 water datasets:secure water treatment(SWaT),water distribution(WADI),and battle of the attack detection algorithms(BATADAL),which capture diverse attack scenarios in smart treatment and distribution systems.Comparative analysis against state-of-the-art baselines including Transformer,DRL,bidirectional encoder representations from transformers(BERT),convolutional neural network(CNN),long short-term memory(LSTM),and support vector machines(SVM)demonstrates that our proposed Evo-Transformer-DRL framework consistently outperforms others in key metrics such as accuracy,recall,area under the curve(AUC),and execution time.Notably,it achieves a maximum detection accuracy of 99.19%,highlighting its strong generalization capability across different testbeds.These results confirm the suitability of our hybrid framework for real-world Industry 4.0 deployment,where rapid adaptation,scalability,and reliability are paramount for securing critical infrastructure systems.展开更多
Data serves as the foundation for training and testing machine learning and artificial intelligencemodels.The most fundamental part of data is its attributes or features.The feature set size changes from one dataset t...Data serves as the foundation for training and testing machine learning and artificial intelligencemodels.The most fundamental part of data is its attributes or features.The feature set size changes from one dataset to another.Only the relevant features contributemeaningfully to classificationaccuracy.The presence of irrelevant features reduces the system’s effectiveness.Classification performance often deteriorates on high-dimensional datasets due to the large search space.Thus,one of the significant obstacles affecting the performance of the learning process in the majority of machine learning and data mining techniques is the dimensionality of the datasets.Feature selection(FS)is an effective preprocessing step in classification tasks.The aim of applying FS is to exclude redundant and unrelated features while retaining the most informative ones to optimize classification capability and compress computational complexity.In this paper,a novel hybrid binary metaheuristic algorithm,termed hSC-FPA,is proposed by hybridizing the Flower Pollination Algorithm(FPA)and the Sine Cosine Algorithm(SCA).Hybridization controls the exploration capacity of SCA and the exploitation behavior of FPA to maintain a balanced search process.SCA guides the global search in the early iterations,while FPA’s local pollination refines promising solutions in later stages.A binary conversion mechanism using a threshold function is implemented to handle the discrete nature of the feature selection problem.The functionality of the proposed hSC-FPA is authenticated on fourteen standard datasets from the UCI repository using the K-Nearest Neighbors(K-NN)classifier.Experimental results are benchmarked against the standalone SCA and FPA algorithms.The hSC-FPA consistently achieves higher classification accuracy,selects a more compact feature subset,and demonstrates superior convergence behavior.These findings support the stability and outperformance of the hybrid feature selection method presented.展开更多
The failure of liquid storage tanks,one of the most critical infrastructure systems widely used,during severe earthquakes can have direct or indirect impacts on public safety.The significance of their safe performance...The failure of liquid storage tanks,one of the most critical infrastructure systems widely used,during severe earthquakes can have direct or indirect impacts on public safety.The significance of their safe performance even after destructive earthquakes and their potential for operational use underscores the necessity of appropriate seismic design.Hence,seismic isolation,specifically base isolation,has gained attention as a seismic control method to reduce damage to these infrastructures by increasing their vibration period.One prevalent type of seismic isolator used for tanks and other structures is the friction pendulum system(FPS)isolator.However,due to its fixed period or frequency,it may be susceptible to resonance effects during long-period earthquakes.This research explores an alternative solution by investigating the variable-curvature friction pendulum isolator(VFPI).This isolator type exhibits behavior similar to that of FPS isolators under low excitations and transforms into a pure friction system under high excitations.The study proposes optimizing this VFPI,which features a polynomial function termed the Polynomial Friction Pendulum Isolator(PFPI),by introducing a suitable optimization function to minimize the acceleration transmitted to the superstructure,thereby improving the dynamic performance of the elevated storage tank.The research utilizes two wellestablished metaheuristic algorithms for optimization.It evaluates the effectiveness of the proposed isolator through time history analysis using the state space procedure under various ground motion records.Results,particularly under long-period ground motions,indicate a substantial reduction in the dynamic response of an elevated liquid storage tank equipped with the optimized PFPI.This underscores the potential of the proposed solution in enhancing the seismic resilience of liquid storage tanks.展开更多
Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a ...Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a Convolutional Neural Network(CNN)with a Bidirectional Long Short-Term Memory(BiLSTM)architecture,optimized using the Firefly Optimization algorithm(FO).The proposed CNN-BiLSTM-FO model is tailored for structured biomedical data,capturing both local patterns and sequential dependencies in diagnostic features,while the Firefly Algorithm fine-tunes key hyperparameters to maximize predictive performance.The approach is evaluated on two benchmark biomedical datasets:one comprising diagnostic data for bone cancer detection and another for identifying marrow cell abnormalities.Experimental results demonstrate that the proposed method outperforms standard deep learning models,including CNN,LSTM,BiLSTM,and CNN-LSTM hybrids,significantly.The CNNBiLSTM-FO model achieves an accuracy of 98.55%for bone cancer detection and 96.04%for marrow abnormality classification.The paper also presents a detailed complexity analysis of the proposed algorithm and compares its performance across multiple evaluation metrics such as precision,recall,F1-score,and AUC.The results confirm the effectiveness of the firefly-based optimization strategy in improving classification accuracy and model robustness.This work introduces a scalable and accurate diagnostic solution that holds strong potential for integration into intelligent clinical decision-support systems.展开更多
Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on ...Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on machine learning of rock visible and near-infrared spectral data.First,the rock spectral data are preprocessed using Savitzky-Golay(SG)smoothing to remove the noise of the spectral data;then,the preprocessed rock spectral data are downscaled using Principal Component Analysis(PCA)to reduce the redundancy of the data,optimize the effective discriminative information,and obtain the rock spectral features;finally,a Bayesian-optimized lithology identification model is established based on rock spectral features,optimize the model hyperparameters using Bayesian optimization(BO)algorithm to avoid the combination of hyperparameters falling into the local optimal solution,and output the predicted type of rock,so as to realize the Bayesian-optimized lithology identification.In addition,this paper conducts comparative analysis on models based on Artificial Neural Network(ANN)/Random Forest(RF),dimensionality reduction/full band,and optimization algorithms.It uses the confusion matrix,accuracy,Precison(P),Recall(R)and F_(1)values(F_(1))as the evaluation indexes of model accuracy.The results indicate that the lithology identification model optimized by the BO-ANN after dimensionality reduction achieves an accuracy of up to 99.80%,up to 99.79%and up to 99.79%.Compared with the BO-RF model,it has higher identification accuracy and better stability for each type of rock identification.The experiments and reliability analysis show that the Bayesian-optimized lithology identification method proposed in this paper has good robustness and generalization performance,which is of great significance for realizing fast,accurate and Bayesian-optimized lithology identification in tunnel site.展开更多
Prediction of stability in SG(Smart Grid)is essential in maintaining consistency and reliability of power supply in grid infrastructure.Analyzing the fluctuations in power generation and consumption patterns of smart ...Prediction of stability in SG(Smart Grid)is essential in maintaining consistency and reliability of power supply in grid infrastructure.Analyzing the fluctuations in power generation and consumption patterns of smart cities assists in effectively managing continuous power supply in the grid.It also possesses a better impact on averting overloading and permitting effective energy storage.Even though many traditional techniques have predicted the consumption rate for preserving stability,enhancement is required in prediction measures with minimized loss.To overcome the complications in existing studies,this paper intends to predict stability from the smart grid stability prediction dataset using machine learning algorithms.To accomplish this,pre-processing is performed initially to handle missing values since it develops biased models when missing values are mishandled and performs feature scaling to normalize independent data features.Then,the pre-processed data are taken for training and testing.Following that,the regression process is performed using Modified PSO(Particle Swarm Optimization)optimized XGBoost Technique with dynamic inertia weight update,which analyses variables like gamma(G),reaction time(tau1–tau4),and power balance(p1–p4)for providing effective future stability in SG.Since PSO attains optimal solution by adjusting position through dynamic inertial weights,it is integrated with XGBoost due to its scalability and faster computational speed characteristics.The hyperparameters of XGBoost are fine-tuned in the training process for achieving promising outcomes on prediction.Regression results are measured through evaluation metrics such as MSE(Mean Square Error)of 0.011312781,MAE(Mean Absolute Error)of 0.008596322,and RMSE(Root Mean Square Error)of 0.010636156 and MAPE(Mean Absolute Percentage Error)value of 0.0052 which determine the efficacy of the system.展开更多
Effective completion design in hydraulic fracturing(HF)is crucial for optimizing production in unconventional reservoirs.Traditional geometric designs often fail to account for geological and engineering heterogeneity...Effective completion design in hydraulic fracturing(HF)is crucial for optimizing production in unconventional reservoirs.Traditional geometric designs often fail to account for geological and engineering heterogeneity,leading to suboptimal stimulation.This study introduces a mechanism-guided data-driven model for optimized completion design that covers the entire process from sweet spot evaluation to stage and cluster optimization.For geological sweet spot evaluation,a mechanism-guided weighted K-medoids clustering model was developed by assigning weights to petrophysical parameters based on their correlation with production profiles.Engineering sweet spots were characterized using bottomhole mechanical specific energy(MSEb)and minimum horizontal in-situ stress(Shmin).The completion design optimization employed dynamic programming and a hybrid multi-objective optimization approach(NSGA-II),integrating geological and engineering sweet spots with operational constraints.The study showed a positive correlation between high-quality geological sweet spots and production(average correlation coefficient of 0.34),and a negative correlation between fluid allocation and engineering sweet spots(correlation coefficient of−0.46).Field application in the Jimsar Sag,Xinjiang,demonstrated that the proposed model significantly outperforms traditional geometric designs.Test wells showed an average 186%increase in cumulative production per 100 m over three months compared to conventional wells.The key findings of this work provide a novel technical pathway for optimized completion design of unconventional reservoirs with significant engineering applicability.展开更多
In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the...In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the resilience and dependability of software systems.This study presents a novel software defect prediction technique that significantly enhances performance through a hybrid machine learning approach.The innovative methodology integrates a Genetic Algorithm(GA)for precise feature selection,a Decision Tree(DT)for robust classification,and leverages the capabilities of Particle Swarm Optimization(PSO)and Ant Colony Optimization(ACO)algorithms for precision⁃driven optimization.The utilization of datasets from varied sources enriches the predictive prowess of our model.Of particular significance in our pursuit is the unwavering focus on enhancing the prediction process through a highly refined PSO⁃ACO algorithm,thereby optimizing the efficiency and effectiveness of the GA⁃DT hybrid model.The thorough evaluation of our proposed approach unfolds across seven software projects,unveiling a paradigm shift in performance metrics.Results unequivocally demonstrate that the GA⁃DT with PSO⁃ACO algorithm surpasses its counterparts,showcasing unparalleled accuracy and reliability.Furthermore,our hybrid approach demonstrates outstanding performance in terms of F⁃measure,with an impressive increase rate of 78%.展开更多
Detecting Alzheimer’s disease is essential for patient care,as an accurate diagnosis influences treatment options.Classifying dementia from non-dementia in brain MRIs is challenging due to features such as hippocampa...Detecting Alzheimer’s disease is essential for patient care,as an accurate diagnosis influences treatment options.Classifying dementia from non-dementia in brain MRIs is challenging due to features such as hippocampal atrophy,while manual diagnosis is susceptible to error.Optimal computer-aided diagnosis(CAD)systems are essential for improving accuracy and reducing misclassification risks.This study proposes an optimized ensemble method(CEOE-Net)that initiates with the selection of pre-trained models,including DenseNet121,ResNet50V2,and ResNet152V2 for unique feature extraction.Each selected model is enhanced with the inclusion of a channel attention(CA)block to improve the feature extraction process.In addition,this study employs the Short Time Fourier transform(STFT)technique with each individual model for hierarchical feature extraction before making final predictions in classifying MRI images of dementia and non-demented individuals,considering them as backbone models for building the ensemble method.STFT highlights subtle differences in brain structure and activity,particularly when combined with CA mechanisms that emphasize relevant features by converting spatial data into the frequency domain.The predictions generated from these models are then processed by the Chaotic Evolution Optimization(CEO)algorithm,which determines the optimal weightage set for each backbone model to maximize their contribution.The CEO optimizer explores weight distribution to ensure the most effective combination of model predictions for enhancing classification accuracy,thus significantly improving overall ensemble performance.This study utilized three datasets for validation:two private clinical brain MRI datasets(OSASIS and ADNI)to test the proposed model’s effectiveness.Image augmentation techniques were also employed to enhance dataset diversity and improve classification performance.The proposed CEOE-Net outperforms conventional baseline models and existing methods by showing its effectiveness as a clinical tool for the accurate classification of dementia and non-dementia MRI brain images,as well as autistic and non-autistic facial features.It achieved consistent accuracies of 93.44%on OSASIS and 81.94%on ADNI.展开更多
Fractional differential equations(FDEs)provide a powerful tool for modeling systems with memory and non-local effects,but understanding their underlying structure remains a significant challenge.While numerous numeric...Fractional differential equations(FDEs)provide a powerful tool for modeling systems with memory and non-local effects,but understanding their underlying structure remains a significant challenge.While numerous numerical and semi-analytical methods exist to find solutions,new approaches are needed to analyze the intrinsic properties of the FDEs themselves.This paper introduces a novel computational framework for the structural analysis of FDEs involving iterated Caputo derivatives.The methodology is based on a transformation that recasts the original FDE into an equivalent higher-order form,represented as the sum of a closed-form,integer-order component G(y)and a residual fractional power seriesΨ(x).This transformed FDE is subsequently reduced to a first-order ordinary differential equation(ODE).The primary novelty of the proposed methodology lies in treating the structure of the integer-order component G(y)not as fixed,but as a parameterizable polynomial whose coefficients can be determined via global optimization.Using particle swarm optimization,the framework identifies an optimal ODE architecture by minimizing a dual objective that balances solution accuracy against a high-fidelity reference and the magnitude of the truncated residual series.The effectiveness of the approach is demonstrated on both a linear FDE and a nonlinear fractional Riccati equation.Results demonstrate that the framework successfully identifies an optimal,low-degree polynomial ODE architecture that is not necessarily identical to the forcing function of the original FDE.This work provides a new tool for analyzing the underlying structure of FDEs and gaining deeper insights into the interplay between local and non-local dynamics in fractional systems.展开更多
Multi-instance image generation remains a challenging task in the field of computer vision.While existing diffusionmodels demonstrate impressive fidelity in image generation,they often struggle with precisely controll...Multi-instance image generation remains a challenging task in the field of computer vision.While existing diffusionmodels demonstrate impressive fidelity in image generation,they often struggle with precisely controlling each object’s shape,pose,and size.Methods like layout-to-image and mask-to-image provide spatial guidance but frequently suffer from object shape distortion,overlaps,and poor consistency,particularly in complex scenes with multiple objects.To address these issues,we introduce PolyDiffusion,a contour-based diffusion framework that encodes each object’s contour as a boundary-coordinate sequence,decoupling object shapes and positions.This approach allows for better control over object geometry and spatial positioning,which is critical for achieving high-quality multiinstance generation.We formulate the training process as a multi-objective optimization problem,balancing three key objectives:a denoising diffusion loss to maintain overall image fidelity,a cross-attention contour alignment loss to ensure precise shape adherence,and a reward-guided denoising objective that minimizes the Fréchet distance to real images.In addition,the Object Space-Aware Attention module fuses contour tokens with visual features,while a prior-guided fusion mechanism utilizes inter-object spatial relationships and class semantics to enhance consistency across multiple objects.Experimental results on benchmark datasets such as COCO-Stuff and VOC-2012 demonstrate that PolyDiffusion significantly outperforms existing layout-to-image and mask-to-image methods,achieving notable improvements in both image quality and instance-level segmentation accuracy.The implementation of Poly Diffusion is available at https://github.com/YYYYYJS/PolyDiffusion(accessed on 06 August 2025).展开更多
Accurate daily suspended sediment load(SSL)prediction is essential for sustainable water resource management,sediment control,and environmental planning.However,SSL prediction is highly complex due to its nonlinear an...Accurate daily suspended sediment load(SSL)prediction is essential for sustainable water resource management,sediment control,and environmental planning.However,SSL prediction is highly complex due to its nonlinear and dynamic nature,making traditional empirical models inadequate.This study proposes a novel hybrid approach,integrating the Adaptive Neuro-Fuzzy Inference System(ANFIS)with the Gradient-Based Optimizer(GBO),to enhance SSL forecasting accuracy.The research compares the performance of ANFIS-GBO with three alternative models:standard ANFIS,ANFIS with Particle Swarm Optimization(ANFIS-PSO),and ANFIS with Grey Wolf Optimization(ANFIS-GWO).Historical SSL and streamflow data from the Bailong River Basin,China,are used to train and validate the models.The input selection process is optimized using the Multivariate Adaptive Regression Splines(MARS)method.Model performance is evaluated using statistical metrics such as Root Mean Square Error(RMSE),Mean Absolute Error(MAE),Mean Absolute Percentage Error(MAPE),Nash Sutcliffe Efficiency(NSE),and Determination Coefficient(R^(2)).Additionally,visual assessments,including scatter plots,Taylor diagrams,and violin plots,provide further insights into model reliability.The results indicate that including historical SSL data improves predictive accuracy,with ANFIS-GBO outperforming the other models.ANFIS-GBO achieves the lowest RMSE and MAE and the highest NSE and R^(2),demonstrating its superior learning ability and adaptability.The findings highlight the effectiveness of nature-inspired optimization algorithms in enhancing sediment load forecasting and contribute to the advancement of AI-based hydrological modeling.Future research should explore the integration of additional environmental and climatic variables to enhance predictive capabilities further.展开更多
Surface morphology of Ceratocanthus beetle elytra was investigated for spike surface texture and its geometry using Scanning Electron Microscopy(SEM).Material properties were analyzed for both surface and cross-sectio...Surface morphology of Ceratocanthus beetle elytra was investigated for spike surface texture and its geometry using Scanning Electron Microscopy(SEM).Material properties were analyzed for both surface and cross-section of elytra using nano-indentation technique.The spike texture was significantly rigid compared with the non-textured zone;a bi-layer system of E and H was identified at the elytra cross-section.Normal load acting on spike texture during free-fall conditions was estimated analytically and deflection equation was derived.The design of spike texture with conical base was studied for minimization of deflection and volume using the Non-dominated Sorting Genetic Algorithm(NSGA-II)optimization technique,confirming the smart design of the natural solution.The frictional behavior of elytra was studied using fundamental tribology test and the role of the oriented spike texture was investigated for frictional anisotropy.Compression resistance of full beetle was evaluated for both conglobated and non-conglobated configuration and tensile strengths were compared using Brazilian test.Puncture and wear resistance of full elytra were characterized and correlated with its defense mechanism.展开更多
In the current digital context,safeguarding copyright is a major issue,particularly for architectural drawings produced by students.These works are frequently the result of innovative academic thinking combining creat...In the current digital context,safeguarding copyright is a major issue,particularly for architectural drawings produced by students.These works are frequently the result of innovative academic thinking combining creativity and technical precision.They are particularly vulnerable to the risk of illegal reproduction when disseminated in digital format.This research suggests,for the first time,an innovative approach to copyright protection by embedding a double digital watermark to address this challenge.The solution relies on a synergistic fusion of several sophisticated methods:Krawtchouk Optimized Octonion Moments(OKOM),Quaternion Singular Value Decomposition(QSVD),and Discrete Waveform Transform(DWT).To improve watermark embedding,the biologically inspired algorithm Chaos-White Shark Optimization(CWSO)is used,which allows dynamically adapting essential parameters such as the scaling factor of the insertion.Thus,two watermarks are inserted at the same time:an institutional logo and a student image,encoded in the main image(the architectural plan)through octonionic projections.This allows minimizing the amount of data to be integrated while increasing resistance.The suggested approach guarantees a perfect balance between the discreetness of the watermark(validated by PSNR indices>47 dB and SSIM>0.99)and its resistance to different attacks(JPEG compression,noise,rotation,resizing,filtering,etc.),as proven by the normalized correlation values(NC>0.9)obtained following the extraction.Therefore,this method represents a notable progress for securing academic works in architecture,providing an effective,discreet and reversible digital protection,which does not harm the visual appearance of the original works.展开更多
基金Fund for funding this research work under Research Support Program for Central labs at King Khalid University through the project number CL/CO/B/6.
文摘Ground water is a crucial ecological resource and source of drinking water to a great percentage of theworld population.The quality of groundwater in an area with industrial emission and air pollution is an especiallyimportant issue that requires proper evaluation.This paper introduces a spatiotemporal deep learning model thatincorporates the use of metaheuristic optimization in predicting groundwater quality in various pollution contexts.Thegiven method is a combination of the Spatial-Temporal-Assisted Deep Belief Network(StaDBN)and a hybrid WhaleOptimization Algorithm and Tiki-Taka Algorithms(WOA-TTA)that would model intricate patterns of contamination.Historical ground water data sets with the hydrochemical data and time are preprocessed and pertinent and nonredundant features are determined with the Addax Optimization Algorithm(AOA).Spatial and temporal dependenciesare explicitly integrated in StaDBN architecture to facilitate representation learning,and network hyperparametersare optimized by the WOA-TTA module to increase the training efficiency and predictive performance.The modelwas coded in Python and tested based on common statistical measures,such as root mean square error(RMSE),Nash Sutcliffe efficiency(NSE),mean absolute error(MAE),and the correlation coefficient(R).The proposedGWQP-StaDBN-WOA-TTA framework demonstrates superior predictive performance and interpretability comparedto conventional machine learning and deep learning models,achieving higher correlation(R=0.963),improvedNash-Sutcliffe efficiency(NSE=0.84),and substantially lower prediction errors(MAE=0.29,RMSE=0.48),therebyvalidating its effectiveness for groundwater quality assessment under industrial and atmospheric pollution scenarios.
基金funded by the American University of Sharjah.United Arab Emirates award number EN 9502-FRG19-M-E75。
文摘Variable stiffness composites present a promising solution for mitigating impact loads via varying the fiber volume fraction layer-wise,thereby adjusting the panel's stiffness.Since each layer of the composite may be affected by a different failure mode,the optimal fiber volume fraction to suppress damage initiation and evolution is different across the layers.This research examines how re-allocating the fibers layer-wise enhances the composites'impact resistance.In this study,constant stiffness panels with the same fiber volume fraction throughout the layers are compared to variable stiffness ones by varying volume fraction layer-wise.A method is established that utilizes numerical analysis coupled with optimization techniques to determine the optimal fiber volume fraction in both scenarios.Three different reinforcement fibers(Kevlar,carbon,and glass)embedded in epoxy resin were studied.Panels were manufactured and tested under various loading conditions to validate results.Kevlar reinforcement revealed the highest tensile toughness,followed by carbon and then glass fibers.Varying reinforcement volume fraction significantly influences failure modes.Higher fractions lead to matrix cracking and debonding,while lower fractions result in more fiber breakage.The optimal volume fraction for maximizing fiber breakage energy is around 45%,whereas it is about 90%for matrix cracking and debonding.A drop tower test was used to examine the composite structure's behavior under lowvelocity impact,confirming the superiority of Kevlar-reinforced composites with variable stiffness.Conversely,glass-reinforced composites with constant stiffness revealed the lowest performance with the highest deflection.Across all reinforcement materials,the variable stiffness structure consistently outperformed its constant stiffness counterpart.
文摘The increasing integration of cyber-physical components in Industry 4.0 water infrastructures has heightened the risk of false data injection(FDI)attacks,posing critical threats to operational integrity,resource management,and public safety.Traditional detection mechanisms often struggle to generalize across heterogeneous environments or adapt to sophisticated,stealthy threats.To address these challenges,we propose a novel evolutionary optimized transformer-based deep reinforcement learning framework(Evo-Transformer-DRL)designed for robust and adaptive FDI detection in smart water infrastructures.The proposed architecture integrates three powerful paradigms:a transformer encoder for modeling complex temporal dependencies in multivariate time series,a DRL agent for learning optimal decision policies in dynamic environments,and an evolutionary optimizer to fine-tune model hyper-parameters.This synergy enhances detection performance while maintaining adaptability across varying data distributions.Specifically,hyper-parameters of both the transformer and DRL modules are optimized using an improved grey wolf optimizer(IGWO),ensuring a balanced trade-off between detection accuracy and computational efficiency.The model is trained and evaluated on three realistic Industry 4.0 water datasets:secure water treatment(SWaT),water distribution(WADI),and battle of the attack detection algorithms(BATADAL),which capture diverse attack scenarios in smart treatment and distribution systems.Comparative analysis against state-of-the-art baselines including Transformer,DRL,bidirectional encoder representations from transformers(BERT),convolutional neural network(CNN),long short-term memory(LSTM),and support vector machines(SVM)demonstrates that our proposed Evo-Transformer-DRL framework consistently outperforms others in key metrics such as accuracy,recall,area under the curve(AUC),and execution time.Notably,it achieves a maximum detection accuracy of 99.19%,highlighting its strong generalization capability across different testbeds.These results confirm the suitability of our hybrid framework for real-world Industry 4.0 deployment,where rapid adaptation,scalability,and reliability are paramount for securing critical infrastructure systems.
基金supported by a research grant from Lahore College for Women University(LCWU),Lahore,Pakistan.
文摘Data serves as the foundation for training and testing machine learning and artificial intelligencemodels.The most fundamental part of data is its attributes or features.The feature set size changes from one dataset to another.Only the relevant features contributemeaningfully to classificationaccuracy.The presence of irrelevant features reduces the system’s effectiveness.Classification performance often deteriorates on high-dimensional datasets due to the large search space.Thus,one of the significant obstacles affecting the performance of the learning process in the majority of machine learning and data mining techniques is the dimensionality of the datasets.Feature selection(FS)is an effective preprocessing step in classification tasks.The aim of applying FS is to exclude redundant and unrelated features while retaining the most informative ones to optimize classification capability and compress computational complexity.In this paper,a novel hybrid binary metaheuristic algorithm,termed hSC-FPA,is proposed by hybridizing the Flower Pollination Algorithm(FPA)and the Sine Cosine Algorithm(SCA).Hybridization controls the exploration capacity of SCA and the exploitation behavior of FPA to maintain a balanced search process.SCA guides the global search in the early iterations,while FPA’s local pollination refines promising solutions in later stages.A binary conversion mechanism using a threshold function is implemented to handle the discrete nature of the feature selection problem.The functionality of the proposed hSC-FPA is authenticated on fourteen standard datasets from the UCI repository using the K-Nearest Neighbors(K-NN)classifier.Experimental results are benchmarked against the standalone SCA and FPA algorithms.The hSC-FPA consistently achieves higher classification accuracy,selects a more compact feature subset,and demonstrates superior convergence behavior.These findings support the stability and outperformance of the hybrid feature selection method presented.
文摘The failure of liquid storage tanks,one of the most critical infrastructure systems widely used,during severe earthquakes can have direct or indirect impacts on public safety.The significance of their safe performance even after destructive earthquakes and their potential for operational use underscores the necessity of appropriate seismic design.Hence,seismic isolation,specifically base isolation,has gained attention as a seismic control method to reduce damage to these infrastructures by increasing their vibration period.One prevalent type of seismic isolator used for tanks and other structures is the friction pendulum system(FPS)isolator.However,due to its fixed period or frequency,it may be susceptible to resonance effects during long-period earthquakes.This research explores an alternative solution by investigating the variable-curvature friction pendulum isolator(VFPI).This isolator type exhibits behavior similar to that of FPS isolators under low excitations and transforms into a pure friction system under high excitations.The study proposes optimizing this VFPI,which features a polynomial function termed the Polynomial Friction Pendulum Isolator(PFPI),by introducing a suitable optimization function to minimize the acceleration transmitted to the superstructure,thereby improving the dynamic performance of the elevated storage tank.The research utilizes two wellestablished metaheuristic algorithms for optimization.It evaluates the effectiveness of the proposed isolator through time history analysis using the state space procedure under various ground motion records.Results,particularly under long-period ground motions,indicate a substantial reduction in the dynamic response of an elevated liquid storage tank equipped with the optimized PFPI.This underscores the potential of the proposed solution in enhancing the seismic resilience of liquid storage tanks.
文摘Early and accurate detection of bone cancer and marrow cell abnormalities is critical for timely intervention and improved patient outcomes.This paper proposes a novel hybrid deep learning framework that integrates a Convolutional Neural Network(CNN)with a Bidirectional Long Short-Term Memory(BiLSTM)architecture,optimized using the Firefly Optimization algorithm(FO).The proposed CNN-BiLSTM-FO model is tailored for structured biomedical data,capturing both local patterns and sequential dependencies in diagnostic features,while the Firefly Algorithm fine-tunes key hyperparameters to maximize predictive performance.The approach is evaluated on two benchmark biomedical datasets:one comprising diagnostic data for bone cancer detection and another for identifying marrow cell abnormalities.Experimental results demonstrate that the proposed method outperforms standard deep learning models,including CNN,LSTM,BiLSTM,and CNN-LSTM hybrids,significantly.The CNNBiLSTM-FO model achieves an accuracy of 98.55%for bone cancer detection and 96.04%for marrow abnormality classification.The paper also presents a detailed complexity analysis of the proposed algorithm and compares its performance across multiple evaluation metrics such as precision,recall,F1-score,and AUC.The results confirm the effectiveness of the firefly-based optimization strategy in improving classification accuracy and model robustness.This work introduces a scalable and accurate diagnostic solution that holds strong potential for integration into intelligent clinical decision-support systems.
基金support from the National Natural Science Foundation of China(Grant Nos:52379103 and 52279103)the Natural Science Foundation of Shandong Province(Grant No:ZR2023YQ049).
文摘Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on machine learning of rock visible and near-infrared spectral data.First,the rock spectral data are preprocessed using Savitzky-Golay(SG)smoothing to remove the noise of the spectral data;then,the preprocessed rock spectral data are downscaled using Principal Component Analysis(PCA)to reduce the redundancy of the data,optimize the effective discriminative information,and obtain the rock spectral features;finally,a Bayesian-optimized lithology identification model is established based on rock spectral features,optimize the model hyperparameters using Bayesian optimization(BO)algorithm to avoid the combination of hyperparameters falling into the local optimal solution,and output the predicted type of rock,so as to realize the Bayesian-optimized lithology identification.In addition,this paper conducts comparative analysis on models based on Artificial Neural Network(ANN)/Random Forest(RF),dimensionality reduction/full band,and optimization algorithms.It uses the confusion matrix,accuracy,Precison(P),Recall(R)and F_(1)values(F_(1))as the evaluation indexes of model accuracy.The results indicate that the lithology identification model optimized by the BO-ANN after dimensionality reduction achieves an accuracy of up to 99.80%,up to 99.79%and up to 99.79%.Compared with the BO-RF model,it has higher identification accuracy and better stability for each type of rock identification.The experiments and reliability analysis show that the Bayesian-optimized lithology identification method proposed in this paper has good robustness and generalization performance,which is of great significance for realizing fast,accurate and Bayesian-optimized lithology identification in tunnel site.
基金Prince Sattam bin Abdulaziz University project number(PSAU/2023/R/1445)。
文摘Prediction of stability in SG(Smart Grid)is essential in maintaining consistency and reliability of power supply in grid infrastructure.Analyzing the fluctuations in power generation and consumption patterns of smart cities assists in effectively managing continuous power supply in the grid.It also possesses a better impact on averting overloading and permitting effective energy storage.Even though many traditional techniques have predicted the consumption rate for preserving stability,enhancement is required in prediction measures with minimized loss.To overcome the complications in existing studies,this paper intends to predict stability from the smart grid stability prediction dataset using machine learning algorithms.To accomplish this,pre-processing is performed initially to handle missing values since it develops biased models when missing values are mishandled and performs feature scaling to normalize independent data features.Then,the pre-processed data are taken for training and testing.Following that,the regression process is performed using Modified PSO(Particle Swarm Optimization)optimized XGBoost Technique with dynamic inertia weight update,which analyses variables like gamma(G),reaction time(tau1–tau4),and power balance(p1–p4)for providing effective future stability in SG.Since PSO attains optimal solution by adjusting position through dynamic inertial weights,it is integrated with XGBoost due to its scalability and faster computational speed characteristics.The hyperparameters of XGBoost are fine-tuned in the training process for achieving promising outcomes on prediction.Regression results are measured through evaluation metrics such as MSE(Mean Square Error)of 0.011312781,MAE(Mean Absolute Error)of 0.008596322,and RMSE(Root Mean Square Error)of 0.010636156 and MAPE(Mean Absolute Percentage Error)value of 0.0052 which determine the efficacy of the system.
基金supported by the National Natural Science Foundation of China(Grant/Award Nos.:52310001009 and 52122401).
文摘Effective completion design in hydraulic fracturing(HF)is crucial for optimizing production in unconventional reservoirs.Traditional geometric designs often fail to account for geological and engineering heterogeneity,leading to suboptimal stimulation.This study introduces a mechanism-guided data-driven model for optimized completion design that covers the entire process from sweet spot evaluation to stage and cluster optimization.For geological sweet spot evaluation,a mechanism-guided weighted K-medoids clustering model was developed by assigning weights to petrophysical parameters based on their correlation with production profiles.Engineering sweet spots were characterized using bottomhole mechanical specific energy(MSEb)and minimum horizontal in-situ stress(Shmin).The completion design optimization employed dynamic programming and a hybrid multi-objective optimization approach(NSGA-II),integrating geological and engineering sweet spots with operational constraints.The study showed a positive correlation between high-quality geological sweet spots and production(average correlation coefficient of 0.34),and a negative correlation between fluid allocation and engineering sweet spots(correlation coefficient of−0.46).Field application in the Jimsar Sag,Xinjiang,demonstrated that the proposed model significantly outperforms traditional geometric designs.Test wells showed an average 186%increase in cumulative production per 100 m over three months compared to conventional wells.The key findings of this work provide a novel technical pathway for optimized completion design of unconventional reservoirs with significant engineering applicability.
文摘In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the resilience and dependability of software systems.This study presents a novel software defect prediction technique that significantly enhances performance through a hybrid machine learning approach.The innovative methodology integrates a Genetic Algorithm(GA)for precise feature selection,a Decision Tree(DT)for robust classification,and leverages the capabilities of Particle Swarm Optimization(PSO)and Ant Colony Optimization(ACO)algorithms for precision⁃driven optimization.The utilization of datasets from varied sources enriches the predictive prowess of our model.Of particular significance in our pursuit is the unwavering focus on enhancing the prediction process through a highly refined PSO⁃ACO algorithm,thereby optimizing the efficiency and effectiveness of the GA⁃DT hybrid model.The thorough evaluation of our proposed approach unfolds across seven software projects,unveiling a paradigm shift in performance metrics.Results unequivocally demonstrate that the GA⁃DT with PSO⁃ACO algorithm surpasses its counterparts,showcasing unparalleled accuracy and reliability.Furthermore,our hybrid approach demonstrates outstanding performance in terms of F⁃measure,with an impressive increase rate of 78%.
基金supported in part by the Science and Technology Major Special Project Fund of Changsha(No.kh2401010)in part by the High-Performance Computing Center of Central South University+3 种基金supported by the National Natural Science Foundation of China(Grants Nos.82022024,31970572)The Science and Technology Innovation Program of Hunan Province(2021RC4018,2021RC5027)Innovation-Driven Project of Central South University(Grant No.2020CX003)NIH grants U01 MH122591,1U01MH116489,1R01MH110920,R01MH126459.
文摘Detecting Alzheimer’s disease is essential for patient care,as an accurate diagnosis influences treatment options.Classifying dementia from non-dementia in brain MRIs is challenging due to features such as hippocampal atrophy,while manual diagnosis is susceptible to error.Optimal computer-aided diagnosis(CAD)systems are essential for improving accuracy and reducing misclassification risks.This study proposes an optimized ensemble method(CEOE-Net)that initiates with the selection of pre-trained models,including DenseNet121,ResNet50V2,and ResNet152V2 for unique feature extraction.Each selected model is enhanced with the inclusion of a channel attention(CA)block to improve the feature extraction process.In addition,this study employs the Short Time Fourier transform(STFT)technique with each individual model for hierarchical feature extraction before making final predictions in classifying MRI images of dementia and non-demented individuals,considering them as backbone models for building the ensemble method.STFT highlights subtle differences in brain structure and activity,particularly when combined with CA mechanisms that emphasize relevant features by converting spatial data into the frequency domain.The predictions generated from these models are then processed by the Chaotic Evolution Optimization(CEO)algorithm,which determines the optimal weightage set for each backbone model to maximize their contribution.The CEO optimizer explores weight distribution to ensure the most effective combination of model predictions for enhancing classification accuracy,thus significantly improving overall ensemble performance.This study utilized three datasets for validation:two private clinical brain MRI datasets(OSASIS and ADNI)to test the proposed model’s effectiveness.Image augmentation techniques were also employed to enhance dataset diversity and improve classification performance.The proposed CEOE-Net outperforms conventional baseline models and existing methods by showing its effectiveness as a clinical tool for the accurate classification of dementia and non-dementia MRI brain images,as well as autistic and non-autistic facial features.It achieved consistent accuracies of 93.44%on OSASIS and 81.94%on ADNI.
基金Research Council of Lithuania(LMTLT),agreement No.S-PD-24-120Research Council of Lithuania(LMTLT),agreement No.S-PD-24-120funded by the Research Council of Lithuania.
文摘Fractional differential equations(FDEs)provide a powerful tool for modeling systems with memory and non-local effects,but understanding their underlying structure remains a significant challenge.While numerous numerical and semi-analytical methods exist to find solutions,new approaches are needed to analyze the intrinsic properties of the FDEs themselves.This paper introduces a novel computational framework for the structural analysis of FDEs involving iterated Caputo derivatives.The methodology is based on a transformation that recasts the original FDE into an equivalent higher-order form,represented as the sum of a closed-form,integer-order component G(y)and a residual fractional power seriesΨ(x).This transformed FDE is subsequently reduced to a first-order ordinary differential equation(ODE).The primary novelty of the proposed methodology lies in treating the structure of the integer-order component G(y)not as fixed,but as a parameterizable polynomial whose coefficients can be determined via global optimization.Using particle swarm optimization,the framework identifies an optimal ODE architecture by minimizing a dual objective that balances solution accuracy against a high-fidelity reference and the magnitude of the truncated residual series.The effectiveness of the approach is demonstrated on both a linear FDE and a nonlinear fractional Riccati equation.Results demonstrate that the framework successfully identifies an optimal,low-degree polynomial ODE architecture that is not necessarily identical to the forcing function of the original FDE.This work provides a new tool for analyzing the underlying structure of FDEs and gaining deeper insights into the interplay between local and non-local dynamics in fractional systems.
基金supported in part by the Scientific Research Fund of National Natural Science Foundation of China(Grant No.62372168)the Hunan Provincial Natural Science Foundation of China(Grant No.2023JJ30266)+2 种基金the Research Project on teaching reform in Hunan province(No.HNJG-2022-0791)the Hunan University of Science and Technology(No.2022-44-8)the National Social Science Funds of China(19BZX044).
文摘Multi-instance image generation remains a challenging task in the field of computer vision.While existing diffusionmodels demonstrate impressive fidelity in image generation,they often struggle with precisely controlling each object’s shape,pose,and size.Methods like layout-to-image and mask-to-image provide spatial guidance but frequently suffer from object shape distortion,overlaps,and poor consistency,particularly in complex scenes with multiple objects.To address these issues,we introduce PolyDiffusion,a contour-based diffusion framework that encodes each object’s contour as a boundary-coordinate sequence,decoupling object shapes and positions.This approach allows for better control over object geometry and spatial positioning,which is critical for achieving high-quality multiinstance generation.We formulate the training process as a multi-objective optimization problem,balancing three key objectives:a denoising diffusion loss to maintain overall image fidelity,a cross-attention contour alignment loss to ensure precise shape adherence,and a reward-guided denoising objective that minimizes the Fréchet distance to real images.In addition,the Object Space-Aware Attention module fuses contour tokens with visual features,while a prior-guided fusion mechanism utilizes inter-object spatial relationships and class semantics to enhance consistency across multiple objects.Experimental results on benchmark datasets such as COCO-Stuff and VOC-2012 demonstrate that PolyDiffusion significantly outperforms existing layout-to-image and mask-to-image methods,achieving notable improvements in both image quality and instance-level segmentation accuracy.The implementation of Poly Diffusion is available at https://github.com/YYYYYJS/PolyDiffusion(accessed on 06 August 2025).
基金supported by the National Natural Science Foundation of China(52350410465)the General Projects of Guangdong Natural Science Research Projects(2023A1515011520).
文摘Accurate daily suspended sediment load(SSL)prediction is essential for sustainable water resource management,sediment control,and environmental planning.However,SSL prediction is highly complex due to its nonlinear and dynamic nature,making traditional empirical models inadequate.This study proposes a novel hybrid approach,integrating the Adaptive Neuro-Fuzzy Inference System(ANFIS)with the Gradient-Based Optimizer(GBO),to enhance SSL forecasting accuracy.The research compares the performance of ANFIS-GBO with three alternative models:standard ANFIS,ANFIS with Particle Swarm Optimization(ANFIS-PSO),and ANFIS with Grey Wolf Optimization(ANFIS-GWO).Historical SSL and streamflow data from the Bailong River Basin,China,are used to train and validate the models.The input selection process is optimized using the Multivariate Adaptive Regression Splines(MARS)method.Model performance is evaluated using statistical metrics such as Root Mean Square Error(RMSE),Mean Absolute Error(MAE),Mean Absolute Percentage Error(MAPE),Nash Sutcliffe Efficiency(NSE),and Determination Coefficient(R^(2)).Additionally,visual assessments,including scatter plots,Taylor diagrams,and violin plots,provide further insights into model reliability.The results indicate that including historical SSL data improves predictive accuracy,with ANFIS-GBO outperforming the other models.ANFIS-GBO achieves the lowest RMSE and MAE and the highest NSE and R^(2),demonstrating its superior learning ability and adaptability.The findings highlight the effectiveness of nature-inspired optimization algorithms in enhancing sediment load forecasting and contribute to the advancement of AI-based hydrological modeling.Future research should explore the integration of additional environmental and climatic variables to enhance predictive capabilities further.
基金supported by Ministero Universitàe Ricerca(MUR-PRIN 20222022ATZCJN AMPHYBIA)CUP N.E53D23003040006Ministero dell'istruzione dell'universitàe della ricerca(MIUR-PON 2018 PROSCAN)CUP N.E96C18000440008European Union NextGenerationEU PNRR Spoke 7 CN00000013 HPC CUP N.E63C22000970007.
文摘Surface morphology of Ceratocanthus beetle elytra was investigated for spike surface texture and its geometry using Scanning Electron Microscopy(SEM).Material properties were analyzed for both surface and cross-section of elytra using nano-indentation technique.The spike texture was significantly rigid compared with the non-textured zone;a bi-layer system of E and H was identified at the elytra cross-section.Normal load acting on spike texture during free-fall conditions was estimated analytically and deflection equation was derived.The design of spike texture with conical base was studied for minimization of deflection and volume using the Non-dominated Sorting Genetic Algorithm(NSGA-II)optimization technique,confirming the smart design of the natural solution.The frictional behavior of elytra was studied using fundamental tribology test and the role of the oriented spike texture was investigated for frictional anisotropy.Compression resistance of full beetle was evaluated for both conglobated and non-conglobated configuration and tensile strengths were compared using Brazilian test.Puncture and wear resistance of full elytra were characterized and correlated with its defense mechanism.
文摘In the current digital context,safeguarding copyright is a major issue,particularly for architectural drawings produced by students.These works are frequently the result of innovative academic thinking combining creativity and technical precision.They are particularly vulnerable to the risk of illegal reproduction when disseminated in digital format.This research suggests,for the first time,an innovative approach to copyright protection by embedding a double digital watermark to address this challenge.The solution relies on a synergistic fusion of several sophisticated methods:Krawtchouk Optimized Octonion Moments(OKOM),Quaternion Singular Value Decomposition(QSVD),and Discrete Waveform Transform(DWT).To improve watermark embedding,the biologically inspired algorithm Chaos-White Shark Optimization(CWSO)is used,which allows dynamically adapting essential parameters such as the scaling factor of the insertion.Thus,two watermarks are inserted at the same time:an institutional logo and a student image,encoded in the main image(the architectural plan)through octonionic projections.This allows minimizing the amount of data to be integrated while increasing resistance.The suggested approach guarantees a perfect balance between the discreetness of the watermark(validated by PSNR indices>47 dB and SSIM>0.99)and its resistance to different attacks(JPEG compression,noise,rotation,resizing,filtering,etc.),as proven by the normalized correlation values(NC>0.9)obtained following the extraction.Therefore,this method represents a notable progress for securing academic works in architecture,providing an effective,discreet and reversible digital protection,which does not harm the visual appearance of the original works.