In the development of linear quadratic regulator(LQR) algorithms, the Riccati equation approach offers two important characteristics——it is recursive and readily meets the existence condition. However, these attribu...In the development of linear quadratic regulator(LQR) algorithms, the Riccati equation approach offers two important characteristics——it is recursive and readily meets the existence condition. However, these attributes are applicable only to transformed singular systems, and the efficiency of the regulator may be undermined if constraints are violated in nonsingular versions. To address this gap, we introduce a direct approach to the LQR problem for linear singular systems, avoiding the need for any transformations and eliminating the need for regularity assumptions. To achieve this goal, we begin by formulating a quadratic cost function to derive the LQR algorithm through a penalized and weighted regression framework and then connect it to a constrained minimization problem using the Bellman's criterion. Then, we employ a dynamic programming strategy in a backward approach within a finite horizon to develop an LQR algorithm for the original system. To accomplish this, we address the stability and convergence analysis under the reachability and observability assumptions of a hypothetical system constructed by the pencil of augmented matrices and connected using the Hamiltonian diagonalization technique.展开更多
Breast cancer is among the leading causes of cancer mortality globally,and its diagnosis through histopathological image analysis is often prone to inter-observer variability and misclassification.Existing machine lea...Breast cancer is among the leading causes of cancer mortality globally,and its diagnosis through histopathological image analysis is often prone to inter-observer variability and misclassification.Existing machine learning(ML)methods struggle with intra-class heterogeneity and inter-class similarity,necessitating more robust classification models.This study presents an ML classifier ensemble hybrid model for deep feature extraction with deep learning(DL)and Bat Swarm Optimization(BSO)hyperparameter optimization to improve breast cancer histopathology(BCH)image classification.A dataset of 804 Hematoxylin and Eosin(H&E)stained images classified as Benign,in situ,Invasive,and Normal categories(ICIAR2018_BACH_Challenge)has been utilized.ResNet50 was utilized for feature extraction,while Support Vector Machines(SVM),Random Forests(RF),XGBoosts(XGB),Decision Trees(DT),and AdaBoosts(ADB)were utilized for classification.BSO was utilized for hyperparameter optimization in a soft voting ensemble approach.Accuracy,precision,recall,specificity,F1-score,Receiver Operating Characteristic(ROC),and Precision-Recall(PR)were utilized for model performance metrics.The model using an ensemble outperformed individual classifiers in terms of having greater accuracy(~90.0%),precision(~86.4%),recall(~86.3%),and specificity(~96.6%).The robustness of the model was verified by both ROC and PR curves,which showed AUC values of 1.00,0.99,and 0.98 for Benign,Invasive,and in situ instances,respectively.This ensemble model delivers a strong and clinically valid methodology for breast cancer classification that enhances precision and minimizes diagnostic errors.Future work should focus on explainable AI,multi-modal fusion,few-shot learning,and edge computing for real-world deployment.展开更多
Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveill...Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveillance,biometric authentication,and human-computer interaction.This paper provides a comprehensive review of face detection techniques developed to handle occluded faces.Studies are categorized into four main approaches:feature-based,machine learning-based,deep learning-based,and hybrid methods.We analyzed state-of-the-art studies within each category,examining their methodologies,strengths,and limitations based on widely used benchmark datasets,highlighting their adaptability to partial and severe occlusions.The review also identifies key challenges,including dataset diversity,model generalization,and computational efficiency.Our findings reveal that deep learning methods dominate recent studies,benefiting from their ability to extract hierarchical features and handle complex occlusion patterns.More recently,researchers have increasingly explored Transformer-based architectures,such as Vision Transformer(ViT)and Swin Transformer,to further improve detection robustness under challenging occlusion scenarios.In addition,hybrid approaches,which aim to combine traditional andmodern techniques,are emerging as a promising direction for improving robustness.This review provides valuable insights for researchers aiming to develop more robust face detection systems and for practitioners seeking to deploy reliable solutions in real-world,occlusionprone environments.Further improvements and the proposal of broader datasets are required to developmore scalable,robust,and efficient models that can handle complex occlusions in real-world scenarios.展开更多
This research aims to address the challenges of fault detection and isolation(FDI)in digital grids,focusing on improving the reliability and stability of power systems.Traditional fault detection techniques,such as ru...This research aims to address the challenges of fault detection and isolation(FDI)in digital grids,focusing on improving the reliability and stability of power systems.Traditional fault detection techniques,such as rule-based fuzzy systems and conventional FDI methods,often struggle with the dynamic nature of modern grids,resulting in delays and inaccuracies in fault classification.To overcome these limitations,this study introduces a Hybrid NeuroFuzzy Fault Detection Model that combines the adaptive learning capabilities of neural networks with the reasoning strength of fuzzy logic.The model’s performance was evaluated through extensive simulations on the IEEE 33-bus test system,considering various fault scenarios,including line-to-ground faults(LGF),three-phase short circuits(3PSC),and harmonic distortions(HD).The quantitative results show that the model achieves 97.2%accuracy,a false negative rate(FNR)of 1.9%,and a false positive rate(FPR)of 2.3%,demonstrating its high precision in fault diagnosis.The qualitative analysis further highlights the model’s adaptability and its potential for seamless integration into smart grids,micro grids,and renewable energy systems.By dynamically refining fuzzy inference rules,the model enhances fault detection efficiency without compromising computational feasibility.These findings contribute to the development of more resilient and adaptive fault management systems,paving the way for advanced smart grid technologies.展开更多
Face detection is a critical component inmodern security,surveillance,and human-computer interaction systems,with widespread applications in smartphones,biometric access control,and public monitoring.However,detecting...Face detection is a critical component inmodern security,surveillance,and human-computer interaction systems,with widespread applications in smartphones,biometric access control,and public monitoring.However,detecting faces with high levels of occlusion,such as those covered by masks,veils,or scarves,remains a significant challenge,as traditional models often fail to generalize under such conditions.This paper presents a hybrid approach that combines traditional handcrafted feature extraction technique called Histogram of Oriented Gradients(HOG)and Canny edge detection with modern deep learning models.The goal is to improve face detection accuracy under occlusions.The proposed method leverages the structural strengths of HOG and edge-based object proposals while exploiting the feature extraction capabilities of Convolutional Neural Networks(CNNs).The effectiveness of the proposed model is assessed using a custom dataset containing 10,000 heavily occluded face images and a subset of the Common Objects in Context(COCO)dataset for non-face samples.The COCO dataset was selected for its variety and realism in background contexts.Experimental evaluations demonstrate significant performance improvements compared to baseline CNN models.Results indicate that DenseNet121 combined with HOG outperforms other counterparts in classification metrics with an F1-score of 87.96%and precision of 88.02%.Enhanced performance is achieved through reduced false positives and improved localization accuracy with the integration of object proposals based on Canny and contour detection.While the proposed method increases inference time from 33.52 to 97.80 ms,it achieves a notable improvement in precision from 80.85% to 88.02% when comparing the baseline DenseNet121 model to its hybrid counterpart.Limitations of the method include higher computational cost and the need for careful tuning of parameters across the edge detection,handcrafted features,and CNN components.These findings highlight the potential of combining handcrafted and learned features for occluded face detection tasks.展开更多
This paper studies the technics of reducing item exposure by utilizing automatic item generation methods. Known test item calibration method uses item parameter estimation with the statistical data, collected during e...This paper studies the technics of reducing item exposure by utilizing automatic item generation methods. Known test item calibration method uses item parameter estimation with the statistical data, collected during examinees prior testing. Disadvantage of the mentioned item calibration method is the item exposure; when test items become familiar to the examinees. To reduce the item exposure, automatic item generation method is used, where item models are being constructed based on already calibrated test items without losing already estimated item parameters. A technic of item model extraction method from the already calibrated and therefore exposed test items described, which can be used by the test item development specialists to integrate automatic item generation principles with the existing testing applications.展开更多
The robust stability study of the classic Smith predictor-based control system for uncertain fractional-order plants with interval time delays and interval coefficients is the emphasis of this work.Interval uncertaint...The robust stability study of the classic Smith predictor-based control system for uncertain fractional-order plants with interval time delays and interval coefficients is the emphasis of this work.Interval uncertainties are a type of parametric uncertainties that cannot be avoided when modeling real-world plants.Also,in the considered Smith predictor control structure it is supposed that the controller is a fractional-order proportional integral derivative(FOPID)controller.To the best of the authors'knowledge,no method has been developed until now to analyze the robust stability of a Smith predictor based fractional-order control system in the presence of the simultaneous uncertainties in gain,time-constants,and time delay.The three primary contributions of this study are as follows:ⅰ)a set of necessary and sufficient conditions is constructed using a graphical method to examine the robust stability of a Smith predictor-based fractionalorder control system—the proposed method explicitly determines whether or not the FOPID controller can robustly stabilize the Smith predictor-based fractional-order control system;ⅱ)an auxiliary function as a robust stability testing function is presented to reduce the computational complexity of the robust stability analysis;andⅲ)two auxiliary functions are proposed to achieve the control requirements on the disturbance rejection and the noise reduction.Finally,four numerical examples and an experimental verification are presented in this study to demonstrate the efficacy and significance of the suggested technique.展开更多
The rapid growth of machine learning(ML)across fields has intensified the challenge of selecting the right algorithm for specific tasks,known as the Algorithm Selection Problem(ASP).Traditional trial-and-error methods...The rapid growth of machine learning(ML)across fields has intensified the challenge of selecting the right algorithm for specific tasks,known as the Algorithm Selection Problem(ASP).Traditional trial-and-error methods have become impractical due to their resource demands.Automated Machine Learning(AutoML)systems automate this process,but often neglect the group structures and sparsity in meta-features,leading to inefficiencies in algorithm recommendations for classification tasks.This paper proposes a meta-learning approach using Multivariate Sparse Group Lasso(MSGL)to address these limitations.Our method models both within-group and across-group sparsity among meta-features to manage high-dimensional data and reduce multicollinearity across eight meta-feature groups.The Fast Iterative Shrinkage-Thresholding Algorithm(FISTA)with adaptive restart efficiently solves the non-smooth optimization problem.Empirical validation on 145 classification datasets with 17 classification algorithms shows that our meta-learning method outperforms four state-of-the-art approaches,achieving 77.18%classification accuracy,86.07%recommendation accuracy and 88.83%normalized discounted cumulative gain.展开更多
Underwater Wireless Sensor Networks(UWSNs)are gaining popularity because of their potential uses in oceanography,seismic activity monitoring,environmental preservation,and underwater mapping.Yet,these networks are fac...Underwater Wireless Sensor Networks(UWSNs)are gaining popularity because of their potential uses in oceanography,seismic activity monitoring,environmental preservation,and underwater mapping.Yet,these networks are faced with challenges such as self-interference,long propagation delays,limited bandwidth,and changing network topologies.These challenges are coped with by designing advanced routing protocols.In this work,we present Under Water Fuzzy-Routing Protocol for Low power and Lossy networks(UWF-RPL),an enhanced fuzzy-based protocol that improves decision-making during path selection and traffic distribution over different network nodes.Our method extends RPL with the aid of fuzzy logic to optimize depth,energy,Received Signal Strength Indicator(RSSI)to Expected Transmission Count(ETX)ratio,and latency.Theproposed protocol outperforms other techniques in that it offersmore energy efficiency,better packet delivery,lowdelay,and no queue overflow.It also exhibits better scalability and reliability in dynamic underwater networks,which is of very high importance in maintaining the network operations efficiency and the lifetime of UWSNs optimized.Compared to other recent methods,it offers improved network convergence time(10%–23%),energy efficiency(15%),packet delivery(17%),and delay(24%).展开更多
The use of electronic communication has been significantly increased over the last few decades.Email is one of the most well-known means of electronic communication.Traditional email applications are widely used by a ...The use of electronic communication has been significantly increased over the last few decades.Email is one of the most well-known means of electronic communication.Traditional email applications are widely used by a large population;however,illiterate and semi-illiterate people face challenges in using them.A major population of Pakistan is illiterate that has little or no practice of computer usage.In this paper,we investigate the challenges of using email applications by illiterate and semi-illiterate people.In addition,we also propose a solution by developing an application tailored to the needs of illiterate/semi-illiterate people.Research shows that illiterate people are good at learning the designs that convey information with pictures instead of text-only,and focus more on one object/action at a time.Our proposed solution is based on designing user interfaces that consist of icons and vocal/audio instructions instead of text.Further,we use background voice/audio which is more helpful than flooding a picture with a lot of information.We tested our application using a large number of users with various skill levels(from no computer knowledge to experts).Our results of the usability tests indicate that the application can be used by illiterate people without any training or third-party’s help.展开更多
We present simulation results on evolution development of orbital motion of short-period comets with the revolution period not exceeding 6-7 years, namely comets 21P/Giacobini-Zinner, 26P/Grigg- Skjellerup and 7P/Pons...We present simulation results on evolution development of orbital motion of short-period comets with the revolution period not exceeding 6-7 years, namely comets 21P/Giacobini-Zinner, 26P/Grigg- Skjellerup and 7P/Pons-Winnecke. The calculations cover the range from the date of the object's discovery to 2100. Variations in the objects' orbital elements under the action of gravity disturbances, taking Earth's gravitational potential into account when the small body approaches, are analyzed. Corrected dates of peri- helion passages can be used for scheduling observations.展开更多
One of the greatest factors that affects the economic condition of a country is its institutions.In the model of good governance,the primary elements for stronger institution include efficiency,transparency,and accoun...One of the greatest factors that affects the economic condition of a country is its institutions.In the model of good governance,the primary elements for stronger institution include efficiency,transparency,and accountability;and technology plays a major role in improving these elements.However,there are myriad of challenges when it comes to practical integration of technology in these institutions for efficiency.It is more challenging when a country is developing and one that is already weak economically.It is also important to mention that the challenges of digitization in public sector is not limited to developing countries only.It is equally challenging,even today,in already developed countries to digitally transform their public institutions for improved policymaking and for responsive service delivery.Many factors contribute to the failure of such digitization initiatives,more so within developing countries.And the purpose of this paper is to identify those factors,to measure the significance of each of those factors,and to realize and overcome them.This research considered the case study of Pakistan;however,the results are very likely to match the conditions of other developing regions around the world.Through questionnaires and interviews,valuable feedback was gathered from up to 25 senior government officers that are closely associated with digitization initiatives in public sector.The feedback to the questions were overall unanimous.The results indicate the most significant of factors that affect government digitization in this developing region,including some factors that were not expected.展开更多
The advent of the COVID-19 pandemic has adversely affected the entire world and has put forth high demand for techniques that remotely manage crowd-related tasks.Video surveillance and crowd management using video ana...The advent of the COVID-19 pandemic has adversely affected the entire world and has put forth high demand for techniques that remotely manage crowd-related tasks.Video surveillance and crowd management using video analysis techniques have significantly impacted today’s research,and numerous applications have been developed in this domain.This research proposed an anomaly detection technique applied to Umrah videos in Kaaba during the COVID-19 pandemic through sparse crowd analysis.Managing theKaaba rituals is crucial since the crowd gathers from around the world and requires proper analysis during these days of the pandemic.The Umrah videos are analyzed,and a system is devised that can track and monitor the crowd flow in Kaaba.The crowd in these videos is sparse due to the pandemic,and we have developed a technique to track the maximum crowd flow and detect any object(person)moving in the direction unlikely of the major flow.We have detected abnormal movement by creating the histograms for the vertical and horizontal flows and applying thresholds to identify the non-majority flow.Our algorithm aims to analyze the crowd through video surveillance and timely detect any abnormal activity tomaintain a smooth crowd flowinKaaba during the pandemic.展开更多
In this paper, the authors present the development of a data modelling tool that visualizes the transformation process of an "Entity-Relationship" Diagram (ERD) into a relational database schema. The authors' foc...In this paper, the authors present the development of a data modelling tool that visualizes the transformation process of an "Entity-Relationship" Diagram (ERD) into a relational database schema. The authors' focus is the design of a tool for educational purposes and its implementation on e-learning database course. The tool presents two stages of database design. The first stage is to draw ERD graphically and validate it. The drawing is done by a learner. Then at second stage, the system enables automatically transformation of ERD to relational database schema by using common rules. Thus, the learner could understand more easily how to apply the theoretical material. A detailed description of system functionalities and algorithm for the conversion are proposed. Finally, a user interface and usage aspects are exposed.展开更多
This paper provides an overview of the main recommendations and approaches of the methodology on parallel computation application development for hybrid structures. This methodology was developed within the master's ...This paper provides an overview of the main recommendations and approaches of the methodology on parallel computation application development for hybrid structures. This methodology was developed within the master's thesis project "Optimization of complex tasks' computation on hybrid distributed computational structures" accomplished by Orekhov during which the main research objective was the determination of" patterns of the behavior of scaling efficiency and other parameters which define performance of different algorithms' implementations executed on hybrid distributed computational structures. Major outcomes and dependencies obtained within the master's thesis project were formed into a methodology which covers the problems of applications based on parallel computations and describes the process of its development in details, offering easy ways of avoiding potentially crucial problems. The paper is backed by the real-life examples such as clustering algorithms instead of artificial benchmarks.展开更多
Virtualization technology plays a key role in cloud computing.Thus,the security issues of virtualization tools(hypervisors,emulators,etc.) should be under precise consideration.However,threats of insider attacks are...Virtualization technology plays a key role in cloud computing.Thus,the security issues of virtualization tools(hypervisors,emulators,etc.) should be under precise consideration.However,threats of insider attacks are underestimated.The virtualization tools and hypervisors have been poorly protected from this type of attacks.Furthermore,hypervisor is one of the most critical elements in cloud computing infrastructure.Firstly,hypervisor vulnerabilities analysis is provided.Secondly,a formal model of insider attack on hypervisor is developed.Consequently,on the basis of the formal attack model,we propose a new methodology of hypervisor stability evaluation.In this paper,certain security countermeasures are considered that should be integrated in hypervisor software architecture.展开更多
Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control l...Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.展开更多
基金supported by the European Union’s Horizon Europe research and innovation programme (101120657)project ENFIELD (European Lighthouse to Manifest Trustworthy and Green AI), the Estonian Research Council (PRG658, PRG1463)the Estonian Centre of Excellence in Energy Efficiency, ENER (TK230) funded by the Estonian Ministry of Education and Research。
文摘In the development of linear quadratic regulator(LQR) algorithms, the Riccati equation approach offers two important characteristics——it is recursive and readily meets the existence condition. However, these attributes are applicable only to transformed singular systems, and the efficiency of the regulator may be undermined if constraints are violated in nonsingular versions. To address this gap, we introduce a direct approach to the LQR problem for linear singular systems, avoiding the need for any transformations and eliminating the need for regularity assumptions. To achieve this goal, we begin by formulating a quadratic cost function to derive the LQR algorithm through a penalized and weighted regression framework and then connect it to a constrained minimization problem using the Bellman's criterion. Then, we employ a dynamic programming strategy in a backward approach within a finite horizon to develop an LQR algorithm for the original system. To accomplish this, we address the stability and convergence analysis under the reachability and observability assumptions of a hypothetical system constructed by the pencil of augmented matrices and connected using the Hamiltonian diagonalization technique.
文摘Breast cancer is among the leading causes of cancer mortality globally,and its diagnosis through histopathological image analysis is often prone to inter-observer variability and misclassification.Existing machine learning(ML)methods struggle with intra-class heterogeneity and inter-class similarity,necessitating more robust classification models.This study presents an ML classifier ensemble hybrid model for deep feature extraction with deep learning(DL)and Bat Swarm Optimization(BSO)hyperparameter optimization to improve breast cancer histopathology(BCH)image classification.A dataset of 804 Hematoxylin and Eosin(H&E)stained images classified as Benign,in situ,Invasive,and Normal categories(ICIAR2018_BACH_Challenge)has been utilized.ResNet50 was utilized for feature extraction,while Support Vector Machines(SVM),Random Forests(RF),XGBoosts(XGB),Decision Trees(DT),and AdaBoosts(ADB)were utilized for classification.BSO was utilized for hyperparameter optimization in a soft voting ensemble approach.Accuracy,precision,recall,specificity,F1-score,Receiver Operating Characteristic(ROC),and Precision-Recall(PR)were utilized for model performance metrics.The model using an ensemble outperformed individual classifiers in terms of having greater accuracy(~90.0%),precision(~86.4%),recall(~86.3%),and specificity(~96.6%).The robustness of the model was verified by both ROC and PR curves,which showed AUC values of 1.00,0.99,and 0.98 for Benign,Invasive,and in situ instances,respectively.This ensemble model delivers a strong and clinically valid methodology for breast cancer classification that enhances precision and minimizes diagnostic errors.Future work should focus on explainable AI,multi-modal fusion,few-shot learning,and edge computing for real-world deployment.
基金funded by A’Sharqiyah University,Sultanate of Oman,under Research Project grant number(BFP/RGP/ICT/22/490).
文摘Detecting faces under occlusion remains a significant challenge in computer vision due to variations caused by masks,sunglasses,and other obstructions.Addressing this issue is crucial for applications such as surveillance,biometric authentication,and human-computer interaction.This paper provides a comprehensive review of face detection techniques developed to handle occluded faces.Studies are categorized into four main approaches:feature-based,machine learning-based,deep learning-based,and hybrid methods.We analyzed state-of-the-art studies within each category,examining their methodologies,strengths,and limitations based on widely used benchmark datasets,highlighting their adaptability to partial and severe occlusions.The review also identifies key challenges,including dataset diversity,model generalization,and computational efficiency.Our findings reveal that deep learning methods dominate recent studies,benefiting from their ability to extract hierarchical features and handle complex occlusion patterns.More recently,researchers have increasingly explored Transformer-based architectures,such as Vision Transformer(ViT)and Swin Transformer,to further improve detection robustness under challenging occlusion scenarios.In addition,hybrid approaches,which aim to combine traditional andmodern techniques,are emerging as a promising direction for improving robustness.This review provides valuable insights for researchers aiming to develop more robust face detection systems and for practitioners seeking to deploy reliable solutions in real-world,occlusionprone environments.Further improvements and the proposal of broader datasets are required to developmore scalable,robust,and efficient models that can handle complex occlusions in real-world scenarios.
文摘This research aims to address the challenges of fault detection and isolation(FDI)in digital grids,focusing on improving the reliability and stability of power systems.Traditional fault detection techniques,such as rule-based fuzzy systems and conventional FDI methods,often struggle with the dynamic nature of modern grids,resulting in delays and inaccuracies in fault classification.To overcome these limitations,this study introduces a Hybrid NeuroFuzzy Fault Detection Model that combines the adaptive learning capabilities of neural networks with the reasoning strength of fuzzy logic.The model’s performance was evaluated through extensive simulations on the IEEE 33-bus test system,considering various fault scenarios,including line-to-ground faults(LGF),three-phase short circuits(3PSC),and harmonic distortions(HD).The quantitative results show that the model achieves 97.2%accuracy,a false negative rate(FNR)of 1.9%,and a false positive rate(FPR)of 2.3%,demonstrating its high precision in fault diagnosis.The qualitative analysis further highlights the model’s adaptability and its potential for seamless integration into smart grids,micro grids,and renewable energy systems.By dynamically refining fuzzy inference rules,the model enhances fault detection efficiency without compromising computational feasibility.These findings contribute to the development of more resilient and adaptive fault management systems,paving the way for advanced smart grid technologies.
基金funded by A’Sharqiyah University,Sultanate of Oman,under Research Project Grant Number(BFP/RGP/ICT/22/490).
文摘Face detection is a critical component inmodern security,surveillance,and human-computer interaction systems,with widespread applications in smartphones,biometric access control,and public monitoring.However,detecting faces with high levels of occlusion,such as those covered by masks,veils,or scarves,remains a significant challenge,as traditional models often fail to generalize under such conditions.This paper presents a hybrid approach that combines traditional handcrafted feature extraction technique called Histogram of Oriented Gradients(HOG)and Canny edge detection with modern deep learning models.The goal is to improve face detection accuracy under occlusions.The proposed method leverages the structural strengths of HOG and edge-based object proposals while exploiting the feature extraction capabilities of Convolutional Neural Networks(CNNs).The effectiveness of the proposed model is assessed using a custom dataset containing 10,000 heavily occluded face images and a subset of the Common Objects in Context(COCO)dataset for non-face samples.The COCO dataset was selected for its variety and realism in background contexts.Experimental evaluations demonstrate significant performance improvements compared to baseline CNN models.Results indicate that DenseNet121 combined with HOG outperforms other counterparts in classification metrics with an F1-score of 87.96%and precision of 88.02%.Enhanced performance is achieved through reduced false positives and improved localization accuracy with the integration of object proposals based on Canny and contour detection.While the proposed method increases inference time from 33.52 to 97.80 ms,it achieves a notable improvement in precision from 80.85% to 88.02% when comparing the baseline DenseNet121 model to its hybrid counterpart.Limitations of the method include higher computational cost and the need for careful tuning of parameters across the edge detection,handcrafted features,and CNN components.These findings highlight the potential of combining handcrafted and learned features for occluded face detection tasks.
文摘This paper studies the technics of reducing item exposure by utilizing automatic item generation methods. Known test item calibration method uses item parameter estimation with the statistical data, collected during examinees prior testing. Disadvantage of the mentioned item calibration method is the item exposure; when test items become familiar to the examinees. To reduce the item exposure, automatic item generation method is used, where item models are being constructed based on already calibrated test items without losing already estimated item parameters. A technic of item model extraction method from the already calibrated and therefore exposed test items described, which can be used by the test item development specialists to integrate automatic item generation principles with the existing testing applications.
基金supported by the Estonian Research Council(PRG658)。
文摘The robust stability study of the classic Smith predictor-based control system for uncertain fractional-order plants with interval time delays and interval coefficients is the emphasis of this work.Interval uncertainties are a type of parametric uncertainties that cannot be avoided when modeling real-world plants.Also,in the considered Smith predictor control structure it is supposed that the controller is a fractional-order proportional integral derivative(FOPID)controller.To the best of the authors'knowledge,no method has been developed until now to analyze the robust stability of a Smith predictor based fractional-order control system in the presence of the simultaneous uncertainties in gain,time-constants,and time delay.The three primary contributions of this study are as follows:ⅰ)a set of necessary and sufficient conditions is constructed using a graphical method to examine the robust stability of a Smith predictor-based fractionalorder control system—the proposed method explicitly determines whether or not the FOPID controller can robustly stabilize the Smith predictor-based fractional-order control system;ⅱ)an auxiliary function as a robust stability testing function is presented to reduce the computational complexity of the robust stability analysis;andⅲ)two auxiliary functions are proposed to achieve the control requirements on the disturbance rejection and the noise reduction.Finally,four numerical examples and an experimental verification are presented in this study to demonstrate the efficacy and significance of the suggested technique.
文摘The rapid growth of machine learning(ML)across fields has intensified the challenge of selecting the right algorithm for specific tasks,known as the Algorithm Selection Problem(ASP).Traditional trial-and-error methods have become impractical due to their resource demands.Automated Machine Learning(AutoML)systems automate this process,but often neglect the group structures and sparsity in meta-features,leading to inefficiencies in algorithm recommendations for classification tasks.This paper proposes a meta-learning approach using Multivariate Sparse Group Lasso(MSGL)to address these limitations.Our method models both within-group and across-group sparsity among meta-features to manage high-dimensional data and reduce multicollinearity across eight meta-feature groups.The Fast Iterative Shrinkage-Thresholding Algorithm(FISTA)with adaptive restart efficiently solves the non-smooth optimization problem.Empirical validation on 145 classification datasets with 17 classification algorithms shows that our meta-learning method outperforms four state-of-the-art approaches,achieving 77.18%classification accuracy,86.07%recommendation accuracy and 88.83%normalized discounted cumulative gain.
文摘Underwater Wireless Sensor Networks(UWSNs)are gaining popularity because of their potential uses in oceanography,seismic activity monitoring,environmental preservation,and underwater mapping.Yet,these networks are faced with challenges such as self-interference,long propagation delays,limited bandwidth,and changing network topologies.These challenges are coped with by designing advanced routing protocols.In this work,we present Under Water Fuzzy-Routing Protocol for Low power and Lossy networks(UWF-RPL),an enhanced fuzzy-based protocol that improves decision-making during path selection and traffic distribution over different network nodes.Our method extends RPL with the aid of fuzzy logic to optimize depth,energy,Received Signal Strength Indicator(RSSI)to Expected Transmission Count(ETX)ratio,and latency.Theproposed protocol outperforms other techniques in that it offersmore energy efficiency,better packet delivery,lowdelay,and no queue overflow.It also exhibits better scalability and reliability in dynamic underwater networks,which is of very high importance in maintaining the network operations efficiency and the lifetime of UWSNs optimized.Compared to other recent methods,it offers improved network convergence time(10%–23%),energy efficiency(15%),packet delivery(17%),and delay(24%).
基金This work is supported by the Security Testing Lab established at the University of Engineering&TechnologyPeshawar under the funded project National Center for Cyber Security of the Higher Education Commission(HEC),Pakistan。
文摘The use of electronic communication has been significantly increased over the last few decades.Email is one of the most well-known means of electronic communication.Traditional email applications are widely used by a large population;however,illiterate and semi-illiterate people face challenges in using them.A major population of Pakistan is illiterate that has little or no practice of computer usage.In this paper,we investigate the challenges of using email applications by illiterate and semi-illiterate people.In addition,we also propose a solution by developing an application tailored to the needs of illiterate/semi-illiterate people.Research shows that illiterate people are good at learning the designs that convey information with pictures instead of text-only,and focus more on one object/action at a time.Our proposed solution is based on designing user interfaces that consist of icons and vocal/audio instructions instead of text.Further,we use background voice/audio which is more helpful than flooding a picture with a lot of information.We tested our application using a large number of users with various skill levels(from no computer knowledge to experts).Our results of the usability tests indicate that the application can be used by illiterate people without any training or third-party’s help.
文摘We present simulation results on evolution development of orbital motion of short-period comets with the revolution period not exceeding 6-7 years, namely comets 21P/Giacobini-Zinner, 26P/Grigg- Skjellerup and 7P/Pons-Winnecke. The calculations cover the range from the date of the object's discovery to 2100. Variations in the objects' orbital elements under the action of gravity disturbances, taking Earth's gravitational potential into account when the small body approaches, are analyzed. Corrected dates of peri- helion passages can be used for scheduling observations.
文摘One of the greatest factors that affects the economic condition of a country is its institutions.In the model of good governance,the primary elements for stronger institution include efficiency,transparency,and accountability;and technology plays a major role in improving these elements.However,there are myriad of challenges when it comes to practical integration of technology in these institutions for efficiency.It is more challenging when a country is developing and one that is already weak economically.It is also important to mention that the challenges of digitization in public sector is not limited to developing countries only.It is equally challenging,even today,in already developed countries to digitally transform their public institutions for improved policymaking and for responsive service delivery.Many factors contribute to the failure of such digitization initiatives,more so within developing countries.And the purpose of this paper is to identify those factors,to measure the significance of each of those factors,and to realize and overcome them.This research considered the case study of Pakistan;however,the results are very likely to match the conditions of other developing regions around the world.Through questionnaires and interviews,valuable feedback was gathered from up to 25 senior government officers that are closely associated with digitization initiatives in public sector.The feedback to the questions were overall unanimous.The results indicate the most significant of factors that affect government digitization in this developing region,including some factors that were not expected.
基金The authors extend their appreciation to the Deputyship for Research and Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number QURDO001Project title:Intelligent Real-Time Crowd Monitoring System Using Unmanned Aerial Vehicle(UAV)Video and Global Positioning Systems(GPS)Data。
文摘The advent of the COVID-19 pandemic has adversely affected the entire world and has put forth high demand for techniques that remotely manage crowd-related tasks.Video surveillance and crowd management using video analysis techniques have significantly impacted today’s research,and numerous applications have been developed in this domain.This research proposed an anomaly detection technique applied to Umrah videos in Kaaba during the COVID-19 pandemic through sparse crowd analysis.Managing theKaaba rituals is crucial since the crowd gathers from around the world and requires proper analysis during these days of the pandemic.The Umrah videos are analyzed,and a system is devised that can track and monitor the crowd flow in Kaaba.The crowd in these videos is sparse due to the pandemic,and we have developed a technique to track the maximum crowd flow and detect any object(person)moving in the direction unlikely of the major flow.We have detected abnormal movement by creating the histograms for the vertical and horizontal flows and applying thresholds to identify the non-majority flow.Our algorithm aims to analyze the crowd through video surveillance and timely detect any abnormal activity tomaintain a smooth crowd flowinKaaba during the pandemic.
文摘In this paper, the authors present the development of a data modelling tool that visualizes the transformation process of an "Entity-Relationship" Diagram (ERD) into a relational database schema. The authors' focus is the design of a tool for educational purposes and its implementation on e-learning database course. The tool presents two stages of database design. The first stage is to draw ERD graphically and validate it. The drawing is done by a learner. Then at second stage, the system enables automatically transformation of ERD to relational database schema by using common rules. Thus, the learner could understand more easily how to apply the theoretical material. A detailed description of system functionalities and algorithm for the conversion are proposed. Finally, a user interface and usage aspects are exposed.
文摘This paper provides an overview of the main recommendations and approaches of the methodology on parallel computation application development for hybrid structures. This methodology was developed within the master's thesis project "Optimization of complex tasks' computation on hybrid distributed computational structures" accomplished by Orekhov during which the main research objective was the determination of" patterns of the behavior of scaling efficiency and other parameters which define performance of different algorithms' implementations executed on hybrid distributed computational structures. Major outcomes and dependencies obtained within the master's thesis project were formed into a methodology which covers the problems of applications based on parallel computations and describes the process of its development in details, offering easy ways of avoiding potentially crucial problems. The paper is backed by the real-life examples such as clustering algorithms instead of artificial benchmarks.
文摘Virtualization technology plays a key role in cloud computing.Thus,the security issues of virtualization tools(hypervisors,emulators,etc.) should be under precise consideration.However,threats of insider attacks are underestimated.The virtualization tools and hypervisors have been poorly protected from this type of attacks.Furthermore,hypervisor is one of the most critical elements in cloud computing infrastructure.Firstly,hypervisor vulnerabilities analysis is provided.Secondly,a formal model of insider attack on hypervisor is developed.Consequently,on the basis of the formal attack model,we propose a new methodology of hypervisor stability evaluation.In this paper,certain security countermeasures are considered that should be integrated in hypervisor software architecture.
基金funding by Comunidad de Madrid within the framework of the Multiannual Agreement with Universidad Politécnica de Madrid to encourage research by young doctors(PRINCE project).
文摘Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.