Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces th...Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization.展开更多
Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for opti...Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for optimal coverage,ranking further refines their execution order to detect critical faults earlier.This study investigates machine learning techniques to enhance both prioritization and ranking,contributing to more effective and efficient testing processes.We first employ advanced feature engineering alongside ensemble models,including Gradient Boosted,Support Vector Machines,Random Forests,and Naive Bayes classifiers to optimize test case prioritization,achieving an accuracy score of 0.98847 and significantly improving the Average Percentage of Fault Detection(APFD).Subsequently,we introduce a deep Q-learning framework combined with a Genetic Algorithm(GA)to refine test case ranking within priority levels.This approach achieves a rank accuracy of 0.9172,demonstrating robust performance despite the increasing computational demands of specialized variation operators.Our findings highlight the effectiveness of stacked ensemble learning and reinforcement learning in optimizing test case prioritization and ranking.This integrated approach improves testing efficiency,reduces late-stage defects,and improves overall software stability.The study provides valuable information for AI-driven testing frameworks,paving the way for more intelligent and adaptive software quality assurance methodologies.展开更多
Quality engineers play a key role in software product development,covering various stages such as requirements analysis,design,coding,testing,and delivery.Its responsibilities include formulating quality standards,wri...Quality engineers play a key role in software product development,covering various stages such as requirements analysis,design,coding,testing,and delivery.Its responsibilities include formulating quality standards,writing test cases,conducting functional and performance tests,and optimizing the product based on feedback.In government procurement projects,quality evaluation focuses on process compliance,security,and functional compatibility.KPI evaluation trees are commonly used for quantitative assessment,and a dynamic adjustment mechanism for indicators needs to be established to cope with complex demands.In addition,risk-driven testing and agile development should be combined to set up quality access control to ensure that each iteration version meets expectations.The multi-dimensional quality assurance and verification scoring mechanism can effectively enhance product reliability and reduce project risks.展开更多
With the rapid advancement of information technology,the quality assurance and evaluation of software engineering education have become pivotal concerns for higher education institutions.In this paper,we focus on a co...With the rapid advancement of information technology,the quality assurance and evaluation of software engineering education have become pivotal concerns for higher education institutions.In this paper,we focus on a comparative study of software engineering education in China and Europe,aiming to explore the theoretical frameworks and practical pathways employed in both regions.Initially,we introduce and contrast the engineering education accreditation systems of China and Europe,including the Chinese engineering education accreditation framework and the European EUR-ACE(European Accreditation of Engineering Programmes)standards,highlighting their core principles and evaluation methodologies.Subsequently,we provide case studies of several universities in China and Europe,such as Sun Yat-sen University,Tsinghua University,Technical University of Munich,and Imperial College London.Finally,we offer recommendations to foster mutual learning and collaboration between Chinese and European institutions,aiming to enhance the overall quality of software engineering education globally.This work provides valuable insights for educational administrators,faculty members,and policymakers,contributing to the ongoing improvement and innovative development of software engineering education in China and Europe.展开更多
Ensuring software quality in open⁃source environments requires adaptive mechanisms to enhance scalability,optimize service provisioning,and improve reliability.This study presents the dynamic correlation analysis tech...Ensuring software quality in open⁃source environments requires adaptive mechanisms to enhance scalability,optimize service provisioning,and improve reliability.This study presents the dynamic correlation analysis technique to enhance software quality management in open⁃source environments by addressing dynamic scalability,adaptive service provisioning,and software reliability.The proposed methodology integrates a scalability metric,an optimized service provisioning model,and a weighted entropy⁃based reliability assessment to systematically improve key performance parameters.Experimental evaluation conducted on multiple open⁃source software(OSS)versions demonstrates significant improvements:scalability increased by 27.5%,service provisioning time reduced by 18.3%,and software reliability improved by 22.1%compared to baseline methods.A comparative analysis with prior works further highlights the effectiveness of this approach in ensuring adaptability,efficiency,and resilience in dynamic software ecosystems.Future work will focus on real⁃time monitoring and AI⁃driven adaptive provisioning to further enhance software quality management.展开更多
The quality of the software product is a crucial factor that contributes to its success. Therefore, it is important to specify the right software quality requirements that will establish the basis for desired quality ...The quality of the software product is a crucial factor that contributes to its success. Therefore, it is important to specify the right software quality requirements that will establish the basis for desired quality of the final system/software product. There are several known methodologies/ processes that support the specification of the system/software functional requirements starting from the user needs to finally obtain the system requirements that the developers can implement through their development process. System/software quality requirements are interdependent with functional requirements, which means that the system/software quality requirements are meant to be specified in parallel with the latter. The ISO/IEC 25000 [1] SQuaRE series of standards include the standard ISO/IEC 25030—Software engineering—Software Quality Requirements and Evaluation—Quality requirements [2], which has as main goal to help specify software quality requirements. As to date, this standard does not offer clear and concise steps that a software quality engineer could follow in order to specify them. This article presents modifications recommended for ISO/IEC 25030 standard, with, among the others, a new requirements definition process that allows for specifying the system/software quality requirements taking into account the existing published system and software quality model ISO/IEC 25010 [3] as well as all the stake- holders of the project.展开更多
[Objective] To establish the traceability mechanism of agricultural products safety, and the application of promote domestic based software in the supervision area of agricultural products quality and safety. [Method]...[Objective] To establish the traceability mechanism of agricultural products safety, and the application of promote domestic based software in the supervision area of agricultural products quality and safety. [Method] Through the analysis on the circulation characteristics of agricultural products, like fruits, vegetables, livestock and poultry, the agricultural products quality safety management and traceability query business component libraries were designed. Based on the run-time-supporting environment provided by domestic based software, traceability management system of agricultural products quality and safety was constructed. [Result] The traceability management system provided the information interaction and comprehensive management platform of agricultural product quality and safety based on domestic based software for the government, enterprises and consumers. [Conclusion] Through the application demonstration, the quality control and information traceability of full circulation of agricultural products was achieved effective and reliably, and the management level of agricultural products quality and safety was improved.展开更多
Software security poses substantial risks to our society because software has become part of our life. Numerous techniques have been proposed to resolve or mitigate the impact of software security issues. Among them, ...Software security poses substantial risks to our society because software has become part of our life. Numerous techniques have been proposed to resolve or mitigate the impact of software security issues. Among them, software testing and analysis are two of the critical methods, which significantly benefit from the advancements in deep learning technologies. Due to the successful use of deep learning in software security, recently,researchers have explored the potential of using large language models(LLMs) in this area. In this paper, we systematically review the results focusing on LLMs in software security. We analyze the topics of fuzzing, unit test, program repair, bug reproduction, data-driven bug detection, and bug triage. We deconstruct these techniques into several stages and analyze how LLMs can be used in the stages. We also discuss the future directions of using LLMs in software security, including the future directions for the existing use of LLMs and extensions from conventional deep learning research.展开更多
When the expression “Software Quality” is used, we usually think in terms of an excellent software product that fulfills our expectations. These expectations are based on the intended use. Number of models has been ...When the expression “Software Quality” is used, we usually think in terms of an excellent software product that fulfills our expectations. These expectations are based on the intended use. Number of models has been proposed for evaluation of software quality based on various characteristics. In this paper quality of software product is defined in terms of basic components as constituent part of any program or software and proposed a software quality prediction model based on basic components. It has been justified with example that if any software quality model uses the tacit knowledge that will be better than any other model in terms of quality.展开更多
It is essential to study quality of software production like others. Software productions have special properties. They are intangible. So, qualitative evaluation encountered with complexity. Hence, proposing a model ...It is essential to study quality of software production like others. Software productions have special properties. They are intangible. So, qualitative evaluation encountered with complexity. Hence, proposing a model in order to evaluate the quality of Software productions is considerable to most software managers and experts. In this paper, regarding to improve software productions quality, the process of software production has been determined, using CMM standard framework of maturity level. In CMM it does not present a method for measurement and evaluation maturity level, the presented process in CMM standard mapped by COBIT control objectives has been combined in the process of software productions development in developed hybrid framework. In this research, the processes have been mapped utilizing focus and established group, in parallel of software production in different maturity level of CMM mapped by COBIT framework. In order to show the capabilities of proposed framework, the hybrid evaluation model was employed in a software developing organization as a case study. According to the results of evaluation, improvements proceedings and action plans have been proposed and discussed to enhance the software production processes.展开更多
Spectrum-based fault localization (SBFL) generates a ranked list of suspicious elements by using the program execution spectrum, but the excessive number of elements ranked in parallel results in low localization accu...Spectrum-based fault localization (SBFL) generates a ranked list of suspicious elements by using the program execution spectrum, but the excessive number of elements ranked in parallel results in low localization accuracy. Most researchers consider intra-class dependencies to improve localization accuracy. However, some studies show that inter-class method call type faults account for more than 20%, which means such methods still have certain limitations. To solve the above problems, this paper proposes a two-phase software fault localization based on relational graph convolutional neural networks (Two-RGCNFL). Firstly, in Phase 1, the method call dependence graph (MCDG) of the program is constructed, the intra-class and inter-class dependencies in MCDG are extracted by using the relational graph convolutional neural network, and the classifier is used to identify the faulty methods. Then, the GraphSMOTE algorithm is improved to alleviate the impact of class imbalance on classification accuracy. Aiming at the problem of parallel ranking of element suspicious values in traditional SBFL technology, in Phase 2, Doc2Vec is used to learn static features, while spectrum information serves as dynamic features. A RankNet model based on siamese multi-layer perceptron is constructed to score and rank statements in the faulty method. This work conducts experiments on 5 real projects of Defects4J benchmark. Experimental results show that, compared with the traditional SBFL technique and two baseline methods, our approach improves the Top-1 accuracy by 262.86%, 29.59% and 53.01%, respectively, which verifies the effectiveness of Two-RGCNFL. Furthermore, this work verifies the importance of inter-class dependencies through ablation experiments.展开更多
Software-defined networking(SDN)is an innovative paradigm that separates the control and data planes,introducing centralized network control.SDN is increasingly being adopted by Carrier Grade networks,offering enhance...Software-defined networking(SDN)is an innovative paradigm that separates the control and data planes,introducing centralized network control.SDN is increasingly being adopted by Carrier Grade networks,offering enhanced networkmanagement capabilities than those of traditional networks.However,because SDN is designed to ensure high-level service availability,it faces additional challenges.One of themost critical challenges is ensuring efficient detection and recovery from link failures in the data plane.Such failures can significantly impact network performance and lead to service outages,making resiliency a key concern for the effective adoption of SDN.Since the recovery process is intrinsically dependent on timely failure detection,this research surveys and analyzes the current literature on both failure detection and recovery approaches in SDN.The survey provides a critical comparison of existing failure detection techniques,highlighting their advantages and disadvantages.Additionally,it examines the current failure recovery methods,categorized as either restoration-based or protection-based,and offers a comprehensive comparison of their strengths and limitations.Lastly,future research challenges and directions are discussed to address the shortcomings of existing failure recovery methods.展开更多
This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, use...This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, uses a response surface modeling(RSM) methodology and serves as a visualization and analysis tool(VAT) for three-dimensional air quality data obtained by atmospheric models. The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits. The case study of contiguous U.S.demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias 〈 2% and assisting in air quality policy making in near real time.展开更多
Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniq...Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniques coming up rapidly.The purpose of this study is to review the recent developments in the field of security integration in the software development lifecycle(SDLC)by analyzing the articles published in the last two decades and to propose a way forward.This review follows Kitchenham’s review protocol.The review has been divided into three main stages including planning,execution,and analysis.From the selected 100 articles,it becomes evident that need of a collaborative approach is necessary for addressing critical software security risks(CSSRs)through effective risk management/estimation techniques.Quantifying risks using a numeric scale enables a comprehensive understanding of their severity,facilitating focused resource allocation and mitigation efforts.Through a comprehensive understanding of potential vulnerabilities and proactive mitigation efforts facilitated by protection poker,organizations can prioritize resources effectively to ensure the successful outcome of projects and initiatives in today’s dynamic threat landscape.The review reveals that threat analysis and security testing are needed to develop automated tools for the future.Accurate estimation of effort required to prioritize potential security risks is a big challenge in software security.The accuracy of effort estimation can be further improved by exploring new techniques,particularly those involving deep learning.It is also imperative to validate these effort estimation methods to ensure all potential security threats are addressed.Another challenge is selecting the right model for each specific security threat.To achieve a comprehensive evaluation,researchers should use well-known benchmark checklists.展开更多
Software testing is a critical phase due to misconceptions about ambiguities in the requirements during specification,which affect the testing process.Therefore,it is difficult to identify all faults in software.As re...Software testing is a critical phase due to misconceptions about ambiguities in the requirements during specification,which affect the testing process.Therefore,it is difficult to identify all faults in software.As requirement changes continuously,it increases the irrelevancy and redundancy during testing.Due to these challenges;fault detection capability decreases and there arises a need to improve the testing process,which is based on changes in requirements specification.In this research,we have developed a model to resolve testing challenges through requirement prioritization and prediction in an agile-based environment.The research objective is to identify the most relevant and meaningful requirements through semantic analysis for correct change analysis.Then compute the similarity of requirements through case-based reasoning,which predicted the requirements for reuse and restricted to error-based requirements.Afterward,the apriori algorithm mapped out requirement frequency to select relevant test cases based on frequently reused or not reused test cases to increase the fault detection rate.Furthermore,the proposed model was evaluated by conducting experiments.The results showed that requirement redundancy and irrelevancy improved due to semantic analysis,which correctly predicted the requirements,increasing the fault detection rate and resulting in high user satisfaction.The predicted requirements are mapped into test cases,increasing the fault detection rate after changes to achieve higher user satisfaction.Therefore,the model improves the redundancy and irrelevancy of requirements by more than 90%compared to other clustering methods and the analytical hierarchical process,achieving an 80%fault detection rate at an earlier stage.Hence,it provides guidelines for practitioners and researchers in the modern era.In the future,we will provide the working prototype of this model for proof of concept.展开更多
Link failure is a critical issue in large networks and must be effectively addressed.In software-defined networks(SDN),link failure recovery schemes can be categorized into proactive and reactive approaches.Reactive s...Link failure is a critical issue in large networks and must be effectively addressed.In software-defined networks(SDN),link failure recovery schemes can be categorized into proactive and reactive approaches.Reactive schemes have longer recovery times while proactive schemes provide faster recovery but overwhelm the memory of switches by flow entries.As SDN adoption grows,ensuring efficient recovery from link failures in the data plane becomes crucial.In particular,data center networks(DCNs)demand rapid recovery times and efficient resource utilization to meet carrier-grade requirements.This paper proposes an efficient Decentralized Failure Recovery(DFR)model for SDNs,meeting recovery time requirements and optimizing switch memory resource consumption.The DFR model enables switches to autonomously reroute traffic upon link failures without involving the controller,achieving fast recovery times while minimizing memory usage.DFR employs the Fast Failover Group in the OpenFlow standard for local recovery without requiring controller communication and utilizes the k-shortest path algorithm to proactively install backup paths,allowing immediate local recovery without controller intervention and enhancing overall network stability and scalability.DFR employs flow entry aggregation techniques to reduce switch memory usage.Instead of matching flow entries to the destination host’s MAC address,DFR matches packets to the destination switch’s MAC address.This reduces the switches’Ternary Content-Addressable Memory(TCAM)consumption.Additionally,DFR modifies Address Resolution Protocol(ARP)replies to provide source hosts with the destination switch’s MAC address,facilitating flow entry aggregation without affecting normal network operations.The performance of DFR is evaluated through the network emulator Mininet 2.3.1 and Ryu 3.1 as SDN controller.For different number of active flows,number of hosts per edge switch,and different network sizes,the proposed model outperformed various failure recovery models:restoration-based,protection by flow entries,protection by group entries and protection by Vlan-tagging model in terms of recovery time,switch memory consumption and controller overhead which represented the number of flow entry updates to recover from the failure.Experimental results demonstrate that DFR achieves recovery times under 20 milliseconds,satisfying carrier-grade requirements for rapid failure recovery.Additionally,DFR reduces switch memory usage by up to 95%compared to traditional protection methods and minimizes controller load by eliminating the need for controller intervention during failure recovery.Theresults underscore the efficiency and scalability of the DFR model,making it a practical solution for enhancing network resilience in SDN environments.展开更多
Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software ...Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software defect prediction can be effectively performed using traditional features,but there are some redundant or irrelevant features in them(the presence or absence of this feature has little effect on the prediction results).These problems can be solved using feature selection.However,existing feature selection methods have shortcomings such as insignificant dimensionality reduction effect and low classification accuracy of the selected optimal feature subset.In order to reduce the impact of these shortcomings,this paper proposes a new feature selection method Cubic TraverseMa Beluga whale optimization algorithm(CTMBWO)based on the improved Beluga whale optimization algorithm(BWO).The goal of this study is to determine how well the CTMBWO can extract the features that are most important for correctly predicting software defects,improve the accuracy of fault prediction,reduce the number of the selected feature and mitigate the risk of overfitting,thereby achieving more efficient resource utilization and better distribution of test workload.The CTMBWO comprises three main stages:preprocessing the dataset,selecting relevant features,and evaluating the classification performance of the model.The novel feature selection method can effectively improve the performance of SDP.This study performs experiments on two software defect datasets(PROMISE,NASA)and shows the method’s classification performance using four detailed evaluation metrics,Accuracy,F1-score,MCC,AUC and Recall.The results indicate that the approach presented in this paper achieves outstanding classification performance on both datasets and has significant improvement over the baseline models.展开更多
The advent of parametric design has resulted in a marked increase in the complexity of building.Unfortunately,traditional construction methods make it difficult to meet the needs.Therefore,construction robots have bec...The advent of parametric design has resulted in a marked increase in the complexity of building.Unfortunately,traditional construction methods make it difficult to meet the needs.Therefore,construction robots have become a pivotal production tool in this context.Since the arm span of a single robot usually does not exceed 3 meters,it is not competent for producing large-scale building components.Accordingly,the extension of the robot,s working range is often achieved by external axes.Nevertheless,the coupling control of external axes and robots and their kinematic solution have become key challenges.The primary technical difficulties include customized construction robots,automatic solutions for external axes,fixed axis joints,and specific motion mode control.This paper proposes solutions to these difficulties,introduces the relevant basic concepts and algorithms in detail,and encapsulates these robotics principles and algorithm processes into the Grasshopper plug-in commonly used by architects to form the FURobot software platform.This platform effectively solves the above problems,lowers the threshold for architects,and improves production efficiency.The effectiveness of the algorithm and software in this paper is verified through simulation experiments.展开更多
This paper proposes a multivariate data fusion based quality evaluation model for software talent cultivation.The model constructs a comprehensive ability and quality evaluation index system for college students from ...This paper proposes a multivariate data fusion based quality evaluation model for software talent cultivation.The model constructs a comprehensive ability and quality evaluation index system for college students from a perspective of engineering course,especially of software engineering.As for evaluation method,relying on the behavioral data of students during their school years,we aim to construct the evaluation model as objective as possible,effectively weakening the negative impact of personal subjective assumptions on the evaluation results.展开更多
In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the...In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the resilience and dependability of software systems.This study presents a novel software defect prediction technique that significantly enhances performance through a hybrid machine learning approach.The innovative methodology integrates a Genetic Algorithm(GA)for precise feature selection,a Decision Tree(DT)for robust classification,and leverages the capabilities of Particle Swarm Optimization(PSO)and Ant Colony Optimization(ACO)algorithms for precision⁃driven optimization.The utilization of datasets from varied sources enriches the predictive prowess of our model.Of particular significance in our pursuit is the unwavering focus on enhancing the prediction process through a highly refined PSO⁃ACO algorithm,thereby optimizing the efficiency and effectiveness of the GA⁃DT hybrid model.The thorough evaluation of our proposed approach unfolds across seven software projects,unveiling a paradigm shift in performance metrics.Results unequivocally demonstrate that the GA⁃DT with PSO⁃ACO algorithm surpasses its counterparts,showcasing unparalleled accuracy and reliability.Furthermore,our hybrid approach demonstrates outstanding performance in terms of F⁃measure,with an impressive increase rate of 78%.展开更多
基金Project supported by the Project of the Anhui Provincial Natural Science Foundation(Grant No.2308085MA19)Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDA0410401)+2 种基金the National Natural Science Foundation of China(Grant No.52202120)the National Key Research and Development Program of China(Grant No.2023YFA1609800)USTC Research Funds of the Double First-Class Initiative(Grant No.YD2310002013)。
文摘Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization.
文摘Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for optimal coverage,ranking further refines their execution order to detect critical faults earlier.This study investigates machine learning techniques to enhance both prioritization and ranking,contributing to more effective and efficient testing processes.We first employ advanced feature engineering alongside ensemble models,including Gradient Boosted,Support Vector Machines,Random Forests,and Naive Bayes classifiers to optimize test case prioritization,achieving an accuracy score of 0.98847 and significantly improving the Average Percentage of Fault Detection(APFD).Subsequently,we introduce a deep Q-learning framework combined with a Genetic Algorithm(GA)to refine test case ranking within priority levels.This approach achieves a rank accuracy of 0.9172,demonstrating robust performance despite the increasing computational demands of specialized variation operators.Our findings highlight the effectiveness of stacked ensemble learning and reinforcement learning in optimizing test case prioritization and ranking.This integrated approach improves testing efficiency,reduces late-stage defects,and improves overall software stability.The study provides valuable information for AI-driven testing frameworks,paving the way for more intelligent and adaptive software quality assurance methodologies.
文摘Quality engineers play a key role in software product development,covering various stages such as requirements analysis,design,coding,testing,and delivery.Its responsibilities include formulating quality standards,writing test cases,conducting functional and performance tests,and optimizing the product based on feedback.In government procurement projects,quality evaluation focuses on process compliance,security,and functional compatibility.KPI evaluation trees are commonly used for quantitative assessment,and a dynamic adjustment mechanism for indicators needs to be established to cope with complex demands.In addition,risk-driven testing and agile development should be combined to set up quality access control to ensure that each iteration version meets expectations.The multi-dimensional quality assurance and verification scoring mechanism can effectively enhance product reliability and reduce project risks.
基金supported by the Guangdong Higher Education Association’s“14th Five Year Plan”2024 Higher Education Research Project(24GYB03)the Natural Science Foundation of Guangdong Province(2024A1515010255)。
文摘With the rapid advancement of information technology,the quality assurance and evaluation of software engineering education have become pivotal concerns for higher education institutions.In this paper,we focus on a comparative study of software engineering education in China and Europe,aiming to explore the theoretical frameworks and practical pathways employed in both regions.Initially,we introduce and contrast the engineering education accreditation systems of China and Europe,including the Chinese engineering education accreditation framework and the European EUR-ACE(European Accreditation of Engineering Programmes)standards,highlighting their core principles and evaluation methodologies.Subsequently,we provide case studies of several universities in China and Europe,such as Sun Yat-sen University,Tsinghua University,Technical University of Munich,and Imperial College London.Finally,we offer recommendations to foster mutual learning and collaboration between Chinese and European institutions,aiming to enhance the overall quality of software engineering education globally.This work provides valuable insights for educational administrators,faculty members,and policymakers,contributing to the ongoing improvement and innovative development of software engineering education in China and Europe.
文摘Ensuring software quality in open⁃source environments requires adaptive mechanisms to enhance scalability,optimize service provisioning,and improve reliability.This study presents the dynamic correlation analysis technique to enhance software quality management in open⁃source environments by addressing dynamic scalability,adaptive service provisioning,and software reliability.The proposed methodology integrates a scalability metric,an optimized service provisioning model,and a weighted entropy⁃based reliability assessment to systematically improve key performance parameters.Experimental evaluation conducted on multiple open⁃source software(OSS)versions demonstrates significant improvements:scalability increased by 27.5%,service provisioning time reduced by 18.3%,and software reliability improved by 22.1%compared to baseline methods.A comparative analysis with prior works further highlights the effectiveness of this approach in ensuring adaptability,efficiency,and resilience in dynamic software ecosystems.Future work will focus on real⁃time monitoring and AI⁃driven adaptive provisioning to further enhance software quality management.
文摘The quality of the software product is a crucial factor that contributes to its success. Therefore, it is important to specify the right software quality requirements that will establish the basis for desired quality of the final system/software product. There are several known methodologies/ processes that support the specification of the system/software functional requirements starting from the user needs to finally obtain the system requirements that the developers can implement through their development process. System/software quality requirements are interdependent with functional requirements, which means that the system/software quality requirements are meant to be specified in parallel with the latter. The ISO/IEC 25000 [1] SQuaRE series of standards include the standard ISO/IEC 25030—Software engineering—Software Quality Requirements and Evaluation—Quality requirements [2], which has as main goal to help specify software quality requirements. As to date, this standard does not offer clear and concise steps that a software quality engineer could follow in order to specify them. This article presents modifications recommended for ISO/IEC 25030 standard, with, among the others, a new requirements definition process that allows for specifying the system/software quality requirements taking into account the existing published system and software quality model ISO/IEC 25010 [3] as well as all the stake- holders of the project.
基金Supported by Common Chips and Basic Software Products(2010ZX01045-001-004-3)~~
文摘[Objective] To establish the traceability mechanism of agricultural products safety, and the application of promote domestic based software in the supervision area of agricultural products quality and safety. [Method] Through the analysis on the circulation characteristics of agricultural products, like fruits, vegetables, livestock and poultry, the agricultural products quality safety management and traceability query business component libraries were designed. Based on the run-time-supporting environment provided by domestic based software, traceability management system of agricultural products quality and safety was constructed. [Result] The traceability management system provided the information interaction and comprehensive management platform of agricultural product quality and safety based on domestic based software for the government, enterprises and consumers. [Conclusion] Through the application demonstration, the quality control and information traceability of full circulation of agricultural products was achieved effective and reliably, and the management level of agricultural products quality and safety was improved.
文摘Software security poses substantial risks to our society because software has become part of our life. Numerous techniques have been proposed to resolve or mitigate the impact of software security issues. Among them, software testing and analysis are two of the critical methods, which significantly benefit from the advancements in deep learning technologies. Due to the successful use of deep learning in software security, recently,researchers have explored the potential of using large language models(LLMs) in this area. In this paper, we systematically review the results focusing on LLMs in software security. We analyze the topics of fuzzing, unit test, program repair, bug reproduction, data-driven bug detection, and bug triage. We deconstruct these techniques into several stages and analyze how LLMs can be used in the stages. We also discuss the future directions of using LLMs in software security, including the future directions for the existing use of LLMs and extensions from conventional deep learning research.
文摘When the expression “Software Quality” is used, we usually think in terms of an excellent software product that fulfills our expectations. These expectations are based on the intended use. Number of models has been proposed for evaluation of software quality based on various characteristics. In this paper quality of software product is defined in terms of basic components as constituent part of any program or software and proposed a software quality prediction model based on basic components. It has been justified with example that if any software quality model uses the tacit knowledge that will be better than any other model in terms of quality.
文摘It is essential to study quality of software production like others. Software productions have special properties. They are intangible. So, qualitative evaluation encountered with complexity. Hence, proposing a model in order to evaluate the quality of Software productions is considerable to most software managers and experts. In this paper, regarding to improve software productions quality, the process of software production has been determined, using CMM standard framework of maturity level. In CMM it does not present a method for measurement and evaluation maturity level, the presented process in CMM standard mapped by COBIT control objectives has been combined in the process of software productions development in developed hybrid framework. In this research, the processes have been mapped utilizing focus and established group, in parallel of software production in different maturity level of CMM mapped by COBIT framework. In order to show the capabilities of proposed framework, the hybrid evaluation model was employed in a software developing organization as a case study. According to the results of evaluation, improvements proceedings and action plans have been proposed and discussed to enhance the software production processes.
基金funded by the Youth Fund of the National Natural Science Foundation of China(Grant No.42261070).
文摘Spectrum-based fault localization (SBFL) generates a ranked list of suspicious elements by using the program execution spectrum, but the excessive number of elements ranked in parallel results in low localization accuracy. Most researchers consider intra-class dependencies to improve localization accuracy. However, some studies show that inter-class method call type faults account for more than 20%, which means such methods still have certain limitations. To solve the above problems, this paper proposes a two-phase software fault localization based on relational graph convolutional neural networks (Two-RGCNFL). Firstly, in Phase 1, the method call dependence graph (MCDG) of the program is constructed, the intra-class and inter-class dependencies in MCDG are extracted by using the relational graph convolutional neural network, and the classifier is used to identify the faulty methods. Then, the GraphSMOTE algorithm is improved to alleviate the impact of class imbalance on classification accuracy. Aiming at the problem of parallel ranking of element suspicious values in traditional SBFL technology, in Phase 2, Doc2Vec is used to learn static features, while spectrum information serves as dynamic features. A RankNet model based on siamese multi-layer perceptron is constructed to score and rank statements in the faulty method. This work conducts experiments on 5 real projects of Defects4J benchmark. Experimental results show that, compared with the traditional SBFL technique and two baseline methods, our approach improves the Top-1 accuracy by 262.86%, 29.59% and 53.01%, respectively, which verifies the effectiveness of Two-RGCNFL. Furthermore, this work verifies the importance of inter-class dependencies through ablation experiments.
文摘Software-defined networking(SDN)is an innovative paradigm that separates the control and data planes,introducing centralized network control.SDN is increasingly being adopted by Carrier Grade networks,offering enhanced networkmanagement capabilities than those of traditional networks.However,because SDN is designed to ensure high-level service availability,it faces additional challenges.One of themost critical challenges is ensuring efficient detection and recovery from link failures in the data plane.Such failures can significantly impact network performance and lead to service outages,making resiliency a key concern for the effective adoption of SDN.Since the recovery process is intrinsically dependent on timely failure detection,this research surveys and analyzes the current literature on both failure detection and recovery approaches in SDN.The survey provides a critical comparison of existing failure detection techniques,highlighting their advantages and disadvantages.Additionally,it examines the current failure recovery methods,categorized as either restoration-based or protection-based,and offers a comprehensive comparison of their strengths and limitations.Lastly,future research challenges and directions are discussed to address the shortcomings of existing failure recovery methods.
基金Financial and data support for this work is provided by the U.S. Environmental Protection Agency (No. GS-10F-0205T)partly supported by the funding of Guangdong Provincial Key Laboratory of Atmospheric Environment and Pollution Control (No. h2xj D612004 Ш )+1 种基金the funding of State Environmental Protection Key Laboratory of Sources and Control of Air Pollution Complex (No. SCAPC201308)the project of Atmospheric Haze Collaboration Control Technology Design (No. XDB05030400) from Chinese Academy of Sciences
文摘This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, uses a response surface modeling(RSM) methodology and serves as a visualization and analysis tool(VAT) for three-dimensional air quality data obtained by atmospheric models. The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits. The case study of contiguous U.S.demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias 〈 2% and assisting in air quality policy making in near real time.
文摘Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniques coming up rapidly.The purpose of this study is to review the recent developments in the field of security integration in the software development lifecycle(SDLC)by analyzing the articles published in the last two decades and to propose a way forward.This review follows Kitchenham’s review protocol.The review has been divided into three main stages including planning,execution,and analysis.From the selected 100 articles,it becomes evident that need of a collaborative approach is necessary for addressing critical software security risks(CSSRs)through effective risk management/estimation techniques.Quantifying risks using a numeric scale enables a comprehensive understanding of their severity,facilitating focused resource allocation and mitigation efforts.Through a comprehensive understanding of potential vulnerabilities and proactive mitigation efforts facilitated by protection poker,organizations can prioritize resources effectively to ensure the successful outcome of projects and initiatives in today’s dynamic threat landscape.The review reveals that threat analysis and security testing are needed to develop automated tools for the future.Accurate estimation of effort required to prioritize potential security risks is a big challenge in software security.The accuracy of effort estimation can be further improved by exploring new techniques,particularly those involving deep learning.It is also imperative to validate these effort estimation methods to ensure all potential security threats are addressed.Another challenge is selecting the right model for each specific security threat.To achieve a comprehensive evaluation,researchers should use well-known benchmark checklists.
文摘Software testing is a critical phase due to misconceptions about ambiguities in the requirements during specification,which affect the testing process.Therefore,it is difficult to identify all faults in software.As requirement changes continuously,it increases the irrelevancy and redundancy during testing.Due to these challenges;fault detection capability decreases and there arises a need to improve the testing process,which is based on changes in requirements specification.In this research,we have developed a model to resolve testing challenges through requirement prioritization and prediction in an agile-based environment.The research objective is to identify the most relevant and meaningful requirements through semantic analysis for correct change analysis.Then compute the similarity of requirements through case-based reasoning,which predicted the requirements for reuse and restricted to error-based requirements.Afterward,the apriori algorithm mapped out requirement frequency to select relevant test cases based on frequently reused or not reused test cases to increase the fault detection rate.Furthermore,the proposed model was evaluated by conducting experiments.The results showed that requirement redundancy and irrelevancy improved due to semantic analysis,which correctly predicted the requirements,increasing the fault detection rate and resulting in high user satisfaction.The predicted requirements are mapped into test cases,increasing the fault detection rate after changes to achieve higher user satisfaction.Therefore,the model improves the redundancy and irrelevancy of requirements by more than 90%compared to other clustering methods and the analytical hierarchical process,achieving an 80%fault detection rate at an earlier stage.Hence,it provides guidelines for practitioners and researchers in the modern era.In the future,we will provide the working prototype of this model for proof of concept.
文摘Link failure is a critical issue in large networks and must be effectively addressed.In software-defined networks(SDN),link failure recovery schemes can be categorized into proactive and reactive approaches.Reactive schemes have longer recovery times while proactive schemes provide faster recovery but overwhelm the memory of switches by flow entries.As SDN adoption grows,ensuring efficient recovery from link failures in the data plane becomes crucial.In particular,data center networks(DCNs)demand rapid recovery times and efficient resource utilization to meet carrier-grade requirements.This paper proposes an efficient Decentralized Failure Recovery(DFR)model for SDNs,meeting recovery time requirements and optimizing switch memory resource consumption.The DFR model enables switches to autonomously reroute traffic upon link failures without involving the controller,achieving fast recovery times while minimizing memory usage.DFR employs the Fast Failover Group in the OpenFlow standard for local recovery without requiring controller communication and utilizes the k-shortest path algorithm to proactively install backup paths,allowing immediate local recovery without controller intervention and enhancing overall network stability and scalability.DFR employs flow entry aggregation techniques to reduce switch memory usage.Instead of matching flow entries to the destination host’s MAC address,DFR matches packets to the destination switch’s MAC address.This reduces the switches’Ternary Content-Addressable Memory(TCAM)consumption.Additionally,DFR modifies Address Resolution Protocol(ARP)replies to provide source hosts with the destination switch’s MAC address,facilitating flow entry aggregation without affecting normal network operations.The performance of DFR is evaluated through the network emulator Mininet 2.3.1 and Ryu 3.1 as SDN controller.For different number of active flows,number of hosts per edge switch,and different network sizes,the proposed model outperformed various failure recovery models:restoration-based,protection by flow entries,protection by group entries and protection by Vlan-tagging model in terms of recovery time,switch memory consumption and controller overhead which represented the number of flow entry updates to recover from the failure.Experimental results demonstrate that DFR achieves recovery times under 20 milliseconds,satisfying carrier-grade requirements for rapid failure recovery.Additionally,DFR reduces switch memory usage by up to 95%compared to traditional protection methods and minimizes controller load by eliminating the need for controller intervention during failure recovery.Theresults underscore the efficiency and scalability of the DFR model,making it a practical solution for enhancing network resilience in SDN environments.
文摘Software defect prediction(SDP)aims to find a reliable method to predict defects in specific software projects and help software engineers allocate limited resources to release high-quality software products.Software defect prediction can be effectively performed using traditional features,but there are some redundant or irrelevant features in them(the presence or absence of this feature has little effect on the prediction results).These problems can be solved using feature selection.However,existing feature selection methods have shortcomings such as insignificant dimensionality reduction effect and low classification accuracy of the selected optimal feature subset.In order to reduce the impact of these shortcomings,this paper proposes a new feature selection method Cubic TraverseMa Beluga whale optimization algorithm(CTMBWO)based on the improved Beluga whale optimization algorithm(BWO).The goal of this study is to determine how well the CTMBWO can extract the features that are most important for correctly predicting software defects,improve the accuracy of fault prediction,reduce the number of the selected feature and mitigate the risk of overfitting,thereby achieving more efficient resource utilization and better distribution of test workload.The CTMBWO comprises three main stages:preprocessing the dataset,selecting relevant features,and evaluating the classification performance of the model.The novel feature selection method can effectively improve the performance of SDP.This study performs experiments on two software defect datasets(PROMISE,NASA)and shows the method’s classification performance using four detailed evaluation metrics,Accuracy,F1-score,MCC,AUC and Recall.The results indicate that the approach presented in this paper achieves outstanding classification performance on both datasets and has significant improvement over the baseline models.
基金National Key R&D Program of China(Nos.2023YFC3806900,2022YFE0141400)。
文摘The advent of parametric design has resulted in a marked increase in the complexity of building.Unfortunately,traditional construction methods make it difficult to meet the needs.Therefore,construction robots have become a pivotal production tool in this context.Since the arm span of a single robot usually does not exceed 3 meters,it is not competent for producing large-scale building components.Accordingly,the extension of the robot,s working range is often achieved by external axes.Nevertheless,the coupling control of external axes and robots and their kinematic solution have become key challenges.The primary technical difficulties include customized construction robots,automatic solutions for external axes,fixed axis joints,and specific motion mode control.This paper proposes solutions to these difficulties,introduces the relevant basic concepts and algorithms in detail,and encapsulates these robotics principles and algorithm processes into the Grasshopper plug-in commonly used by architects to form the FURobot software platform.This platform effectively solves the above problems,lowers the threshold for architects,and improves production efficiency.The effectiveness of the algorithm and software in this paper is verified through simulation experiments.
基金supported in part by the Education Reform Key Projects of Heilongjiang Province(Grant No.SJGZ20220011,SJGZ20220012)the Excellent Project of Ministry of Education and China Higher Education Association on Digital Ideological and Political Education in Universities(Grant No.GXSZSZJPXM001)。
文摘This paper proposes a multivariate data fusion based quality evaluation model for software talent cultivation.The model constructs a comprehensive ability and quality evaluation index system for college students from a perspective of engineering course,especially of software engineering.As for evaluation method,relying on the behavioral data of students during their school years,we aim to construct the evaluation model as objective as possible,effectively weakening the negative impact of personal subjective assumptions on the evaluation results.
文摘In the dynamic landscape of software technologies,the demand for sophisticated applications across diverse industries is ever⁃increasing.However,predicting software defects remains a crucial challenge for ensuring the resilience and dependability of software systems.This study presents a novel software defect prediction technique that significantly enhances performance through a hybrid machine learning approach.The innovative methodology integrates a Genetic Algorithm(GA)for precise feature selection,a Decision Tree(DT)for robust classification,and leverages the capabilities of Particle Swarm Optimization(PSO)and Ant Colony Optimization(ACO)algorithms for precision⁃driven optimization.The utilization of datasets from varied sources enriches the predictive prowess of our model.Of particular significance in our pursuit is the unwavering focus on enhancing the prediction process through a highly refined PSO⁃ACO algorithm,thereby optimizing the efficiency and effectiveness of the GA⁃DT hybrid model.The thorough evaluation of our proposed approach unfolds across seven software projects,unveiling a paradigm shift in performance metrics.Results unequivocally demonstrate that the GA⁃DT with PSO⁃ACO algorithm surpasses its counterparts,showcasing unparalleled accuracy and reliability.Furthermore,our hybrid approach demonstrates outstanding performance in terms of F⁃measure,with an impressive increase rate of 78%.