Mental-health risk detection seeks early signs of distress from social media posts and clinical transcripts to enable timely intervention before crises.When such risks go undetected,consequences can escalate to self-h...Mental-health risk detection seeks early signs of distress from social media posts and clinical transcripts to enable timely intervention before crises.When such risks go undetected,consequences can escalate to self-harm,long-term disability,reduced productivity,and significant societal and economic burden.Despite recent advances,detecting risk from online text remains challenging due to heterogeneous language,evolving semantics,and the sequential emergence of new datasets.Effective solutions must encode clinically meaningful cues,reason about causal relations,and adapt to new domains without forgetting prior knowledge.To address these challenges,this paper presents a Continual Neuro-Symbolic Graph Learning(CNSGL)framework that unifies symbolic reasoning,causal inference,and continual learning within a single architecture.Each post is represented as a symbolic graph linking clinically relevant tags to textual content,enriched with causal edges derived from directional Point-wise Mutual Information(PMI).A two-layer Graph Convolutional Network(GCN)encodes these graphs,and a Transformer-based attention pooler aggregates node embeddings while providing interpretable tag-level importances.Continual adaptation across datasets is achieved through the Multi-Head Freeze(MH-Freeze)strategy,which freezes a shared encoder and incrementally trains lightweight task-specific heads(small classifiers attached to the shared embedding).Experimental evaluations across six diverse mental-health datasets ranging from Reddit discourse to clinical interviews,demonstrate that MH-Freeze consistently outperforms existing continual-learning baselines in both discriminative accuracy and calibration reliability.Across six datasets,MH-Freeze achieves up to 0.925 accuracy and 0.923 F1-Score,with AUPRC≥0.934 and AUROC≥0.942,consistently surpassing all continual-learning baselines.The results confirm the framework’s ability to preserve prior knowledge,adapt to domain shifts,and maintain causal interpretability,establishing CNSGL as a promising step toward robust,explainable,and lifelong mental-health risk assessment.展开更多
A new method that designs and implements the component-based distributed & hierarchical flexible manufacturing control software is described with a component concept in this paper. The proposed method takes aim at...A new method that designs and implements the component-based distributed & hierarchical flexible manufacturing control software is described with a component concept in this paper. The proposed method takes aim at improving the flexibility and reliability of the control system. On the basis of describing the concepts of component-based software and the distributed object technology, the architecture of the component-based software of the control system is suggested with the Common Object Request Broker Architecture (CORBA). And then, we propose a design method for component-based distributed & hierarchical flexible manufacturing control system. Finally, to verify the software design method, a prototype flexible manufacturing control system software has been implemented in Orbix 2.3c, VC + + 6. 0 and has been tested in connection with the physical flexible manufacturing shop at the WuXi Professional Institute.展开更多
The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defin...The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defined assumption of a-priori-ness of knowledge. For realizing this aim, the following work has been done: 1) a two-valued algebraic system of formal axiology has been defined precisely and applied to proper-philosophy of physics, namely, to an almost unknown (not-recognized) formal-axiological aspect of the physical law of conservation of energy;2) the formal axiomatic epistemology-and-axiology theory Sigma has been defined precisely and applied to proper-physics for realizing the above-indicated purpose. Thus, a discrete mathematical model of relationship between philosophy of physics and universal epistemology united with formal axiology has been constructed. Results: 1) By accurate computing relevant compositions of evaluation-functions within the discrete mathematical model, it is demonstrated that a formal-axiological analog of the great conservation law of proper physics is a formal-axiological law of two-valued algebra of metaphysics. (A precise algorithmic definition of the unhabitual (not-well-known) notion “formal-axiological law of algebra of metaphysics” is given.) 2) The hitherto never published significantly new nontrivial scientific result of investigation presented in this article is a formal logical inference of the law of conservation of energy within the formal axiomatic theory Sigma from conjunction of the formal-axiological analog of the law of conservation of energy and the assumption of a-priori-ness of knowledge.展开更多
In view of the flaws of component-based software (CBS) reliability modeling and analysis, the low recognition degree of debugging process, too many assumptions and difficulties in obtaining the solution, a CBS relia...In view of the flaws of component-based software (CBS) reliability modeling and analysis, the low recognition degree of debugging process, too many assumptions and difficulties in obtaining the solution, a CBS reliability simulation process is presented incorporating the imperfect debugging and the limitation of debugging resources. Considering the effect of imperfect debugging on fault detec- tion and correction process, a CBS integration testing model is sketched by multi-queue muhichannel and finite server queuing model (MMFSQM). Compared with the analytical method based on pa- rameters and other nonparametric approaches, the simulation approach can relax more of the usual reliability modeling assumptions and effectively expound integration testing process of CBS. Then, CBS reliability process simulation procedure is developed accordingly. The proposed simulation ap- proach is validated to be sound and effective by simulation experiment studies and analysis.展开更多
For increased and various communication requirements of modem applications on embedded systems, general purpose protocol stacks and protocol models are not efficient because they are fixed to execute in the static mod...For increased and various communication requirements of modem applications on embedded systems, general purpose protocol stacks and protocol models are not efficient because they are fixed to execute in the static mode. We present the Component-Based Communication Protocol Architecture (CCPA) to make communication dynamic and configurable. It can develop, test and store the customized components for flexible reuse. The protocols are implemented by component assembly and support by configurable environments. This leads to smaller memory, more flexibility, more reconfiguration ability, better concurrency, and multiple data channel support.展开更多
Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use th...Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use this information to provide better service and improve user experience. Location awareness makes certain applications possible, e.g., recommending nearby businesses and tracking estimated routes. An Android application is able to collect useful Wi-Fi information without registering a location listener with a network-based provider. We passively collected the data of the IDs of Wi-Fi access points and the received signal strengths. We developed and implemented an algorithm to analyse the data;and designed heuristics to infer the location of the device over time—all without ever connecting to the network thus maximally preserving the privacy of the user.展开更多
Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general pop...Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general population. These sub-groups have higher infectivity rates. We came up with a likelihood inference model of multi-type birth-death process that can be used to make inference for HIV epidemic in an African setting. We employ a likelihood inference that incorporates a probability of removal from infectious pool in the model. We have simulated trees and made parameter inference on the simulated trees as well as investigating whether the model distinguishes between heterogeneous and homogeneous dynamics. The model makes fairly good parameter inference. It distinguishes between heterogeneous and homogeneous dynamics well. Parameter estimation was also performed under sparse sampling scenario. We investigated whether trees obtained from a structured population are more balanced than those from a non-structured host population using tree statistics that measure tree balance and imbalance. Trees from non-structured population were more balanced basing on Colless and Sackin indices.展开更多
For a more accurate and comprehensive assessment of the trustworthiness of component-based soft- ware system, the fuzzy analytic hierarchy process is introduced to establish the analysis model. Combine qualitative and...For a more accurate and comprehensive assessment of the trustworthiness of component-based soft- ware system, the fuzzy analytic hierarchy process is introduced to establish the analysis model. Combine qualitative and quantitative analyses, the impacts to overall trustworthiness by the different types of components are distinguished. Considering the coupling relationship between components, dividing the system into several layers from target layer to scheme layer, evaluating the scheme advantages disadvantages by group decision-making, the trustworthiness of a typical J2EE structured component-based software is assessed. The trustworthiness asses model of the software components provides an effective methods of operation.展开更多
The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path inform...The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path information can also help protect user privacy and security.However,due to the complicated and heterogeneous structure of the Internet,path information is not publicly available.Obtaining path information is challenging due to the limited measurement probes and collectors.Therefore,inferring Internet inter-domain paths from the limited data is a supplementary approach to measure Internet inter-domain paths.The purpose of this survey is to provide an overview of techniques that have been conducted to infer Internet inter-domain paths from 2005 to 2023 and present the main lessons from these studies.To this end,we summarize the inter-domain path inference techniques based on the granularity of the paths,for each method,we describe the data sources,the key ideas,the advantages,and the limitations.To help readers understand the path inference techniques,we also summarize the background techniques for path inference,such as techniques to measure the Internet,infer AS relationships,resolve aliases,and map IP addresses to ASes.A case study of the existing techniques is also presented to show the real-world applications of inter-domain path inference.Additionally,we discuss the challenges and opportunities in inferring Internet inter-domain paths,the drawbacks of the state-of-the-art techniques,and the future directions.展开更多
Against the deficiencies of component-based software(CBS) reliability modeling and analysis,for instance,importing too many assumptions,paying less attention to debugging process without considering imperfect debuggin...Against the deficiencies of component-based software(CBS) reliability modeling and analysis,for instance,importing too many assumptions,paying less attention to debugging process without considering imperfect debugging and change-point(CP) problems adequately,an approach of CBS reliability process analysis is proposed which incorporates the imperfect debugging and CP.First,perfect/imperfect debugging and CP are reviewed.Based on the queuing theory,a multi-queue multichannel and infinite server queuing model(MMISQM) is presented to sketch the integration test process of CBS.Meanwhile,considering the effects of imperfect debugging and CP,expressions for fault detection and correction are derived based on MMISQM.Numerical results demonstrate that the proposed model can sketch the integration test process of CBS with preferable performance which outperforms other models.展开更多
In view of the problems and the weaknesses of component-based software ( CBS ) reliability modeling and analysis, and a lack of consideration for real debugging circumstance of integration tes- ting, a CBS reliabili...In view of the problems and the weaknesses of component-based software ( CBS ) reliability modeling and analysis, and a lack of consideration for real debugging circumstance of integration tes- ting, a CBS reliability process analysis model is proposed incorporating debugging time delay, im- perfect debugging and limited debugging resources. CBS integration testing is formulated as a multi- queue muhichannel and finite server queuing model (MMFSQM) to illustrate fault detection process (FDP) and fault correction process (FCP). A unified FCP is sketched, given debugging delay, the diversities of faults processing and the limitations of debugging resources. Furthermore, the impacts of imperfect debugging on fault detection and correction are explicitly elaborated, and the expres- sions of the cumulative number of fault detected and corrected are illustrated. Finally, the results of numerical experiments verify the effectiveness and rationality of the proposed model. By comparison, the proposed model is superior to the other models. The proposed model is closer to real CBS testing process and facilitates software engineer' s quantitatively analyzing, measuring and predicting CBS reliability. K展开更多
Since most of the available component-based software reliability models consume high computational cost and suffer from the evaluating complexity for the software system with complex structures,a component-based back-...Since most of the available component-based software reliability models consume high computational cost and suffer from the evaluating complexity for the software system with complex structures,a component-based back-propagation reliability model(CBPRM)with low complexity for the complex software system reliability evaluation is presented in this paper.The proposed model is based on the artificial neural networks and the component reliability sensitivity analyses.These analyses are performed dynamically and assigned to the neurons to optimize the reliability evaluation.CBPRM has a linear increasing complexity and outperforms the state-based and the path-based reliability models.Another advantage of CBPRM over others is its robustness.CBPRM depends on the component reliabilities and the correlative sensitivities,which are independent from the software system structure.Based on the theory analysis and experiment results,it shows that the complexity of CBPRM is evidently lower than the contrast models and the reliability evaluating accuracy is acceptable when the software system structure is complex.展开更多
Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial ...Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial eastern and middle/western Pacific on the SSTA in the equatorial region and their contribution to the latter are diagnosed and verified with observations of a number of significant El Nio and La Nia episodes. New viewpoints are propsed. The methods of wavelet decomposition and reconstruction are used to build a predictive model based on independent domains of frequency,which shows some advantages in composite prediction and prediction validity.The methods presented above are of non-linearity, error-allowing and auto-adaptive/learning, in addition to rapid and easy access,illustrative and quantitative presentation,and analyzed results that agree generally with facts. They are useful in diagnosing and predicting the El Nio and La Nia problems that are just roughly described in dynamics.展开更多
If the components in a component-based software system come from different sources, the characteristics of the components may be different. Therefore, evaluating the reliability of a component-based system with a fixe...If the components in a component-based software system come from different sources, the characteristics of the components may be different. Therefore, evaluating the reliability of a component-based system with a fixed model for all components will not be reasonable. To solve this problem, this paper combines a single reliability growth model with an architecture-based reliability model, and proposes an optimal selecting approach. First, the most appropriate model of each component is selected according to the historical reliability data of the component, so that the evaluation deviation is the smallest. Then, system reliability is evaluated according to both the relationships among components and the using frequency of each component. As the approach takes into account the historical data and the using frequency of each component, the evaluation and prediction results are more accurate than those of using a single model.展开更多
In a component-based software development life cycle, selection of preexisting components is an important task. Every component that has to be reused has an associated risk of failure of not meeting the functional and...In a component-based software development life cycle, selection of preexisting components is an important task. Every component that has to be reused has an associated risk of failure of not meeting the functional and non-functional requirements. A component's failure would lead a developer to look for some other alternative of combinations of COTS, in-house and engineered components among possible candidate combinations. This means design itself can readily change. The very process of design of a software system and component selection seems to be heavily dependent on testing results. Instability of design, further, becomes more severe due to requirements change requests. Therefore, this instability of design has to be essentially mitigated by using proper design and testing approaches, otherwise, it may lead to exorbitantly high testing cost due to the repeated testing of various alternatives. How these three activities: Component-based software design, component selection and component-based software testing are interrelated? What process model is most suited to address this concern? This work explores the above questions and their implication in terms of nature of a process model that can be convincing in case of component-based software development.展开更多
Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Ol...Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Olympic Park, Beijing, Shandong Province, Gansu Province and Shanxi Province, we have generalized the method and procedure for inferring the discontinuity of electrical structures (DES) indicating a buried fault in urban areas from resistivity tomograms and its typical electrical features. In general, the layered feature of the electrical structure is first analyzed to preliminarily define whether or not a DES exists in the target area. Resistivity contours in resistivity tomograms are then analyzed from the deep to the shallow. If they extend upward from the deep to the shallow and shape into an integral dislocation, sharp flexure (convergence) or gradient zone, it is inferred that the DES exists, indicating a buried fault. Finally, horizontal tracing is be carried out to define the trend of the DES. The DES can be divided into three types-type AB, ABA and AC. In the present paper, the Zhangdian-Renhe fault system in Zibo city is used as an example to illustrate how to use the method to infer the location and spatial extension of a target fault. Geologic drilling holes are placed based on our research results, and the drilling logs testify that our results are correct. However, the method of this paper is not exclusive and inflexible. It is expected to provide reference and assistance for inferring the shallow buried faults in urban areas from resistivity tomograms in the future.展开更多
Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel perf...Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.展开更多
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(RS-2025-00518960)in part by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(RS-2025-00563192).
文摘Mental-health risk detection seeks early signs of distress from social media posts and clinical transcripts to enable timely intervention before crises.When such risks go undetected,consequences can escalate to self-harm,long-term disability,reduced productivity,and significant societal and economic burden.Despite recent advances,detecting risk from online text remains challenging due to heterogeneous language,evolving semantics,and the sequential emergence of new datasets.Effective solutions must encode clinically meaningful cues,reason about causal relations,and adapt to new domains without forgetting prior knowledge.To address these challenges,this paper presents a Continual Neuro-Symbolic Graph Learning(CNSGL)framework that unifies symbolic reasoning,causal inference,and continual learning within a single architecture.Each post is represented as a symbolic graph linking clinically relevant tags to textual content,enriched with causal edges derived from directional Point-wise Mutual Information(PMI).A two-layer Graph Convolutional Network(GCN)encodes these graphs,and a Transformer-based attention pooler aggregates node embeddings while providing interpretable tag-level importances.Continual adaptation across datasets is achieved through the Multi-Head Freeze(MH-Freeze)strategy,which freezes a shared encoder and incrementally trains lightweight task-specific heads(small classifiers attached to the shared embedding).Experimental evaluations across six diverse mental-health datasets ranging from Reddit discourse to clinical interviews,demonstrate that MH-Freeze consistently outperforms existing continual-learning baselines in both discriminative accuracy and calibration reliability.Across six datasets,MH-Freeze achieves up to 0.925 accuracy and 0.923 F1-Score,with AUPRC≥0.934 and AUROC≥0.942,consistently surpassing all continual-learning baselines.The results confirm the framework’s ability to preserve prior knowledge,adapt to domain shifts,and maintain causal interpretability,establishing CNSGL as a promising step toward robust,explainable,and lifelong mental-health risk assessment.
基金Supported by National High Technology Development plan(Item No.:2001AA412250)and Shanghai Science & Technology Development Project(Item No.:02FK04)
文摘A new method that designs and implements the component-based distributed & hierarchical flexible manufacturing control software is described with a component concept in this paper. The proposed method takes aim at improving the flexibility and reliability of the control system. On the basis of describing the concepts of component-based software and the distributed object technology, the architecture of the component-based software of the control system is suggested with the Common Object Request Broker Architecture (CORBA). And then, we propose a design method for component-based distributed & hierarchical flexible manufacturing control system. Finally, to verify the software design method, a prototype flexible manufacturing control system software has been implemented in Orbix 2.3c, VC + + 6. 0 and has been tested in connection with the physical flexible manufacturing shop at the WuXi Professional Institute.
文摘The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defined assumption of a-priori-ness of knowledge. For realizing this aim, the following work has been done: 1) a two-valued algebraic system of formal axiology has been defined precisely and applied to proper-philosophy of physics, namely, to an almost unknown (not-recognized) formal-axiological aspect of the physical law of conservation of energy;2) the formal axiomatic epistemology-and-axiology theory Sigma has been defined precisely and applied to proper-physics for realizing the above-indicated purpose. Thus, a discrete mathematical model of relationship between philosophy of physics and universal epistemology united with formal axiology has been constructed. Results: 1) By accurate computing relevant compositions of evaluation-functions within the discrete mathematical model, it is demonstrated that a formal-axiological analog of the great conservation law of proper physics is a formal-axiological law of two-valued algebra of metaphysics. (A precise algorithmic definition of the unhabitual (not-well-known) notion “formal-axiological law of algebra of metaphysics” is given.) 2) The hitherto never published significantly new nontrivial scientific result of investigation presented in this article is a formal logical inference of the law of conservation of energy within the formal axiomatic theory Sigma from conjunction of the formal-axiological analog of the law of conservation of energy and the assumption of a-priori-ness of knowledge.
基金Supported by the National High Technology Research and Development Program of China(No.2008AA01A201)the National Nature Science Foundation of China(No.60503015,90818016)
文摘In view of the flaws of component-based software (CBS) reliability modeling and analysis, the low recognition degree of debugging process, too many assumptions and difficulties in obtaining the solution, a CBS reliability simulation process is presented incorporating the imperfect debugging and the limitation of debugging resources. Considering the effect of imperfect debugging on fault detec- tion and correction process, a CBS integration testing model is sketched by multi-queue muhichannel and finite server queuing model (MMFSQM). Compared with the analytical method based on pa- rameters and other nonparametric approaches, the simulation approach can relax more of the usual reliability modeling assumptions and effectively expound integration testing process of CBS. Then, CBS reliability process simulation procedure is developed accordingly. The proposed simulation ap- proach is validated to be sound and effective by simulation experiment studies and analysis.
基金supported by the National Natural Science Foundation of China(61471343)the National Key Technology Research and Development Program of the Ministry of Science and Technology of China(2014BAK14B03)
基金Project (No. 2002AA1Z2306) supported by the Hi-Tech Researchand Development Program (863) of China
文摘For increased and various communication requirements of modem applications on embedded systems, general purpose protocol stacks and protocol models are not efficient because they are fixed to execute in the static mode. We present the Component-Based Communication Protocol Architecture (CCPA) to make communication dynamic and configurable. It can develop, test and store the customized components for flexible reuse. The protocols are implemented by component assembly and support by configurable environments. This leads to smaller memory, more flexibility, more reconfiguration ability, better concurrency, and multiple data channel support.
文摘Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use this information to provide better service and improve user experience. Location awareness makes certain applications possible, e.g., recommending nearby businesses and tracking estimated routes. An Android application is able to collect useful Wi-Fi information without registering a location listener with a network-based provider. We passively collected the data of the IDs of Wi-Fi access points and the received signal strengths. We developed and implemented an algorithm to analyse the data;and designed heuristics to infer the location of the device over time—all without ever connecting to the network thus maximally preserving the privacy of the user.
文摘Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general population. These sub-groups have higher infectivity rates. We came up with a likelihood inference model of multi-type birth-death process that can be used to make inference for HIV epidemic in an African setting. We employ a likelihood inference that incorporates a probability of removal from infectious pool in the model. We have simulated trees and made parameter inference on the simulated trees as well as investigating whether the model distinguishes between heterogeneous and homogeneous dynamics. The model makes fairly good parameter inference. It distinguishes between heterogeneous and homogeneous dynamics well. Parameter estimation was also performed under sparse sampling scenario. We investigated whether trees obtained from a structured population are more balanced than those from a non-structured host population using tree statistics that measure tree balance and imbalance. Trees from non-structured population were more balanced basing on Colless and Sackin indices.
基金Sponsored by the National High Technology Research and Development Program of China ("863"Program) (2009AA01Z433)
文摘For a more accurate and comprehensive assessment of the trustworthiness of component-based soft- ware system, the fuzzy analytic hierarchy process is introduced to establish the analysis model. Combine qualitative and quantitative analyses, the impacts to overall trustworthiness by the different types of components are distinguished. Considering the coupling relationship between components, dividing the system into several layers from target layer to scheme layer, evaluating the scheme advantages disadvantages by group decision-making, the trustworthiness of a typical J2EE structured component-based software is assessed. The trustworthiness asses model of the software components provides an effective methods of operation.
基金the China Postdoctoral Science Foundation(2023TQ0089)the National Natural Science Foundation of China(Nos.62072465,62172155)the Science and Technology Innovation Program of Hunan Province(Nos.2022RC3061,2023RC3027).
文摘The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path information can also help protect user privacy and security.However,due to the complicated and heterogeneous structure of the Internet,path information is not publicly available.Obtaining path information is challenging due to the limited measurement probes and collectors.Therefore,inferring Internet inter-domain paths from the limited data is a supplementary approach to measure Internet inter-domain paths.The purpose of this survey is to provide an overview of techniques that have been conducted to infer Internet inter-domain paths from 2005 to 2023 and present the main lessons from these studies.To this end,we summarize the inter-domain path inference techniques based on the granularity of the paths,for each method,we describe the data sources,the key ideas,the advantages,and the limitations.To help readers understand the path inference techniques,we also summarize the background techniques for path inference,such as techniques to measure the Internet,infer AS relationships,resolve aliases,and map IP addresses to ASes.A case study of the existing techniques is also presented to show the real-world applications of inter-domain path inference.Additionally,we discuss the challenges and opportunities in inferring Internet inter-domain paths,the drawbacks of the state-of-the-art techniques,and the future directions.
基金Supported by the National High Technology Research and Development Program of China(No.2008AA01A201)the National Natural ScienceFoundation of China(No.60503015)+1 种基金the National Key R&D Program of China(No.2013BA17F02)the Shandong Province Science andTechnology Program of China(No.2011GGX10108,2010GGX10104)
文摘Against the deficiencies of component-based software(CBS) reliability modeling and analysis,for instance,importing too many assumptions,paying less attention to debugging process without considering imperfect debugging and change-point(CP) problems adequately,an approach of CBS reliability process analysis is proposed which incorporates the imperfect debugging and CP.First,perfect/imperfect debugging and CP are reviewed.Based on the queuing theory,a multi-queue multichannel and infinite server queuing model(MMISQM) is presented to sketch the integration test process of CBS.Meanwhile,considering the effects of imperfect debugging and CP,expressions for fault detection and correction are derived based on MMISQM.Numerical results demonstrate that the proposed model can sketch the integration test process of CBS with preferable performance which outperforms other models.
基金Supported by the National High Technology Research and Development Program of China(No.2008AA01A201)the National Natural Science Foundation of China(No.60503015)+1 种基金the National Key R&D Program of China(No.2013BA17F02)the Shandong Province Science and Technology Program of China(No.2011GGX10108,2010GGX10104)
文摘In view of the problems and the weaknesses of component-based software ( CBS ) reliability modeling and analysis, and a lack of consideration for real debugging circumstance of integration tes- ting, a CBS reliability process analysis model is proposed incorporating debugging time delay, im- perfect debugging and limited debugging resources. CBS integration testing is formulated as a multi- queue muhichannel and finite server queuing model (MMFSQM) to illustrate fault detection process (FDP) and fault correction process (FCP). A unified FCP is sketched, given debugging delay, the diversities of faults processing and the limitations of debugging resources. Furthermore, the impacts of imperfect debugging on fault detection and correction are explicitly elaborated, and the expres- sions of the cumulative number of fault detected and corrected are illustrated. Finally, the results of numerical experiments verify the effectiveness and rationality of the proposed model. By comparison, the proposed model is superior to the other models. The proposed model is closer to real CBS testing process and facilitates software engineer' s quantitatively analyzing, measuring and predicting CBS reliability. K
基金Supported by the National Natural Science Foundation of China(No.60973118,60873075)
文摘Since most of the available component-based software reliability models consume high computational cost and suffer from the evaluating complexity for the software system with complex structures,a component-based back-propagation reliability model(CBPRM)with low complexity for the complex software system reliability evaluation is presented in this paper.The proposed model is based on the artificial neural networks and the component reliability sensitivity analyses.These analyses are performed dynamically and assigned to the neurons to optimize the reliability evaluation.CBPRM has a linear increasing complexity and outperforms the state-based and the path-based reliability models.Another advantage of CBPRM over others is its robustness.CBPRM depends on the component reliabilities and the correlative sensitivities,which are independent from the software system structure.Based on the theory analysis and experiment results,it shows that the complexity of CBPRM is evidently lower than the contrast models and the reliability evaluating accuracy is acceptable when the software system structure is complex.
文摘Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial eastern and middle/western Pacific on the SSTA in the equatorial region and their contribution to the latter are diagnosed and verified with observations of a number of significant El Nio and La Nia episodes. New viewpoints are propsed. The methods of wavelet decomposition and reconstruction are used to build a predictive model based on independent domains of frequency,which shows some advantages in composite prediction and prediction validity.The methods presented above are of non-linearity, error-allowing and auto-adaptive/learning, in addition to rapid and easy access,illustrative and quantitative presentation,and analyzed results that agree generally with facts. They are useful in diagnosing and predicting the El Nio and La Nia problems that are just roughly described in dynamics.
文摘If the components in a component-based software system come from different sources, the characteristics of the components may be different. Therefore, evaluating the reliability of a component-based system with a fixed model for all components will not be reasonable. To solve this problem, this paper combines a single reliability growth model with an architecture-based reliability model, and proposes an optimal selecting approach. First, the most appropriate model of each component is selected according to the historical reliability data of the component, so that the evaluation deviation is the smallest. Then, system reliability is evaluated according to both the relationships among components and the using frequency of each component. As the approach takes into account the historical data and the using frequency of each component, the evaluation and prediction results are more accurate than those of using a single model.
文摘In a component-based software development life cycle, selection of preexisting components is an important task. Every component that has to be reused has an associated risk of failure of not meeting the functional and non-functional requirements. A component's failure would lead a developer to look for some other alternative of combinations of COTS, in-house and engineered components among possible candidate combinations. This means design itself can readily change. The very process of design of a software system and component selection seems to be heavily dependent on testing results. Instability of design, further, becomes more severe due to requirements change requests. Therefore, this instability of design has to be essentially mitigated by using proper design and testing approaches, otherwise, it may lead to exorbitantly high testing cost due to the repeated testing of various alternatives. How these three activities: Component-based software design, component selection and component-based software testing are interrelated? What process model is most suited to address this concern? This work explores the above questions and their implication in terms of nature of a process model that can be convincing in case of component-based software development.
基金The project entitled "Urban Active Fault Surveying Project"(143623) funded by the National Development and Roform Commission of China"Active Faults Exploration and Seismic Hazard Assessment in Zibo City"(SD1501) funded by the Department of Science & Technology of Shangdong Province,China
文摘Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Olympic Park, Beijing, Shandong Province, Gansu Province and Shanxi Province, we have generalized the method and procedure for inferring the discontinuity of electrical structures (DES) indicating a buried fault in urban areas from resistivity tomograms and its typical electrical features. In general, the layered feature of the electrical structure is first analyzed to preliminarily define whether or not a DES exists in the target area. Resistivity contours in resistivity tomograms are then analyzed from the deep to the shallow. If they extend upward from the deep to the shallow and shape into an integral dislocation, sharp flexure (convergence) or gradient zone, it is inferred that the DES exists, indicating a buried fault. Finally, horizontal tracing is be carried out to define the trend of the DES. The DES can be divided into three types-type AB, ABA and AC. In the present paper, the Zhangdian-Renhe fault system in Zibo city is used as an example to illustrate how to use the method to infer the location and spatial extension of a target fault. Geologic drilling holes are placed based on our research results, and the drilling logs testify that our results are correct. However, the method of this paper is not exclusive and inflexible. It is expected to provide reference and assistance for inferring the shallow buried faults in urban areas from resistivity tomograms in the future.
文摘Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.