Dempster-Shafer evidence theory is broadly employed in the research of multi-source information fusion.Nevertheless,when fusing highly conflicting evidence it may pro-duce counterintuitive outcomes.To address this iss...Dempster-Shafer evidence theory is broadly employed in the research of multi-source information fusion.Nevertheless,when fusing highly conflicting evidence it may pro-duce counterintuitive outcomes.To address this issue,a fusion approach based on a newly defined belief exponential diver-gence and Deng entropy is proposed.First,a belief exponential divergence is proposed as the conflict measurement between evidences.Then,the credibility of each evidence is calculated.Afterwards,the Deng entropy is used to calculate information volume to determine the uncertainty of evidence.Then,the weight of evidence is calculated by integrating the credibility and uncertainty of each evidence.Ultimately,initial evidences are amended and fused using Dempster’s rule of combination.The effectiveness of this approach in addressing the fusion of three typical conflict paradoxes is demonstrated by arithmetic exam-ples.Additionally,the proposed approach is applied to aerial tar-get recognition and iris dataset-based classification to validate its efficacy.Results indicate that the proposed approach can enhance the accuracy of target recognition and effectively address the issue of fusing conflicting evidences.展开更多
For reservoirs with complex non-Gaussian geological characteristics,such as carbonate reservoirs or reservoirs with sedimentary facies distribution,it is difficult to implement history matching directly,especially for...For reservoirs with complex non-Gaussian geological characteristics,such as carbonate reservoirs or reservoirs with sedimentary facies distribution,it is difficult to implement history matching directly,especially for the ensemble-based data assimilation methods.In this paper,we propose a multi-source information fused generative adversarial network(MSIGAN)model,which is used for parameterization of the complex geologies.In MSIGAN,various information such as facies distribution,microseismic,and inter-well connectivity,can be integrated to learn the geological features.And two major generative models in deep learning,variational autoencoder(VAE)and generative adversarial network(GAN)are combined in our model.Then the proposed MSIGAN model is integrated into the ensemble smoother with multiple data assimilation(ESMDA)method to conduct history matching.We tested the proposed method on two reservoir models with fluvial facies.The experimental results show that the proposed MSIGAN model can effectively learn the complex geological features,which can promote the accuracy of history matching.展开更多
Information spreading has been investigated for many years,but the mechanism of why the information explosively catches on overnight is still under debate.This explosive spreading phenomenon was usually considered dri...Information spreading has been investigated for many years,but the mechanism of why the information explosively catches on overnight is still under debate.This explosive spreading phenomenon was usually considered driven separately by social reinforcement or higher-order interactions.However,due to the limitations of empirical data and theoretical analysis,how the higher-order network structure affects the explosive information spreading under the role of social reinforcement has not been fully explored.In this work,we propose an information-spreading model by considering the social reinforcement in real and synthetic higher-order networks,describable as hypergraphs.Depending on the average group size(hyperedge cardinality)and node membership(hyperdegree),we observe two different spreading behaviors:(i)The spreading progress is not sensitive to social reinforcement,resulting in the information localized in a small part of nodes;(ii)a strong social reinforcement will promote the large-scale spread of information and induce an explosive transition.Moreover,a large average group size and membership would be beneficial to the appearance of the explosive transition.Further,we display that the heterogeneity of the node membership and group size distributions benefit the information spreading.Finally,we extend the group-based approximate master equations to verify the simulation results.Our findings may help us to comprehend the rapidly information-spreading phenomenon in modern society.展开更多
The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate disseminatio...The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate dissemination at distinct initial moments.Although there are many research results of multi-source identification,the challenge of locating sources with varying initiation times using a limited subset of observational nodes remains unresolved.In this study,we provide the backward spread tree theorem and source centrality theorem,and develop a backward spread centrality algorithm to identify all the information sources that trigger the spread at different start times.The proposed algorithm does not require prior knowledge of the number of sources,however,it can estimate both the initial spread moment and the spread duration.The core concept of this algorithm involves inferring suspected sources through source centrality theorem and locating the source from the suspected sources with linear programming.Extensive experiments from synthetic and real network simulation corroborate the superiority of our method in terms of both efficacy and efficiency.Furthermore,we find that our method maintains robustness irrespective of the number of sources and the average degree of network.Compared with classical and state-of-the art source identification methods,our method generally improves the AUROC value by 0.1 to 0.2.展开更多
This paper addresses the challenge of accurately and timely determining the position of a train,with specific consideration given to the integration of the global navigation satellite system(GNSS)and inertial navigati...This paper addresses the challenge of accurately and timely determining the position of a train,with specific consideration given to the integration of the global navigation satellite system(GNSS)and inertial navigation system(INS).To overcome the increasing errors in the INS during interruptions in GNSS signals,as well as the uncertainty associated with process and measurement noise,a deep learning-based method for train positioning is proposed.This method combines convolutional neural networks(CNN),long short-term memory(LSTM),and the invariant extended Kalman filter(IEKF)to enhance the perception of train positions.It effectively handles GNSS signal interruptions and mitigates the impact of noise.Experimental evaluation and comparisons with existing approaches are provided to illustrate the effectiveness and robustness of the proposed method.展开更多
In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for th...In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for the distribution network only considers the information reported by the Feeder Terminal Unit(FTU)and the fault tolerance rate is low when the information is omitted or misreported.Therefore,this study considers the influence of the distributed generations(DGs)for the distribution network.This takes the CEIAS as a redundant information source and solves the model by applying a binary particle swarm optimization algorithm(BPSO).The improved Dempster/S-hafer evidence theory(D-S evidence theory)is used for evidence fusion to achieve the fault section location for the distribution network.An example is provided to verify that the proposed method can achieve single or multiple fault locations with a higher fault tolerance.展开更多
For milling tool life prediction and health management,accurate extraction and dimensionality reduction of its tool wear features are the key to reduce prediction errors.In this paper,we adopt multi-source information...For milling tool life prediction and health management,accurate extraction and dimensionality reduction of its tool wear features are the key to reduce prediction errors.In this paper,we adopt multi-source information fusion technology to extract and fuse the features of cutting vibration signal,cutting force signal and acoustic emission signal in time domain,frequency domain and time-frequency domain,and downscale the sample features by Pearson correlation coefficient to construct a sample data set;then we propose a tool life prediction model based on CNN-SVM optimized by genetic algorithm(GA),which uses CNN convolutional neural network as the feature learner and SVM support vector machine as the trainer for regression prediction.The results show that the improved model in this paper can effectively predict the tool life with better generalization ability,faster network fitting,and 99.85%prediction accuracy.And compared with the BP model,CNN model,SVM model and CNN-SVM model,the performance of the coefficient of determination R2 metric improved by 4.88%,2.96%,2.53%and 1.34%,respectively.展开更多
Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in ge...Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.展开更多
In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process,...In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process, we have found the statistical regularity for support rates of two different products at a steady stage. The research shows that strength of the public opinion spreading is proportional to the final support rates of a product.展开更多
This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to ...This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to the growth of quantum complexity and the evolution of entanglement entropy in physical systems. By integrating principles from quantum mechanics, information theory, and holography, we develop a comprehensive theory that explains how time can emerge from timeless quantum processes. Our approach unifies concepts from quantum mechanics, general relativity, and thermodynamics, providing new perspectives on longstanding puzzles such as the black hole information paradox and the arrow of time. We derive modified Friedmann equations that incorporate quantum information measures, offering novel insights into cosmic evolution and the nature of dark energy. The paper presents a series of experimental proposals to test key aspects of this theory, ranging from quantum simulations to cosmological observations. Our framework suggests a deeply information-theoretic view of the universe, challenging our understanding of the nature of reality and opening new avenues for technological applications in quantum computing and sensing. This work contributes to the ongoing quest for a unified theory of quantum gravity and information, potentially with far-reaching implications for our understanding of space, time, and the fundamental structure of the cosmos.展开更多
Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity a...Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger components at a low cost only with local information.In this paper,we discuss the effectiveness and efficiency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.展开更多
Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by ...Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by Chinese academy of Chinese medical sciences wereused to analyze the specific acupoints,compatibility,frequency etc.Results:174 acumoxibustion prescriptions were collected,including 163 acupoints.among them eighteen acupoints were used more than 30 times such as Hegu(LI4),Zusanli(ST36),Quchi(LI11)and Fengshi(GB31).The combinations of 31 acupoints were used more than 15 times,such as the combination of Quchi(LI11)and Zusanli(ST36),the combination of acupoint Quchi(LI11)and Jianyu(LI15),Hegu and Quchi(LI11).The most commonly used treatment method for stroke treatment is to dredge the Yangming meridian and Shaoyang meridian through acupuncture the multiple acupoints located on these two meridians..The commonly used acupoints are mainly distributed in the limbs,head and face.The most commonly used specific acupoint is intersection acupoint.The usage frequency of specific acupoints are higher than that of non-specific acupoints.Conclusion:Dredging the collaterals,dispelling wind-evil and restoring consciousness are the main principle for the treatment of stroke.Specific acupoints in head,face and climbs maybe the main targeted acupoints.Combination of Yang meridians with other meridians is needed to improve the effects.The Yangming meridian and Shaoyang meridian are most used meridians and Hegu(LI4),Quchi(LI11)and Zusanli(ST36)are the most used acupionts.展开更多
Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics ...Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics flow through the system. The information environment of MES and its effect on MES scheduling are analyzed. A methodological proposal is given to address the problem of agile scheduling in a complex information environment, based on which a microeconomic market and game theoretic model-based scheduling approach is presented. The future development of this method is also discussed.展开更多
The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on So...The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on Software-Defined Architecture(SDA).The concepts of critical functions and critical states are defined,and then,the critical functional parameters of the target system are collected and analyzed.Experiments based on the analysis results are performed for reconfiguring the implementations of the whole system.A formal model is presented for analyzing and improving the survivability of the system,and the problem investigated in this paper is reduced to an optimization problem for increasing the system survival time.展开更多
The spreading of the quantum-mechanical probability distribution density of the three-dimensional system is quantitatively determined by means of the local information-theoretic quantity of the Shannon information and...The spreading of the quantum-mechanical probability distribution density of the three-dimensional system is quantitatively determined by means of the local information-theoretic quantity of the Shannon information and information energy in both position and momentum spaces. The complexity measure which is equivalent to Cramer–Rao uncertainty product is determined. We have obtained the information content stored, the concentration of quantum system and complexity measure numerically for n = 0, 1, 2 and 3 respectively.展开更多
With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQ...With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQE model based on security-critical components to improve the efficiency of system security evaluation urgently.In this paper,we summarize the implication of critical components in different filed and propose a recognition algorithm of security-critical components based on threat attack tree to support the ISQE process.The evaluation model establishes a framework for ISQE of CHISs that are updated iteratively.Firstly,with the support of asset identification and topology data,we sort the security importance of each asset based on the threat attack tree and obtain the security-critical components(set)of the CHIS.Then,we build the evaluation indicator tree of the evaluation target and propose an ISQE algorithm based on the coefficient of variation to calculate the security quality value of the CHIS.Moreover,we present a novel indicator measurement uncertainty aiming to better supervise the performance of the proposed model.Simulation results show the advantages of the proposed algorithm in the evaluation of CHISs.展开更多
It is an important issue to identify important influencing factors in railway accident analysis.In this paper,employing the good measure of dependence for two-variable relationships,the maximal information coefficient...It is an important issue to identify important influencing factors in railway accident analysis.In this paper,employing the good measure of dependence for two-variable relationships,the maximal information coefficient(MIC),which can capture a wide range of associations,a complex network model for railway accident analysis is designed in which nodes denote factors of railway accidents and edges are generated between two factors of which MIC values are larger than or equal to the dependent criterion.The variety of network structure is studied.As the increasing of the dependent criterion,the network becomes to an approximate scale-free network.Moreover,employing the proposed network,important influencing factors are identified.And we find that the annual track density-gross tonnage factor is an important factor which is a cut vertex when the dependent criterion is equal to 0.3.From the network,it is found that the railway development is unbalanced for different states which is consistent with the fact.展开更多
Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and pu...Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and purchasers is becoming progressively familiar as all industries are moving towards a massive sustainable potential.To handle such sort of developments in supply chain management the involvement of fuzzy settings and their generalisations is playing an important role.Keeping in mind this role,the aim of this study is to analyse the role and involvement of complex q-rung orthopair normal fuzzy(CQRONF)information in supply chain management.The major impact of this theory is to analyse the notion of confidence CQRONF weighted averaging,confidence CQRONF ordered weighted averaging,confidence CQRONF hybrid averaging,confidence CQRONF weighted geometric,confidence CQRONF ordered weighted geometric,confidence CQRONF hybrid geometric operators and try to diagnose various properties and results.Furthermore,with the help of the CRITIC and VIKOR models,we diagnosed the novel theory of the CQRONF-CRITIC-VIKOR model to check the sensitivity analysis of the initiated method.Moreover,in the availability of diagnosed operators,we constructed a multi-attribute decision-making tool for finding a beneficial sustainable supplier to handle complex dilemmas.Finally,the initiated operator's efficiency is proved by comparative analysis.展开更多
Fault diagnosis(FD)is essential for ensuring the reliable operation of chillers and preventing energy waste.Feature selection(FS)is a critical prerequisite for effective FD.However,current FS methods have two major ga...Fault diagnosis(FD)is essential for ensuring the reliable operation of chillers and preventing energy waste.Feature selection(FS)is a critical prerequisite for effective FD.However,current FS methods have two major gaps.First,most approaches rely on single-source ranking information(SSRI)to evaluate features individually,which results in non-robust outcomes across different models and datasets due to the one-sided nature of SSRI.Second,thermodynamic mechanism features are often overlooked,leading to incomplete initial feature libraries,making it challenging to select optimal features and achieve better diagnostic performance.To address these issues,a robust ensemble FS method based on multi-source ranking information(MSRI)is proposed.By employing an efficient strategy based on maximizing relevance while proper redundancy,the MSRI method fully leverages Mutual Information,Information Gain,Gain Ratio,Gini index,Chi-squared,and Relief-F from both qualitative and quantitative perspectives.Additionally,comprehensive consideration of thermodynamic mechanism features ensures a complete initial feature library.From a methodological standpoint,a general framework for constructing the MSRI-based FS method is provided.The proposed method is applied to chiller FD and tested across ten widely-used machine learning models.Thirteen optimized features are selected from the original set of forty-two,achieving an average diagnostic accuracy of 98.40%and an average F-measure above 94.94%,demonstrating the effectiveness and generalizability of the MSRI method.Compared to the SSRI approach,the MSRI method shows superior robustness,with the standard deviation of diagnostic accuracy reduced by 0.03 to 0.07 and an improvement in diagnostic accuracy ranging from 2.53%to 6.12%.Moreover,the MSRI method reduced computation time by 98.62%compared to wrapper methods,without sacrificing accuracy.展开更多
Complex evidence theory is a generalized Dempster-Shafer evidence theory,which has the ability to express uncertain information.One of the key issues is the uncertainty measure of Complex Basic Belief Assignment(CBBA)...Complex evidence theory is a generalized Dempster-Shafer evidence theory,which has the ability to express uncertain information.One of the key issues is the uncertainty measure of Complex Basic Belief Assignment(CBBA).However,the research on the uncertainty measure of complex evidence theory is still an open issue.Therefore,in this paper,first,the Fractal-based Complex Belief(FCB)entropy as a generalization of Fractal-based Belief(FB)entropy,which has superiority in uncertainty measurement of CBBA,is proposed.Second,on the basis of FCB entropy,we propose Fractal-based Supremum Complex Belief(FSCB)entropy and Fractal-based Infimum Complex Belief(FICB)entropy,with FSCB entropy as the upper bound and FICB entropy as the lower bound.They are collectively called the proposed FCB entropy.Furthermore,we analyze the properties,physical interpretation and numerical examples to prove the rationality of the proposed method.Finally,a practical information fusion application is proposed to prove that the proposed FCB entropy can reasonably measure the uncertainty of CBBA.The results show that,the proposed FCB entropy can handle the uncertainty measure of CBBA,which can be a reasonable way for uncertainty measure in complex evidence theory.展开更多
基金supported by the National Natural Science Foundation of China(61903305,62073267)the Fundamental Research Funds for the Central Universities(HXGJXM202214).
文摘Dempster-Shafer evidence theory is broadly employed in the research of multi-source information fusion.Nevertheless,when fusing highly conflicting evidence it may pro-duce counterintuitive outcomes.To address this issue,a fusion approach based on a newly defined belief exponential diver-gence and Deng entropy is proposed.First,a belief exponential divergence is proposed as the conflict measurement between evidences.Then,the credibility of each evidence is calculated.Afterwards,the Deng entropy is used to calculate information volume to determine the uncertainty of evidence.Then,the weight of evidence is calculated by integrating the credibility and uncertainty of each evidence.Ultimately,initial evidences are amended and fused using Dempster’s rule of combination.The effectiveness of this approach in addressing the fusion of three typical conflict paradoxes is demonstrated by arithmetic exam-ples.Additionally,the proposed approach is applied to aerial tar-get recognition and iris dataset-based classification to validate its efficacy.Results indicate that the proposed approach can enhance the accuracy of target recognition and effectively address the issue of fusing conflicting evidences.
基金supported by the National Natural Science Foundation of China under Grant 51722406,52074340,and 51874335the Shandong Provincial Natural Science Foundation under Grant JQ201808+5 种基金The Fundamental Research Funds for the Central Universities under Grant 18CX02097Athe Major Scientific and Technological Projects of CNPC under Grant ZD2019-183-008the Science and Technology Support Plan for Youth Innovation of University in Shandong Province under Grant 2019KJH002the National Research Council of Science and Technology Major Project of China under Grant 2016ZX05025001-006111 Project under Grant B08028Sinopec Science and Technology Project under Grant P20050-1
文摘For reservoirs with complex non-Gaussian geological characteristics,such as carbonate reservoirs or reservoirs with sedimentary facies distribution,it is difficult to implement history matching directly,especially for the ensemble-based data assimilation methods.In this paper,we propose a multi-source information fused generative adversarial network(MSIGAN)model,which is used for parameterization of the complex geologies.In MSIGAN,various information such as facies distribution,microseismic,and inter-well connectivity,can be integrated to learn the geological features.And two major generative models in deep learning,variational autoencoder(VAE)and generative adversarial network(GAN)are combined in our model.Then the proposed MSIGAN model is integrated into the ensemble smoother with multiple data assimilation(ESMDA)method to conduct history matching.We tested the proposed method on two reservoir models with fluvial facies.The experimental results show that the proposed MSIGAN model can effectively learn the complex geological features,which can promote the accuracy of history matching.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.12305043 and 12165016)the Natural Science Foundation of Jiangsu Province(Grant No.BK20220511)+1 种基金the Project of Undergraduate Scientific Research(Grant No.22A684)the support from the Jiangsu Specially-Appointed Professor Program。
文摘Information spreading has been investigated for many years,but the mechanism of why the information explosively catches on overnight is still under debate.This explosive spreading phenomenon was usually considered driven separately by social reinforcement or higher-order interactions.However,due to the limitations of empirical data and theoretical analysis,how the higher-order network structure affects the explosive information spreading under the role of social reinforcement has not been fully explored.In this work,we propose an information-spreading model by considering the social reinforcement in real and synthetic higher-order networks,describable as hypergraphs.Depending on the average group size(hyperedge cardinality)and node membership(hyperdegree),we observe two different spreading behaviors:(i)The spreading progress is not sensitive to social reinforcement,resulting in the information localized in a small part of nodes;(ii)a strong social reinforcement will promote the large-scale spread of information and induce an explosive transition.Moreover,a large average group size and membership would be beneficial to the appearance of the explosive transition.Further,we display that the heterogeneity of the node membership and group size distributions benefit the information spreading.Finally,we extend the group-based approximate master equations to verify the simulation results.Our findings may help us to comprehend the rapidly information-spreading phenomenon in modern society.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.62103375,62006106,61877055,and 62171413)the Philosophy and Social Science Planning Project of Zhejinag Province,China(Grant No.22NDJC009Z)+1 种基金the Education Ministry Humanities and Social Science Foundation of China(Grant No.19YJCZH056)the Natural Science Foundation of Zhejiang Province,China(Grant Nos.LY23F030003,LY22F030006,and LQ21F020005).
文摘The dissemination of information across various locations is an ubiquitous occurrence,however,prevalent methodologies for multi-source identification frequently overlook the fact that sources may initiate dissemination at distinct initial moments.Although there are many research results of multi-source identification,the challenge of locating sources with varying initiation times using a limited subset of observational nodes remains unresolved.In this study,we provide the backward spread tree theorem and source centrality theorem,and develop a backward spread centrality algorithm to identify all the information sources that trigger the spread at different start times.The proposed algorithm does not require prior knowledge of the number of sources,however,it can estimate both the initial spread moment and the spread duration.The core concept of this algorithm involves inferring suspected sources through source centrality theorem and locating the source from the suspected sources with linear programming.Extensive experiments from synthetic and real network simulation corroborate the superiority of our method in terms of both efficacy and efficiency.Furthermore,we find that our method maintains robustness irrespective of the number of sources and the average degree of network.Compared with classical and state-of-the art source identification methods,our method generally improves the AUROC value by 0.1 to 0.2.
基金supported by the National Natural Science Foundation of China(Nos.61925302,62273027)the Beijing Natural Science Foundation(L211021).
文摘This paper addresses the challenge of accurately and timely determining the position of a train,with specific consideration given to the integration of the global navigation satellite system(GNSS)and inertial navigation system(INS).To overcome the increasing errors in the INS during interruptions in GNSS signals,as well as the uncertainty associated with process and measurement noise,a deep learning-based method for train positioning is proposed.This method combines convolutional neural networks(CNN),long short-term memory(LSTM),and the invariant extended Kalman filter(IEKF)to enhance the perception of train positions.It effectively handles GNSS signal interruptions and mitigates the impact of noise.Experimental evaluation and comparisons with existing approaches are provided to illustrate the effectiveness and robustness of the proposed method.
基金supported by the Science and Technology Project of State Grid Shandong Electric Power Company?“Research on the Data-Driven Method for Energy Internet”?(Project No.2018A-100)。
文摘In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for the distribution network only considers the information reported by the Feeder Terminal Unit(FTU)and the fault tolerance rate is low when the information is omitted or misreported.Therefore,this study considers the influence of the distributed generations(DGs)for the distribution network.This takes the CEIAS as a redundant information source and solves the model by applying a binary particle swarm optimization algorithm(BPSO).The improved Dempster/S-hafer evidence theory(D-S evidence theory)is used for evidence fusion to achieve the fault section location for the distribution network.An example is provided to verify that the proposed method can achieve single or multiple fault locations with a higher fault tolerance.
基金financed with the means of Basic Scientific Research Youth Program of Education Department of Liaoning Province,No.LJKQZ2021185Yingkou Enterprise and Doctor Innovation Program (QB-2021-05).
文摘For milling tool life prediction and health management,accurate extraction and dimensionality reduction of its tool wear features are the key to reduce prediction errors.In this paper,we adopt multi-source information fusion technology to extract and fuse the features of cutting vibration signal,cutting force signal and acoustic emission signal in time domain,frequency domain and time-frequency domain,and downscale the sample features by Pearson correlation coefficient to construct a sample data set;then we propose a tool life prediction model based on CNN-SVM optimized by genetic algorithm(GA),which uses CNN convolutional neural network as the feature learner and SVM support vector machine as the trainer for regression prediction.The results show that the improved model in this paper can effectively predict the tool life with better generalization ability,faster network fitting,and 99.85%prediction accuracy.And compared with the BP model,CNN model,SVM model and CNN-SVM model,the performance of the coefficient of determination R2 metric improved by 4.88%,2.96%,2.53%and 1.34%,respectively.
文摘Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.
文摘In this paper, we investigate the regularity of spreading of information and public opinions towards two competing products in complex networks. By building its mathematical model and simulating its evolution process, we have found the statistical regularity for support rates of two different products at a steady stage. The research shows that strength of the public opinion spreading is proportional to the final support rates of a product.
文摘This paper presents a novel framework for understanding time as an emergent phenomenon arising from quantum information dynamics. We propose that the flow of time and its directional arrow are intrinsically linked to the growth of quantum complexity and the evolution of entanglement entropy in physical systems. By integrating principles from quantum mechanics, information theory, and holography, we develop a comprehensive theory that explains how time can emerge from timeless quantum processes. Our approach unifies concepts from quantum mechanics, general relativity, and thermodynamics, providing new perspectives on longstanding puzzles such as the black hole information paradox and the arrow of time. We derive modified Friedmann equations that incorporate quantum information measures, offering novel insights into cosmic evolution and the nature of dark energy. The paper presents a series of experimental proposals to test key aspects of this theory, ranging from quantum simulations to cosmological observations. Our framework suggests a deeply information-theoretic view of the universe, challenging our understanding of the nature of reality and opening new avenues for technological applications in quantum computing and sensing. This work contributes to the ongoing quest for a unified theory of quantum gravity and information, potentially with far-reaching implications for our understanding of space, time, and the fundamental structure of the cosmos.
基金supported by the Research Fund from the National Natural Science Foundation of China(Nos.61521091,61650110516,and 61601013)
文摘Robustness of complex networks has been studied for decades,with a particular focus on network attack.Research on network repair,on the other hand,has been conducted only very lately,given the even higher complexity and absence of an effective evaluation metric.A recently proposed network repair strategy is self-healing,which aims to repair networks for larger components at a low cost only with local information.In this paper,we discuss the effectiveness and efficiency of self-healing,which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality.This leads us to a new network repair evaluation metric.Since the time complexity of the computation is very high,we devise a greedy ranking strategy.Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy.Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.
文摘Objective:Explore the characteristics of acupoints in the treatment of stroke with complex network and point mutual information method.Methods:The complex network and point wise mutual information system-developed by Chinese academy of Chinese medical sciences wereused to analyze the specific acupoints,compatibility,frequency etc.Results:174 acumoxibustion prescriptions were collected,including 163 acupoints.among them eighteen acupoints were used more than 30 times such as Hegu(LI4),Zusanli(ST36),Quchi(LI11)and Fengshi(GB31).The combinations of 31 acupoints were used more than 15 times,such as the combination of Quchi(LI11)and Zusanli(ST36),the combination of acupoint Quchi(LI11)and Jianyu(LI15),Hegu and Quchi(LI11).The most commonly used treatment method for stroke treatment is to dredge the Yangming meridian and Shaoyang meridian through acupuncture the multiple acupoints located on these two meridians..The commonly used acupoints are mainly distributed in the limbs,head and face.The most commonly used specific acupoint is intersection acupoint.The usage frequency of specific acupoints are higher than that of non-specific acupoints.Conclusion:Dredging the collaterals,dispelling wind-evil and restoring consciousness are the main principle for the treatment of stroke.Specific acupoints in head,face and climbs maybe the main targeted acupoints.Combination of Yang meridians with other meridians is needed to improve the effects.The Yangming meridian and Shaoyang meridian are most used meridians and Hegu(LI4),Quchi(LI11)and Zusanli(ST36)are the most used acupionts.
基金Supported by the National Natural Science Foundation of China(50105006 )National Hi-tech R&D Program of China (2001AA412140 and 2003AA411120)
文摘Since Manufacturing Execution System (MES) is a bridge which links the upper planning system of the enterprise and the control system of the shop floor, various kinds of the information with different characteristics flow through the system. The information environment of MES and its effect on MES scheduling are analyzed. A methodological proposal is given to address the problem of agile scheduling in a complex information environment, based on which a microeconomic market and game theoretic model-based scheduling approach is presented. The future development of this method is also discussed.
文摘The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on Software-Defined Architecture(SDA).The concepts of critical functions and critical states are defined,and then,the critical functional parameters of the target system are collected and analyzed.Experiments based on the analysis results are performed for reconfiguring the implementations of the whole system.A formal model is presented for analyzing and improving the survivability of the system,and the problem investigated in this paper is reduced to an optimization problem for increasing the system survival time.
文摘The spreading of the quantum-mechanical probability distribution density of the three-dimensional system is quantitatively determined by means of the local information-theoretic quantity of the Shannon information and information energy in both position and momentum spaces. The complexity measure which is equivalent to Cramer–Rao uncertainty product is determined. We have obtained the information content stored, the concentration of quantum system and complexity measure numerically for n = 0, 1, 2 and 3 respectively.
基金supported in part by the National Key R&D Program of China under Grant 2019YFB2102400,2016YFF0204001in part by the BUPT Excellent Ph.D.Students Foundation under Grant CX2019117.
文摘With the skyrocketing development of technologies,there are many issues in information security quantitative evaluation(ISQE)of complex heterogeneous information systems(CHISs).The development of CHIS calls for an ISQE model based on security-critical components to improve the efficiency of system security evaluation urgently.In this paper,we summarize the implication of critical components in different filed and propose a recognition algorithm of security-critical components based on threat attack tree to support the ISQE process.The evaluation model establishes a framework for ISQE of CHISs that are updated iteratively.Firstly,with the support of asset identification and topology data,we sort the security importance of each asset based on the threat attack tree and obtain the security-critical components(set)of the CHIS.Then,we build the evaluation indicator tree of the evaluation target and propose an ISQE algorithm based on the coefficient of variation to calculate the security quality value of the CHIS.Moreover,we present a novel indicator measurement uncertainty aiming to better supervise the performance of the proposed model.Simulation results show the advantages of the proposed algorithm in the evaluation of CHISs.
基金Supported by the Fundamental Research Funds for the Central Universities under Grant No.2016YJS087the National Natural Science Foundation of China under Grant No.U1434209the Research Foundation of State Key Laboratory of Railway Traffic Control and Safety,Beijing Jiaotong University under Grant No.RCS2016ZJ001
文摘It is an important issue to identify important influencing factors in railway accident analysis.In this paper,employing the good measure of dependence for two-variable relationships,the maximal information coefficient(MIC),which can capture a wide range of associations,a complex network model for railway accident analysis is designed in which nodes denote factors of railway accidents and edges are generated between two factors of which MIC values are larger than or equal to the dependent criterion.The variety of network structure is studied.As the increasing of the dependent criterion,the network becomes to an approximate scale-free network.Moreover,employing the proposed network,important influencing factors are identified.And we find that the annual track density-gross tonnage factor is an important factor which is a cut vertex when the dependent criterion is equal to 0.3.From the network,it is found that the railway development is unbalanced for different states which is consistent with the fact.
文摘Supply chain management is an essential part of an organisation's sustainable programme.Understanding the concentration of natural environment,public,and economic influence and feasibility of your suppliers and purchasers is becoming progressively familiar as all industries are moving towards a massive sustainable potential.To handle such sort of developments in supply chain management the involvement of fuzzy settings and their generalisations is playing an important role.Keeping in mind this role,the aim of this study is to analyse the role and involvement of complex q-rung orthopair normal fuzzy(CQRONF)information in supply chain management.The major impact of this theory is to analyse the notion of confidence CQRONF weighted averaging,confidence CQRONF ordered weighted averaging,confidence CQRONF hybrid averaging,confidence CQRONF weighted geometric,confidence CQRONF ordered weighted geometric,confidence CQRONF hybrid geometric operators and try to diagnose various properties and results.Furthermore,with the help of the CRITIC and VIKOR models,we diagnosed the novel theory of the CQRONF-CRITIC-VIKOR model to check the sensitivity analysis of the initiated method.Moreover,in the availability of diagnosed operators,we constructed a multi-attribute decision-making tool for finding a beneficial sustainable supplier to handle complex dilemmas.Finally,the initiated operator's efficiency is proved by comparative analysis.
基金the National Natural Science Foundation of China(No.52478087)China Postdoctoral Science Foundation(No.2024M750799,No.2024T170238)+4 种基金China Scholarship Council(No.202308410494)Zhongyuan Outstanding Youth Talent Program(No.2022 Year)Youth Scientist Project in Henan Province(No.225200810087)the Program for Science&Technology Innovation Talents in Universities of Henan Province(No.22HASTIT025)the Program for Innovative Research Team(in Science and Technology)in University of Henan Province(No.22IRTSTHN006).
文摘Fault diagnosis(FD)is essential for ensuring the reliable operation of chillers and preventing energy waste.Feature selection(FS)is a critical prerequisite for effective FD.However,current FS methods have two major gaps.First,most approaches rely on single-source ranking information(SSRI)to evaluate features individually,which results in non-robust outcomes across different models and datasets due to the one-sided nature of SSRI.Second,thermodynamic mechanism features are often overlooked,leading to incomplete initial feature libraries,making it challenging to select optimal features and achieve better diagnostic performance.To address these issues,a robust ensemble FS method based on multi-source ranking information(MSRI)is proposed.By employing an efficient strategy based on maximizing relevance while proper redundancy,the MSRI method fully leverages Mutual Information,Information Gain,Gain Ratio,Gini index,Chi-squared,and Relief-F from both qualitative and quantitative perspectives.Additionally,comprehensive consideration of thermodynamic mechanism features ensures a complete initial feature library.From a methodological standpoint,a general framework for constructing the MSRI-based FS method is provided.The proposed method is applied to chiller FD and tested across ten widely-used machine learning models.Thirteen optimized features are selected from the original set of forty-two,achieving an average diagnostic accuracy of 98.40%and an average F-measure above 94.94%,demonstrating the effectiveness and generalizability of the MSRI method.Compared to the SSRI approach,the MSRI method shows superior robustness,with the standard deviation of diagnostic accuracy reduced by 0.03 to 0.07 and an improvement in diagnostic accuracy ranging from 2.53%to 6.12%.Moreover,the MSRI method reduced computation time by 98.62%compared to wrapper methods,without sacrificing accuracy.
基金supported by the National Natural Science Foundation of China (No. 62473067)Chongqing Talents: Exceptional Young Talents Project, China (No. cstc2022ycjh-bgzxm0070)Chongqing Overseas Scholars Innovation Program, China (No. cx2022024)
文摘Complex evidence theory is a generalized Dempster-Shafer evidence theory,which has the ability to express uncertain information.One of the key issues is the uncertainty measure of Complex Basic Belief Assignment(CBBA).However,the research on the uncertainty measure of complex evidence theory is still an open issue.Therefore,in this paper,first,the Fractal-based Complex Belief(FCB)entropy as a generalization of Fractal-based Belief(FB)entropy,which has superiority in uncertainty measurement of CBBA,is proposed.Second,on the basis of FCB entropy,we propose Fractal-based Supremum Complex Belief(FSCB)entropy and Fractal-based Infimum Complex Belief(FICB)entropy,with FSCB entropy as the upper bound and FICB entropy as the lower bound.They are collectively called the proposed FCB entropy.Furthermore,we analyze the properties,physical interpretation and numerical examples to prove the rationality of the proposed method.Finally,a practical information fusion application is proposed to prove that the proposed FCB entropy can reasonably measure the uncertainty of CBBA.The results show that,the proposed FCB entropy can handle the uncertainty measure of CBBA,which can be a reasonable way for uncertainty measure in complex evidence theory.