Relationships between Unified Modeling Language (UML) diagrams are complex. The complexity leads to inconsistencies between UML diagrams easily. This paper focus on how to identify and check inconsistencies between UM...Relationships between Unified Modeling Language (UML) diagrams are complex. The complexity leads to inconsistencies between UML diagrams easily. This paper focus on how to identify and check inconsistencies between UML diagrams. 13 consistency rules are given to identify inconsistencies between the most frequent 6 types of UML diagrams in the domain of information systems analysis and design. These diagrams are as follows: Use Case Diagrams, Class Diagrams, Activity Diagrams, State Machine Diagrams, Sequence Diagrams and Communication Diagrams. 4 methods are given to check inconsistencies between UML diagrams as follows: manual check, compulsory restriction, automatic maintenance, dynamic check. These rules and methods are helpful for developers to model information systems.展开更多
New challenges including how to share information on heterogeneous devices appear in data-intensive pervasive computing environments. Data integration is a practical approach to these applications. Dealing with incons...New challenges including how to share information on heterogeneous devices appear in data-intensive pervasive computing environments. Data integration is a practical approach to these applications. Dealing with inconsistencies is one of the important problems in data integration. In this paper we motivate the problem of data inconsistency solution for data integration in pervasive environments. We define data qualit~ criteria and expense quality criteria for data sources to solve data inconsistency. In our solution, firstly, data sources needing high expense to obtain data from them are discarded by using expense quality criteria and utility function. Since it is difficult to obtain the actual quality of data sources in pervasive computing environment, we introduce fuzzy multi-attribute group decision making approach to selecting the appropriate data sources. The experimental results show that our solution has ideal effectiveness.展开更多
Object-oriented modeling with declarative equation based languages often unconsciously leads to structural inconsistencies. Component-based debugging is a new structural analysis approach that addresses this problem b...Object-oriented modeling with declarative equation based languages often unconsciously leads to structural inconsistencies. Component-based debugging is a new structural analysis approach that addresses this problem by analyzing the structure of each component in a model to separately locate faulty components. The analysis procedure is performed recursively based on the depth-first rule. It first generates fictitious equations for a component to establish a debugging environment, and then detects structural defects by using graph theoretical approaches to analyzing the structure of the system of equations resulting from the component. The proposed method can automatically locate components that cause the structural inconsistencies, and show the user detailed error messages. This information can be a great help in finding and localizing structural inconsistencies, and in some cases pinpoints them immediately.展开更多
Inconsistency of multi-perspective requirements specifications is a pervasive issue during the requirements process. However, managing inconsistency is not just a pure technical problem. It is always associated with a...Inconsistency of multi-perspective requirements specifications is a pervasive issue during the requirements process. However, managing inconsistency is not just a pure technical problem. It is always associated with a process of interactions and competitions among corresponding stakeholders. The main contribution of this paper is to present a negotiations approach to handling inconsistencies in multi-perspective software requirements. In particular, the priority of requirements relative to each perspective plays an important role in proceeding negotiation over resolving inconsistencies among different stakeholders. An algorithm of generating negotiation proposals and an approach to evaluating proposals are also presented in this paper, respectively.展开更多
Wastewater-based epidemiology has emerged as a transformative surveillance tool for estimating substance consumption and monitoring disease prevalence,particularly during the COVID-19 pandemic.It enables the populatio...Wastewater-based epidemiology has emerged as a transformative surveillance tool for estimating substance consumption and monitoring disease prevalence,particularly during the COVID-19 pandemic.It enables the population-level monitoring of illicit drug use,pathogen prevalence,and environmental pollutant exposure.In this perspective,we summarize the key challenges specific to the Chinese context:(1)Sampling inconsistencies,necessitating standardized 24-hour composite protocols with high-frequency autosamplers(≤15 min/event)to improve the representativeness of samples.展开更多
The food industry is evolving toward intelligence and digitalization,but is faced with challenges such as inconsistent standards and poor system compatibility due to lack of unified technical guidance.GB/T 46511-2025,...The food industry is evolving toward intelligence and digitalization,but is faced with challenges such as inconsistent standards and poor system compatibility due to lack of unified technical guidance.GB/T 46511-2025,General technical requirements for food digital factory,the first general technical national standard for food digital factory,was released recently.It bridges the gap in the industry,serving as the technical support and implementation framework for the intelligent and digital transformation of enterprises in the food industry.展开更多
Currently, knowledge-based sharing and service system has been a hot issue and knowledge fusion, especially for implicit knowledge discovery, becomes the core of knowledge processing and optimization in the system. In...Currently, knowledge-based sharing and service system has been a hot issue and knowledge fusion, especially for implicit knowledge discovery, becomes the core of knowledge processing and optimization in the system. In the research, a knowledge fusion framework based on agricultural ontology and fusion rules was pro- posed, including knowledge extraction, clearing and annotation modules based on a- gricultural ontology, fusion rule construction, choosing and evaluation modules based on agricultural ontology and knowledge fusion module for users' demands. Finally, the significance of the framework to system of agricultural knowledge services was proved with the help of a case.展开更多
The advancements of mobile devices, public networks and the Internet of creature huge amounts of complex data, both construct & unstructured are being captured in trust to allow organizations to produce better bus...The advancements of mobile devices, public networks and the Internet of creature huge amounts of complex data, both construct & unstructured are being captured in trust to allow organizations to produce better business decisions as data is now pivotal for an organizations success. These enormous amounts of data are referred to as Big Data, which enables a competitive advantage over rivals when processed and analyzed appropriately. However Big Data Analytics has a few concerns including Management of Data, Privacy & Security, getting optimal path for transport data, and Data Representation. However, the structure of network does not completely match transportation demand, i.e., there still exist a few bottlenecks in the network. This paper presents a new approach to get the optimal path of valuable data movement through a given network based on the knapsack problem. This paper will give value for each piece of data, it depends on the importance of this data (each piece of data defined by two arguments size and value), and the approach tries to find the optimal path from source to destination, a mathematical models are developed to adjust data flows between their shortest paths based on the 0 - 1 knapsack problem. We also take out computational experience using the commercial software Gurobi and a greedy algorithm (GA), respectively. The outcome indicates that the suggest models are active and workable. This paper introduced two different algorithms to study the shortest path problems: the first algorithm studies the shortest path problems when stochastic activates and activities does not depend on weights. The second algorithm studies the shortest path problems depends on weights.展开更多
In order to improve the working efficiency of the power battery pack and prolong the service life, there is a problem of inconsistency among the individual cells. Based on the centralized equalization structure of the...In order to improve the working efficiency of the power battery pack and prolong the service life, there is a problem of inconsistency among the individual cells. Based on the centralized equalization structure of the multi-output winding transformer, a three-stage hybrid equalization control strategy is designed for equalization. The equalization scheme realizes that the high voltage single battery transfers the energy to the low voltage battery cell during the charging of the battery pack, improving not only charging efficiency and energy use loss, but also the high voltage battery transferring the power to the low voltage battery cell when the pressure difference is greater than 10 mv during the discharge. Between 5 mv and 10 mv, it performs passive equalization, reducing the output fluctuation of the power battery pack and achieving the balance purpose. During the standing time, the maximum active balancing operation within the battery pack is performed in order to achieve intra-group optimum consistency. It is proved by experiments that the equalization control method can realize the quick and effective equalization in the battery pack, and the energy balance of each single battery.展开更多
According to the second law of thermodynamics, as currently understood, any given transit of a system along the reversible path proceeds with a total entropy change equal to zero. The fact that this condition is also ...According to the second law of thermodynamics, as currently understood, any given transit of a system along the reversible path proceeds with a total entropy change equal to zero. The fact that this condition is also the identifier of thermodynamic equilibrium, makes each and every point along the reversible path a state of equilibrium, and the reversible path, as expressed by a noted thermodynamic author, “a dense succession of equilibrium states”. The difficulties with these notions are plural. The fact, for example, that systems need to be forced out of equilibrium via the expenditure of work, would make any spontaneous reversible process a consumer of work, this in opposition to common thermodynamic wisdom that makes spontaneous reversible processes the most efficient transformers of work-producing-potential into actual work. The solution to this and other related impasses is provided by Dialectical Thermodynamics via its previously proved notion assigning a negative entropy change to the energy upgrading process represented by the transformation of heat into work. The said solution is here exemplified with the ideal-gas phase isomerization of butane into isobutane.展开更多
AIM: To identify the clinical outcomes of hepato-cellular carcinoma (HCC) patients with inconsistent α-fetoprotein (AFP) levels which were initially high and then low at recurrence.METHODS: We retrospectively include...AIM: To identify the clinical outcomes of hepato-cellular carcinoma (HCC) patients with inconsistent α-fetoprotein (AFP) levels which were initially high and then low at recurrence.METHODS: We retrospectively included 178 patients who underwent liver resection with high preoperative AFP levels (≥ 200 ng/dL). Sixty-nine HCC patients had recurrence during follow-up and were grouped by their AFP levels at recurrence: group Ⅰ, AFP ≤ 20 ng/dL (n = 16); group Ⅱ, AFP 20-200 ng/dL (n = 24); and group Ⅲ, AFP ≥ 200 ng/dL (n = 29). Their preoperative clinical characteristics, accumulated recurrence rate, and recurrence-to-death survival rate were compared. Three patients, one in each group, underwentliver resection twice for primary and recurrent HCC. AFP immunohistochemistry of primary and recurrent HCC specimens were examined.RESULTS: In this study, 23% of patients demon-strated normal AFP levels at HCC recurrence. The AFP levels in these patients were initially high. There were no significant differences in clinical characteristics between the three groups except for the mean recur-rence interval (21.8 ± 14.6, 12.3 ± 7.7, 8.3 ± 6.6 mo, respectively, P < 0.001) and survival time (40.2 ± 19.9, 36.1 ± 22.4, 21.9 ± 22.0 mo, respectively, P = 0.013). Tumor size > 5 cm, total bilirubin > 1.2 mg/dL, vessel invasion, Child classification B, group Ⅲ, and recurrence interval < 12 mo, were risk factors for survival rate. Cox regression analysis was performed and vessel invasion, group Ⅲ, and recurrence interval were independent risk factors. The recurrence inter-val was significant longer in group Ⅰ (P < 0.001). The recurrence-to-death survival rate was significantly bet-ter in group Ⅱ (P = 0.016). AFP staining was strong in the primary HCC specimens and was reduced at recur-rence in group Ⅰ specimens.CONCLUSION: Patients in group Ⅰ with inconsistent AFP levels had a longer recurrence interval and worse recurrence-to-death survival rate than those in group Ⅱ. This clinical presentation may be caused by a delay in the detection of HCC recurrence.展开更多
Inconsistencies or conflicts appearing in the integration of ontologies and general rules are handled by applying prioritizing and updating. First, a prioritized knowledge base is obtained by weighting information wei...Inconsistencies or conflicts appearing in the integration of ontologies and general rules are handled by applying prioritizing and updating. First, a prioritized knowledge base is obtained by weighting information weight. Then, based on the idea "abandoning the old for the new", the weight of each rule is greater than that of the information in ontologies. If ontologies conflict with general rules, then a new knowledge-base without any inconsistency or conflict is obtained by using rules with big weight updating information in ontologies with small weight. Thus, current logic programming solvers and description logic reasoners are employed to implement the reasoning services, such as querying etc. Updating based on prioritizing is more suitable for handling inconsistencies than other approaches to introducing non-standard semantics if knowledge bases are dynamically evolving. Moreover, a consistent knowledge base can be always maintained in the dynamical environment by updating outdated information with new information based on weighting. Finally, this approach to dealing with inconsistencies is feasibly exemplified.展开更多
Fault detection and diagnosis(FDD) facilitates reliable operation of systems. Various approaches have been proposed for FDD like Analytical redundancy(AR), Principal component analysis(PCA), Discrete event system(DES)...Fault detection and diagnosis(FDD) facilitates reliable operation of systems. Various approaches have been proposed for FDD like Analytical redundancy(AR), Principal component analysis(PCA), Discrete event system(DES) model etc., in the literature. Performance of FDD schemes greatly depends on accuracy of the sensors which measure the system parameters.Due to various reasons like faults, communication errors etc.,sensors may occasionally miss or report erroneous values of some system parameters to FDD engine, resulting in measurement inconsistency of these parameters. Schemes like AR, PCA etc.,have mechanisms to handle measurement inconsistency, however,they are computationally heavy. DES based FDD techniques are widely used because of computational simplicity, but they cannot handle measurement inconsistency efficiently. Existing DES based schemes do not use Measurement inconsistent(MI)parameters for FDD. These parameters are not permanently unmeasurable or erroneous, so ignoring them may lead to weak diagnosis. To address this issue, we propose a Measurement inconsistent discrete event system(MIDES) framework, which uses MI parameters for FDD at the instances they are measured by the sensors. Otherwise, when they are unmeasurable or erroneously reported, the MIDES invokes an estimator diagnoser that predicts the state(s) the system is expected to be in, using the subsequent parameters measured by the other sensors. The efficacy of the proposed method is illustrated using a pumpvalve system. In addition, an MIDES based intrusion detection system has been developed for detection of rogue dynamic host configuration protocol(DHCP) server attack by mapping the attack to a fault in the DES framework.展开更多
Conventional reliability models of belt drive systems in the failure mode of fatigue are mainly based on the static stress strength interference model and its extended models, which cannot consider dynamic factors in ...Conventional reliability models of belt drive systems in the failure mode of fatigue are mainly based on the static stress strength interference model and its extended models, which cannot consider dynamic factors in the operational duration and be used for further availability analysis. In this paper, time-dependent reliability models, failure rate models and availability models of belt drive systems are developed based on the system dynamic equations with the dynamic stress and the material property degradation taken into account. In the proposed models, dynamic failure dependence and imperfect maintenance are taken into consideration. Furthermore, the issue of time scale inconsistency between system failure rate and system availability is proposed and addressed in the proposed system availability models. Besides, Monte Carlo simulations are carried out to validate the established models. The results from the proposed models and those from the Monte Carlo simulations show a consistency. Furthermore, the case studies show that the failure dependence, imperfect maintenance and the time scale inconsistency have significant influences on system availability. The independence assumption about the belt drive systems results in underestimations of both reliability and availability. Moreover, the neglect of the time scale inconsistency causes the underestimate of the system availability. Meanwhile, these influences show obvious time-dependent characteristics.展开更多
Test of consistency is critical for the analytic hierarchy process(AHP) methodology. When a pairwise comparison matrix(PCM) fails the consistency test, the decision maker(DM) needs to make revisions. The state of the ...Test of consistency is critical for the analytic hierarchy process(AHP) methodology. When a pairwise comparison matrix(PCM) fails the consistency test, the decision maker(DM) needs to make revisions. The state of the art focuses on changing a single entry or creating a new matrix based on the original inconsistent matrix so that the modified matrix can satisfy the consistency requirement. However, we have noticed that the reason that causes inconsistency is not only numerical inconsistency, but also logical inconsistency, which may play a more important role in the whole inconsistency. Therefore, to realize satisfactory consistency, first of all, we should change some entries that form a directed circuit to make the matrix logically consistent, and then adjust other entries within acceptable deviations to make the matrix numerically consistent while preserving most of the original comparison information. In this paper, we firstly present some definitions and theories, based on which two effective methods are provided to identify directed circuits. Four optimization models are proposed to adjust the original inconsistent matrix. Finally, illustrative examples and comparison studies show the effectiveness and feasibility of our method.展开更多
In order to evaluate the performance of a molecular Hain line probe assay (Hain LPA) for rapid detection of rifampicin and isoniazid resistance of Mycobocterium tuberculosis in China, 1612 smear positive patients we...In order to evaluate the performance of a molecular Hain line probe assay (Hain LPA) for rapid detection of rifampicin and isoniazid resistance of Mycobocterium tuberculosis in China, 1612 smear positive patients were consecutively enrolled in this study. Smear positive sputum specimens were collected for Hain LPA and conventional drug susceptibility testing (DST). The sensitivity and specificity of Hain LPA were analyzed by using conventional DST as golden reference. The sensitivity, specificity, positive predictive value {PPV) and negative predictive value (NPV) for rifampicin resistance detection were 88.33%, 97.66%, 81.54%, and 98.62%, respectively. The sensitivity, specificity, PPV and NPV for isoniazid resistance detection were 80.25%, 98.07%, 87.25%, and 96.78%, respectively. These findings suggested that Hain LPA can be an effective method worthy of broader use in China.展开更多
This paper focuses on fast algorithm for computing the assignment reduct in inconsistent incomplete decision systems. It is quite inconvenient to judge the assignment reduct directly ac-cording to its definition. We p...This paper focuses on fast algorithm for computing the assignment reduct in inconsistent incomplete decision systems. It is quite inconvenient to judge the assignment reduct directly ac-cording to its definition. We propose the judgment theorem for the assignment reduct in the inconsistent incomplete decision system, which greatly simplifies judging this type reduct. On such basis, we derive a novel attribute significance measure and construct the fast assignment reduction algorithm (F-ARA), intended for com-puting the assignment reduct in inconsistent incomplete decision systems. Final y, we make a comparison between F-ARA and the discernibility matrix-based method by experiments on 13 Univer-sity of California at Irvine (UCI) datasets, and the experimental results prove that F-ARA is efficient and feasible.展开更多
The inconsistency of lithium-ion cells degrades battery performance,lifetime and even safety.The complexity of the cell reaction mechanism causes an irregular asymmetrical distribution of various cell parameters,such ...The inconsistency of lithium-ion cells degrades battery performance,lifetime and even safety.The complexity of the cell reaction mechanism causes an irregular asymmetrical distribution of various cell parameters,such as capacity and internal resistance,among others.In this study,the Newman electrochemical model was used to simulate the 1 C discharge curves of 100 LiMn2 O4 pouch cells with parameter variations typically produced in manufacturing processes,and the three-parameter Weibull probability model was used to analyze the dispersion and symmetry of the resulting discharge voltage distributions.The results showed that the dispersion of the voltage distribution was related to the rate of decrease in the discharge voltage,and the symmetry was related to the change in the rate of voltage decrease.The effect of the cells’capacity dominated the voltage distribution thermodynamically during discharge,and the phase transformation process significantly skewed the voltage distribution.The effects of the ohmic drop and polarization voltage on the voltage distribution were primarily kinetic.The presence of current returned the right-skewed voltage distribution caused by phase transformation to a more symmetrical distribution.Thus,the Weibull parameters elucidated the electrochemical behavior during the discharge process,and this method can guide the prediction and control of cell inconsistency,as well as detection and control strategies for cell management systems.展开更多
We report invalidating errors related to the statistical approach in the analysis and data inconsistencies in a published single cohort study of patients with Crohn's disease. We provide corrected calculations fro...We report invalidating errors related to the statistical approach in the analysis and data inconsistencies in a published single cohort study of patients with Crohn's disease. We provide corrected calculations from the available data and request that a corrected analysis be provided by the authors. These errors should be corrected.展开更多
In order to effectively diagnose the infeasible linear programming (LP) model of production planning in refinery, the article proposed three stages strategy based on constraints’ classification and infeasibility anal...In order to effectively diagnose the infeasible linear programming (LP) model of production planning in refinery, the article proposed three stages strategy based on constraints’ classification and infeasibility analysis. Generally, infeasibility sources involve structural inconsistencies and data errors, and the data errors are further classified intoⅠ, Ⅱ and Ⅲ. The three stages strategy are: (1) Check data when they are inputted to detect data error Ⅰ and repair them; (2) Inspect data whether they are accorded with material balance before solving the LP model to identify data error Ⅱ and repair them; (3) Find irreducible inconsistent system of infeasible LP model and give diagnosis information priority-ranked to recognize data error Ⅲ and structural inconsistencies. These stages could be automatically executed by computer, and the approach has been applied to diagnose the infeasible model well in our graphic I/O petro-chemical industry modeling system.展开更多
文摘Relationships between Unified Modeling Language (UML) diagrams are complex. The complexity leads to inconsistencies between UML diagrams easily. This paper focus on how to identify and check inconsistencies between UML diagrams. 13 consistency rules are given to identify inconsistencies between the most frequent 6 types of UML diagrams in the domain of information systems analysis and design. These diagrams are as follows: Use Case Diagrams, Class Diagrams, Activity Diagrams, State Machine Diagrams, Sequence Diagrams and Communication Diagrams. 4 methods are given to check inconsistencies between UML diagrams as follows: manual check, compulsory restriction, automatic maintenance, dynamic check. These rules and methods are helpful for developers to model information systems.
基金supported by the National Natural Science Foundation of China under Grant No. 60970010the National Basic Research 973 Program of China under Grant No. 2009CB320705the Specialized Research Fund for the Doctoral Program of Higher Education of China under Grant No. 20090073110026
文摘New challenges including how to share information on heterogeneous devices appear in data-intensive pervasive computing environments. Data integration is a practical approach to these applications. Dealing with inconsistencies is one of the important problems in data integration. In this paper we motivate the problem of data inconsistency solution for data integration in pervasive environments. We define data qualit~ criteria and expense quality criteria for data sources to solve data inconsistency. In our solution, firstly, data sources needing high expense to obtain data from them are discarded by using expense quality criteria and utility function. Since it is difficult to obtain the actual quality of data sources in pervasive computing environment, we introduce fuzzy multi-attribute group decision making approach to selecting the appropriate data sources. The experimental results show that our solution has ideal effectiveness.
基金Supported by the National Natural Science Foundation of China (Grant No. 60574053), the National High-Tech Development 863 Program of China (Grant No. 2003AA001031), and the National Basic Research 973 Program of China (Grant No. 2003CB716207).
文摘Object-oriented modeling with declarative equation based languages often unconsciously leads to structural inconsistencies. Component-based debugging is a new structural analysis approach that addresses this problem by analyzing the structure of each component in a model to separately locate faulty components. The analysis procedure is performed recursively based on the depth-first rule. It first generates fictitious equations for a component to establish a debugging environment, and then detects structural defects by using graph theoretical approaches to analyzing the structure of the system of equations resulting from the component. The proposed method can automatically locate components that cause the structural inconsistencies, and show the user detailed error messages. This information can be a great help in finding and localizing structural inconsistencies, and in some cases pinpoints them immediately.
基金This research is supported in part by the National Natural Science Foundation of China under Grant No.60703061the National Natural Science Fund for Distinguished Young Scholars of China under Grant No.60625204+3 种基金the National Key Research and Development Program of China under Grant No.2002CB312004the National 863 High-tech Project of China under Grant No.2006AA01Z155the Key Project of National Natural Science Foundation of China under Grant No.60496324the International Science Linkage Research Grant under the Australia-China Special Fund for Science and Technology
文摘Inconsistency of multi-perspective requirements specifications is a pervasive issue during the requirements process. However, managing inconsistency is not just a pure technical problem. It is always associated with a process of interactions and competitions among corresponding stakeholders. The main contribution of this paper is to present a negotiations approach to handling inconsistencies in multi-perspective software requirements. In particular, the priority of requirements relative to each perspective plays an important role in proceeding negotiation over resolving inconsistencies among different stakeholders. An algorithm of generating negotiation proposals and an approach to evaluating proposals are also presented in this paper, respectively.
基金supported by the National Natural Science Foundation of China(Grant no.42307534)Discovery Project(DP220101790)+1 种基金The University of Queensland ScholarshipAustralian Research Council Discovery Project(DP220101790).
文摘Wastewater-based epidemiology has emerged as a transformative surveillance tool for estimating substance consumption and monitoring disease prevalence,particularly during the COVID-19 pandemic.It enables the population-level monitoring of illicit drug use,pathogen prevalence,and environmental pollutant exposure.In this perspective,we summarize the key challenges specific to the Chinese context:(1)Sampling inconsistencies,necessitating standardized 24-hour composite protocols with high-frequency autosamplers(≤15 min/event)to improve the representativeness of samples.
文摘The food industry is evolving toward intelligence and digitalization,but is faced with challenges such as inconsistent standards and poor system compatibility due to lack of unified technical guidance.GB/T 46511-2025,General technical requirements for food digital factory,the first general technical national standard for food digital factory,was released recently.It bridges the gap in the industry,serving as the technical support and implementation framework for the intelligent and digital transformation of enterprises in the food industry.
基金Supported by Specialized Funds of CASIndividual Service System of Agricultural Information in Tibet(2012-J-08)+1 种基金Science and Technology Funds of CASMultimedia Information Service in Rural Area based on 3G Information Terminal(201219)~~
文摘Currently, knowledge-based sharing and service system has been a hot issue and knowledge fusion, especially for implicit knowledge discovery, becomes the core of knowledge processing and optimization in the system. In the research, a knowledge fusion framework based on agricultural ontology and fusion rules was pro- posed, including knowledge extraction, clearing and annotation modules based on a- gricultural ontology, fusion rule construction, choosing and evaluation modules based on agricultural ontology and knowledge fusion module for users' demands. Finally, the significance of the framework to system of agricultural knowledge services was proved with the help of a case.
文摘The advancements of mobile devices, public networks and the Internet of creature huge amounts of complex data, both construct & unstructured are being captured in trust to allow organizations to produce better business decisions as data is now pivotal for an organizations success. These enormous amounts of data are referred to as Big Data, which enables a competitive advantage over rivals when processed and analyzed appropriately. However Big Data Analytics has a few concerns including Management of Data, Privacy & Security, getting optimal path for transport data, and Data Representation. However, the structure of network does not completely match transportation demand, i.e., there still exist a few bottlenecks in the network. This paper presents a new approach to get the optimal path of valuable data movement through a given network based on the knapsack problem. This paper will give value for each piece of data, it depends on the importance of this data (each piece of data defined by two arguments size and value), and the approach tries to find the optimal path from source to destination, a mathematical models are developed to adjust data flows between their shortest paths based on the 0 - 1 knapsack problem. We also take out computational experience using the commercial software Gurobi and a greedy algorithm (GA), respectively. The outcome indicates that the suggest models are active and workable. This paper introduced two different algorithms to study the shortest path problems: the first algorithm studies the shortest path problems when stochastic activates and activities does not depend on weights. The second algorithm studies the shortest path problems depends on weights.
文摘In order to improve the working efficiency of the power battery pack and prolong the service life, there is a problem of inconsistency among the individual cells. Based on the centralized equalization structure of the multi-output winding transformer, a three-stage hybrid equalization control strategy is designed for equalization. The equalization scheme realizes that the high voltage single battery transfers the energy to the low voltage battery cell during the charging of the battery pack, improving not only charging efficiency and energy use loss, but also the high voltage battery transferring the power to the low voltage battery cell when the pressure difference is greater than 10 mv during the discharge. Between 5 mv and 10 mv, it performs passive equalization, reducing the output fluctuation of the power battery pack and achieving the balance purpose. During the standing time, the maximum active balancing operation within the battery pack is performed in order to achieve intra-group optimum consistency. It is proved by experiments that the equalization control method can realize the quick and effective equalization in the battery pack, and the energy balance of each single battery.
文摘According to the second law of thermodynamics, as currently understood, any given transit of a system along the reversible path proceeds with a total entropy change equal to zero. The fact that this condition is also the identifier of thermodynamic equilibrium, makes each and every point along the reversible path a state of equilibrium, and the reversible path, as expressed by a noted thermodynamic author, “a dense succession of equilibrium states”. The difficulties with these notions are plural. The fact, for example, that systems need to be forced out of equilibrium via the expenditure of work, would make any spontaneous reversible process a consumer of work, this in opposition to common thermodynamic wisdom that makes spontaneous reversible processes the most efficient transformers of work-producing-potential into actual work. The solution to this and other related impasses is provided by Dialectical Thermodynamics via its previously proved notion assigning a negative entropy change to the energy upgrading process represented by the transformation of heat into work. The said solution is here exemplified with the ideal-gas phase isomerization of butane into isobutane.
文摘AIM: To identify the clinical outcomes of hepato-cellular carcinoma (HCC) patients with inconsistent α-fetoprotein (AFP) levels which were initially high and then low at recurrence.METHODS: We retrospectively included 178 patients who underwent liver resection with high preoperative AFP levels (≥ 200 ng/dL). Sixty-nine HCC patients had recurrence during follow-up and were grouped by their AFP levels at recurrence: group Ⅰ, AFP ≤ 20 ng/dL (n = 16); group Ⅱ, AFP 20-200 ng/dL (n = 24); and group Ⅲ, AFP ≥ 200 ng/dL (n = 29). Their preoperative clinical characteristics, accumulated recurrence rate, and recurrence-to-death survival rate were compared. Three patients, one in each group, underwentliver resection twice for primary and recurrent HCC. AFP immunohistochemistry of primary and recurrent HCC specimens were examined.RESULTS: In this study, 23% of patients demon-strated normal AFP levels at HCC recurrence. The AFP levels in these patients were initially high. There were no significant differences in clinical characteristics between the three groups except for the mean recur-rence interval (21.8 ± 14.6, 12.3 ± 7.7, 8.3 ± 6.6 mo, respectively, P < 0.001) and survival time (40.2 ± 19.9, 36.1 ± 22.4, 21.9 ± 22.0 mo, respectively, P = 0.013). Tumor size > 5 cm, total bilirubin > 1.2 mg/dL, vessel invasion, Child classification B, group Ⅲ, and recurrence interval < 12 mo, were risk factors for survival rate. Cox regression analysis was performed and vessel invasion, group Ⅲ, and recurrence interval were independent risk factors. The recurrence inter-val was significant longer in group Ⅰ (P < 0.001). The recurrence-to-death survival rate was significantly bet-ter in group Ⅱ (P = 0.016). AFP staining was strong in the primary HCC specimens and was reduced at recur-rence in group Ⅰ specimens.CONCLUSION: Patients in group Ⅰ with inconsistent AFP levels had a longer recurrence interval and worse recurrence-to-death survival rate than those in group Ⅱ. This clinical presentation may be caused by a delay in the detection of HCC recurrence.
基金The National Natural Science Foundation of China (No60973003)
文摘Inconsistencies or conflicts appearing in the integration of ontologies and general rules are handled by applying prioritizing and updating. First, a prioritized knowledge base is obtained by weighting information weight. Then, based on the idea "abandoning the old for the new", the weight of each rule is greater than that of the information in ontologies. If ontologies conflict with general rules, then a new knowledge-base without any inconsistency or conflict is obtained by using rules with big weight updating information in ontologies with small weight. Thus, current logic programming solvers and description logic reasoners are employed to implement the reasoning services, such as querying etc. Updating based on prioritizing is more suitable for handling inconsistencies than other approaches to introducing non-standard semantics if knowledge bases are dynamically evolving. Moreover, a consistent knowledge base can be always maintained in the dynamical environment by updating outdated information with new information based on weighting. Finally, this approach to dealing with inconsistencies is feasibly exemplified.
基金supported by TATA Consultancy Services(TCS),India through TCS Research Fellowship Program
文摘Fault detection and diagnosis(FDD) facilitates reliable operation of systems. Various approaches have been proposed for FDD like Analytical redundancy(AR), Principal component analysis(PCA), Discrete event system(DES) model etc., in the literature. Performance of FDD schemes greatly depends on accuracy of the sensors which measure the system parameters.Due to various reasons like faults, communication errors etc.,sensors may occasionally miss or report erroneous values of some system parameters to FDD engine, resulting in measurement inconsistency of these parameters. Schemes like AR, PCA etc.,have mechanisms to handle measurement inconsistency, however,they are computationally heavy. DES based FDD techniques are widely used because of computational simplicity, but they cannot handle measurement inconsistency efficiently. Existing DES based schemes do not use Measurement inconsistent(MI)parameters for FDD. These parameters are not permanently unmeasurable or erroneous, so ignoring them may lead to weak diagnosis. To address this issue, we propose a Measurement inconsistent discrete event system(MIDES) framework, which uses MI parameters for FDD at the instances they are measured by the sensors. Otherwise, when they are unmeasurable or erroneously reported, the MIDES invokes an estimator diagnoser that predicts the state(s) the system is expected to be in, using the subsequent parameters measured by the other sensors. The efficacy of the proposed method is illustrated using a pumpvalve system. In addition, an MIDES based intrusion detection system has been developed for detection of rogue dynamic host configuration protocol(DHCP) server attack by mapping the attack to a fault in the DES framework.
基金Supported by Program for Liaoning Innovative Talents in University(Grant No.LR2017070)National Natural Science Foundation of China(Grant No.51505207)+1 种基金Open Foundation of Zhejiang Provincial Top Key Academic Discipline of Mechanical Engineering(Grant No.ZSTUME02A01)National Natural Science Foundation of China(Grant No.U1708255)
文摘Conventional reliability models of belt drive systems in the failure mode of fatigue are mainly based on the static stress strength interference model and its extended models, which cannot consider dynamic factors in the operational duration and be used for further availability analysis. In this paper, time-dependent reliability models, failure rate models and availability models of belt drive systems are developed based on the system dynamic equations with the dynamic stress and the material property degradation taken into account. In the proposed models, dynamic failure dependence and imperfect maintenance are taken into consideration. Furthermore, the issue of time scale inconsistency between system failure rate and system availability is proposed and addressed in the proposed system availability models. Besides, Monte Carlo simulations are carried out to validate the established models. The results from the proposed models and those from the Monte Carlo simulations show a consistency. Furthermore, the case studies show that the failure dependence, imperfect maintenance and the time scale inconsistency have significant influences on system availability. The independence assumption about the belt drive systems results in underestimations of both reliability and availability. Moreover, the neglect of the time scale inconsistency causes the underestimate of the system availability. Meanwhile, these influences show obvious time-dependent characteristics.
基金supported by the National Natural Science Foundation of China(61601501 61502521)
文摘Test of consistency is critical for the analytic hierarchy process(AHP) methodology. When a pairwise comparison matrix(PCM) fails the consistency test, the decision maker(DM) needs to make revisions. The state of the art focuses on changing a single entry or creating a new matrix based on the original inconsistent matrix so that the modified matrix can satisfy the consistency requirement. However, we have noticed that the reason that causes inconsistency is not only numerical inconsistency, but also logical inconsistency, which may play a more important role in the whole inconsistency. Therefore, to realize satisfactory consistency, first of all, we should change some entries that form a directed circuit to make the matrix logically consistent, and then adjust other entries within acceptable deviations to make the matrix numerically consistent while preserving most of the original comparison information. In this paper, we firstly present some definitions and theories, based on which two effective methods are provided to identify directed circuits. Four optimization models are proposed to adjust the original inconsistent matrix. Finally, illustrative examples and comparison studies show the effectiveness and feasibility of our method.
基金supported by Bill&Melinda Gates Foundation Tuberculosis Prevention and Control Project(2009-04-01)
文摘In order to evaluate the performance of a molecular Hain line probe assay (Hain LPA) for rapid detection of rifampicin and isoniazid resistance of Mycobocterium tuberculosis in China, 1612 smear positive patients were consecutively enrolled in this study. Smear positive sputum specimens were collected for Hain LPA and conventional drug susceptibility testing (DST). The sensitivity and specificity of Hain LPA were analyzed by using conventional DST as golden reference. The sensitivity, specificity, positive predictive value {PPV) and negative predictive value (NPV) for rifampicin resistance detection were 88.33%, 97.66%, 81.54%, and 98.62%, respectively. The sensitivity, specificity, PPV and NPV for isoniazid resistance detection were 80.25%, 98.07%, 87.25%, and 96.78%, respectively. These findings suggested that Hain LPA can be an effective method worthy of broader use in China.
基金supported by the National Natural Science Foundation of China(61363047)the Jiangxi Education Department(GJJ13760)the Science and Technology Support Foundation of Jiangxi Province(20111BBE50008)
文摘This paper focuses on fast algorithm for computing the assignment reduct in inconsistent incomplete decision systems. It is quite inconvenient to judge the assignment reduct directly ac-cording to its definition. We propose the judgment theorem for the assignment reduct in the inconsistent incomplete decision system, which greatly simplifies judging this type reduct. On such basis, we derive a novel attribute significance measure and construct the fast assignment reduction algorithm (F-ARA), intended for com-puting the assignment reduct in inconsistent incomplete decision systems. Final y, we make a comparison between F-ARA and the discernibility matrix-based method by experiments on 13 Univer-sity of California at Irvine (UCI) datasets, and the experimental results prove that F-ARA is efficient and feasible.
基金financially supported by the National Natural Science Foundation of China(No.U156405)the GRINM Youth Foundation funded project
文摘The inconsistency of lithium-ion cells degrades battery performance,lifetime and even safety.The complexity of the cell reaction mechanism causes an irregular asymmetrical distribution of various cell parameters,such as capacity and internal resistance,among others.In this study,the Newman electrochemical model was used to simulate the 1 C discharge curves of 100 LiMn2 O4 pouch cells with parameter variations typically produced in manufacturing processes,and the three-parameter Weibull probability model was used to analyze the dispersion and symmetry of the resulting discharge voltage distributions.The results showed that the dispersion of the voltage distribution was related to the rate of decrease in the discharge voltage,and the symmetry was related to the change in the rate of voltage decrease.The effect of the cells’capacity dominated the voltage distribution thermodynamically during discharge,and the phase transformation process significantly skewed the voltage distribution.The effects of the ohmic drop and polarization voltage on the voltage distribution were primarily kinetic.The presence of current returned the right-skewed voltage distribution caused by phase transformation to a more symmetrical distribution.Thus,the Weibull parameters elucidated the electrochemical behavior during the discharge process,and this method can guide the prediction and control of cell inconsistency,as well as detection and control strategies for cell management systems.
文摘We report invalidating errors related to the statistical approach in the analysis and data inconsistencies in a published single cohort study of patients with Crohn's disease. We provide corrected calculations from the available data and request that a corrected analysis be provided by the authors. These errors should be corrected.
文摘In order to effectively diagnose the infeasible linear programming (LP) model of production planning in refinery, the article proposed three stages strategy based on constraints’ classification and infeasibility analysis. Generally, infeasibility sources involve structural inconsistencies and data errors, and the data errors are further classified intoⅠ, Ⅱ and Ⅲ. The three stages strategy are: (1) Check data when they are inputted to detect data error Ⅰ and repair them; (2) Inspect data whether they are accorded with material balance before solving the LP model to identify data error Ⅱ and repair them; (3) Find irreducible inconsistent system of infeasible LP model and give diagnosis information priority-ranked to recognize data error Ⅲ and structural inconsistencies. These stages could be automatically executed by computer, and the approach has been applied to diagnose the infeasible model well in our graphic I/O petro-chemical industry modeling system.