The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path inform...The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path information can also help protect user privacy and security.However,due to the complicated and heterogeneous structure of the Internet,path information is not publicly available.Obtaining path information is challenging due to the limited measurement probes and collectors.Therefore,inferring Internet inter-domain paths from the limited data is a supplementary approach to measure Internet inter-domain paths.The purpose of this survey is to provide an overview of techniques that have been conducted to infer Internet inter-domain paths from 2005 to 2023 and present the main lessons from these studies.To this end,we summarize the inter-domain path inference techniques based on the granularity of the paths,for each method,we describe the data sources,the key ideas,the advantages,and the limitations.To help readers understand the path inference techniques,we also summarize the background techniques for path inference,such as techniques to measure the Internet,infer AS relationships,resolve aliases,and map IP addresses to ASes.A case study of the existing techniques is also presented to show the real-world applications of inter-domain path inference.Additionally,we discuss the challenges and opportunities in inferring Internet inter-domain paths,the drawbacks of the state-of-the-art techniques,and the future directions.展开更多
Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial ...Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial eastern and middle/western Pacific on the SSTA in the equatorial region and their contribution to the latter are diagnosed and verified with observations of a number of significant El Nio and La Nia episodes. New viewpoints are propsed. The methods of wavelet decomposition and reconstruction are used to build a predictive model based on independent domains of frequency,which shows some advantages in composite prediction and prediction validity.The methods presented above are of non-linearity, error-allowing and auto-adaptive/learning, in addition to rapid and easy access,illustrative and quantitative presentation,and analyzed results that agree generally with facts. They are useful in diagnosing and predicting the El Nio and La Nia problems that are just roughly described in dynamics.展开更多
Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Ol...Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Olympic Park, Beijing, Shandong Province, Gansu Province and Shanxi Province, we have generalized the method and procedure for inferring the discontinuity of electrical structures (DES) indicating a buried fault in urban areas from resistivity tomograms and its typical electrical features. In general, the layered feature of the electrical structure is first analyzed to preliminarily define whether or not a DES exists in the target area. Resistivity contours in resistivity tomograms are then analyzed from the deep to the shallow. If they extend upward from the deep to the shallow and shape into an integral dislocation, sharp flexure (convergence) or gradient zone, it is inferred that the DES exists, indicating a buried fault. Finally, horizontal tracing is be carried out to define the trend of the DES. The DES can be divided into three types-type AB, ABA and AC. In the present paper, the Zhangdian-Renhe fault system in Zibo city is used as an example to illustrate how to use the method to infer the location and spatial extension of a target fault. Geologic drilling holes are placed based on our research results, and the drilling logs testify that our results are correct. However, the method of this paper is not exclusive and inflexible. It is expected to provide reference and assistance for inferring the shallow buried faults in urban areas from resistivity tomograms in the future.展开更多
The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defin...The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defined assumption of a-priori-ness of knowledge. For realizing this aim, the following work has been done: 1) a two-valued algebraic system of formal axiology has been defined precisely and applied to proper-philosophy of physics, namely, to an almost unknown (not-recognized) formal-axiological aspect of the physical law of conservation of energy;2) the formal axiomatic epistemology-and-axiology theory Sigma has been defined precisely and applied to proper-physics for realizing the above-indicated purpose. Thus, a discrete mathematical model of relationship between philosophy of physics and universal epistemology united with formal axiology has been constructed. Results: 1) By accurate computing relevant compositions of evaluation-functions within the discrete mathematical model, it is demonstrated that a formal-axiological analog of the great conservation law of proper physics is a formal-axiological law of two-valued algebra of metaphysics. (A precise algorithmic definition of the unhabitual (not-well-known) notion “formal-axiological law of algebra of metaphysics” is given.) 2) The hitherto never published significantly new nontrivial scientific result of investigation presented in this article is a formal logical inference of the law of conservation of energy within the formal axiomatic theory Sigma from conjunction of the formal-axiological analog of the law of conservation of energy and the assumption of a-priori-ness of knowledge.展开更多
Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general pop...Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general population. These sub-groups have higher infectivity rates. We came up with a likelihood inference model of multi-type birth-death process that can be used to make inference for HIV epidemic in an African setting. We employ a likelihood inference that incorporates a probability of removal from infectious pool in the model. We have simulated trees and made parameter inference on the simulated trees as well as investigating whether the model distinguishes between heterogeneous and homogeneous dynamics. The model makes fairly good parameter inference. It distinguishes between heterogeneous and homogeneous dynamics well. Parameter estimation was also performed under sparse sampling scenario. We investigated whether trees obtained from a structured population are more balanced than those from a non-structured host population using tree statistics that measure tree balance and imbalance. Trees from non-structured population were more balanced basing on Colless and Sackin indices.展开更多
Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use th...Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use this information to provide better service and improve user experience. Location awareness makes certain applications possible, e.g., recommending nearby businesses and tracking estimated routes. An Android application is able to collect useful Wi-Fi information without registering a location listener with a network-based provider. We passively collected the data of the IDs of Wi-Fi access points and the received signal strengths. We developed and implemented an algorithm to analyse the data;and designed heuristics to infer the location of the device over time—all without ever connecting to the network thus maximally preserving the privacy of the user.展开更多
Deciding the penalty of a law case has always been a complex process, which may involve with much coordination. Despite the judicial study based on the rules and conditions, artificial intelligence and machine learnin...Deciding the penalty of a law case has always been a complex process, which may involve with much coordination. Despite the judicial study based on the rules and conditions, artificial intelligence and machine learning has rarely been used to study the problem of penalty inferring, leaving the large amount of law cases as well as various factors among them untouched. This paper aims to incorporate the state-of-the-art artificial intelligence methods to exploit to what extent this problem can be alleviated. We first analyze 145 000 law cases and observe that there are two sorts of labels, temporal labels and spatial labels, which have unique characteristics. Temporal labels and spatial labels tend to converge towards the final penalty, on condition that the cases are of the same category. In light of this, we propose a latent-class probabilistic generative model, namely Penalty Topic Model (PTM), to infer the topic of law cases, and the temporal and spatial patterns of topics embedded in the case judgment. Then, the learnt knowledge is utilized to automatically cluster all cases accordingly in a unified way. We conduct extensive experiments to evaluate the performance of the proposed PTM on a real large-scale dataset of law cases. The experimental results show the superiority of our proposed PTM.展开更多
Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel perf...Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.展开更多
Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the ...Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.展开更多
The primary objective of this study is to measure fluoride levels in groundwater samples using machine learning approaches alongside traditional and fuzzy logic models based health risk assessment in the hard rock Arj...The primary objective of this study is to measure fluoride levels in groundwater samples using machine learning approaches alongside traditional and fuzzy logic models based health risk assessment in the hard rock Arjunanadi River basin,South India.Fluoride levels in the study area vary between 0.1 and 3.10 mg/L,with 32 samples exceeding the World Health Organization(WHO)standard of 1.5 mg/L.Hydrogeochemical analyses(Durov and Gibbs)clearly show that the overall water chemistry is primarily influenced by simple dissolution,mixing,and rock-water interactions,indicating that geogenic sources are the predominant contributors to fluoride in the study area.Around 446.5 km^(2)is considered at risk.In predictive analysis,five Machine Learning(ML)models were used,with the AdaBoost model performing better than the other models,achieving 96%accuracy and 4%error rate.The Traditional Health Risk Assessment(THRA)results indicate that 65%of samples pose highly susceptible for dental fluorosis,while 12%of samples pose highly susceptible for skeletal fluorosis in young age groups.The Fuzzy Inference System(FIS)model effectively manages ambiguity and linguistic factors,which are crucial when addressing health risks linked to groundwater fluoride contamination.In this model,input variables include fluoride concentration,individual age,and ingestion rate,while output variables consist of dental caries risk,dental fluorosis,and skeletal fluorosis.The overall results indicate that increased ingestion rates and prolonged exposure to contaminated water make adults and the elderly people vulnerable to dental and skeletal fluorosis,along with very young and young age groups.This study is an essential resource for local authorities,healthcare officials,and communities,aiding in the mitigation of health risks associated with groundwater contamination and enhancing quality of life through improved water management and health risk assessment,aligning with Sustainable Development Goals(SDGs)3 and 6,thereby contributing to a cleaner and healthier society.展开更多
Edge Machine Learning(EdgeML)and Tiny Machine Learning(TinyML)are fast-growing fields that bring machine learning to resource-constrained devices,allowing real-time data processing and decision-making at the network’...Edge Machine Learning(EdgeML)and Tiny Machine Learning(TinyML)are fast-growing fields that bring machine learning to resource-constrained devices,allowing real-time data processing and decision-making at the network’s edge.However,the complexity of model conversion techniques,diverse inference mechanisms,and varied learning strategies make designing and deploying these models challenging.Additionally,deploying TinyML models on resource-constrained hardware with specific software frameworks has broadened EdgeML’s applications across various sectors.These factors underscore the necessity for a comprehensive literature review,as current reviews do not systematically encompass the most recent findings on these topics.Consequently,it provides a comprehensive overview of state-of-the-art techniques in model conversion,inference mechanisms,learning strategies within EdgeML,and deploying these models on resource-constrained edge devices using TinyML.It identifies 90 research articles published between 2018 and 2025,categorizing them into two main areas:(1)model conversion,inference,and learning strategies in EdgeML and(2)deploying TinyML models on resource-constrained hardware using specific software frameworks.In the first category,the synthesis of selected research articles compares and critically reviews various model conversion techniques,inference mechanisms,and learning strategies.In the second category,the synthesis identifies and elaborates on major development boards,software frameworks,sensors,and algorithms used in various applications across six major sectors.As a result,this article provides valuable insights for researchers,practitioners,and developers.It assists them in choosing suitable model conversion techniques,inference mechanisms,learning strategies,hardware development boards,software frameworks,sensors,and algorithms tailored to their specific needs and applications across various sectors.展开更多
In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes...In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.展开更多
Offshore drilling costs are high,and the downhole environment is even more complex.Improving the rate of penetration(ROP)can effectively shorten offshore drilling cycles and improve economic benefits.It is difficult f...Offshore drilling costs are high,and the downhole environment is even more complex.Improving the rate of penetration(ROP)can effectively shorten offshore drilling cycles and improve economic benefits.It is difficult for the current ROP models to guarantee the prediction accuracy and the robustness of the models at the same time.To address the current issues,a new ROP prediction model was developed in this study,which considers ROP as a time series signal(ROP signal).The model is based on the time convolutional network(TCN)framework and integrates ensemble empirical modal decomposition(EEMD)and Bayesian network causal inference(BN),the model is named EEMD-BN-TCN.Within the proposed model,the EEMD decomposes the original ROP signal into multiple sets of sub-signals.The BN determines the causal relationship between the sub-signals and the key physical parameters(weight on bit and revolutions per minute)and carries out preliminary reconstruction of the sub-signals based on the causal relationship.The TCN predicts signals reconstructed by BN.When applying this model to an actual production well,the average absolute percentage error of the EEMD-BN-TCN prediction decreased from 18.4%with TCN to 9.2%.In addition,compared with other models,the EEMD-BN-TCN can improve the decomposition signal of ROP by regulating weight on bit and revolutions per minute,ultimately enhancing ROP.展开更多
Unmanned Aerial Vehicles(UAVs)coupled with deep learning such as Convolutional Neural Networks(CNNs)have been widely applied across numerous domains,including agriculture,smart city monitoring,and fire rescue operatio...Unmanned Aerial Vehicles(UAVs)coupled with deep learning such as Convolutional Neural Networks(CNNs)have been widely applied across numerous domains,including agriculture,smart city monitoring,and fire rescue operations,owing to their malleability and versatility.However,the computation-intensive and latency-sensitive natures of CNNs present a formidable obstacle to their deployment on resource-constrained UAVs.Some early studies have explored a hybrid approach that dynamically switches between lightweight and complex models to balance accuracy and latency.However,they often overlook scenarios involving multiple concurrent CNN streams,where competition for resources between streams can substantially impact latency and overall system performance.In this paper,we first investigate the deployment of both lightweight and complex models for multiple CNN streams in UAV swarm.Specifically,we formulate an optimization problem to minimize the total latency across multiple CNN streams,under the constraints on UAV memory and the accuracy requirement of each stream.To address this problem,we propose an algorithm called Adaptive Model Switching of collaborative inference for MultiCNN streams(AMSM)to identify the inference strategy with a low latency.Simulation results demonstrate that the proposed AMSM algorithm consistently achieves the lowest latency while meeting the accuracy requirements compared to benchmark algorithms.展开更多
This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes con...This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes considering spatial location,time,and two key parameters:diffusion rate and growth rate.A Bayesian framework is employed to analyze the model's parameters and assess prediction uncertainties.Satellite imagery from 1992 and 2022 was used for model calibration and validation.By solving the DLG model using the finite difference method,we predicted a 6.6%–51.1%increase in vegetation density for the Atlantic Rainforest and a 5.3%–99.9%increase for the Rupestrian Grassland over 30 years,with the latter showing slower recovery but achieving a better model fit(lower RMSE)compared to the Atlantic Rainforest.The Bayesian approach revealed well-defined parameter distributions and lower parameter values for the Rupestrian Grassland,supporting the slower recovery prediction.Importantly,the model achieved good agreement with observed vegetation patterns in unseen validation data for both biomes.While there were minor spatial variations in accuracy,the overall distributions of predicted and observed vegetation density were comparable.Furthermore,this study highlights the importance of considering uncertainty in model predictions.Bayesian inference allowed us to quantify this uncertainty,demonstrating that the model's performance can vary across locations.Our approach provides valuable insights into forest regeneration process uncertainties,enabling comparisons of modeled scenarios at different recovery stages for better decision-making in these critical mountain biomes.展开更多
Published proof test coverage(PTC)estimates for emergency shutdown valves(ESDVs)show only moderate agreement and are predominantly opinion-based.A Failure Modes,Effects,and Diagnostics Analysis(FMEDA)was undertaken us...Published proof test coverage(PTC)estimates for emergency shutdown valves(ESDVs)show only moderate agreement and are predominantly opinion-based.A Failure Modes,Effects,and Diagnostics Analysis(FMEDA)was undertaken using component failure rate data to predict PTC for a full stroke test and a partial stroke test.Given the subjective and uncertain aspects of the FMEDA approach,specifically the selection of component failure rates and the determination of the probability of detecting failure modes,a Fuzzy Inference System(FIS)was proposed to manage the data,addressing the inherent uncertainties.Fuzzy inference systems have been used previously for various FMEA type assessments,but this is the first time an FIS has been employed for use with FMEDA.ESDV PTC values were generated from both the standard FMEDA and the fuzzy-FMEDA approaches using data provided by FMEDA experts.This work demonstrates that fuzzy inference systems can address the subjectivity inherent in FMEDA data,enabling reliable estimates of ESDV proof test coverage for both full and partial stroke tests.This facilitates optimized maintenance planning while ensuring safety is not compromised.展开更多
Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-len...Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.展开更多
Understanding the characteristics and driving factors behind changes in vegetation ecosystem resilience is crucial for mitigating both current and future impacts of climate change. Despite recent advances in resilienc...Understanding the characteristics and driving factors behind changes in vegetation ecosystem resilience is crucial for mitigating both current and future impacts of climate change. Despite recent advances in resilience research, significant knowledge gaps remain regarding the drivers of resilience changes. In this study, we investigated the dynamics of ecosystem resilience across China and identified potential driving factors using the kernel normalized difference vegetation index(kNDVI) from 2000 to 2020. Our results indicate that vegetation resilience in China has exhibited an increasing trend over the past two decades, with a notable breakpoint occurring around 2012. We found that precipitation was the dominant driver of changes in ecosystem resilience, accounting for 35.82% of the variation across China, followed by monthly average maximum temperature(Tmax) and vapor pressure deficit(VPD), which explained 28.95% and 28.31% of the variation, respectively. Furthermore, we revealed that daytime and nighttime warming has asymmetric impacts on vegetation resilience, with temperature factors such as Tmin and Tmax becoming more influential, while the importance of precipitation slightly decreases after the resilience change point. Overall, our study highlights the key roles of water availability and temperature in shaping vegetation resilience and underscores the asymmetric effects of daytime and nighttime warming on ecosystem resilience.展开更多
基金the China Postdoctoral Science Foundation(2023TQ0089)the National Natural Science Foundation of China(Nos.62072465,62172155)the Science and Technology Innovation Program of Hunan Province(Nos.2022RC3061,2023RC3027).
文摘The global Internet is a complex network of interconnected autonomous systems(ASes).Understanding Internet inter-domain path information is crucial for understanding,managing,and improving the Internet.The path information can also help protect user privacy and security.However,due to the complicated and heterogeneous structure of the Internet,path information is not publicly available.Obtaining path information is challenging due to the limited measurement probes and collectors.Therefore,inferring Internet inter-domain paths from the limited data is a supplementary approach to measure Internet inter-domain paths.The purpose of this survey is to provide an overview of techniques that have been conducted to infer Internet inter-domain paths from 2005 to 2023 and present the main lessons from these studies.To this end,we summarize the inter-domain path inference techniques based on the granularity of the paths,for each method,we describe the data sources,the key ideas,the advantages,and the limitations.To help readers understand the path inference techniques,we also summarize the background techniques for path inference,such as techniques to measure the Internet,infer AS relationships,resolve aliases,and map IP addresses to ASes.A case study of the existing techniques is also presented to show the real-world applications of inter-domain path inference.Additionally,we discuss the challenges and opportunities in inferring Internet inter-domain paths,the drawbacks of the state-of-the-art techniques,and the future directions.
文摘Methods and approaches are discussed that identify and filter off affecting factors (noise) above primary signals,based on the Adaptive-Nework-Based Fuzzy Inference System. Influences of the zonal winds in equatorial eastern and middle/western Pacific on the SSTA in the equatorial region and their contribution to the latter are diagnosed and verified with observations of a number of significant El Nio and La Nia episodes. New viewpoints are propsed. The methods of wavelet decomposition and reconstruction are used to build a predictive model based on independent domains of frequency,which shows some advantages in composite prediction and prediction validity.The methods presented above are of non-linearity, error-allowing and auto-adaptive/learning, in addition to rapid and easy access,illustrative and quantitative presentation,and analyzed results that agree generally with facts. They are useful in diagnosing and predicting the El Nio and La Nia problems that are just roughly described in dynamics.
基金The project entitled "Urban Active Fault Surveying Project"(143623) funded by the National Development and Roform Commission of China"Active Faults Exploration and Seismic Hazard Assessment in Zibo City"(SD1501) funded by the Department of Science & Technology of Shangdong Province,China
文摘Electrical resistivity tomography (ERT) has been used to experimentally detect shallow buried faults in urban areas in the past a few years, with some progress and experience obtained. According to the results from Olympic Park, Beijing, Shandong Province, Gansu Province and Shanxi Province, we have generalized the method and procedure for inferring the discontinuity of electrical structures (DES) indicating a buried fault in urban areas from resistivity tomograms and its typical electrical features. In general, the layered feature of the electrical structure is first analyzed to preliminarily define whether or not a DES exists in the target area. Resistivity contours in resistivity tomograms are then analyzed from the deep to the shallow. If they extend upward from the deep to the shallow and shape into an integral dislocation, sharp flexure (convergence) or gradient zone, it is inferred that the DES exists, indicating a buried fault. Finally, horizontal tracing is be carried out to define the trend of the DES. The DES can be divided into three types-type AB, ABA and AC. In the present paper, the Zhangdian-Renhe fault system in Zibo city is used as an example to illustrate how to use the method to infer the location and spatial extension of a target fault. Geologic drilling holes are placed based on our research results, and the drilling logs testify that our results are correct. However, the method of this paper is not exclusive and inflexible. It is expected to provide reference and assistance for inferring the shallow buried faults in urban areas from resistivity tomograms in the future.
文摘The research purpose is invention (construction) of a formal logical inference of the Law of Conservation of Energy within a logically formalized axiomatic epistemology-and-axiology theory Sigma from a precisely defined assumption of a-priori-ness of knowledge. For realizing this aim, the following work has been done: 1) a two-valued algebraic system of formal axiology has been defined precisely and applied to proper-philosophy of physics, namely, to an almost unknown (not-recognized) formal-axiological aspect of the physical law of conservation of energy;2) the formal axiomatic epistemology-and-axiology theory Sigma has been defined precisely and applied to proper-physics for realizing the above-indicated purpose. Thus, a discrete mathematical model of relationship between philosophy of physics and universal epistemology united with formal axiology has been constructed. Results: 1) By accurate computing relevant compositions of evaluation-functions within the discrete mathematical model, it is demonstrated that a formal-axiological analog of the great conservation law of proper physics is a formal-axiological law of two-valued algebra of metaphysics. (A precise algorithmic definition of the unhabitual (not-well-known) notion “formal-axiological law of algebra of metaphysics” is given.) 2) The hitherto never published significantly new nontrivial scientific result of investigation presented in this article is a formal logical inference of the law of conservation of energy within the formal axiomatic theory Sigma from conjunction of the formal-axiological analog of the law of conservation of energy and the assumption of a-priori-ness of knowledge.
基金supported by the National Natural Science Foundation of China(61471343)the National Key Technology Research and Development Program of the Ministry of Science and Technology of China(2014BAK14B03)
文摘Human Immunodeficiency Virus (HIV) dynamics in Africa are purely characterised by sparse sampling of DNA sequences for individuals who are infected. There are some sub-groups that are more at risk than the general population. These sub-groups have higher infectivity rates. We came up with a likelihood inference model of multi-type birth-death process that can be used to make inference for HIV epidemic in an African setting. We employ a likelihood inference that incorporates a probability of removal from infectious pool in the model. We have simulated trees and made parameter inference on the simulated trees as well as investigating whether the model distinguishes between heterogeneous and homogeneous dynamics. The model makes fairly good parameter inference. It distinguishes between heterogeneous and homogeneous dynamics well. Parameter estimation was also performed under sparse sampling scenario. We investigated whether trees obtained from a structured population are more balanced than those from a non-structured host population using tree statistics that measure tree balance and imbalance. Trees from non-structured population were more balanced basing on Colless and Sackin indices.
文摘Mobile phones are becoming a primary platform for information access. A major aspect of ubiquitous computing is context-aware applications which collect information about the environment that the user is in and use this information to provide better service and improve user experience. Location awareness makes certain applications possible, e.g., recommending nearby businesses and tracking estimated routes. An Android application is able to collect useful Wi-Fi information without registering a location listener with a network-based provider. We passively collected the data of the IDs of Wi-Fi access points and the received signal strengths. We developed and implemented an algorithm to analyse the data;and designed heuristics to infer the location of the device over time—all without ever connecting to the network thus maximally preserving the privacy of the user.
基金This work is supported in part by the National Key Research and Development Program of China under Grant No. 2016YFC0800805 and the National Natural Science Foundation of China under Grant No. 61690201.
文摘Deciding the penalty of a law case has always been a complex process, which may involve with much coordination. Despite the judicial study based on the rules and conditions, artificial intelligence and machine learning has rarely been used to study the problem of penalty inferring, leaving the large amount of law cases as well as various factors among them untouched. This paper aims to incorporate the state-of-the-art artificial intelligence methods to exploit to what extent this problem can be alleviated. We first analyze 145 000 law cases and observe that there are two sorts of labels, temporal labels and spatial labels, which have unique characteristics. Temporal labels and spatial labels tend to converge towards the final penalty, on condition that the cases are of the same category. In light of this, we propose a latent-class probabilistic generative model, namely Penalty Topic Model (PTM), to infer the topic of law cases, and the temporal and spatial patterns of topics embedded in the case judgment. Then, the learnt knowledge is utilized to automatically cluster all cases accordingly in a unified way. We conduct extensive experiments to evaluate the performance of the proposed PTM on a real large-scale dataset of law cases. The experimental results show the superiority of our proposed PTM.
文摘Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.
基金Supported by the National Natural Science Foundation of China(12061017,12361055)the Research Fund of Guangxi Key Lab of Multi-source Information Mining&Security(22-A-01-01)。
文摘Existing blockwise empirical likelihood(BEL)method blocks the observations or their analogues,which is proven useful under some dependent data settings.In this paper,we introduce a new BEL(NBEL)method by blocking the scoring functions under high dimensional cases.We study the construction of confidence regions for the parameters in spatial autoregressive models with spatial autoregressive disturbances(SARAR models)with high dimension of parameters by using the NBEL method.It is shown that the NBEL ratio statistics are asymptoticallyχ^(2)-type distributed,which are used to obtain the NBEL based confidence regions for the parameters in SARAR models.A simulation study is conducted to compare the performances of the NBEL and the usual EL methods.
基金the Anusandhan National Research Foundation(ANRF),New Delhi[Erstwhile,Science and Engineering Research Board(SERB)]Department of Science and Technology(DST)(Government of India)(File No.:CRG/2022/002618 Dated:22.08.2023)for providing the grant and support to carry out this work effectively.
文摘The primary objective of this study is to measure fluoride levels in groundwater samples using machine learning approaches alongside traditional and fuzzy logic models based health risk assessment in the hard rock Arjunanadi River basin,South India.Fluoride levels in the study area vary between 0.1 and 3.10 mg/L,with 32 samples exceeding the World Health Organization(WHO)standard of 1.5 mg/L.Hydrogeochemical analyses(Durov and Gibbs)clearly show that the overall water chemistry is primarily influenced by simple dissolution,mixing,and rock-water interactions,indicating that geogenic sources are the predominant contributors to fluoride in the study area.Around 446.5 km^(2)is considered at risk.In predictive analysis,five Machine Learning(ML)models were used,with the AdaBoost model performing better than the other models,achieving 96%accuracy and 4%error rate.The Traditional Health Risk Assessment(THRA)results indicate that 65%of samples pose highly susceptible for dental fluorosis,while 12%of samples pose highly susceptible for skeletal fluorosis in young age groups.The Fuzzy Inference System(FIS)model effectively manages ambiguity and linguistic factors,which are crucial when addressing health risks linked to groundwater fluoride contamination.In this model,input variables include fluoride concentration,individual age,and ingestion rate,while output variables consist of dental caries risk,dental fluorosis,and skeletal fluorosis.The overall results indicate that increased ingestion rates and prolonged exposure to contaminated water make adults and the elderly people vulnerable to dental and skeletal fluorosis,along with very young and young age groups.This study is an essential resource for local authorities,healthcare officials,and communities,aiding in the mitigation of health risks associated with groundwater contamination and enhancing quality of life through improved water management and health risk assessment,aligning with Sustainable Development Goals(SDGs)3 and 6,thereby contributing to a cleaner and healthier society.
文摘Edge Machine Learning(EdgeML)and Tiny Machine Learning(TinyML)are fast-growing fields that bring machine learning to resource-constrained devices,allowing real-time data processing and decision-making at the network’s edge.However,the complexity of model conversion techniques,diverse inference mechanisms,and varied learning strategies make designing and deploying these models challenging.Additionally,deploying TinyML models on resource-constrained hardware with specific software frameworks has broadened EdgeML’s applications across various sectors.These factors underscore the necessity for a comprehensive literature review,as current reviews do not systematically encompass the most recent findings on these topics.Consequently,it provides a comprehensive overview of state-of-the-art techniques in model conversion,inference mechanisms,learning strategies within EdgeML,and deploying these models on resource-constrained edge devices using TinyML.It identifies 90 research articles published between 2018 and 2025,categorizing them into two main areas:(1)model conversion,inference,and learning strategies in EdgeML and(2)deploying TinyML models on resource-constrained hardware using specific software frameworks.In the first category,the synthesis of selected research articles compares and critically reviews various model conversion techniques,inference mechanisms,and learning strategies.In the second category,the synthesis identifies and elaborates on major development boards,software frameworks,sensors,and algorithms used in various applications across six major sectors.As a result,this article provides valuable insights for researchers,practitioners,and developers.It assists them in choosing suitable model conversion techniques,inference mechanisms,learning strategies,hardware development boards,software frameworks,sensors,and algorithms tailored to their specific needs and applications across various sectors.
基金supported by the National Natural Science Foundation of China(No.71501183).
文摘In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.
基金the financial support by the National Natural Science Foundation of China(Grant No.U24B2029)the Key Projects of the National Natural Science Foundation of China(Grant No.52334001)+1 种基金the Strategic Cooperation Technology Projects of CNPC and CUPB(Grand No.ZLZX2020-02)the China University of Petroleum,Beijing(Grand No.ZX20230042)。
文摘Offshore drilling costs are high,and the downhole environment is even more complex.Improving the rate of penetration(ROP)can effectively shorten offshore drilling cycles and improve economic benefits.It is difficult for the current ROP models to guarantee the prediction accuracy and the robustness of the models at the same time.To address the current issues,a new ROP prediction model was developed in this study,which considers ROP as a time series signal(ROP signal).The model is based on the time convolutional network(TCN)framework and integrates ensemble empirical modal decomposition(EEMD)and Bayesian network causal inference(BN),the model is named EEMD-BN-TCN.Within the proposed model,the EEMD decomposes the original ROP signal into multiple sets of sub-signals.The BN determines the causal relationship between the sub-signals and the key physical parameters(weight on bit and revolutions per minute)and carries out preliminary reconstruction of the sub-signals based on the causal relationship.The TCN predicts signals reconstructed by BN.When applying this model to an actual production well,the average absolute percentage error of the EEMD-BN-TCN prediction decreased from 18.4%with TCN to 9.2%.In addition,compared with other models,the EEMD-BN-TCN can improve the decomposition signal of ROP by regulating weight on bit and revolutions per minute,ultimately enhancing ROP.
基金supported by the National Natural Science Foundation of China(No.61931011)the Jiangsu Provincial Key Research and Development Program,China(No.BE2021013-4)the Fundamental Research Project in University Characteristic Disciplines,China(No.ILF240071A24)。
文摘Unmanned Aerial Vehicles(UAVs)coupled with deep learning such as Convolutional Neural Networks(CNNs)have been widely applied across numerous domains,including agriculture,smart city monitoring,and fire rescue operations,owing to their malleability and versatility.However,the computation-intensive and latency-sensitive natures of CNNs present a formidable obstacle to their deployment on resource-constrained UAVs.Some early studies have explored a hybrid approach that dynamically switches between lightweight and complex models to balance accuracy and latency.However,they often overlook scenarios involving multiple concurrent CNN streams,where competition for resources between streams can substantially impact latency and overall system performance.In this paper,we first investigate the deployment of both lightweight and complex models for multiple CNN streams in UAV swarm.Specifically,we formulate an optimization problem to minimize the total latency across multiple CNN streams,under the constraints on UAV memory and the accuracy requirement of each stream.To address this problem,we propose an algorithm called Adaptive Model Switching of collaborative inference for MultiCNN streams(AMSM)to identify the inference strategy with a low latency.Simulation results demonstrate that the proposed AMSM algorithm consistently achieves the lowest latency while meeting the accuracy requirements compared to benchmark algorithms.
基金financial support from the Brazilian National Council for Scientific and Technological Development(CNPq)and the Federal University of Ouro PretoFinancial support from the Minas Gerais Research Foundation(FAPEMIG)under grant number APQ-06559-24 is also gratefully acknowledged。
文摘This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes considering spatial location,time,and two key parameters:diffusion rate and growth rate.A Bayesian framework is employed to analyze the model's parameters and assess prediction uncertainties.Satellite imagery from 1992 and 2022 was used for model calibration and validation.By solving the DLG model using the finite difference method,we predicted a 6.6%–51.1%increase in vegetation density for the Atlantic Rainforest and a 5.3%–99.9%increase for the Rupestrian Grassland over 30 years,with the latter showing slower recovery but achieving a better model fit(lower RMSE)compared to the Atlantic Rainforest.The Bayesian approach revealed well-defined parameter distributions and lower parameter values for the Rupestrian Grassland,supporting the slower recovery prediction.Importantly,the model achieved good agreement with observed vegetation patterns in unseen validation data for both biomes.While there were minor spatial variations in accuracy,the overall distributions of predicted and observed vegetation density were comparable.Furthermore,this study highlights the importance of considering uncertainty in model predictions.Bayesian inference allowed us to quantify this uncertainty,demonstrating that the model's performance can vary across locations.Our approach provides valuable insights into forest regeneration process uncertainties,enabling comparisons of modeled scenarios at different recovery stages for better decision-making in these critical mountain biomes.
文摘Published proof test coverage(PTC)estimates for emergency shutdown valves(ESDVs)show only moderate agreement and are predominantly opinion-based.A Failure Modes,Effects,and Diagnostics Analysis(FMEDA)was undertaken using component failure rate data to predict PTC for a full stroke test and a partial stroke test.Given the subjective and uncertain aspects of the FMEDA approach,specifically the selection of component failure rates and the determination of the probability of detecting failure modes,a Fuzzy Inference System(FIS)was proposed to manage the data,addressing the inherent uncertainties.Fuzzy inference systems have been used previously for various FMEA type assessments,but this is the first time an FIS has been employed for use with FMEDA.ESDV PTC values were generated from both the standard FMEDA and the fuzzy-FMEDA approaches using data provided by FMEDA experts.This work demonstrates that fuzzy inference systems can address the subjectivity inherent in FMEDA data,enabling reliable estimates of ESDV proof test coverage for both full and partial stroke tests.This facilitates optimized maintenance planning while ensuring safety is not compromised.
文摘Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.
基金National Key Research and Development Program,No.2021xjkk0303。
文摘Understanding the characteristics and driving factors behind changes in vegetation ecosystem resilience is crucial for mitigating both current and future impacts of climate change. Despite recent advances in resilience research, significant knowledge gaps remain regarding the drivers of resilience changes. In this study, we investigated the dynamics of ecosystem resilience across China and identified potential driving factors using the kernel normalized difference vegetation index(kNDVI) from 2000 to 2020. Our results indicate that vegetation resilience in China has exhibited an increasing trend over the past two decades, with a notable breakpoint occurring around 2012. We found that precipitation was the dominant driver of changes in ecosystem resilience, accounting for 35.82% of the variation across China, followed by monthly average maximum temperature(Tmax) and vapor pressure deficit(VPD), which explained 28.95% and 28.31% of the variation, respectively. Furthermore, we revealed that daytime and nighttime warming has asymmetric impacts on vegetation resilience, with temperature factors such as Tmin and Tmax becoming more influential, while the importance of precipitation slightly decreases after the resilience change point. Overall, our study highlights the key roles of water availability and temperature in shaping vegetation resilience and underscores the asymmetric effects of daytime and nighttime warming on ecosystem resilience.