This paper is concerned with the right use of simple hypothesis tests in the area of positive accounting research.Both the usefulness and the limitations of the technique are dealt with in detail.
Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)tec...Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.展开更多
Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing the...Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.展开更多
In this paper, large sample properties of resampling tests of hypotheses on the population mean resampled according to the empirical likelihood and the Kullback-Leibler criteria are investigated. It is proved that und...In this paper, large sample properties of resampling tests of hypotheses on the population mean resampled according to the empirical likelihood and the Kullback-Leibler criteria are investigated. It is proved that under the null hypothesis both of them are superior to the classical one.展开更多
The statistical testing models of the plate tectonic units and the hypothesis of their rigidity is presented by using the dense geodetic data, and to a certain extent the established statistic value c...The statistical testing models of the plate tectonic units and the hypothesis of their rigidity is presented by using the dense geodetic data, and to a certain extent the established statistic value can be regarded as a quantitative index to compare the rigidity degrees of different blocks. The several conclusions about the global megaplates and the regional tectonics of China are tested and verified by actual calculations, which testifies the effectiveness of this method in testing the rigidity degree and delineating their boundaries.展开更多
Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global no...Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.展开更多
We consider an extension to Sequential Probability Ratio Tests for when we have uncertain costs, but also opportunity to learn about these in an adaptive manner. In doing so we demonstrate the effects that allowing un...We consider an extension to Sequential Probability Ratio Tests for when we have uncertain costs, but also opportunity to learn about these in an adaptive manner. In doing so we demonstrate the effects that allowing uncertainty has on observation cost, and the costs associated with Type I and Type II error. The value of information relating to modelled uncertainties is derived and the case of statistical dependence between the parameter affecting decision outcome and the parameter affecting unknown cost is also examined. Numerical examples of the derived theory are provided, along with a simulation comparing this adaptive learning framework to the classical one.展开更多
The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techni...The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.展开更多
Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of ...Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.展开更多
In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit...In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit satellites, and the directional nature of their signals, traditional monitoring methods are no longer suitable, especially in the case of multiple power level. Mobile crowdsensing(MCS), as a new technology, can make full use of idle resources to complete a variety of perceptual tasks. However, traditional MCS heavily relies on a centralized server and is vulnerable to single point of failure attacks. Therefore, we replace the original centralized server with a blockchain-based distributed service provider to enable its security. Therefore, in this work, we propose a blockchain-based MCS framework, in which we explain in detail how this framework can achieve abnormal frequency behavior monitoring in an inter-satellite spectrum sharing system. Then, under certain false alarm probability, we propose an abnormal spectrum detection algorithm based on mixed hypothesis test to maximize detection probability in single power level and multiple power level scenarios, respectively. Finally, a Bad out of Good(BooG) detector is proposed to ease the computational pressure on the blockchain nodes. Simulation results show the effectiveness of the proposed framework.展开更多
We present here an alternative definition of the P-value for statistical hypothesis test of a real-valued parameter for a continuous random variable X. Our approach uses neither the notion of Type I error nor the assu...We present here an alternative definition of the P-value for statistical hypothesis test of a real-valued parameter for a continuous random variable X. Our approach uses neither the notion of Type I error nor the assumption that null hypothesis is true. Instead, the new P-value involves the maximum likelihood estimator, which is usually available for a parameter such as the mean μ or standard deviation σ of a random variable X with a common distribution.展开更多
A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytic...A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytical model, with which a system identification algorithm is derived by using the improved Newton method. In order to investigate its properties in the face of measurement errors, the Monte Carlo method is introduced to simulate the measured data. Based on the identified results, their statistical distributions can be assumed, the status of an existing structure can be statistically evaluated by hypothesis tests. A 5-story, two-bay steel frame is used to carry out numerical simulation studies in detail, and the proposed scheme is proved to be effective.展开更多
Online Food Delivery Platforms(OFDPs)has witnessed phenomenal growth in the past few years,especially this year due to the COVID-19 pandemic.This Pandemic has forced many governments across the world to give momentum ...Online Food Delivery Platforms(OFDPs)has witnessed phenomenal growth in the past few years,especially this year due to the COVID-19 pandemic.This Pandemic has forced many governments across the world to give momentum to OFD services and make their presence among the customers.The Presence of several multinational and national companies in this sector has enhanced the competition and companies are trying to adapt various marketing strategies and exploring the brand experience(BEX)dimension that helps in enhancing the brand equity(BE)of OFDPs.BEXs are critical for building brand loyalty(BL)and making companies profitable.Customers can experience different kinds of brand experiences through feeling,emotions,affection,behavior,and intellect.The present research work is taken up to analyze the factors affecting BEX and its impact on BL and BE of the OFDPs and analyze the mediating role of BL in the relationship between BEX and BE of the OFDPs in the Indian context.A survey of 457 Indian customers was carried out.A questionnaire was used for data collection and a mediation study was used to test hypothesized relationships.Our computational analysis reveals that BEX influences the BL and BE of OFDPs.The study further indicates that BL mediates the relationship between BEX and BE of OFDPs.The effective marketing and relationship management practices will help company to enhance BEX that will enable in enhancing BL and raising BE of their product.It therefore provides a more thorough analysis of BEX constructs and their consequences than previous research.Some of the managerial implication,limitations,and scope of future research are also presented in the study.展开更多
The relationship between wheat varieties and occurrence of head blight was discussed, the independence test was carried out by using contingency table.
Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the sta...Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the statics of the observation noise are pre-given before the filtering process.Therefore,any unpredicted outliers in observation noise will decrease the stability of the filter.In view of this problem,improved CKF method with robustness is proposed.Multiple fading factors are introduced to rescale the observation noise covariance.Then the update stage of the filter can be autonomously tuned,and if there are outliers exist in the observations,the update should be less weighted.Under the Gaussian assumption of KF,the Mahalanobis distance of the innovation vector is supposed to be Chi-square distributed.Therefore a judging index based on Chi-square test is designed to detect the noise outliers,determining whether the fading tune are required.The proposed method is applied in the nonlinear alignment of SINS,and vehicle experiment proves the effective of the proposed method.展开更多
Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect...Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect. In order to solve this problem, we propose a new gearbox deterioration detection technique based on autoregressive modeling and hypothesis testing in this paper. A stationary autoregressive model was built by using a normal vibration signal from each shaft. The established autoregressive model was then applied to process fault signals from each shaft of a two-stage gearbox. What this paper investigated is a combined technique which unites a time-varying autoregressive model and a two sample Kolmogorov-Smimov goodness-of-fit test, to detect the deterioration of gearing system with simultaneously variable shaft speed and variable load. The time-varying autoregressive model residuals representing both healthy and faulty gear conditions were compared with the original healthy time-synchronous average signals. Compared with the traditional kurtosis statistic, this technique for gearbox deterioration detection has shown significant advantages in highlighting the presence of incipient gear fault in all different speed shafts involved in the meshing motion under variable conditions.展开更多
Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile an...Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile and 2-standard deviation of TEC values, this paper analyzed the characteristics of ionospheric changes before and after the destructive event. The Neyman-Pearson signal detection method is employed to compute the probabilities of TEC abnormalities. Result shows that one week before the Wenchuan earthquake, ionospheric TEC over the epicenter and its vicinities displays obvious abnormal disturbances, most of which are positive anomalies. The largest TEC abnormal changes appeared on May 9, three days prior to the seismic event. Signal detection shows that the largest possibility ofTEC abnormity on May 9 is 50.74%, indicating that ionospheric abnormities three days before the main shock are likely related to the preparation process of the Ms8.0 Wenchuan earthquake.展开更多
A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles f...A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles from two jammed 3-D radars (or 2-D passive sensors). Then, hypothesis testing based deghosting method in the multiple target scenarios is proposed using the two intersection points. In order to analyze the performance of the proposed method, the correct association probability of the true targets and the incorrect association probability of the ghost targets are defined. Finally, the Monte Carlo simulations are given for the proposed method compared with the hinge angle method in the cases of both two and three radars. The simulation results show that the proposed method has better performance than the hinge angle method in three radars case.展开更多
Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific v...Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific value (CV) and ash content (AC) of different parts of Masson pine (Pinus massoniana) trees in southern China using hypothesis testing and regression analysis. CV and AC of different tree parts were almost significantly different (P〈0.05). In descending order, ash-free calorific value (AFCV) ranked as foliage 〉 branch 〉 stem bark 〉 root 〉 stem wood, and AC ranked as foliage 〉 stem bark 〉 root 〉 branch 〉 stem wood. CV and AC of stem wood from the top, middle and lower sections of trees differed significantly. CV increased from the top to the lower sections of the tnmk while AC decreased. Mean gross calorific value (GCV) and AFCV of aboveground parts were significantly higher than those of belowground parts (roots). The mean GCV, AFCV and AC of a whole tree of Masson pine were 21.54 kJ/g, 21.74 kJ/g and 0.90%, re- spectively. CV and AC of different tree parts were, to some extent, cor- related with tree diameter, height and origin.展开更多
文摘This paper is concerned with the right use of simple hypothesis tests in the area of positive accounting research.Both the usefulness and the limitations of the technique are dealt with in detail.
基金This work was supported by NSFC(No.11471006 and No.81601456),Science and Technology Innovation Plan of Xi’an(No.2019421315KYPT004JC006)and the HPC Platform,Xi’an Jiaotong University.
文摘Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.
文摘Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.
文摘In this paper, large sample properties of resampling tests of hypotheses on the population mean resampled according to the empirical likelihood and the Kullback-Leibler criteria are investigated. It is proved that under the null hypothesis both of them are superior to the classical one.
文摘The statistical testing models of the plate tectonic units and the hypothesis of their rigidity is presented by using the dense geodetic data, and to a certain extent the established statistic value can be regarded as a quantitative index to compare the rigidity degrees of different blocks. The several conclusions about the global megaplates and the regional tectonics of China are tested and verified by actual calculations, which testifies the effectiveness of this method in testing the rigidity degree and delineating their boundaries.
文摘Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.
文摘We consider an extension to Sequential Probability Ratio Tests for when we have uncertain costs, but also opportunity to learn about these in an adaptive manner. In doing so we demonstrate the effects that allowing uncertainty has on observation cost, and the costs associated with Type I and Type II error. The value of information relating to modelled uncertainties is derived and the case of statistical dependence between the parameter affecting decision outcome and the parameter affecting unknown cost is also examined. Numerical examples of the derived theory are provided, along with a simulation comparing this adaptive learning framework to the classical one.
文摘The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.
文摘Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.
文摘In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit satellites, and the directional nature of their signals, traditional monitoring methods are no longer suitable, especially in the case of multiple power level. Mobile crowdsensing(MCS), as a new technology, can make full use of idle resources to complete a variety of perceptual tasks. However, traditional MCS heavily relies on a centralized server and is vulnerable to single point of failure attacks. Therefore, we replace the original centralized server with a blockchain-based distributed service provider to enable its security. Therefore, in this work, we propose a blockchain-based MCS framework, in which we explain in detail how this framework can achieve abnormal frequency behavior monitoring in an inter-satellite spectrum sharing system. Then, under certain false alarm probability, we propose an abnormal spectrum detection algorithm based on mixed hypothesis test to maximize detection probability in single power level and multiple power level scenarios, respectively. Finally, a Bad out of Good(BooG) detector is proposed to ease the computational pressure on the blockchain nodes. Simulation results show the effectiveness of the proposed framework.
文摘We present here an alternative definition of the P-value for statistical hypothesis test of a real-valued parameter for a continuous random variable X. Our approach uses neither the notion of Type I error nor the assumption that null hypothesis is true. Instead, the new P-value involves the maximum likelihood estimator, which is usually available for a parameter such as the mean μ or standard deviation σ of a random variable X with a common distribution.
基金The National Natural Science Foundation of China(No50538020)
文摘A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytical model, with which a system identification algorithm is derived by using the improved Newton method. In order to investigate its properties in the face of measurement errors, the Monte Carlo method is introduced to simulate the measured data. Based on the identified results, their statistical distributions can be assumed, the status of an existing structure can be statistically evaluated by hypothesis tests. A 5-story, two-bay steel frame is used to carry out numerical simulation studies in detail, and the proposed scheme is proved to be effective.
文摘Online Food Delivery Platforms(OFDPs)has witnessed phenomenal growth in the past few years,especially this year due to the COVID-19 pandemic.This Pandemic has forced many governments across the world to give momentum to OFD services and make their presence among the customers.The Presence of several multinational and national companies in this sector has enhanced the competition and companies are trying to adapt various marketing strategies and exploring the brand experience(BEX)dimension that helps in enhancing the brand equity(BE)of OFDPs.BEXs are critical for building brand loyalty(BL)and making companies profitable.Customers can experience different kinds of brand experiences through feeling,emotions,affection,behavior,and intellect.The present research work is taken up to analyze the factors affecting BEX and its impact on BL and BE of the OFDPs and analyze the mediating role of BL in the relationship between BEX and BE of the OFDPs in the Indian context.A survey of 457 Indian customers was carried out.A questionnaire was used for data collection and a mediation study was used to test hypothesized relationships.Our computational analysis reveals that BEX influences the BL and BE of OFDPs.The study further indicates that BL mediates the relationship between BEX and BE of OFDPs.The effective marketing and relationship management practices will help company to enhance BEX that will enable in enhancing BL and raising BE of their product.It therefore provides a more thorough analysis of BEX constructs and their consequences than previous research.Some of the managerial implication,limitations,and scope of future research are also presented in the study.
文摘The relationship between wheat varieties and occurrence of head blight was discussed, the independence test was carried out by using contingency table.
基金This work is supported by National Natural Science Foundation of China under Grant No.41574069The Major National Projects of China under Grant No.GFZX0301040303.
文摘Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the statics of the observation noise are pre-given before the filtering process.Therefore,any unpredicted outliers in observation noise will decrease the stability of the filter.In view of this problem,improved CKF method with robustness is proposed.Multiple fading factors are introduced to rescale the observation noise covariance.Then the update stage of the filter can be autonomously tuned,and if there are outliers exist in the observations,the update should be less weighted.Under the Gaussian assumption of KF,the Mahalanobis distance of the innovation vector is supposed to be Chi-square distributed.Therefore a judging index based on Chi-square test is designed to detect the noise outliers,determining whether the fading tune are required.The proposed method is applied in the nonlinear alignment of SINS,and vehicle experiment proves the effective of the proposed method.
基金supported by National Natural Science Foundation of China (Grant No. 50675232)Key Project of Ministry of Education of ChinaChongqing Municipal Natural Science Key Foundation of China (Grant No. 2007BA6021)
文摘Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect. In order to solve this problem, we propose a new gearbox deterioration detection technique based on autoregressive modeling and hypothesis testing in this paper. A stationary autoregressive model was built by using a normal vibration signal from each shaft. The established autoregressive model was then applied to process fault signals from each shaft of a two-stage gearbox. What this paper investigated is a combined technique which unites a time-varying autoregressive model and a two sample Kolmogorov-Smimov goodness-of-fit test, to detect the deterioration of gearing system with simultaneously variable shaft speed and variable load. The time-varying autoregressive model residuals representing both healthy and faulty gear conditions were compared with the original healthy time-synchronous average signals. Compared with the traditional kurtosis statistic, this technique for gearbox deterioration detection has shown significant advantages in highlighting the presence of incipient gear fault in all different speed shafts involved in the meshing motion under variable conditions.
基金supported by the Key Technology Research and Development Program of China (2008BAC35B02)
文摘Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile and 2-standard deviation of TEC values, this paper analyzed the characteristics of ionospheric changes before and after the destructive event. The Neyman-Pearson signal detection method is employed to compute the probabilities of TEC abnormalities. Result shows that one week before the Wenchuan earthquake, ionospheric TEC over the epicenter and its vicinities displays obvious abnormal disturbances, most of which are positive anomalies. The largest TEC abnormal changes appeared on May 9, three days prior to the seismic event. Signal detection shows that the largest possibility ofTEC abnormity on May 9 is 50.74%, indicating that ionospheric abnormities three days before the main shock are likely related to the preparation process of the Ms8.0 Wenchuan earthquake.
基金supported partly by the Foundation for the Author of National Excellent Doctoral Dissertation of China(200443)the National Natural Science Foundation of China(60541001)+1 种基金the Program for New Century Excellent Talents inUniversity(05-0912)the Foundation of Taishan Scholars.
文摘A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles from two jammed 3-D radars (or 2-D passive sensors). Then, hypothesis testing based deghosting method in the multiple target scenarios is proposed using the two intersection points. In order to analyze the performance of the proposed method, the correct association probability of the true targets and the incorrect association probability of the ghost targets are defined. Finally, the Monte Carlo simulations are given for the proposed method compared with the hinge angle method in the cases of both two and three radars. The simulation results show that the proposed method has better performance than the hinge angle method in three radars case.
基金initiated as part of the National Biomass Modeling Program in Continuous Forest Inventory(NBMP-CFI)funded by the State Forestry Administration of China
文摘Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific value (CV) and ash content (AC) of different parts of Masson pine (Pinus massoniana) trees in southern China using hypothesis testing and regression analysis. CV and AC of different tree parts were almost significantly different (P〈0.05). In descending order, ash-free calorific value (AFCV) ranked as foliage 〉 branch 〉 stem bark 〉 root 〉 stem wood, and AC ranked as foliage 〉 stem bark 〉 root 〉 branch 〉 stem wood. CV and AC of stem wood from the top, middle and lower sections of trees differed significantly. CV increased from the top to the lower sections of the tnmk while AC decreased. Mean gross calorific value (GCV) and AFCV of aboveground parts were significantly higher than those of belowground parts (roots). The mean GCV, AFCV and AC of a whole tree of Masson pine were 21.54 kJ/g, 21.74 kJ/g and 0.90%, re- spectively. CV and AC of different tree parts were, to some extent, cor- related with tree diameter, height and origin.