Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)tec...Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.展开更多
Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing the...Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.展开更多
Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global no...Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.展开更多
The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techni...The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.展开更多
Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of ...Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.展开更多
Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size ...Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size n is available from each population. This paper presents a likelihood ratio test criterion for testing equality of K multiple correlation matrices and extends the results to the testing of equality of K partial correlation matrices.展开更多
With the continuous development of science and technology and other fields in today's world,statistical analysis and research have become indispensable research methods in people's daily lives.Among these,hypo...With the continuous development of science and technology and other fields in today's world,statistical analysis and research have become indispensable research methods in people's daily lives.Among these,hypothesis testing plays an important role in fields such as biology,medicine,and economics,and has significant effects in most scenarios.However,the suitability and effectiveness of different hypothesis testing methods vary depending on the context,often leading to different outcomes and levels of accuracy.This paper focuses on discussing various hypothesis testing methods in statistics,such as t-test,chi-square test,z-test,F-test,etc.In addition,this study analyzes various cases in real life and optimize and improve some hypothesis testing methods.Based on a review of existing literature,the study explores how traditional hypothesis testing determines statistical significance—where the null hypothesis is rejected if the test statistic falls within the rejection region.To improve and enhance the determination of the significance level,handle uncertainty,and increase the sample size,this paper proposes alternative methods such as NHST test,Bayesian test,big data sequential test,and failure rate hypothesis test from the perspectives of medicine and kinesiology.展开更多
A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytic...A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytical model, with which a system identification algorithm is derived by using the improved Newton method. In order to investigate its properties in the face of measurement errors, the Monte Carlo method is introduced to simulate the measured data. Based on the identified results, their statistical distributions can be assumed, the status of an existing structure can be statistically evaluated by hypothesis tests. A 5-story, two-bay steel frame is used to carry out numerical simulation studies in detail, and the proposed scheme is proved to be effective.展开更多
The relationship between wheat varieties and occurrence of head blight was discussed, the independence test was carried out by using contingency table.
The great mantle plume debate(GPD) has been going on for ~15 years(Foulger and Natland, 2003;Anderson, 2004; Niu, 2005; Davies, 2005; Foulger, 2005; Campbell, 2005; Campbell and Davies, 2006),centered on whether mantl...The great mantle plume debate(GPD) has been going on for ~15 years(Foulger and Natland, 2003;Anderson, 2004; Niu, 2005; Davies, 2005; Foulger, 2005; Campbell, 2005; Campbell and Davies, 2006),centered on whether mantle plumes exist as a result of Earth's cooling or whether their existence is purely required for convenience in explaining certain Earth phenomena(Niu, 2005). Despite the mounting evidence that many of the so-called plumes may be localized melting anomalies, the debate is likely to continue. We recognize that the slow progress of the debate results from communication difficulties.Many debaters may not truly appreciate(1) what the mantle plume hypothesis actually is, and(2) none of the petrological, geochemical and geophysical methods widely used can actually provide smoking-gun evidence for or against mantle plume hypothesis. In this short paper, we clarify these issues, and elaborate a geologically effective approach to test the hypothesis. According to the mantle plume hypothesis, a thermal mantle plume must originate from the thermal boundary layer at the core-mantle boundary(CMB), and a large mantle plume head is required to carry the material from the deep mantle to the surface. The plume head product in ocean basins is the oceanic plateau, which is a lithospheric terrane that is large(1000's km across), thick(>200 km), shallow(2–4 km high above the surrounding seafloors), buoyant(~1% less dense than the surrounding lithosphere), and thus must be preserved in the surface geology(Niu et al., 2003). The Hawaiian volcanism has been considered as the surface expression of a type mantle plume, but it does not seem to have a(known) plume head product. If this is true, the Hawaiian mantle plume in particular and the mantle plume hypothesis in general must be questioned. Therefore, whether there is an oceanic plateau-like product for the Hawaiian volcanism is key to testing the mantle plume hypothesis, and the Kamchatka-Okhotsk Sea basement is the best candidate to find out if it is indeed the Hawaiian mantle plume head product or not(Niu et al., 2003; Niu, 2004).展开更多
Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the sta...Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the statics of the observation noise are pre-given before the filtering process.Therefore,any unpredicted outliers in observation noise will decrease the stability of the filter.In view of this problem,improved CKF method with robustness is proposed.Multiple fading factors are introduced to rescale the observation noise covariance.Then the update stage of the filter can be autonomously tuned,and if there are outliers exist in the observations,the update should be less weighted.Under the Gaussian assumption of KF,the Mahalanobis distance of the innovation vector is supposed to be Chi-square distributed.Therefore a judging index based on Chi-square test is designed to detect the noise outliers,determining whether the fading tune are required.The proposed method is applied in the nonlinear alignment of SINS,and vehicle experiment proves the effective of the proposed method.展开更多
Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect...Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect. In order to solve this problem, we propose a new gearbox deterioration detection technique based on autoregressive modeling and hypothesis testing in this paper. A stationary autoregressive model was built by using a normal vibration signal from each shaft. The established autoregressive model was then applied to process fault signals from each shaft of a two-stage gearbox. What this paper investigated is a combined technique which unites a time-varying autoregressive model and a two sample Kolmogorov-Smimov goodness-of-fit test, to detect the deterioration of gearing system with simultaneously variable shaft speed and variable load. The time-varying autoregressive model residuals representing both healthy and faulty gear conditions were compared with the original healthy time-synchronous average signals. Compared with the traditional kurtosis statistic, this technique for gearbox deterioration detection has shown significant advantages in highlighting the presence of incipient gear fault in all different speed shafts involved in the meshing motion under variable conditions.展开更多
Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile an...Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile and 2-standard deviation of TEC values, this paper analyzed the characteristics of ionospheric changes before and after the destructive event. The Neyman-Pearson signal detection method is employed to compute the probabilities of TEC abnormalities. Result shows that one week before the Wenchuan earthquake, ionospheric TEC over the epicenter and its vicinities displays obvious abnormal disturbances, most of which are positive anomalies. The largest TEC abnormal changes appeared on May 9, three days prior to the seismic event. Signal detection shows that the largest possibility ofTEC abnormity on May 9 is 50.74%, indicating that ionospheric abnormities three days before the main shock are likely related to the preparation process of the Ms8.0 Wenchuan earthquake.展开更多
A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles f...A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles from two jammed 3-D radars (or 2-D passive sensors). Then, hypothesis testing based deghosting method in the multiple target scenarios is proposed using the two intersection points. In order to analyze the performance of the proposed method, the correct association probability of the true targets and the incorrect association probability of the ghost targets are defined. Finally, the Monte Carlo simulations are given for the proposed method compared with the hinge angle method in the cases of both two and three radars. The simulation results show that the proposed method has better performance than the hinge angle method in three radars case.展开更多
Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific v...Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific value (CV) and ash content (AC) of different parts of Masson pine (Pinus massoniana) trees in southern China using hypothesis testing and regression analysis. CV and AC of different tree parts were almost significantly different (P〈0.05). In descending order, ash-free calorific value (AFCV) ranked as foliage 〉 branch 〉 stem bark 〉 root 〉 stem wood, and AC ranked as foliage 〉 stem bark 〉 root 〉 branch 〉 stem wood. CV and AC of stem wood from the top, middle and lower sections of trees differed significantly. CV increased from the top to the lower sections of the tnmk while AC decreased. Mean gross calorific value (GCV) and AFCV of aboveground parts were significantly higher than those of belowground parts (roots). The mean GCV, AFCV and AC of a whole tree of Masson pine were 21.54 kJ/g, 21.74 kJ/g and 0.90%, re- spectively. CV and AC of different tree parts were, to some extent, cor- related with tree diameter, height and origin.展开更多
Abstract We use moderate deviations to study the signal detection problem for a diffusion model. We establish a moderate deviation principle for the log-likelihood function of the diffusion model. Then applying the mo...Abstract We use moderate deviations to study the signal detection problem for a diffusion model. We establish a moderate deviation principle for the log-likelihood function of the diffusion model. Then applying the moderate deviation estimates to hypothesis testing for signal detection problem we give a decision region such that its error probability of the second kind tends to zero with faster speed than the error probability of the first kind when the error probability of the first kind is approximated by e-ατ(T), where α〉 0, τ(T) = o(T) and τ(T)→∞ as the observation time T goes to infinity.展开更多
In order to solve the model update problem in mean-shift based tracker, a novel mechanism is proposed. Kalman filter is employed to update object model by filtering object kernel-histogram using previous model and cur...In order to solve the model update problem in mean-shift based tracker, a novel mechanism is proposed. Kalman filter is employed to update object model by filtering object kernel-histogram using previous model and current candidate. A self-tuning method is used for adaptively adjust all the parameters of the filters under the analysis of the filtering residuals. In addition, hypothesis testing servers as the criterion for determining whether to accept filtering result. Therefore, the tracker has the ability to handle occlusion so as to avoid over-update. The experimental results show that our method can not only keep up with the object appearance and scale changes but also be robust to occlusion.展开更多
We consider testing hypotheses concerning comparing dispersions between two parameter vectors of multinomial distributions in both one-sample and two-sample cases. The comparison criterion is the concept of Schur majo...We consider testing hypotheses concerning comparing dispersions between two parameter vectors of multinomial distributions in both one-sample and two-sample cases. The comparison criterion is the concept of Schur majorization. A new dispersion index is proposed for testing the hypotheses. The corresponding test for the one-sample problem is an exact test. For the two-sample problem, the bootstrap is used to approximate the null distribution of the test statistic and the p-value. We prove that the bootstrap test is asymptotically correct and consistent. Simulation studies for the bootstrap test are reported and a real life example is presented.展开更多
With the help of relative entropy theory,norm theory,and bootstrap methodology,a new hypothesis testing method is proposed to verify reliability with a three-parameter Weibull distribution.Based on the relative differ...With the help of relative entropy theory,norm theory,and bootstrap methodology,a new hypothesis testing method is proposed to verify reliability with a three-parameter Weibull distribution.Based on the relative difference information of the experimental value vector to the theoretical value vector of reliability,six criteria of the minimum weighted relative entropy norm are established to extract the optimal information vector of the Weibull parameters in the reliability experiment of product lifetime.The rejection region used in the hypothesis testing is deduced via the area of intersection set of the estimated truth-value function and its confidence interval function of the three-parameter Weibull distribution.The case studies of simulation lifetime,helicopter component failure,and ceramic material failure indicate that the proposed method is able to reflect the practical situation of the reliability experiment.展开更多
基金This work was supported by NSFC(No.11471006 and No.81601456),Science and Technology Innovation Plan of Xi’an(No.2019421315KYPT004JC006)and the HPC Platform,Xi’an Jiaotong University.
文摘Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.
文摘Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.
文摘Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.
文摘The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.
文摘Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.
文摘Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size n is available from each population. This paper presents a likelihood ratio test criterion for testing equality of K multiple correlation matrices and extends the results to the testing of equality of K partial correlation matrices.
文摘With the continuous development of science and technology and other fields in today's world,statistical analysis and research have become indispensable research methods in people's daily lives.Among these,hypothesis testing plays an important role in fields such as biology,medicine,and economics,and has significant effects in most scenarios.However,the suitability and effectiveness of different hypothesis testing methods vary depending on the context,often leading to different outcomes and levels of accuracy.This paper focuses on discussing various hypothesis testing methods in statistics,such as t-test,chi-square test,z-test,F-test,etc.In addition,this study analyzes various cases in real life and optimize and improve some hypothesis testing methods.Based on a review of existing literature,the study explores how traditional hypothesis testing determines statistical significance—where the null hypothesis is rejected if the test statistic falls within the rejection region.To improve and enhance the determination of the significance level,handle uncertainty,and increase the sample size,this paper proposes alternative methods such as NHST test,Bayesian test,big data sequential test,and failure rate hypothesis test from the perspectives of medicine and kinesiology.
基金The National Natural Science Foundation of China(No50538020)
文摘A statistical damage detection and condition assessment scheme for existing structures is developed. First a virtual work error estimator is defined to express the discrepancy between a real structure and its analytical model, with which a system identification algorithm is derived by using the improved Newton method. In order to investigate its properties in the face of measurement errors, the Monte Carlo method is introduced to simulate the measured data. Based on the identified results, their statistical distributions can be assumed, the status of an existing structure can be statistically evaluated by hypothesis tests. A 5-story, two-bay steel frame is used to carry out numerical simulation studies in detail, and the proposed scheme is proved to be effective.
文摘The relationship between wheat varieties and occurrence of head blight was discussed, the independence test was carried out by using contingency table.
基金supported by the National Natural Science Foundation of China (41130314, 41630968)Chinese Academy of Sciences Innovation (Y42217101L)+1 种基金grants from Qingdao National Laboratory for Marine Science and Technology (2015ASKJ03)the NSFC-Shandong Joint Fund for Marine Science Research Centers (U1606401)
文摘The great mantle plume debate(GPD) has been going on for ~15 years(Foulger and Natland, 2003;Anderson, 2004; Niu, 2005; Davies, 2005; Foulger, 2005; Campbell, 2005; Campbell and Davies, 2006),centered on whether mantle plumes exist as a result of Earth's cooling or whether their existence is purely required for convenience in explaining certain Earth phenomena(Niu, 2005). Despite the mounting evidence that many of the so-called plumes may be localized melting anomalies, the debate is likely to continue. We recognize that the slow progress of the debate results from communication difficulties.Many debaters may not truly appreciate(1) what the mantle plume hypothesis actually is, and(2) none of the petrological, geochemical and geophysical methods widely used can actually provide smoking-gun evidence for or against mantle plume hypothesis. In this short paper, we clarify these issues, and elaborate a geologically effective approach to test the hypothesis. According to the mantle plume hypothesis, a thermal mantle plume must originate from the thermal boundary layer at the core-mantle boundary(CMB), and a large mantle plume head is required to carry the material from the deep mantle to the surface. The plume head product in ocean basins is the oceanic plateau, which is a lithospheric terrane that is large(1000's km across), thick(>200 km), shallow(2–4 km high above the surrounding seafloors), buoyant(~1% less dense than the surrounding lithosphere), and thus must be preserved in the surface geology(Niu et al., 2003). The Hawaiian volcanism has been considered as the surface expression of a type mantle plume, but it does not seem to have a(known) plume head product. If this is true, the Hawaiian mantle plume in particular and the mantle plume hypothesis in general must be questioned. Therefore, whether there is an oceanic plateau-like product for the Hawaiian volcanism is key to testing the mantle plume hypothesis, and the Kamchatka-Okhotsk Sea basement is the best candidate to find out if it is indeed the Hawaiian mantle plume head product or not(Niu et al., 2003; Niu, 2004).
基金This work is supported by National Natural Science Foundation of China under Grant No.41574069The Major National Projects of China under Grant No.GFZX0301040303.
文摘Nonlinear initial alignment is a significant research topic for strapdown inertial navigation system(SINS).Cubature Kalman filter(CKF)is a popular tool for nonlinear initial alignment.Standard CKF assumes that the statics of the observation noise are pre-given before the filtering process.Therefore,any unpredicted outliers in observation noise will decrease the stability of the filter.In view of this problem,improved CKF method with robustness is proposed.Multiple fading factors are introduced to rescale the observation noise covariance.Then the update stage of the filter can be autonomously tuned,and if there are outliers exist in the observations,the update should be less weighted.Under the Gaussian assumption of KF,the Mahalanobis distance of the innovation vector is supposed to be Chi-square distributed.Therefore a judging index based on Chi-square test is designed to detect the noise outliers,determining whether the fading tune are required.The proposed method is applied in the nonlinear alignment of SINS,and vehicle experiment proves the effective of the proposed method.
基金supported by National Natural Science Foundation of China (Grant No. 50675232)Key Project of Ministry of Education of ChinaChongqing Municipal Natural Science Key Foundation of China (Grant No. 2007BA6021)
文摘Multiple dominant gear meshing frequencies are present in the vibration signals collected from gearboxes and the conventional spiky features that represent initial gear fault conditions are usually difficult to detect. In order to solve this problem, we propose a new gearbox deterioration detection technique based on autoregressive modeling and hypothesis testing in this paper. A stationary autoregressive model was built by using a normal vibration signal from each shaft. The established autoregressive model was then applied to process fault signals from each shaft of a two-stage gearbox. What this paper investigated is a combined technique which unites a time-varying autoregressive model and a two sample Kolmogorov-Smimov goodness-of-fit test, to detect the deterioration of gearing system with simultaneously variable shaft speed and variable load. The time-varying autoregressive model residuals representing both healthy and faulty gear conditions were compared with the original healthy time-synchronous average signals. Compared with the traditional kurtosis statistic, this technique for gearbox deterioration detection has shown significant advantages in highlighting the presence of incipient gear fault in all different speed shafts involved in the meshing motion under variable conditions.
基金supported by the Key Technology Research and Development Program of China (2008BAC35B02)
文摘Ionospheric TEC (total electron content) time series are derived from GPS measurements at 13 stations around the epicenter of the 2008 Wenchuan earthquake. Defining anomaly bounds for a sliding window by quartile and 2-standard deviation of TEC values, this paper analyzed the characteristics of ionospheric changes before and after the destructive event. The Neyman-Pearson signal detection method is employed to compute the probabilities of TEC abnormalities. Result shows that one week before the Wenchuan earthquake, ionospheric TEC over the epicenter and its vicinities displays obvious abnormal disturbances, most of which are positive anomalies. The largest TEC abnormal changes appeared on May 9, three days prior to the seismic event. Signal detection shows that the largest possibility ofTEC abnormity on May 9 is 50.74%, indicating that ionospheric abnormities three days before the main shock are likely related to the preparation process of the Ms8.0 Wenchuan earthquake.
基金supported partly by the Foundation for the Author of National Excellent Doctoral Dissertation of China(200443)the National Natural Science Foundation of China(60541001)+1 种基金the Program for New Century Excellent Talents inUniversity(05-0912)the Foundation of Taishan Scholars.
文摘A new deghosting method based on the generalized triangulation is presented. First, two intersection points corresponding to the emitter position are obtained by utilizing two azimuth angles and two elevation angles from two jammed 3-D radars (or 2-D passive sensors). Then, hypothesis testing based deghosting method in the multiple target scenarios is proposed using the two intersection points. In order to analyze the performance of the proposed method, the correct association probability of the true targets and the incorrect association probability of the ghost targets are defined. Finally, the Monte Carlo simulations are given for the proposed method compared with the hinge angle method in the cases of both two and three radars. The simulation results show that the proposed method has better performance than the hinge angle method in three radars case.
基金initiated as part of the National Biomass Modeling Program in Continuous Forest Inventory(NBMP-CFI)funded by the State Forestry Administration of China
文摘Calorific value of plants is an important parameter for evalu- ating and indexing material cycles and energy conversion in forest eco- systems. Based on mensuration data of 150 sample sets, we analyzed the calorific value (CV) and ash content (AC) of different parts of Masson pine (Pinus massoniana) trees in southern China using hypothesis testing and regression analysis. CV and AC of different tree parts were almost significantly different (P〈0.05). In descending order, ash-free calorific value (AFCV) ranked as foliage 〉 branch 〉 stem bark 〉 root 〉 stem wood, and AC ranked as foliage 〉 stem bark 〉 root 〉 branch 〉 stem wood. CV and AC of stem wood from the top, middle and lower sections of trees differed significantly. CV increased from the top to the lower sections of the tnmk while AC decreased. Mean gross calorific value (GCV) and AFCV of aboveground parts were significantly higher than those of belowground parts (roots). The mean GCV, AFCV and AC of a whole tree of Masson pine were 21.54 kJ/g, 21.74 kJ/g and 0.90%, re- spectively. CV and AC of different tree parts were, to some extent, cor- related with tree diameter, height and origin.
基金supported by National Natural Science Foundation of China (Grant Nos.10871153 and 11171262)the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 200804860048)
文摘Abstract We use moderate deviations to study the signal detection problem for a diffusion model. We establish a moderate deviation principle for the log-likelihood function of the diffusion model. Then applying the moderate deviation estimates to hypothesis testing for signal detection problem we give a decision region such that its error probability of the second kind tends to zero with faster speed than the error probability of the first kind when the error probability of the first kind is approximated by e-ατ(T), where α〉 0, τ(T) = o(T) and τ(T)→∞ as the observation time T goes to infinity.
文摘In order to solve the model update problem in mean-shift based tracker, a novel mechanism is proposed. Kalman filter is employed to update object model by filtering object kernel-histogram using previous model and current candidate. A self-tuning method is used for adaptively adjust all the parameters of the filters under the analysis of the filtering residuals. In addition, hypothesis testing servers as the criterion for determining whether to accept filtering result. Therefore, the tracker has the ability to handle occlusion so as to avoid over-update. The experimental results show that our method can not only keep up with the object appearance and scale changes but also be robust to occlusion.
基金Sponsored by the National NSFC (10771126, 10801130)
文摘We consider testing hypotheses concerning comparing dispersions between two parameter vectors of multinomial distributions in both one-sample and two-sample cases. The comparison criterion is the concept of Schur majorization. A new dispersion index is proposed for testing the hypotheses. The corresponding test for the one-sample problem is an exact test. For the two-sample problem, the bootstrap is used to approximate the null distribution of the test statistic and the p-value. We prove that the bootstrap test is asymptotically correct and consistent. Simulation studies for the bootstrap test are reported and a real life example is presented.
基金Project (Nos. 51075123 and 50675011) supported by the National Natural Science Foundation of China
文摘With the help of relative entropy theory,norm theory,and bootstrap methodology,a new hypothesis testing method is proposed to verify reliability with a three-parameter Weibull distribution.Based on the relative difference information of the experimental value vector to the theoretical value vector of reliability,six criteria of the minimum weighted relative entropy norm are established to extract the optimal information vector of the Weibull parameters in the reliability experiment of product lifetime.The rejection region used in the hypothesis testing is deduced via the area of intersection set of the estimated truth-value function and its confidence interval function of the three-parameter Weibull distribution.The case studies of simulation lifetime,helicopter component failure,and ceramic material failure indicate that the proposed method is able to reflect the practical situation of the reliability experiment.