A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias pr...A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias prediction.Wavelet analysis was first used to break down the satellite clock frequency data into several levels,producing high and low frequency coefficients for each layer.The correlation coefficients of the high and low frequency coefficients in each of the three sub-intervals created by splitting these coefficients were then determined.The major noise region—the sub-interval with the lowest correlation coefficient—was chosen for thresholding treatment and noise threshold computation.The clock frequency data was then processed using wavelet reconstruction and reconverted to clock data.Lastly,three different kinds of satellite clock data—RTS,whu-o,and IGS-F—were used to confirm the produced data.Our method enhanced the stability of the Quadratic Polynomial(QP)model’s predictions for the C16 satellite by about 40%,according to the results.The accuracy and stability of the Auto Regression Integrated Moving Average(ARIMA)model improved up to 41.8%and 14.2%,respectively,whilst the Wavelet Neural Network(WNN)model improved by roughly 27.8%and 63.6%,respectively.Although our method has little effect on forecasting IGS-F series satellites,the experimental findings show that it can improve the accuracy and stability of QP,ARIMA,and WNN model forecasts for RTS and whu-o satellite clock bias.展开更多
To overcome the shortcoming that the traditional minimum error threshold method can obtain satisfactory image segmentation results only when the object and background of the image strictly obey a certain type of proba...To overcome the shortcoming that the traditional minimum error threshold method can obtain satisfactory image segmentation results only when the object and background of the image strictly obey a certain type of probability distribution,one proposes the regularized minimum error threshold method and treats the traditional minimum error threshold method as its special case.Then one constructs the discrete probability distribution by using the separation between segmentation threshold and the average gray-scale values of the object and background of the image so as to compute the information energy of the probability distribution.The impact of the regularized parameter selection on the optimal segmentation threshold of the regularized minimum error threshold method is investigated.To verify the effectiveness of the proposed regularized minimum error threshold method,one selects typical grey-scale images and performs segmentation tests.The segmentation results obtained by the regularized minimum error threshold method are compared with those obtained with the traditional minimum error threshold method.The segmentation results and their analysis show that the regularized minimum error threshold method is feasible and produces more satisfactory segmentation results than the minimum error threshold method.It does not exert much impact on object acquisition in case of the addition of a certain noise to an image.Therefore,the method can meet the requirements for extracting a real object in the noisy environment.展开更多
As an important model for explaining the seismic rupture mode,the asperity model plays an important role in studying the stress accumulation of faults and the location of earthquake initiation.Taking Qilian-Haiyuan fa...As an important model for explaining the seismic rupture mode,the asperity model plays an important role in studying the stress accumulation of faults and the location of earthquake initiation.Taking Qilian-Haiyuan fault as an example,this paper combines geodetic method and b-value method to propose a multi-source observation data fusion detection method that accurately determines the asperity boundary named dual threshold search method.The method is based on the criterion that the b-value asperity boundary should be most consistent with the slip deficit rate asperity boundary.Then the optimal threshold combination of slip deficit rate and b-value is obtained through threshold search,which can be used to determine the boundary of the asperity.Based on this method,the study finds that there are four potential asperities on the Qilian-Haiyuan fault:two asperities(A1 and A2)are on the Tuolaishan segment and the other two asperities(B and C)are on Lenglongling segment and Jinqianghe segment,respectively.Among them,the lengths of asperities A1 and A2 on Tuolaishan segment are 17.0 km and 64.8 km,respectively.And the lower boundaries are 5.5 km and 15.5 km,respectively;The length of asperity B on Lenglongling segment is 70.7 km,and the lower boundary is 10.2 km.The length of asperity C on Jinqianghe segment is 42.3 km,and the lower boundary is 8.3 km.展开更多
The purpose of this study is to apply different thresholding in mammogram images, and then we will determine which technique is the best in thresholding (extraction) malignant and benign tumors from the rest breast ti...The purpose of this study is to apply different thresholding in mammogram images, and then we will determine which technique is the best in thresholding (extraction) malignant and benign tumors from the rest breast tissues. The used technique is Otsu method, because it is one of the most effective methods for most real world views with regard to uniformity and shape measures. Also, we present all the thresholding methods that used the concept of between class variance. We found from the experimental results that all the used thresholding techniques work well in detection normal breast tissues. But in abnormal tissues (breast tumors), we found that only neighborhood valley emphasis method gave best detection of malignant tumors. Also, the results demonstrate that variance and intensity contrast technique is the best in extraction the micro calcifications which represent the first signs of breast cancer.展开更多
Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important ...Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important component of debris flows and is the most active factor when debris flows oc- cur. Rainfall also determines the temporal and spatial distribution characteristics of the hazards. A reasonable rainfall threshold target is essential to ensuring the accuracy of debris flow pre-warning. Such a threshold is important for the study of the mechanisms of debris flow formation, predicting the characteristics of future activities and the design of prevention and engineering control measures. Most mountainous areas have little data regarding rainfall and hazards, especially in debris flow forming re- gions. Therefore, both the traditional demonstration method and frequency calculated method cannot satisfy the debris flow pre-warning requirements. This study presents the characteristics of pre-warning regions, included the rainfall, hydrologic and topographic conditions. An analogous area with abundant data and the same conditions as the pre-warning region was selected, and the rainfall threshold was calculated by proxy. This method resolved the problem of debris flow pre-warning in ar- eas lacking data and provided a new approach for debris flow pre-warning in mountainous areas.展开更多
The discrimination of neutrons from gamma rays in a mixed radiation field is crucial in neutron detection tasks.Several approaches have been proposed to enhance the performance and accuracy of neutron-gamma discrimina...The discrimination of neutrons from gamma rays in a mixed radiation field is crucial in neutron detection tasks.Several approaches have been proposed to enhance the performance and accuracy of neutron-gamma discrimination.However,their performances are often associated with certain factors,such as experimental requirements and resulting mixed signals.The main purpose of this study is to achieve fast and accurate neutron-gamma discrimination without a priori information on the signal to be analyzed,as well as the experimental setup.Here,a novel method is proposed based on two concepts.The first method exploits the power of nonnegative tensor factorization(NTF)as a blind source separation method to extract the original components from the mixture signals recorded at the output of the stilbene scintillator detector.The second one is based on the principles of support vector machine(SVM)to identify and discriminate these components.In addition to these two main methods,we adopted the Mexican-hat function as a continuous wavelet transform to characterize the components extracted using the NTF model.The resulting scalograms are processed as colored images,which are segmented into two distinct classes using the Otsu thresholding method to extract the features of interest of the neutrons and gamma-ray components from the background noise.We subsequently used principal component analysis to select the most significant of these features wich are used in the training and testing datasets for SVM.Bias-variance analysis is used to optimize the SVM model by finding the optimal level of model complexity with the highest possible generalization performance.In this framework,the obtained results have verified a suitable bias–variance trade-off value.We achieved an operational SVM prediction model for neutron-gamma classification with a high true-positive rate.The accuracy and performance of the SVM based on the NTF was evaluated and validated by comparing it to the charge comparison method via figure of merit.The results indicate that the proposed approach has a superior discrimination quality(figure of merit of 2.20).展开更多
Among all segmentation techniques, Otsu thresholding method is widely used. Line intercept histogram based Otsu thresholding method(LIH Otsu method) can be more resistant to Gaussian noise, highly efficient in computi...Among all segmentation techniques, Otsu thresholding method is widely used. Line intercept histogram based Otsu thresholding method(LIH Otsu method) can be more resistant to Gaussian noise, highly efficient in computing time, and can be easily extended to multilevel thresholding. But when images contain salt-and-pepper noise, LIH Otsu method performs poorly. An improved LIH Otsu method(ILIH Otsu method) is presented, which can be more resistant to Gaussian noise and salt-and-pepper noise. Moreover, it can be easily extended to multilevel thresholding. In order to improve the efficiency, the optimization algorithm based on the kinetic-molecular theory(KMTOA) is used to determine the optimal thresholds. The experimental results show that ILIH Otsu method has stronger anti-noise ability than two-dimensional Otsu thresholding method(2-D Otsu method), LIH Otsu method, K-means clustering algorithm and fuzzy clustering algorithm.展开更多
In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding....In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.展开更多
Under the complex condition of nuclear power plant, all kinds of influence factors may cause distortion of on-line monitoring data. It is essential that on-line monitoring data should be de-noised in order to ensure t...Under the complex condition of nuclear power plant, all kinds of influence factors may cause distortion of on-line monitoring data. It is essential that on-line monitoring data should be de-noised in order to ensure the accuracy of diagnosis. Based on the research of wavelet analysis and threshold de-noising, a new threshold denoising method based on Mallat transform is proposed. This method adopts factor weighing method for threshold quantization. Through the specific case of nuclear power plant, it is verified that the algorithm is of validity and superiority.展开更多
The Sierra Norte de Puebla, Mexico, has a record of hundreds of mass removal processes triggered by rainfall, where the intensity and duration of the rain are the main mechanisms. In order to determine threshold value...The Sierra Norte de Puebla, Mexico, has a record of hundreds of mass removal processes triggered by rainfall, where the intensity and duration of the rain are the main mechanisms. In order to determine threshold values for precipitation as a cause of a landslide, the prior, marginal and conditional probabilities were calculated. A Bayesian method was used for one-dimensional (precipitation intensity) and two-dimensional (precipitation intensity and duration) analysis. This suggested a high probability of mass movement when the precipitation exceeds 60 mm within ten days. A proposed warning system is based on classes in which the threshold is exceeded.展开更多
Based on the particle-in-cell technology and the secondary electron emission theory, a three-dimensional simulation method for multipactor is presented in this paper. By combining the finite difference time domain met...Based on the particle-in-cell technology and the secondary electron emission theory, a three-dimensional simulation method for multipactor is presented in this paper. By combining the finite difference time domain method and the panicle tracing method, such an algorithm is self-consistent and accurate since the interaction between electromagnetic fields and particles is properly modeled. In the time domain aspect, the generation of multipactor can be easily visualized, which makes it possible to gain a deeper insight into the physical mechanism of this effect. In addition to the classic secondary electron emission model, the measured practical secondary electron yield is used, which increases the accuracy of the algorithm. In order to validate the method, the impedance transformer and ridge waveguide filter are studied. By analyzing the evolution of the secondaries obtained by our method, multipactor thresholds of these components are estimated, which show good agreement with the experimental results. Furthermore, the most sensitive positions where multipactor occurs are determined from the phase focusing phenomenon, which is very meaningful for multipactor analysis and design.展开更多
基金2023 Liaoning Institute of Science and Technology Doctoral Program Launch fund(No.2307B29).
文摘A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias prediction.Wavelet analysis was first used to break down the satellite clock frequency data into several levels,producing high and low frequency coefficients for each layer.The correlation coefficients of the high and low frequency coefficients in each of the three sub-intervals created by splitting these coefficients were then determined.The major noise region—the sub-interval with the lowest correlation coefficient—was chosen for thresholding treatment and noise threshold computation.The clock frequency data was then processed using wavelet reconstruction and reconverted to clock data.Lastly,three different kinds of satellite clock data—RTS,whu-o,and IGS-F—were used to confirm the produced data.Our method enhanced the stability of the Quadratic Polynomial(QP)model’s predictions for the C16 satellite by about 40%,according to the results.The accuracy and stability of the Auto Regression Integrated Moving Average(ARIMA)model improved up to 41.8%and 14.2%,respectively,whilst the Wavelet Neural Network(WNN)model improved by roughly 27.8%and 63.6%,respectively.Although our method has little effect on forecasting IGS-F series satellites,the experimental findings show that it can improve the accuracy and stability of QP,ARIMA,and WNN model forecasts for RTS and whu-o satellite clock bias.
基金supported by the National Natural Science Foundations of China(Nos.61136002,61472324)the Natural Science Foundation of Shanxi Province(No.2014JM8331)
文摘To overcome the shortcoming that the traditional minimum error threshold method can obtain satisfactory image segmentation results only when the object and background of the image strictly obey a certain type of probability distribution,one proposes the regularized minimum error threshold method and treats the traditional minimum error threshold method as its special case.Then one constructs the discrete probability distribution by using the separation between segmentation threshold and the average gray-scale values of the object and background of the image so as to compute the information energy of the probability distribution.The impact of the regularized parameter selection on the optimal segmentation threshold of the regularized minimum error threshold method is investigated.To verify the effectiveness of the proposed regularized minimum error threshold method,one selects typical grey-scale images and performs segmentation tests.The segmentation results obtained by the regularized minimum error threshold method are compared with those obtained with the traditional minimum error threshold method.The segmentation results and their analysis show that the regularized minimum error threshold method is feasible and produces more satisfactory segmentation results than the minimum error threshold method.It does not exert much impact on object acquisition in case of the addition of a certain noise to an image.Therefore,the method can meet the requirements for extracting a real object in the noisy environment.
基金This work is supported by the National Key Research and Development Plan of China under Grants No.2018YFC1503604the National Natural Science Foundation of China under Grants No.41721003,No.42074007the Key Laboratory of Geospace Environment and Geodesy,Ministry of Education,Wuhan University,No.19-01-08。
文摘As an important model for explaining the seismic rupture mode,the asperity model plays an important role in studying the stress accumulation of faults and the location of earthquake initiation.Taking Qilian-Haiyuan fault as an example,this paper combines geodetic method and b-value method to propose a multi-source observation data fusion detection method that accurately determines the asperity boundary named dual threshold search method.The method is based on the criterion that the b-value asperity boundary should be most consistent with the slip deficit rate asperity boundary.Then the optimal threshold combination of slip deficit rate and b-value is obtained through threshold search,which can be used to determine the boundary of the asperity.Based on this method,the study finds that there are four potential asperities on the Qilian-Haiyuan fault:two asperities(A1 and A2)are on the Tuolaishan segment and the other two asperities(B and C)are on Lenglongling segment and Jinqianghe segment,respectively.Among them,the lengths of asperities A1 and A2 on Tuolaishan segment are 17.0 km and 64.8 km,respectively.And the lower boundaries are 5.5 km and 15.5 km,respectively;The length of asperity B on Lenglongling segment is 70.7 km,and the lower boundary is 10.2 km.The length of asperity C on Jinqianghe segment is 42.3 km,and the lower boundary is 8.3 km.
文摘The purpose of this study is to apply different thresholding in mammogram images, and then we will determine which technique is the best in thresholding (extraction) malignant and benign tumors from the rest breast tissues. The used technique is Otsu method, because it is one of the most effective methods for most real world views with regard to uniformity and shape measures. Also, we present all the thresholding methods that used the concept of between class variance. We found from the experimental results that all the used thresholding techniques work well in detection normal breast tissues. But in abnormal tissues (breast tumors), we found that only neighborhood valley emphasis method gave best detection of malignant tumors. Also, the results demonstrate that variance and intensity contrast technique is the best in extraction the micro calcifications which represent the first signs of breast cancer.
基金supported by the National Natural Science Foundation of China(Nos.40830742 and 40901007)
文摘Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important component of debris flows and is the most active factor when debris flows oc- cur. Rainfall also determines the temporal and spatial distribution characteristics of the hazards. A reasonable rainfall threshold target is essential to ensuring the accuracy of debris flow pre-warning. Such a threshold is important for the study of the mechanisms of debris flow formation, predicting the characteristics of future activities and the design of prevention and engineering control measures. Most mountainous areas have little data regarding rainfall and hazards, especially in debris flow forming re- gions. Therefore, both the traditional demonstration method and frequency calculated method cannot satisfy the debris flow pre-warning requirements. This study presents the characteristics of pre-warning regions, included the rainfall, hydrologic and topographic conditions. An analogous area with abundant data and the same conditions as the pre-warning region was selected, and the rainfall threshold was calculated by proxy. This method resolved the problem of debris flow pre-warning in ar- eas lacking data and provided a new approach for debris flow pre-warning in mountainous areas.
基金L’Ore´al-UNESCO for the Women in Science Maghreb Program Grant Agreement No.4500410340.
文摘The discrimination of neutrons from gamma rays in a mixed radiation field is crucial in neutron detection tasks.Several approaches have been proposed to enhance the performance and accuracy of neutron-gamma discrimination.However,their performances are often associated with certain factors,such as experimental requirements and resulting mixed signals.The main purpose of this study is to achieve fast and accurate neutron-gamma discrimination without a priori information on the signal to be analyzed,as well as the experimental setup.Here,a novel method is proposed based on two concepts.The first method exploits the power of nonnegative tensor factorization(NTF)as a blind source separation method to extract the original components from the mixture signals recorded at the output of the stilbene scintillator detector.The second one is based on the principles of support vector machine(SVM)to identify and discriminate these components.In addition to these two main methods,we adopted the Mexican-hat function as a continuous wavelet transform to characterize the components extracted using the NTF model.The resulting scalograms are processed as colored images,which are segmented into two distinct classes using the Otsu thresholding method to extract the features of interest of the neutrons and gamma-ray components from the background noise.We subsequently used principal component analysis to select the most significant of these features wich are used in the training and testing datasets for SVM.Bias-variance analysis is used to optimize the SVM model by finding the optimal level of model complexity with the highest possible generalization performance.In this framework,the obtained results have verified a suitable bias–variance trade-off value.We achieved an operational SVM prediction model for neutron-gamma classification with a high true-positive rate.The accuracy and performance of the SVM based on the NTF was evaluated and validated by comparing it to the charge comparison method via figure of merit.The results indicate that the proposed approach has a superior discrimination quality(figure of merit of 2.20).
基金Project(61440026)supported by the National Natural Science Foundation of ChinaProject(11KZ|KZ08062)supported by Doctoral Research Project of Xiangtan University,China
文摘Among all segmentation techniques, Otsu thresholding method is widely used. Line intercept histogram based Otsu thresholding method(LIH Otsu method) can be more resistant to Gaussian noise, highly efficient in computing time, and can be easily extended to multilevel thresholding. But when images contain salt-and-pepper noise, LIH Otsu method performs poorly. An improved LIH Otsu method(ILIH Otsu method) is presented, which can be more resistant to Gaussian noise and salt-and-pepper noise. Moreover, it can be easily extended to multilevel thresholding. In order to improve the efficiency, the optimization algorithm based on the kinetic-molecular theory(KMTOA) is used to determine the optimal thresholds. The experimental results show that ILIH Otsu method has stronger anti-noise ability than two-dimensional Otsu thresholding method(2-D Otsu method), LIH Otsu method, K-means clustering algorithm and fuzzy clustering algorithm.
文摘In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.
文摘Under the complex condition of nuclear power plant, all kinds of influence factors may cause distortion of on-line monitoring data. It is essential that on-line monitoring data should be de-noised in order to ensure the accuracy of diagnosis. Based on the research of wavelet analysis and threshold de-noising, a new threshold denoising method based on Mallat transform is proposed. This method adopts factor weighing method for threshold quantization. Through the specific case of nuclear power plant, it is verified that the algorithm is of validity and superiority.
文摘The Sierra Norte de Puebla, Mexico, has a record of hundreds of mass removal processes triggered by rainfall, where the intensity and duration of the rain are the main mechanisms. In order to determine threshold values for precipitation as a cause of a landslide, the prior, marginal and conditional probabilities were calculated. A Bayesian method was used for one-dimensional (precipitation intensity) and two-dimensional (precipitation intensity and duration) analysis. This suggested a high probability of mass movement when the precipitation exceeds 60 mm within ten days. A proposed warning system is based on classes in which the threshold is exceeded.
基金Project supported by the National Key Laboratory Foundation,China(Grant No.9140C530103110C5301)
文摘Based on the particle-in-cell technology and the secondary electron emission theory, a three-dimensional simulation method for multipactor is presented in this paper. By combining the finite difference time domain method and the panicle tracing method, such an algorithm is self-consistent and accurate since the interaction between electromagnetic fields and particles is properly modeled. In the time domain aspect, the generation of multipactor can be easily visualized, which makes it possible to gain a deeper insight into the physical mechanism of this effect. In addition to the classic secondary electron emission model, the measured practical secondary electron yield is used, which increases the accuracy of the algorithm. In order to validate the method, the impedance transformer and ridge waveguide filter are studied. By analyzing the evolution of the secondaries obtained by our method, multipactor thresholds of these components are estimated, which show good agreement with the experimental results. Furthermore, the most sensitive positions where multipactor occurs are determined from the phase focusing phenomenon, which is very meaningful for multipactor analysis and design.