In this paper, a cardinality compensation method based on Information-weighted Consensus Filter(ICF) using data clustering is proposed in order to accurately estimate the cardinality of the Cardinalized Probability Hy...In this paper, a cardinality compensation method based on Information-weighted Consensus Filter(ICF) using data clustering is proposed in order to accurately estimate the cardinality of the Cardinalized Probability Hypothesis Density(CPHD) filter. Although the joint propagation of the intensity and the cardinality distribution in the CPHD filter process allows for more reliable estimation of the cardinality(target number) than the PHD filter, tracking loss may occur when noise and clutter are high in the measurements in a practical situation. For that reason, the cardinality compensation process is included in the CPHD filter, which is based on information fusion step using estimated cardinality obtained from the CPHD filter and measured cardinality obtained through data clustering. Here, the ICF is used for information fusion. To verify the performance of the proposed method, simulations were carried out and it was confirmed that the tracking performance of the multi-target was improved because the cardinality was estimated more accurately as compared to the existing techniques.展开更多
Although the structured light system that uses digital fringe projection has been widely implemented in three-dimensional surface profile measurement, the measurement system is susceptible to non-linear error. In this...Although the structured light system that uses digital fringe projection has been widely implemented in three-dimensional surface profile measurement, the measurement system is susceptible to non-linear error. In this work, we propose a convenient look-up-table-based (LUT-based) method to compensate for the non-linear error in captured fringe patterns. Without extra calibration, this LUT-based method completely utilizes the captured fringe pattern by recording the full-field differences. Then, a phase compensation map is established to revise the measured phase. Experimental results demonstrate that this method works effectively.展开更多
The low-pass fi ltering eff ect of the Earth results in the absorption and attenuation of the high-frequency components of seismic signals by the stratum during propagation.Hence,seismic data have low resolution.Consi...The low-pass fi ltering eff ect of the Earth results in the absorption and attenuation of the high-frequency components of seismic signals by the stratum during propagation.Hence,seismic data have low resolution.Considering the limitations of traditional high-frequency compensation methods,this paper presents a new method based on adaptive generalized S transform.This method is based on the study of frequency spectrum attenuation law of seismic signals,and the Gauss window function of adaptive generalized S transform is used to fi t the attenuation trend of seismic signals to seek the optimal Gauss window function.The amplitude spectrum compensation function constructed using the optimal Gauss window function is used to modify the time-frequency spectrum of the adaptive generalized S transform of seismic signals and reconstruct seismic signals to compensate for high-frequency attenuation.Practical data processing results show that the method can compensate for the high-frequency components that are absorbed and attenuated by the stratum,thereby eff ectively improving the resolution and quality of seismic data.展开更多
Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the st...Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the state of burnout point with respect to the launch data.In view of this,this paper mainly expounds on two issues:one is based on the approximate analytical solution to the motion equation for the vacuum flight section of a long-range rocket,deriving the analytical solutions to the partial derivatives of the state of burnout point with respect to the changing rate of the finalstage pitch program;the other is based on the initial positioning and orientation error propagation mechanism,proposing the analytical calculation formula for the partial derivatives of the state of burnout point with respect to the launch azimuth.The calculation results of correction data are simulated and verified under different circumstances.The simulation results are as follows:(1) the accuracy of approximation between the analytical solutions and the results attained via the difference method is higher than 90%,and the ratio of calculation time between them is lower than 0.2%,thus demonstrating the accuracy of calculation of data corrections and advantages in calculation speed;(2) after the analytical solutions are compensated,the longitudinal landing deviation of the rocket is less than 20 m and the lateral landing deviation of the rocket is less than 10 m,demonstrating that the corrected data can meet the requirements for the hit accuracy of a long-range rocket.展开更多
On the basis of an analysis of the error sources in multibeam echosounding system,a data processing method for compensating systematic errors in multibeam survey is proposed.In order to improve the accuracy of overall...On the basis of an analysis of the error sources in multibeam echosounding system,a data processing method for compensating systematic errors in multibeam survey is proposed.In order to improve the accuracy of overall swath,a data fusion technique using single beam survey data as control information for single beam and multibeam echosounding is then presented.Some questions involved in solving the adjustment problem,such as its feasibility and the numerical stability,are discussed in detail,and a two_step adjustment method is suggested.Finally,a practical survey data set is used as a case study to prove the efficiency and reliability of the proposed methods.展开更多
Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the op...Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the operating environment of acid production with flue gas is complex and there is much equipment.The data obtained by the detection equipment is seriously polluted and prone to abnormal phenomena such as data loss and outliers.Therefore,to solve the problem of abnormal data in the process of acid production with flue gas,a data cleaning method based on improved random forest is proposed.Firstly,an outlier data recognition model based on isolation forest is designed to identify and eliminate the outliers in the dataset.Secondly,an improved random forest regression model is established.Genetic algorithm is used to optimize the hyperparameters of the random forest regression model.Then the optimal parameter combination is found in the search space and the trend of data is predicted.Finally,the improved random forest data cleaning method is used to compensate for the missing data after eliminating abnormal data and the data cleaning is realized.Results show that the proposed method can accurately eliminate and compensate for the abnormal data in the process of acid production with flue gas.The method improves the accuracy of compensation for missing data.With the data after cleaning,a more accurate model can be established,which is significant to the subsequent temperature control.The conversion rate of SO_(2) can be further improved,thereby improving the yield of sulfuric acid and economic benefits.展开更多
Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,whi...Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,which interfere with the crack detection performance.Till to the present,there still lacks efficient algorithm models and training datasets to deal with the interference brought by the shadows.To fill in the gap,we made several contributions as follows.First,we proposed a new pavement shadow and crack dataset,which contains a variety of shadow and pavement pixel size combinations.It also covers all common cracks(linear cracks and network cracks),placing higher demands on crack detection methods.Second,we designed a two-step shadow-removal-oriented crack detection approach:SROCD,which improves the performance of the algorithm by first removing the shadow and then detecting it.In addition to shadows,the method can cope with other noise disturbances.Third,we explored the mechanism of how shadows affect crack detection.Based on this mechanism,we propose a data augmentation method based on the difference in brightness values,which can adapt to brightness changes caused by seasonal and weather changes.Finally,we introduced a residual feature augmentation algorithm to detect small cracks that can predict sudden disasters,and the algorithm improves the performance of the model overall.We compare our method with the state-of-the-art methods on existing pavement crack datasets and the shadow-crack dataset,and the experimental results demonstrate the superiority of our method.展开更多
Objective speech quality is difficult to be measured without the input reference speech.Mapping methods using data mining are investigated and designed to improve the output-based speech quality assessment algorithm.T...Objective speech quality is difficult to be measured without the input reference speech.Mapping methods using data mining are investigated and designed to improve the output-based speech quality assessment algorithm.The degraded speech is firstly separated into three classes(unvoiced,voiced and silence),and then the consistency measurement between the degraded speech signal and the pre-trained reference model for each class is calculated and mapped to an objective speech quality score using data mining.Fuzzy Gaussian mixture model(GMM)is used to generate the artificial reference model trained on perceptual linear predictive(PLP)features.The mean opinion score(MOS)mapping methods including multivariate non-linear regression(MNLR),fuzzy neural network(FNN)and support vector regression(SVR)are designed and compared with the standard ITU-T P.563 method.Experimental results show that the assessment methods with data mining perform better than ITU-T P.563.Moreover,FNN and SVR are more efficient than MNLR,and FNN performs best with 14.50% increase in the correlation coefficient and 32.76% decrease in the root-mean-square MOS error.展开更多
Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating ...Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating the radiometric inconsistency.The radiometric trans-forming relation between the subject image and the reference image is an essential aspect of RRN.Aimed at accurate radiometric transforming relation modeling,the learning-based nonlinear regression method,Support Vector machine Regression(SVR)is used for fitting the complicated radiometric transforming relation for the coarse-resolution data-referenced RRN.To evaluate the effectiveness of the proposed method,a series of experiments are performed,including two synthetic data experiments and one real data experiment.And the proposed method is compared with other methods that use linear regression,Artificial Neural Network(ANN)or Random Forest(RF)for radiometric transforming relation modeling.The results show that the proposed method performs well on fitting the radiometric transforming relation and could enhance the RRN performance.展开更多
According to the "theoretical admittance " and the "observation admittance" of the actual data,the theoretical value of effective elastic thickness in the study area was 10 km.Combining the gravity...According to the "theoretical admittance " and the "observation admittance" of the actual data,the theoretical value of effective elastic thickness in the study area was 10 km.Combining the gravity anomalies and vertical gravity gradient anomalies,the admittance function is used to construct the 1′×1′ bathymetry model over the Philippine Sea by using the adaptive weighting technique.It is found that the accuracy of the bathymetry model constructed is the highest when the ratio of inversion result of vertical gravity gradient anomalies and inversion result of gravity anomalies is 2∶3.At the same time,using multi-source gravity data to predict bathymetry could synthesize the superiority of gravity anomalies and vertical gravity gradient anomalies on the different seafloor topography,and the accuracy is better than bathymetry model that only used gravity anomalies or vertical gravity gradient anomalies.Taking the ship test data as the checking condition,the accuracy of predicting model is slightly lower than that of V18.1 model and improved by 27.17% and 39.02% respectively compared with the ETOPO1 model and the DTU10 model.Check points which the absolute value of the relative error of the predicting model is in the range of 5% accounted for 94.25% of the total.展开更多
Some methods which use an optical tracker(OT) as a standard reference of electromagnetic tracker(EMT) were proposed in order to compensate the output error of electromagnetic tracker. Usually,they use a magneto-optic ...Some methods which use an optical tracker(OT) as a standard reference of electromagnetic tracker(EMT) were proposed in order to compensate the output error of electromagnetic tracker. Usually,they use a magneto-optic tool to collect the outputs of OT and EMT simultaneously,and then compare the output of OT with that of EMT to compensate the error of EMT. Although the outputs of EMT and OT can be matched each other by using a time stamp which denotes when the acquirement command is sent,but the accuracy will decrease if the tool moves faster for the errors of the time stamp itself. A rapid method for compensating EMT output error is proposed. A particular scan mode of the magneto-optic tool is designed for collecting EMT and OT outputs,and a data registration method is proposed to match the outputs of EMT and OT. The simulated results show that the output error of EMT can be decreased efficiently and the accuracy of the compensation can be improved by about 15% compared with that of the existing methods.展开更多
基金supported by the National GNSS Research Center Program of the Defense Acquisition Program Administration and Agency for Defense Developmentthe Ministry of Science and ICT of the Republic of Korea through the Space Core Technology Development Program (No. NRF2018M1A3A3A02065722)
文摘In this paper, a cardinality compensation method based on Information-weighted Consensus Filter(ICF) using data clustering is proposed in order to accurately estimate the cardinality of the Cardinalized Probability Hypothesis Density(CPHD) filter. Although the joint propagation of the intensity and the cardinality distribution in the CPHD filter process allows for more reliable estimation of the cardinality(target number) than the PHD filter, tracking loss may occur when noise and clutter are high in the measurements in a practical situation. For that reason, the cardinality compensation process is included in the CPHD filter, which is based on information fusion step using estimated cardinality obtained from the CPHD filter and measured cardinality obtained through data clustering. Here, the ICF is used for information fusion. To verify the performance of the proposed method, simulations were carried out and it was confirmed that the tracking performance of the multi-target was improved because the cardinality was estimated more accurately as compared to the existing techniques.
基金the financial support provided by the National Natural Science Foundation of China(11472267 and 11372182)the National Basic Research Program of China(2012CB937504)
文摘Although the structured light system that uses digital fringe projection has been widely implemented in three-dimensional surface profile measurement, the measurement system is susceptible to non-linear error. In this work, we propose a convenient look-up-table-based (LUT-based) method to compensate for the non-linear error in captured fringe patterns. Without extra calibration, this LUT-based method completely utilizes the captured fringe pattern by recording the full-field differences. Then, a phase compensation map is established to revise the measured phase. Experimental results demonstrate that this method works effectively.
基金This research is supported by the National Science and Technology Major Project of China(No.2011ZX05024-001-03)the Natural Science Basic Research Plan in Shaanxi Province of China(No.2021JQ-588)Innovation Fund for graduate students of Xi’an Shiyou University(No.YCS17111017).
文摘The low-pass fi ltering eff ect of the Earth results in the absorption and attenuation of the high-frequency components of seismic signals by the stratum during propagation.Hence,seismic data have low resolution.Considering the limitations of traditional high-frequency compensation methods,this paper presents a new method based on adaptive generalized S transform.This method is based on the study of frequency spectrum attenuation law of seismic signals,and the Gauss window function of adaptive generalized S transform is used to fi t the attenuation trend of seismic signals to seek the optimal Gauss window function.The amplitude spectrum compensation function constructed using the optimal Gauss window function is used to modify the time-frequency spectrum of the adaptive generalized S transform of seismic signals and reconstruct seismic signals to compensate for high-frequency attenuation.Practical data processing results show that the method can compensate for the high-frequency components that are absorbed and attenuated by the stratum,thereby eff ectively improving the resolution and quality of seismic data.
文摘Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the state of burnout point with respect to the launch data.In view of this,this paper mainly expounds on two issues:one is based on the approximate analytical solution to the motion equation for the vacuum flight section of a long-range rocket,deriving the analytical solutions to the partial derivatives of the state of burnout point with respect to the changing rate of the finalstage pitch program;the other is based on the initial positioning and orientation error propagation mechanism,proposing the analytical calculation formula for the partial derivatives of the state of burnout point with respect to the launch azimuth.The calculation results of correction data are simulated and verified under different circumstances.The simulation results are as follows:(1) the accuracy of approximation between the analytical solutions and the results attained via the difference method is higher than 90%,and the ratio of calculation time between them is lower than 0.2%,thus demonstrating the accuracy of calculation of data corrections and advantages in calculation speed;(2) after the analytical solutions are compensated,the longitudinal landing deviation of the rocket is less than 20 m and the lateral landing deviation of the rocket is less than 10 m,demonstrating that the corrected data can meet the requirements for the hit accuracy of a long-range rocket.
文摘On the basis of an analysis of the error sources in multibeam echosounding system,a data processing method for compensating systematic errors in multibeam survey is proposed.In order to improve the accuracy of overall swath,a data fusion technique using single beam survey data as control information for single beam and multibeam echosounding is then presented.Some questions involved in solving the adjustment problem,such as its feasibility and the numerical stability,are discussed in detail,and a two_step adjustment method is suggested.Finally,a practical survey data set is used as a case study to prove the efficiency and reliability of the proposed methods.
基金supported by the National Natural Science Foundation of China(61873006)Beijing Natural Science Foundation(4204087,4212040).
文摘Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the operating environment of acid production with flue gas is complex and there is much equipment.The data obtained by the detection equipment is seriously polluted and prone to abnormal phenomena such as data loss and outliers.Therefore,to solve the problem of abnormal data in the process of acid production with flue gas,a data cleaning method based on improved random forest is proposed.Firstly,an outlier data recognition model based on isolation forest is designed to identify and eliminate the outliers in the dataset.Secondly,an improved random forest regression model is established.Genetic algorithm is used to optimize the hyperparameters of the random forest regression model.Then the optimal parameter combination is found in the search space and the trend of data is predicted.Finally,the improved random forest data cleaning method is used to compensate for the missing data after eliminating abnormal data and the data cleaning is realized.Results show that the proposed method can accurately eliminate and compensate for the abnormal data in the process of acid production with flue gas.The method improves the accuracy of compensation for missing data.With the data after cleaning,a more accurate model can be established,which is significant to the subsequent temperature control.The conversion rate of SO_(2) can be further improved,thereby improving the yield of sulfuric acid and economic benefits.
基金supported in part by the 14th Five-Year Project of Ministry of Science and Technology of China(2021YFD2000304)Fundamental Research Funds for the Central Universities(531118010509)Natural Science Foundation of Hunan Province,China(2021JJ40114)。
文摘Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,which interfere with the crack detection performance.Till to the present,there still lacks efficient algorithm models and training datasets to deal with the interference brought by the shadows.To fill in the gap,we made several contributions as follows.First,we proposed a new pavement shadow and crack dataset,which contains a variety of shadow and pavement pixel size combinations.It also covers all common cracks(linear cracks and network cracks),placing higher demands on crack detection methods.Second,we designed a two-step shadow-removal-oriented crack detection approach:SROCD,which improves the performance of the algorithm by first removing the shadow and then detecting it.In addition to shadows,the method can cope with other noise disturbances.Third,we explored the mechanism of how shadows affect crack detection.Based on this mechanism,we propose a data augmentation method based on the difference in brightness values,which can adapt to brightness changes caused by seasonal and weather changes.Finally,we introduced a residual feature augmentation algorithm to detect small cracks that can predict sudden disasters,and the algorithm improves the performance of the model overall.We compare our method with the state-of-the-art methods on existing pavement crack datasets and the shadow-crack dataset,and the experimental results demonstrate the superiority of our method.
基金Projects(61001188,1161140319)supported by the National Natural Science Foundation of ChinaProject(2012ZX03001034)supported by the National Science and Technology Major ProjectProject(YETP1202)supported by Beijing Higher Education Young Elite Teacher Project,China
文摘Objective speech quality is difficult to be measured without the input reference speech.Mapping methods using data mining are investigated and designed to improve the output-based speech quality assessment algorithm.The degraded speech is firstly separated into three classes(unvoiced,voiced and silence),and then the consistency measurement between the degraded speech signal and the pre-trained reference model for each class is calculated and mapped to an objective speech quality score using data mining.Fuzzy Gaussian mixture model(GMM)is used to generate the artificial reference model trained on perceptual linear predictive(PLP)features.The mean opinion score(MOS)mapping methods including multivariate non-linear regression(MNLR),fuzzy neural network(FNN)and support vector regression(SVR)are designed and compared with the standard ITU-T P.563 method.Experimental results show that the assessment methods with data mining perform better than ITU-T P.563.Moreover,FNN and SVR are more efficient than MNLR,and FNN performs best with 14.50% increase in the correlation coefficient and 32.76% decrease in the root-mean-square MOS error.
基金This research was funded by the National Natural Science Fund of China[grant number 41701415]Science fund project of Wuhan Institute of Technology[grant number K201724]Science and Technology Development Funds Project of Department of Transportation of Hubei Province[grant number 201900001].
文摘Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating the radiometric inconsistency.The radiometric trans-forming relation between the subject image and the reference image is an essential aspect of RRN.Aimed at accurate radiometric transforming relation modeling,the learning-based nonlinear regression method,Support Vector machine Regression(SVR)is used for fitting the complicated radiometric transforming relation for the coarse-resolution data-referenced RRN.To evaluate the effectiveness of the proposed method,a series of experiments are performed,including two synthetic data experiments and one real data experiment.And the proposed method is compared with other methods that use linear regression,Artificial Neural Network(ANN)or Random Forest(RF)for radiometric transforming relation modeling.The results show that the proposed method performs well on fitting the radiometric transforming relation and could enhance the RRN performance.
基金The National Natural Science Foundation of China(41774021,41404020,41774018,41674082,41504018,41674026)The State Key Laboratory of Geo-Information Engineering(SKLGIE2016-M-3-2)The School Project(2017503902,2018222).
文摘According to the "theoretical admittance " and the "observation admittance" of the actual data,the theoretical value of effective elastic thickness in the study area was 10 km.Combining the gravity anomalies and vertical gravity gradient anomalies,the admittance function is used to construct the 1′×1′ bathymetry model over the Philippine Sea by using the adaptive weighting technique.It is found that the accuracy of the bathymetry model constructed is the highest when the ratio of inversion result of vertical gravity gradient anomalies and inversion result of gravity anomalies is 2∶3.At the same time,using multi-source gravity data to predict bathymetry could synthesize the superiority of gravity anomalies and vertical gravity gradient anomalies on the different seafloor topography,and the accuracy is better than bathymetry model that only used gravity anomalies or vertical gravity gradient anomalies.Taking the ship test data as the checking condition,the accuracy of predicting model is slightly lower than that of V18.1 model and improved by 27.17% and 39.02% respectively compared with the ETOPO1 model and the DTU10 model.Check points which the absolute value of the relative error of the predicting model is in the range of 5% accounted for 94.25% of the total.
文摘Some methods which use an optical tracker(OT) as a standard reference of electromagnetic tracker(EMT) were proposed in order to compensate the output error of electromagnetic tracker. Usually,they use a magneto-optic tool to collect the outputs of OT and EMT simultaneously,and then compare the output of OT with that of EMT to compensate the error of EMT. Although the outputs of EMT and OT can be matched each other by using a time stamp which denotes when the acquirement command is sent,but the accuracy will decrease if the tool moves faster for the errors of the time stamp itself. A rapid method for compensating EMT output error is proposed. A particular scan mode of the magneto-optic tool is designed for collecting EMT and OT outputs,and a data registration method is proposed to match the outputs of EMT and OT. The simulated results show that the output error of EMT can be decreased efficiently and the accuracy of the compensation can be improved by about 15% compared with that of the existing methods.