Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating ...Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating the radiometric inconsistency.The radiometric trans-forming relation between the subject image and the reference image is an essential aspect of RRN.Aimed at accurate radiometric transforming relation modeling,the learning-based nonlinear regression method,Support Vector machine Regression(SVR)is used for fitting the complicated radiometric transforming relation for the coarse-resolution data-referenced RRN.To evaluate the effectiveness of the proposed method,a series of experiments are performed,including two synthetic data experiments and one real data experiment.And the proposed method is compared with other methods that use linear regression,Artificial Neural Network(ANN)or Random Forest(RF)for radiometric transforming relation modeling.The results show that the proposed method performs well on fitting the radiometric transforming relation and could enhance the RRN performance.展开更多
Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable dete...Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.展开更多
In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illuminati...In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invarlant features identified from multitemporal Landsat image pairs of Xiamen (厦门) and Fuzhou (福州) areas, both located in the eastern Fujian (福建) Province of China. Compared with the unnormalized image, the radiometric differences between the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnorrealized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference.展开更多
Edge detection and enhancement techniques are commonly used in recognizing the edge of geologic bodies using potential field data. We present a new edge recognition technology based on the normalized vertical derivati...Edge detection and enhancement techniques are commonly used in recognizing the edge of geologic bodies using potential field data. We present a new edge recognition technology based on the normalized vertical derivative of the total horizontal derivative which has the functions of both edge detection and enhancement techniques. First, we calculate the total horizontal derivative (THDR) of the potential-field data and then compute the n-order vertical derivative (VDRn) of the THDR. For the n-order vertical derivative, the peak value of total horizontal derivative (PTHDR) is obtained using a threshold value greater than 0. This PTHDR can be used for edge detection. Second, the PTHDR value is divided by the total horizontal derivative and normalized by the maximum value. Finally, we used different kinds of numerical models to verify the effectiveness and reliability of the new edge recognition technology.展开更多
The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities...The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity,leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals,and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this method,three data types are used,and seven common similarity measurement methods are compared.The experimental result indicates that the relative difference of the method is increasing with the dimensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition,the similarity range of this method in different dimensions is [0,1],which is fit for similarity analysis after dimensionality reduction.展开更多
In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier ...In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier Transform(FFT)algorithms,convolution,image processing algorithms,etcetera.In the domain of digital signal processing,the use of normalization architecture is very vast.The main objective of using normalization is to performcomparison and shift operations.In this research paper,an evolutionary approach for designing an optimized normalization algorithm is proposed using basic logical blocks such as Multiplexer,Adder etc.The proposed normalization algorithm is further used in designing an 8×8 bit Signed Floating-Point Multiply-Accumulate(SFMAC)architecture.Since the SFMAC can accept an 8-bit significand and a 3-bit exponent,the input to the said architecture can be somewhere between−(7.96872)_(10) to+(7.96872)_(10).The proposed architecture is designed and implemented using the Cadence Virtuoso using 90 and 130 nm technologies(in Generic Process Design Kit(GPDK)and Taiwan Semiconductor Manufacturing Company(TSMC),respectively).To reduce the power consumption of the proposed normalization architecture,techniques such as“block enabling”and“clock gating”are used rigorously.According to the analysis done on Cadence,the proposed architecture uses the least amount of power compared to its current predecessors.展开更多
The analysis of spatially correlated binary data observed on lattices is an interesting topic that catches the attention of many scholars of different scientific fields like epidemiology, medicine, agriculture, biolog...The analysis of spatially correlated binary data observed on lattices is an interesting topic that catches the attention of many scholars of different scientific fields like epidemiology, medicine, agriculture, biology, geology and geography. To overcome the encountered difficulties upon fitting the autologistic regression model to analyze such data via Bayesian and/or Markov chain Monte Carlo (MCMC) techniques, the Gaussian latent variable model has been enrolled in the methodology. Assuming a normal distribution for the latent random variable may not be realistic and wrong, normal assumptions might cause bias in parameter estimates and affect the accuracy of results and inferences. Thus, it entails more flexible prior distributions for the latent variable in the spatial models. A review of the recent literature in spatial statistics shows that there is an increasing tendency in presenting models that are involving skew distributions, especially skew-normal ones. In this study, a skew-normal latent variable modeling was developed in Bayesian analysis of the spatially correlated binary data that were acquired on uncorrelated lattices. The proposed methodology was applied in inspecting spatial dependency and related factors of tooth caries occurrences in a sample of students of Yasuj University of Medical Sciences, Yasuj, Iran. The results indicated that the skew-normal latent variable model had validity and it made a decent criterion that fitted caries data.展开更多
Deep learning has emerged as a powerful tool for predicting the remaining useful life(RUL)of batteries,contingent upon access to ample data.However,the inherent limitations of data availability from traditional or acc...Deep learning has emerged as a powerful tool for predicting the remaining useful life(RUL)of batteries,contingent upon access to ample data.However,the inherent limitations of data availability from traditional or accelerated life testing pose significant challenges.To mitigate the prediction accuracy issues arising from small sample sizes in existing intelligent methods,we introduce a novel data augmentation framework for RUL prediction.This framework harnesses the inherent high coincidence of degradation patterns exhibited by lithium-ion batteries to pinpoint the knee point,a critical juncture marking a significant shift in the degradation trajectory.By focusing on this critical knee point,we leverage the power of normalizing flow models to generate virtual data,effectively augmenting the training sample size.Additionally,we integrate a Bayesian Long Short-Term Memory network,optimized with Box-Cox transformation,to address the inherent uncertainty associated with predictions based on augmented data.This integration allows for a more nuanced understanding of RUL prediction uncertainties,offering valuable confidence intervals.The efficacy and superiority of the proposed framework are validated through extensive experiments on the CS2 dataset from the University of Maryland and the CrFeMnNiCo dataset from our laboratory.The results clearly demonstrate a substantial improvement in the confidence interval of RUL predictions compared to pre-optimization,highlighting the ability of the framework to achieve high-precision RUL predictions even with limited data.展开更多
In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be est...In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be estimated simultaneously by the proposed method while the feature of longitudinal data is considered. The existence, strong consistency and asymptotic normality of the estimators are proved under suitable conditions. A simulation study is conducted to investigate the finite sample performance of the proposed method. Our approach can also be used to study the pure single-index model for longitudinal data.展开更多
Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between le...Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between leaves and nonphotosynthetic structures to retrieve the leaf normal has been challenging. We used here an approach to accurately retrieve the leaf normals of camphorwood(Cinnamomum camphora) using TLS point cloud data.First, nonphotosynthetic structures were filtered by using the curvature threshold of each point. Then, the point cloud data were segmented by a voxel method and clustered by a Gaussian mixture model in each voxel. Finally, the normal vector of each cluster was computed by principal component analysis to obtain the leaf normal distribution. We collected leaf inclination angles and estimated the distribution, which we compared with the retrieved leaf normal distribution. The correlation coefficient between measurements and obtained results was 0.96, indicating a good coincidence.展开更多
Since the reform and opening-up program started in 1978,the level of urbanization has increased rapidly in China.Rapid urban expansion and restructuring have had significant impacts on the ecological environment espec...Since the reform and opening-up program started in 1978,the level of urbanization has increased rapidly in China.Rapid urban expansion and restructuring have had significant impacts on the ecological environment especially within built-up areas.In this study,ArcGIS 10,ENVI 4.5,and Visual FoxPro 6.0 were used to analyze the human impacts on vegetation in the built-up areas of 656Chinese cities from 1992 to 2010.Firstly,an existing algorithm was refined to extract the boundaries of the built-up areas based on the Defense Meteorological Satellite Program Operational Linescan System(DMSP_OLS)nighttime light data.This improved algorithm has the advantages of high accuracy and speed.Secondly,a mathematical model(Human impacts(HI))was constructed to measure the impacts of human factors on vegetation during rapid urbanization based on Advanced Very High Resolution Radiometer(AVHRR)Normalized Difference Vegetation Index(NDVI)and Moderate Resolution Imaging Spectroradiometer(MODIS)NDVI.HI values greater than zero indicate relatively beneficial effects while values less than zero indicate proportionally adverse effects.The results were analyzed from four aspects:the size of cities(metropolises,large cities,medium-sized cities,and small cities),large regions(the eastern,central,western,and northeastern China),administrative divisions of China(provinces,autonomous regions,and municipalities)and vegetation zones(humid and semi-humid forest zone,semi-arid steppe zone,and arid desert zone).Finally,we discussed how human factors impacted on vegetation changes in the built-up areas.We found that urban planning policies and developmental stages impacted on vegetation changes in the built-up areas.The negative human impacts followed an inverted′U′shape,first rising and then falling with increase of urban scales.China′s national policies,social and economic development affected vegetation changes in the built-up areas.The findings can provide a scientific basis for municipal planning departments,a decision-making reference for government,and scientific guidance for sustainable development in China.展开更多
Consider tile partial linear model Y=Xβ+ g(T) + e. Wilers Y is at risk of being censored from the right, g is an unknown smoothing function on [0,1], β is a 1-dimensional parameter to be estimated and e is an unobse...Consider tile partial linear model Y=Xβ+ g(T) + e. Wilers Y is at risk of being censored from the right, g is an unknown smoothing function on [0,1], β is a 1-dimensional parameter to be estimated and e is an unobserved error. In Ref[1,2], it wes proved that the estimator for the asymptotic variance of βn(βn) is consistent. In this paper, we establish the limit distribution and the law of the iterated logarithm for,En, and obtain the convergest rates for En and the strong uniform convergent rates for gn(gn).展开更多
In view of the frequent fluctuation of garlic price under the market economy and the current situation of garlic price,the fluctuation of garlic price in the circulation link of garlic industry chain is analyzed,and t...In view of the frequent fluctuation of garlic price under the market economy and the current situation of garlic price,the fluctuation of garlic price in the circulation link of garlic industry chain is analyzed,and the application mode of multidisciplinary in the agricultural industry is discussed.On the basis of the big data platform of garlic industry chain,this paper constructs a Garch model to analyze the fluctuation law of garlic price in the circulation link and provides the garlic industry service from the angle of price fluctuation combined with the economic analysis.The research shows that the average price rate of the price of garlic shows“agglomeration”and cyclical phenomenon,which has the characteristics of fragility,left and a non-normal distribution and the fitting value of the GARCH model is very close to the true value.Finally,it looks into the industrial service form from the perspective of garlic price fluctuation.展开更多
Consider a semiparametric regression model Y_i=X_iβ+g(t_i)+e_i, 1 ≤ i ≤ n, where Y_i is censored on the right by another random variable C_i with known or unknown distribution G. The wavelet estimators of param...Consider a semiparametric regression model Y_i=X_iβ+g(t_i)+e_i, 1 ≤ i ≤ n, where Y_i is censored on the right by another random variable C_i with known or unknown distribution G. The wavelet estimators of parameter and nonparametric part are given by the wavelet smoothing and the synthetic data methods. Under general conditions, the asymptotic normality for the wavelet estimators and the convergence rates for the wavelet estimators of nonparametric components are investigated. A numerical example is given.展开更多
Predominantly the localization accuracy of the magnetic field-based localization approaches is severed by two limiting factors:Smartphone heterogeneity and smaller data lengths.The use of multifarioussmartphones cripp...Predominantly the localization accuracy of the magnetic field-based localization approaches is severed by two limiting factors:Smartphone heterogeneity and smaller data lengths.The use of multifarioussmartphones cripples the performance of such approaches owing to the variability of the magnetic field data.In the same vein,smaller lengths of magnetic field data decrease the localization accuracy substantially.The current study proposes the use of multiple neural networks like deep neural network(DNN),long short term memory network(LSTM),and gated recurrent unit network(GRN)to perform indoor localization based on the embedded magnetic sensor of the smartphone.A voting scheme is introduced that takes predictions from neural networks into consideration to estimate the current location of the user.Contrary to conventional magnetic field-based localization approaches that rely on the magnetic field data intensity,this study utilizes the normalized magnetic field data for this purpose.Training of neural networks is carried out using Galaxy S8 data while the testing is performed with three devices,i.e.,LG G7,Galaxy S8,and LG Q6.Experiments are performed during different times of the day to analyze the impact of time variability.Results indicate that the proposed approach minimizes the impact of smartphone variability and elevates the localization accuracy.Performance comparison with three approaches reveals that the proposed approach outperforms them in mean,50%,and 75%error even using a lesser amount of magnetic field data than those of other approaches.展开更多
Based on the 16d-composite MODIS (moderate resolution imaging spectroradiometer)-NDVI(normalized difference vegetation index) time-series data in 2004, vegetation in North Tibet Plateau was classified and seasonal...Based on the 16d-composite MODIS (moderate resolution imaging spectroradiometer)-NDVI(normalized difference vegetation index) time-series data in 2004, vegetation in North Tibet Plateau was classified and seasonal variations on the pixels selected from different vegetation type were analyzed. The Savitzky-Golay filtering algorithm was applied to perform a filtration processing for MODIS-NDVI time-series data. The processed time-series curves can reflect a real variation trend of vegetation growth. The NDVI time-series curves of coniferous forest, high-cold meadow, high-cold meadow steppe and high-cold steppe all appear a mono-peak model during vegetation growth with the maximum peak occurring in August. A decision-tree classification model was established according to either NDVI time-series data or land surface temperature data. And then, both classifying and processing for vegetations were carried out through the model based on NDVI time-series curves. An accuracy test illustrates that classification results are of high accuracy and credibility and the model is conducive for studying a climate variation and estimating a vegetation production at regional even global scale.展开更多
基金This research was funded by the National Natural Science Fund of China[grant number 41701415]Science fund project of Wuhan Institute of Technology[grant number K201724]Science and Technology Development Funds Project of Department of Transportation of Hubei Province[grant number 201900001].
文摘Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating the radiometric inconsistency.The radiometric trans-forming relation between the subject image and the reference image is an essential aspect of RRN.Aimed at accurate radiometric transforming relation modeling,the learning-based nonlinear regression method,Support Vector machine Regression(SVR)is used for fitting the complicated radiometric transforming relation for the coarse-resolution data-referenced RRN.To evaluate the effectiveness of the proposed method,a series of experiments are performed,including two synthetic data experiments and one real data experiment.And the proposed method is compared with other methods that use linear regression,Artificial Neural Network(ANN)or Random Forest(RF)for radiometric transforming relation modeling.The results show that the proposed method performs well on fitting the radiometric transforming relation and could enhance the RRN performance.
文摘Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.
基金This paper is supported by the National Natural Science Foundation ofChina (No .40371107) .
文摘In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invarlant features identified from multitemporal Landsat image pairs of Xiamen (厦门) and Fuzhou (福州) areas, both located in the eastern Fujian (福建) Province of China. Compared with the unnormalized image, the radiometric differences between the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnorrealized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference.
基金supported by the National Science and Technology Major Projects (2008ZX05025)the Project of National Oil and Gas Resources Strategic Constituency Survey and Evaluation of the Ministry of Land and Resources,China (XQ-2007-05)
文摘Edge detection and enhancement techniques are commonly used in recognizing the edge of geologic bodies using potential field data. We present a new edge recognition technology based on the normalized vertical derivative of the total horizontal derivative which has the functions of both edge detection and enhancement techniques. First, we calculate the total horizontal derivative (THDR) of the potential-field data and then compute the n-order vertical derivative (VDRn) of the THDR. For the n-order vertical derivative, the peak value of total horizontal derivative (PTHDR) is obtained using a threshold value greater than 0. This PTHDR can be used for edge detection. Second, the PTHDR value is divided by the total horizontal derivative and normalized by the maximum value. Finally, we used different kinds of numerical models to verify the effectiveness and reliability of the new edge recognition technology.
基金Supported by the National Natural Science Foundation of China(No.61502475)the Importation and Development of High-Caliber Talents Project of the Beijing Municipal Institutions(No.CIT&TCD201504039)
文摘The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity,leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals,and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this method,three data types are used,and seven common similarity measurement methods are compared.The experimental result indicates that the relative difference of the method is increasing with the dimensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition,the similarity range of this method in different dimensions is [0,1],which is fit for similarity analysis after dimensionality reduction.
基金This work was supported by Research Support Fund(RSF)of Symbiosis International(Deemed University),Pune,India。
文摘In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier Transform(FFT)algorithms,convolution,image processing algorithms,etcetera.In the domain of digital signal processing,the use of normalization architecture is very vast.The main objective of using normalization is to performcomparison and shift operations.In this research paper,an evolutionary approach for designing an optimized normalization algorithm is proposed using basic logical blocks such as Multiplexer,Adder etc.The proposed normalization algorithm is further used in designing an 8×8 bit Signed Floating-Point Multiply-Accumulate(SFMAC)architecture.Since the SFMAC can accept an 8-bit significand and a 3-bit exponent,the input to the said architecture can be somewhere between−(7.96872)_(10) to+(7.96872)_(10).The proposed architecture is designed and implemented using the Cadence Virtuoso using 90 and 130 nm technologies(in Generic Process Design Kit(GPDK)and Taiwan Semiconductor Manufacturing Company(TSMC),respectively).To reduce the power consumption of the proposed normalization architecture,techniques such as“block enabling”and“clock gating”are used rigorously.According to the analysis done on Cadence,the proposed architecture uses the least amount of power compared to its current predecessors.
文摘The analysis of spatially correlated binary data observed on lattices is an interesting topic that catches the attention of many scholars of different scientific fields like epidemiology, medicine, agriculture, biology, geology and geography. To overcome the encountered difficulties upon fitting the autologistic regression model to analyze such data via Bayesian and/or Markov chain Monte Carlo (MCMC) techniques, the Gaussian latent variable model has been enrolled in the methodology. Assuming a normal distribution for the latent random variable may not be realistic and wrong, normal assumptions might cause bias in parameter estimates and affect the accuracy of results and inferences. Thus, it entails more flexible prior distributions for the latent variable in the spatial models. A review of the recent literature in spatial statistics shows that there is an increasing tendency in presenting models that are involving skew distributions, especially skew-normal ones. In this study, a skew-normal latent variable modeling was developed in Bayesian analysis of the spatially correlated binary data that were acquired on uncorrelated lattices. The proposed methodology was applied in inspecting spatial dependency and related factors of tooth caries occurrences in a sample of students of Yasuj University of Medical Sciences, Yasuj, Iran. The results indicated that the skew-normal latent variable model had validity and it made a decent criterion that fitted caries data.
基金supported by the National Natural Science Foundation of China(Grant No.62227814,52205040,22279070,and U21A20170)the Natural Science Basic Research Program of Shaanxi(2023-JC-QN-0140)+3 种基金the Young Talent Fund of Xi’an Association for Science and Technology(Grant No.959202313096)the Key Projects of the Shaanxi Province Natural Science Foundation(Grant No.2025JC-QYXQ-038)the Open Foundation of the State Key Laboratory of Fluid Power and Mechatronic Systems(Grant No.GZKF-202430)the National Key Research and Development Program of China(Grant No.2024YFB3311204)。
文摘Deep learning has emerged as a powerful tool for predicting the remaining useful life(RUL)of batteries,contingent upon access to ample data.However,the inherent limitations of data availability from traditional or accelerated life testing pose significant challenges.To mitigate the prediction accuracy issues arising from small sample sizes in existing intelligent methods,we introduce a novel data augmentation framework for RUL prediction.This framework harnesses the inherent high coincidence of degradation patterns exhibited by lithium-ion batteries to pinpoint the knee point,a critical juncture marking a significant shift in the degradation trajectory.By focusing on this critical knee point,we leverage the power of normalizing flow models to generate virtual data,effectively augmenting the training sample size.Additionally,we integrate a Bayesian Long Short-Term Memory network,optimized with Box-Cox transformation,to address the inherent uncertainty associated with predictions based on augmented data.This integration allows for a more nuanced understanding of RUL prediction uncertainties,offering valuable confidence intervals.The efficacy and superiority of the proposed framework are validated through extensive experiments on the CS2 dataset from the University of Maryland and the CrFeMnNiCo dataset from our laboratory.The results clearly demonstrate a substantial improvement in the confidence interval of RUL predictions compared to pre-optimization,highlighting the ability of the framework to achieve high-precision RUL predictions even with limited data.
基金Supported by the National Natural Science Foundation of China (10571008)the Natural Science Foundation of Henan (092300410149)the Core Teacher Foundationof Henan (2006141)
文摘In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be estimated simultaneously by the proposed method while the feature of longitudinal data is considered. The existence, strong consistency and asymptotic normality of the estimators are proved under suitable conditions. A simulation study is conducted to investigate the finite sample performance of the proposed method. Our approach can also be used to study the pure single-index model for longitudinal data.
文摘Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between leaves and nonphotosynthetic structures to retrieve the leaf normal has been challenging. We used here an approach to accurately retrieve the leaf normals of camphorwood(Cinnamomum camphora) using TLS point cloud data.First, nonphotosynthetic structures were filtered by using the curvature threshold of each point. Then, the point cloud data were segmented by a voxel method and clustered by a Gaussian mixture model in each voxel. Finally, the normal vector of each cluster was computed by principal component analysis to obtain the leaf normal distribution. We collected leaf inclination angles and estimated the distribution, which we compared with the retrieved leaf normal distribution. The correlation coefficient between measurements and obtained results was 0.96, indicating a good coincidence.
基金Under the auspices of National Natural Science Foundation of China(No.41171143,40771064)Program for New Century Excellent Talents in University(No.NCET-07-0398)Fundamental Research Funds for the Central Universities(No.lzu-jbky-2012-k35)
文摘Since the reform and opening-up program started in 1978,the level of urbanization has increased rapidly in China.Rapid urban expansion and restructuring have had significant impacts on the ecological environment especially within built-up areas.In this study,ArcGIS 10,ENVI 4.5,and Visual FoxPro 6.0 were used to analyze the human impacts on vegetation in the built-up areas of 656Chinese cities from 1992 to 2010.Firstly,an existing algorithm was refined to extract the boundaries of the built-up areas based on the Defense Meteorological Satellite Program Operational Linescan System(DMSP_OLS)nighttime light data.This improved algorithm has the advantages of high accuracy and speed.Secondly,a mathematical model(Human impacts(HI))was constructed to measure the impacts of human factors on vegetation during rapid urbanization based on Advanced Very High Resolution Radiometer(AVHRR)Normalized Difference Vegetation Index(NDVI)and Moderate Resolution Imaging Spectroradiometer(MODIS)NDVI.HI values greater than zero indicate relatively beneficial effects while values less than zero indicate proportionally adverse effects.The results were analyzed from four aspects:the size of cities(metropolises,large cities,medium-sized cities,and small cities),large regions(the eastern,central,western,and northeastern China),administrative divisions of China(provinces,autonomous regions,and municipalities)and vegetation zones(humid and semi-humid forest zone,semi-arid steppe zone,and arid desert zone).Finally,we discussed how human factors impacted on vegetation changes in the built-up areas.We found that urban planning policies and developmental stages impacted on vegetation changes in the built-up areas.The negative human impacts followed an inverted′U′shape,first rising and then falling with increase of urban scales.China′s national policies,social and economic development affected vegetation changes in the built-up areas.The findings can provide a scientific basis for municipal planning departments,a decision-making reference for government,and scientific guidance for sustainable development in China.
文摘Consider tile partial linear model Y=Xβ+ g(T) + e. Wilers Y is at risk of being censored from the right, g is an unknown smoothing function on [0,1], β is a 1-dimensional parameter to be estimated and e is an unobserved error. In Ref[1,2], it wes proved that the estimator for the asymptotic variance of βn(βn) is consistent. In this paper, we establish the limit distribution and the law of the iterated logarithm for,En, and obtain the convergest rates for En and the strong uniform convergent rates for gn(gn).
文摘In view of the frequent fluctuation of garlic price under the market economy and the current situation of garlic price,the fluctuation of garlic price in the circulation link of garlic industry chain is analyzed,and the application mode of multidisciplinary in the agricultural industry is discussed.On the basis of the big data platform of garlic industry chain,this paper constructs a Garch model to analyze the fluctuation law of garlic price in the circulation link and provides the garlic industry service from the angle of price fluctuation combined with the economic analysis.The research shows that the average price rate of the price of garlic shows“agglomeration”and cyclical phenomenon,which has the characteristics of fragility,left and a non-normal distribution and the fitting value of the GARCH model is very close to the true value.Finally,it looks into the industrial service form from the perspective of garlic price fluctuation.
基金Supported by the National Natural Science Foundation of China (11071022)the Key Project of Hubei Provincial Department of Education (D20092207)
文摘Consider a semiparametric regression model Y_i=X_iβ+g(t_i)+e_i, 1 ≤ i ≤ n, where Y_i is censored on the right by another random variable C_i with known or unknown distribution G. The wavelet estimators of parameter and nonparametric part are given by the wavelet smoothing and the synthetic data methods. Under general conditions, the asymptotic normality for the wavelet estimators and the convergence rates for the wavelet estimators of nonparametric components are investigated. A numerical example is given.
基金supported by the MSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2019-2016-0-00313)supervised by the IITP(Institute for Information&communication Technology Promotion)+1 种基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Science,ICT and Future Planning(2017R1E1A1A01074345).
文摘Predominantly the localization accuracy of the magnetic field-based localization approaches is severed by two limiting factors:Smartphone heterogeneity and smaller data lengths.The use of multifarioussmartphones cripples the performance of such approaches owing to the variability of the magnetic field data.In the same vein,smaller lengths of magnetic field data decrease the localization accuracy substantially.The current study proposes the use of multiple neural networks like deep neural network(DNN),long short term memory network(LSTM),and gated recurrent unit network(GRN)to perform indoor localization based on the embedded magnetic sensor of the smartphone.A voting scheme is introduced that takes predictions from neural networks into consideration to estimate the current location of the user.Contrary to conventional magnetic field-based localization approaches that rely on the magnetic field data intensity,this study utilizes the normalized magnetic field data for this purpose.Training of neural networks is carried out using Galaxy S8 data while the testing is performed with three devices,i.e.,LG G7,Galaxy S8,and LG Q6.Experiments are performed during different times of the day to analyze the impact of time variability.Results indicate that the proposed approach minimizes the impact of smartphone variability and elevates the localization accuracy.Performance comparison with three approaches reveals that the proposed approach outperforms them in mean,50%,and 75%error even using a lesser amount of magnetic field data than those of other approaches.
基金the Frontier Program of the Knowledge Innovation Program of Chinese Academy of Sciences
文摘Based on the 16d-composite MODIS (moderate resolution imaging spectroradiometer)-NDVI(normalized difference vegetation index) time-series data in 2004, vegetation in North Tibet Plateau was classified and seasonal variations on the pixels selected from different vegetation type were analyzed. The Savitzky-Golay filtering algorithm was applied to perform a filtration processing for MODIS-NDVI time-series data. The processed time-series curves can reflect a real variation trend of vegetation growth. The NDVI time-series curves of coniferous forest, high-cold meadow, high-cold meadow steppe and high-cold steppe all appear a mono-peak model during vegetation growth with the maximum peak occurring in August. A decision-tree classification model was established according to either NDVI time-series data or land surface temperature data. And then, both classifying and processing for vegetations were carried out through the model based on NDVI time-series curves. An accuracy test illustrates that classification results are of high accuracy and credibility and the model is conducive for studying a climate variation and estimating a vegetation production at regional even global scale.