Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemio...Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.展开更多
Essential proteins are an indispensable part of cells and play an extremely significant role in genetic disease diagnosis and drug development.Therefore,the prediction of essential proteins has received extensive atte...Essential proteins are an indispensable part of cells and play an extremely significant role in genetic disease diagnosis and drug development.Therefore,the prediction of essential proteins has received extensive attention from researchers.Many centrality methods and machine learning algorithms have been proposed to predict essential proteins.Nevertheless,the topological characteristics learned by the centrality method are not comprehensive enough,resulting in low accuracy.In addition,machine learning algorithms need sufficient prior knowledge to select features,and the ability to solve imbalanced classification problems needs to be further strengthened.These two factors greatly affect the performance of predicting essential proteins.In this paper,we propose a deep learning framework based on temporal convolutional networks to predict essential proteins by integrating gene expression data and protein-protein interaction(PPI)network.We make use of the method of network embedding to automatically learn more abundant features of proteins in the PPI network.For gene expression data,we treat it as sequence data,and use temporal convolutional networks to extract sequence features.Finally,the two types of features are integrated and put into the multi-layer neural network to complete the final classification task.The performance of our method is evaluated by comparing with seven centrality methods,six machine learning algorithms,and two deep learning models.The results of the experiment show that our method is more effective than the comparison methods for predicting essential proteins.展开更多
High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation lear...High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation learning to an HDI matrix,whose hyper-parameter adaptation can be implemented through a particle swarm optimizer(PSO) to meet scalable requirements.However, conventional PSO is limited by its premature issues,which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle's state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer(SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational efficiency.Hence, SPSO's use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.展开更多
In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all usef...In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.展开更多
The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based o...The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based on complete data. This paper studies the optimal estimation of high-dimensional covariance matrices based on missing and noisy sample under the norm. First, the model with sub-Gaussian additive noise is presented. The generalized sample covariance is then modified to define a hard thresholding estimator , and the minimax upper bound is derived. After that, the minimax lower bound is derived, and it is concluded that the estimator presented in this article is rate-optimal. Finally, numerical simulation analysis is performed. The result shows that for missing samples with sub-Gaussian noise, if the true covariance matrix is sparse, the hard thresholding estimator outperforms the traditional estimate method.展开更多
Cadastral Information System (CIS) is designed for the office automation of cadastral management. With the development of the market economics in China, cadastral management is facing many new problems. The most cruci...Cadastral Information System (CIS) is designed for the office automation of cadastral management. With the development of the market economics in China, cadastral management is facing many new problems. The most crucial one is the temporal problem in cadastral management. That is, CIS must consider both spatial data and temporal data. This paper reviews the situation of the current CIS and provides a method to manage the spatiotemporal data of CIS, and takes the CIS for Guangdong Province as an example to explain how to realize it in practice.展开更多
The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities...The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity,leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals,and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this method,three data types are used,and seven common similarity measurement methods are compared.The experimental result indicates that the relative difference of the method is increasing with the dimensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition,the similarity range of this method in different dimensions is [0,1],which is fit for similarity analysis after dimensionality reduction.展开更多
Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in t...Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in time series forecasting. However, two problems weaken the performance of TCNs. One is that in dilated casual convolution, causal convolution leads to the receptive fields of outputs being concentrated in the earlier part of the input sequence, whereas the recent input information will be severely lost. The other is that the distribution shift problem in time series has not been adequately solved. To address the first problem, we propose a subsequence-based dilated convolution method (SDC). By using multiple convolutional filters to convolve elements of neighboring subsequences, the method extracts temporal features from a growing receptive field via a growing subsequence rather than a single element. Ultimately, the receptive field of each output element can cover the whole input sequence. To address the second problem, we propose a difference and compensation method (DCM). The method reduces the discrepancies between and within the input sequences by difference operations and then compensates the outputs for the information lost due to difference operations. Based on SDC and DCM, we further construct a temporal subsequence-based convolutional network with difference (TSCND) for time series forecasting. The experimental results show that TSCND can reduce prediction mean squared error by 7.3% and save runtime, compared with state-of-the-art models and vanilla TCN.展开更多
This paper presents a conceptual data model, the STA-model, for handling spatial, temporal and attribute aspects of objects in GIS. The model is developed on the basis of object-oriented modeling approach. This model ...This paper presents a conceptual data model, the STA-model, for handling spatial, temporal and attribute aspects of objects in GIS. The model is developed on the basis of object-oriented modeling approach. This model includes two major parts: (a) modeling the signal objects by STA-object elements, and (b) modeling relationships between STA-objects. As an example, the STA-model is applied for modeling land cover change data with spatial, temporal and attribute components.展开更多
This paper integrates genetic algorithm and neura l network techniques to build new temporal predicting analysis tools for geographic information system (GIS). These new GIS tools can be readily applied in a practical...This paper integrates genetic algorithm and neura l network techniques to build new temporal predicting analysis tools for geographic information system (GIS). These new GIS tools can be readily applied in a practical and appropriate manner in spatial and temp oral research to patch the gaps in GIS data mining and knowledge discovery functions. The specific achievement here is the integration of related artificial intellig ent technologies into GIS software to establish a conceptual spatial and temporal analysis framework. And, by using this framework to develop an artificial intelligent spatial and tempor al information analyst (ASIA) system which then is fully utilized in the existin g GIS package. This study of air pollutants forecasting provides a geographical practical case to prove the rationalization and justness of the conceptual tempo ral analysis framework.展开更多
For sequential performance of wave variational data assimilation, we proposed a temporal sliding method in which the temporal overlap is considered. The advantage of this method is that the initial wave spectrum of th...For sequential performance of wave variational data assimilation, we proposed a temporal sliding method in which the temporal overlap is considered. The advantage of this method is that the initial wave spectrum of the optimization process is modified by the observations in latter and former times. This temporal sliding procedure is important for marginal region, such as the China seas, where the duration of assimilation effectiveness is 2-3 days. Experiments were performed in the whole course of Cyclone 9403 (Russ). Around the cyclone center, the maximum value of wave elements did not change much by assimilation, because the extreme value was determined by wind energy input that was not yet optimized. In the area outside the cyclone center, this modification is evident especially for wind wave growth.展开更多
Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently...Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model, the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery fi'om spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.展开更多
Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role...Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role in the cooking oil market of China. The Jianghan Plain and Dongting Lake Plain (JPDLP) are major agricultural production areas in China. Essential changes in winter rape distribution have taken place in this area during the 21st century. However, the pattern of these changes remains unknown. In this study, the spatial and temporal dynamics of winter rape from 2000 to 2017 on the JPDLP were analyzed. An artificial neural network (ANN)-based classification method was proposed to map fractional winter rape distribution by fusing moderate resolution imaging spectrometer (MODIS) data and high-resolution imagery. The results are as follows:(1) The total winter rape acreages on the JPDLP dropped significantly, especially on the Jianghan Plain with a decline of about 45% during 2000 and 2017.(2) The winter rape abundance keeps changing with about 20–30% croplands changing their abundance drastically in every two consecutive observation years.(3) The winter rape has obvious regional differentiation for the trend of its change at the county level, and the decreasing trend was observed more strongly in the traditionally dominant agricultural counties.展开更多
Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing th...Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing the effects of coal fires, and their environmental impact. In this study, the spatio-temporal changes of underground coal fires in Khanh Hoa coal field(North-East of Viet Nam) were analyzed using Landsat time-series data during the 2008-2016 period. Based on land surface temperatures retrieved from Landsat thermal data, underground coal fires related to thermal anomalies were identified using the MEDIAN+1.5×IQR(IQR: Interquartile range) threshold technique. The locations of underground coal fires were validated using a coal fire map produced by the field survey data and cross-validated using the daytime ASTER thermal infrared imagery. Based on the fires extracted from seven Landsat thermal imageries, the spatiotemporal changes of underground coal fire areas were analyzed. The results showed that the thermalanomalous zones have been correlated with known coal fires. Cross-validation of coal fires using ASTER TIR data showed a high consistency of 79.3%. The largest coal fire area of 184.6 hectares was detected in 2010, followed by 2014(181.1 hectares) and 2016(178.5 hectares). The smaller coal fire areas were extracted with areas of 133.6 and 152.5 hectares in 2011 and 2009 respectively. Underground coal fires were mainly detected in the northern and southern part, and tend to spread to north-west of the coal field.展开更多
In large spatial,temporal,and spatio-temporal databases,the concept of dimensions–time and space–is central and their efficient processing has always been a great concern.A Geographic Information System(GIS)is widel...In large spatial,temporal,and spatio-temporal databases,the concept of dimensions–time and space–is central and their efficient processing has always been a great concern.A Geographic Information System(GIS)is widely known for its efficient and sophisticated handling of the spatial databases;however,integration of the time domain introduces additional complexities.Considerable research is ongoing in this area.Though numerous approaches have already been proposed in the literature to address the time-related challenges in GIS community,such as the development of temporal GIS data models,this study contributes to these ongoing efforts through extensions to the Parametric Data Model.Parametric database approaches(Parametric Data Model)have been developed to handle dimensional data such as space,time,belief,etc.In this study,the time dimension is of specific interest and we propose the integration of parametric approaches to GIS to absorb the temporal burden.The temporal processing of the data is done using parametric approaches that address the limitation of GIS to support temporal queries and temporal analysis of phenomena.Using the AutoConViz tool,developed for spatio-temporal data modeling and visualization,results can be directly integrated into standard GIS software programs such as ArcGIS and further processed seamlessly along with and in a similar manner to other attributes.展开更多
Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data. The equidistance problem is solved using NPsim function to calculat...Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data. The equidistance problem is solved using NPsim function to calculate similarity. And a sequential NPsim matrix is built to improve indexing performance. To sum up the above innovations,a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set. Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others. In addition,the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.展开更多
Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlatio...Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlation of spatiotemporal heterogeneous data in the temporal and spatial dimensions,traditional detection methods can not guarantee both detection speed and accuracy.Therefore,this article proposes a method for detecting the accuracy of spatiotemporal heterogeneous data by fusing graph convolution and temporal convolution networks.Firstly,the geographic weighting function is introduced and improved to quantify the degree of association between nodes and calculate the weighted adjacency value to simplify the complex topology.Secondly,design spatiotemporal convolutional units based on graph convolutional neural networks and temporal convolutional networks to improve detection speed and accuracy.Finally,the proposed method is compared with three methods,ARIMA,T-GCN,and STGCN,in real scenarios to verify its effectiveness in terms of detection speed,detection accuracy and stability.The experimental results show that the RMSE,MAE,and MAPE of this method are the smallest in the cases of simple connectivity and complex connectivity degree,which are 13.82/12.08,2.77/2.41,and 16.70/14.73,respectively.Also,it detects the shortest time of 672.31/887.36,respectively.In addition,the evaluation results are the same under different time periods of processing and complex topology environment,which indicates that the detection accuracy of this method is the highest and has good research value and application prospects.展开更多
Viticulturists traditionally have a keen interest in studying the relationship between the biochemistry of grapevines’ leaves/petioles and their associated spectral reflectance in order to understand the fruit ripeni...Viticulturists traditionally have a keen interest in studying the relationship between the biochemistry of grapevines’ leaves/petioles and their associated spectral reflectance in order to understand the fruit ripening rate, water status, nutrient levels, and disease risk. In this paper, we implement imaging spectroscopy (hyperspectral) reflectance data, for the reflective 330 - 2510 nm wavelength region (986 total spectral bands), to assess vineyard nutrient status;this constitutes a high dimensional dataset with a covariance matrix that is ill-conditioned. The identification of the variables (wavelength bands) that contribute useful information for nutrient assessment and prediction, plays a pivotal role in multivariate statistical modeling. In recent years, researchers have successfully developed many continuous, nearly unbiased, sparse and accurate variable selection methods to overcome this problem. This paper compares four regularized and one functional regression methods: Elastic Net, Multi-Step Adaptive Elastic Net, Minimax Concave Penalty, iterative Sure Independence Screening, and Functional Data Analysis for wavelength variable selection. Thereafter, the predictive performance of these regularized sparse models is enhanced using the stepwise regression. This comparative study of regression methods using a high-dimensional and highly correlated grapevine hyperspectral dataset revealed that the performance of Elastic Net for variable selection yields the best predictive ability.展开更多
Real-time database systems contain not only transaction timing constraints, but also data timing constraints. This paper discusses the temporal characteristics of data in real-time databases and offers a definition of...Real-time database systems contain not only transaction timing constraints, but also data timing constraints. This paper discusses the temporal characteristics of data in real-time databases and offers a definition of absolute and relative temporal consistency. In real-time database systems, it is often the case that the policies of transaction schedules only consider the deadline of real-time transactions, making it insufficient to ensure temporal correctness of transactions. A policy is given by considering both the deadlines of transactions and the “data deadline” to schedule real-time transactions. A real-time relational data model and a real-time relational algebra based on the characteristics of temporal data are also proposed. In this model, the temporal data has not only corresponding values, but also validity intervals corresponding to the data values. At the same time, this model is able to keep historical data values. When validity interval of a relation is [NOW, NOW], real-time relational algebra will transform to traditional relational algebra.展开更多
基金supported in part by the Young Scientists Fund of the National Natural Science Foundation of China(Grant Nos.82304253)(and 82273709)the Foundation for Young Talents in Higher Education of Guangdong Province(Grant No.2022KQNCX021)the PhD Starting Project of Guangdong Medical University(Grant No.GDMUB2022054).
文摘Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.
基金the National Natural Science Foundation of China(Nos.11861045 and 62162040)。
文摘Essential proteins are an indispensable part of cells and play an extremely significant role in genetic disease diagnosis and drug development.Therefore,the prediction of essential proteins has received extensive attention from researchers.Many centrality methods and machine learning algorithms have been proposed to predict essential proteins.Nevertheless,the topological characteristics learned by the centrality method are not comprehensive enough,resulting in low accuracy.In addition,machine learning algorithms need sufficient prior knowledge to select features,and the ability to solve imbalanced classification problems needs to be further strengthened.These two factors greatly affect the performance of predicting essential proteins.In this paper,we propose a deep learning framework based on temporal convolutional networks to predict essential proteins by integrating gene expression data and protein-protein interaction(PPI)network.We make use of the method of network embedding to automatically learn more abundant features of proteins in the PPI network.For gene expression data,we treat it as sequence data,and use temporal convolutional networks to extract sequence features.Finally,the two types of features are integrated and put into the multi-layer neural network to complete the final classification task.The performance of our method is evaluated by comparing with seven centrality methods,six machine learning algorithms,and two deep learning models.The results of the experiment show that our method is more effective than the comparison methods for predicting essential proteins.
基金supported in part by the National Natural Science Foundation of China (62372385, 62272078, 62002337)the Chongqing Natural Science Foundation (CSTB2022NSCQ-MSX1486, CSTB2023NSCQ-LZX0069)the Deanship of Scientific Research at King Abdulaziz University, Jeddah, Saudi Arabia (RG-12-135-43)。
文摘High-dimensional and incomplete(HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis(LFA) model is capable of conducting efficient representation learning to an HDI matrix,whose hyper-parameter adaptation can be implemented through a particle swarm optimizer(PSO) to meet scalable requirements.However, conventional PSO is limited by its premature issues,which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle's state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer(SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational efficiency.Hence, SPSO's use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.
基金Outstanding Youth Foundation of Hunan Provincial Department of Education(Grant No.22B0911)。
文摘In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.
文摘The estimation of covariance matrices is very important in many fields, such as statistics. In real applications, data are frequently influenced by high dimensions and noise. However, most relevant studies are based on complete data. This paper studies the optimal estimation of high-dimensional covariance matrices based on missing and noisy sample under the norm. First, the model with sub-Gaussian additive noise is presented. The generalized sample covariance is then modified to define a hard thresholding estimator , and the minimax upper bound is derived. After that, the minimax lower bound is derived, and it is concluded that the estimator presented in this article is rate-optimal. Finally, numerical simulation analysis is performed. The result shows that for missing samples with sub-Gaussian noise, if the true covariance matrix is sparse, the hard thresholding estimator outperforms the traditional estimate method.
文摘Cadastral Information System (CIS) is designed for the office automation of cadastral management. With the development of the market economics in China, cadastral management is facing many new problems. The most crucial one is the temporal problem in cadastral management. That is, CIS must consider both spatial data and temporal data. This paper reviews the situation of the current CIS and provides a method to manage the spatiotemporal data of CIS, and takes the CIS for Guangdong Province as an example to explain how to realize it in practice.
基金Supported by the National Natural Science Foundation of China(No.61502475)the Importation and Development of High-Caliber Talents Project of the Beijing Municipal Institutions(No.CIT&TCD201504039)
文摘The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity,leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals,and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this method,three data types are used,and seven common similarity measurement methods are compared.The experimental result indicates that the relative difference of the method is increasing with the dimensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition,the similarity range of this method in different dimensions is [0,1],which is fit for similarity analysis after dimensionality reduction.
基金supported by the National Key Research and Development Program of China(No.2018YFB2101300)the National Natural Science Foundation of China(Grant No.61871186)the Dean’s Fund of Engineering Research Center of Software/Hardware Co-Design Technology and Application,Ministry of Education(East China Normal University).
文摘Time series forecasting plays an important role in various fields, such as energy, finance, transport, and weather. Temporal convolutional networks (TCNs) based on dilated causal convolution have been widely used in time series forecasting. However, two problems weaken the performance of TCNs. One is that in dilated casual convolution, causal convolution leads to the receptive fields of outputs being concentrated in the earlier part of the input sequence, whereas the recent input information will be severely lost. The other is that the distribution shift problem in time series has not been adequately solved. To address the first problem, we propose a subsequence-based dilated convolution method (SDC). By using multiple convolutional filters to convolve elements of neighboring subsequences, the method extracts temporal features from a growing receptive field via a growing subsequence rather than a single element. Ultimately, the receptive field of each output element can cover the whole input sequence. To address the second problem, we propose a difference and compensation method (DCM). The method reduces the discrepancies between and within the input sequences by difference operations and then compensates the outputs for the information lost due to difference operations. Based on SDC and DCM, we further construct a temporal subsequence-based convolutional network with difference (TSCND) for time series forecasting. The experimental results show that TSCND can reduce prediction mean squared error by 7.3% and save runtime, compared with state-of-the-art models and vanilla TCN.
文摘This paper presents a conceptual data model, the STA-model, for handling spatial, temporal and attribute aspects of objects in GIS. The model is developed on the basis of object-oriented modeling approach. This model includes two major parts: (a) modeling the signal objects by STA-object elements, and (b) modeling relationships between STA-objects. As an example, the STA-model is applied for modeling land cover change data with spatial, temporal and attribute components.
文摘This paper integrates genetic algorithm and neura l network techniques to build new temporal predicting analysis tools for geographic information system (GIS). These new GIS tools can be readily applied in a practical and appropriate manner in spatial and temp oral research to patch the gaps in GIS data mining and knowledge discovery functions. The specific achievement here is the integration of related artificial intellig ent technologies into GIS software to establish a conceptual spatial and temporal analysis framework. And, by using this framework to develop an artificial intelligent spatial and tempor al information analyst (ASIA) system which then is fully utilized in the existin g GIS package. This study of air pollutants forecasting provides a geographical practical case to prove the rationalization and justness of the conceptual tempo ral analysis framework.
基金Supported by the High-Tech Research and Development Program of China (863 Program, No. 2001AA633070 2003AA604040)Na- tional Natural Science Foundation of China (No. 40206003).
文摘For sequential performance of wave variational data assimilation, we proposed a temporal sliding method in which the temporal overlap is considered. The advantage of this method is that the initial wave spectrum of the optimization process is modified by the observations in latter and former times. This temporal sliding procedure is important for marginal region, such as the China seas, where the duration of assimilation effectiveness is 2-3 days. Experiments were performed in the whole course of Cyclone 9403 (Russ). Around the cyclone center, the maximum value of wave elements did not change much by assimilation, because the extreme value was determined by wind energy input that was not yet optimized. In the area outside the cyclone center, this modification is evident especially for wind wave growth.
基金supported by the National Key Basic Research and Development Program of China under contract No.2006CB701305the National Natural Science Foundation of China under coutract No.40571129the National High-Technology Program of China under contract Nos 2002AA639400,2003AA604040 and 2003AA637030.
文摘Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model, the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery fi'om spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.
基金supported by the Natural Science Foundation of Hubei Province, China (2017CFB434)the National Natural Science Foundation of China (41506208 and 61501200)the Basic Research Funds for Yellow River Institute of Hydraulic Research, China (HKYJBYW-2016-06)
文摘Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role in the cooking oil market of China. The Jianghan Plain and Dongting Lake Plain (JPDLP) are major agricultural production areas in China. Essential changes in winter rape distribution have taken place in this area during the 21st century. However, the pattern of these changes remains unknown. In this study, the spatial and temporal dynamics of winter rape from 2000 to 2017 on the JPDLP were analyzed. An artificial neural network (ANN)-based classification method was proposed to map fractional winter rape distribution by fusing moderate resolution imaging spectrometer (MODIS) data and high-resolution imagery. The results are as follows:(1) The total winter rape acreages on the JPDLP dropped significantly, especially on the Jianghan Plain with a decline of about 45% during 2000 and 2017.(2) The winter rape abundance keeps changing with about 20–30% croplands changing their abundance drastically in every two consecutive observation years.(3) The winter rape has obvious regional differentiation for the trend of its change at the county level, and the decreasing trend was observed more strongly in the traditionally dominant agricultural counties.
基金funded by the Ministry-level Scientific and Technological Key Programs of Ministry of Natural Resources and Environment of Viet Nam "Application of thermal infrared remote sensing and GIS for mapping underground coal fires in Quang Ninh coal basin" (Grant No. TNMT.2017.08.06)
文摘Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing the effects of coal fires, and their environmental impact. In this study, the spatio-temporal changes of underground coal fires in Khanh Hoa coal field(North-East of Viet Nam) were analyzed using Landsat time-series data during the 2008-2016 period. Based on land surface temperatures retrieved from Landsat thermal data, underground coal fires related to thermal anomalies were identified using the MEDIAN+1.5×IQR(IQR: Interquartile range) threshold technique. The locations of underground coal fires were validated using a coal fire map produced by the field survey data and cross-validated using the daytime ASTER thermal infrared imagery. Based on the fires extracted from seven Landsat thermal imageries, the spatiotemporal changes of underground coal fire areas were analyzed. The results showed that the thermalanomalous zones have been correlated with known coal fires. Cross-validation of coal fires using ASTER TIR data showed a high consistency of 79.3%. The largest coal fire area of 184.6 hectares was detected in 2010, followed by 2014(181.1 hectares) and 2016(178.5 hectares). The smaller coal fire areas were extracted with areas of 133.6 and 152.5 hectares in 2011 and 2009 respectively. Underground coal fires were mainly detected in the northern and southern part, and tend to spread to north-west of the coal field.
文摘In large spatial,temporal,and spatio-temporal databases,the concept of dimensions–time and space–is central and their efficient processing has always been a great concern.A Geographic Information System(GIS)is widely known for its efficient and sophisticated handling of the spatial databases;however,integration of the time domain introduces additional complexities.Considerable research is ongoing in this area.Though numerous approaches have already been proposed in the literature to address the time-related challenges in GIS community,such as the development of temporal GIS data models,this study contributes to these ongoing efforts through extensions to the Parametric Data Model.Parametric database approaches(Parametric Data Model)have been developed to handle dimensional data such as space,time,belief,etc.In this study,the time dimension is of specific interest and we propose the integration of parametric approaches to GIS to absorb the temporal burden.The temporal processing of the data is done using parametric approaches that address the limitation of GIS to support temporal queries and temporal analysis of phenomena.Using the AutoConViz tool,developed for spatio-temporal data modeling and visualization,results can be directly integrated into standard GIS software programs such as ArcGIS and further processed seamlessly along with and in a similar manner to other attributes.
基金Supported by the National Natural Science Foundation of China(No.61300078)the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions(No.CIT&TCD201504039)+1 种基金Funding Project for Academic Human Resources Development in Beijing Union University(No.BPHR2014A03,Rk100201510)"New Start"Academic Research Projects of Beijing Union University(No.Hzk10201501)
文摘Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data. The equidistance problem is solved using NPsim function to calculate similarity. And a sequential NPsim matrix is built to improve indexing performance. To sum up the above innovations,a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set. Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others. In addition,the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.
基金supported by the National Natural Science Foundation of China under Grants 42172161by the Heilongjiang Provincial Natural Science Foundation of China under Grant LH2020F003+2 种基金by the Heilongjiang Provincial Department of Education Project of China under Grants UNPYSCT-2020144by the Innovation Guidance Fund of Heilongjiang Province of China under Grants 15071202202by the Science and Technology Bureau Project of Qinhuangdao Province of China under Grants 202101A226.
文摘Spatio-temporal heterogeneous data is the database for decisionmaking in many fields,and checking its accuracy can provide data support for making decisions.Due to the randomness,complexity,global and local correlation of spatiotemporal heterogeneous data in the temporal and spatial dimensions,traditional detection methods can not guarantee both detection speed and accuracy.Therefore,this article proposes a method for detecting the accuracy of spatiotemporal heterogeneous data by fusing graph convolution and temporal convolution networks.Firstly,the geographic weighting function is introduced and improved to quantify the degree of association between nodes and calculate the weighted adjacency value to simplify the complex topology.Secondly,design spatiotemporal convolutional units based on graph convolutional neural networks and temporal convolutional networks to improve detection speed and accuracy.Finally,the proposed method is compared with three methods,ARIMA,T-GCN,and STGCN,in real scenarios to verify its effectiveness in terms of detection speed,detection accuracy and stability.The experimental results show that the RMSE,MAE,and MAPE of this method are the smallest in the cases of simple connectivity and complex connectivity degree,which are 13.82/12.08,2.77/2.41,and 16.70/14.73,respectively.Also,it detects the shortest time of 672.31/887.36,respectively.In addition,the evaluation results are the same under different time periods of processing and complex topology environment,which indicates that the detection accuracy of this method is the highest and has good research value and application prospects.
文摘Viticulturists traditionally have a keen interest in studying the relationship between the biochemistry of grapevines’ leaves/petioles and their associated spectral reflectance in order to understand the fruit ripening rate, water status, nutrient levels, and disease risk. In this paper, we implement imaging spectroscopy (hyperspectral) reflectance data, for the reflective 330 - 2510 nm wavelength region (986 total spectral bands), to assess vineyard nutrient status;this constitutes a high dimensional dataset with a covariance matrix that is ill-conditioned. The identification of the variables (wavelength bands) that contribute useful information for nutrient assessment and prediction, plays a pivotal role in multivariate statistical modeling. In recent years, researchers have successfully developed many continuous, nearly unbiased, sparse and accurate variable selection methods to overcome this problem. This paper compares four regularized and one functional regression methods: Elastic Net, Multi-Step Adaptive Elastic Net, Minimax Concave Penalty, iterative Sure Independence Screening, and Functional Data Analysis for wavelength variable selection. Thereafter, the predictive performance of these regularized sparse models is enhanced using the stepwise regression. This comparative study of regression methods using a high-dimensional and highly correlated grapevine hyperspectral dataset revealed that the performance of Elastic Net for variable selection yields the best predictive ability.
基金Project 60073045 supported by National Natural Science Foundation of China
文摘Real-time database systems contain not only transaction timing constraints, but also data timing constraints. This paper discusses the temporal characteristics of data in real-time databases and offers a definition of absolute and relative temporal consistency. In real-time database systems, it is often the case that the policies of transaction schedules only consider the deadline of real-time transactions, making it insufficient to ensure temporal correctness of transactions. A policy is given by considering both the deadlines of transactions and the “data deadline” to schedule real-time transactions. A real-time relational data model and a real-time relational algebra based on the characteristics of temporal data are also proposed. In this model, the temporal data has not only corresponding values, but also validity intervals corresponding to the data values. At the same time, this model is able to keep historical data values. When validity interval of a relation is [NOW, NOW], real-time relational algebra will transform to traditional relational algebra.