Challenges in stratigraphic modeling arise from underground uncertainty.While borehole exploration is reliable,it remains sparse due to economic and site constraints.Electrical resistivity tomography(ERT)as a cost-eff...Challenges in stratigraphic modeling arise from underground uncertainty.While borehole exploration is reliable,it remains sparse due to economic and site constraints.Electrical resistivity tomography(ERT)as a cost-effective geophysical technique can acquire high-density data;however,uncertainty and nonuniqueness inherent in ERT impede its usage for stratigraphy identification.This paper integrates ERT and onsite observations for the first time to propose a novel method for characterizing stratigraphic profiles.The method consists of two steps:(1)ERT for prior knowledge:ERT data are processed by soft clustering using the Gaussian mixture model,followed by probability smoothing to quantify its depthdependent uncertainty;and(2)Observations for calibration:a spatial sequential Bayesian updating(SSBU)algorithm is developed to update the prior knowledge based on likelihoods derived from onsite observations,namely topsoil and boreholes.The effectiveness of the proposed method is validated through its application to a real slope site in Foshan,China.Comparative analysis with advanced borehole-driven methods highlights the superiority of incorporating ERT data in stratigraphic modeling,in terms of prediction accuracy at borehole locations and sensitivity to borehole data.Informed by ERT,reduced sensitivity to boreholes provides a fundamental solution to the longstanding challenge of sparse measurements.The paper further discusses the impact of ERT uncertainty on the proposed model using time-lapse measurements,the impact of model resolution,and applicability in engineering projects.This study,as a breakthrough in stratigraphic modeling,bridges gaps in combining geophysical and geotechnical data to address measurement sparsity and paves the way for more economical geotechnical exploration.展开更多
Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasonin...Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasoning on the basis of greedy and directional strategy to establish a spatial heuristic,so as to improve running efficiency and suitability of shortest path algorithm for traffic network.The authors divide urban traffic network into three hierarchies and set forward a new node hierarchy division rule to avoid the unreliable solution of shortest path.It is argued that the shortest path,no matter distance shortest or time shortest,is usually not the favorite of drivers in practice.Some factors difficult to expect or quantify influence the drivers’ choice greatly.It makes the drivers prefer choosing a less shortest,but more reliable or flexible path to travel on.The presented optimum path algorithm,in addition to the improvement of the running efficiency of shortest path algorithms up to several times,reduces the emergence of those factors,conforms to the intellection characteristic of human beings,and is more easily accepted by drivers.Moreover,it does not require the completeness of networks in the lowest hierarchy and the applicability and fault tolerance of the algorithm have improved.The experiment result shows the advantages of the presented algorithm.The authors argued that the algorithm has great potential application for navigation systems of large_scale traffic networks.展开更多
Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recogni...Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.展开更多
Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to t...Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to the minimum condition principle of form error evaluation, the mathematic model and optimization objective of the GASA are given. The algorithm avoids the optimization and linearization, and can be fulfilled in three steps. First construct two parallel quadrates based on the preset two reference points of the spatial line respectively;second construct centerlines by connecting one quadrate each vertices to another quadrate each vertices;after that, calculate the distances between measured points and the constructed centerlines. The minimum zone straightness error is obtained by repeating comparing and reconstructing quadrates. The principle and steps of the algorithm to evaluate spatial straightness error is described in detail, and the mathematical formula and program flowchart are given also. Results show that this algorithm can evaluate spatial straightness error more effectively and exactly.展开更多
This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteris...This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteristics with respect to the dynamic data environment. On GIS and CAD systems, the R-tree and its successors have been used. In addition, the NN search algorithm is also proposed in an attempt to obtain good performance from the R-tree. On the other hand, the GBD tree is superior to the R-tree with respect to exact match retrieval, because the GBD tree has auxiliary data that uniquely determines the position of the object in the structure. The proposed NN search algorithm depends on the property of the GBD tree described above. The NN search algorithm on the GBD tree was studied and the performance thereof was evaluated through experiments.展开更多
Addressing the issue that flight plans between Chinese city pairs typically rely on a single route,lacking alternative paths and posing challenges in responding to emergencies,this study employs the“quantile-inflecti...Addressing the issue that flight plans between Chinese city pairs typically rely on a single route,lacking alternative paths and posing challenges in responding to emergencies,this study employs the“quantile-inflection point method”to analyze specific deviation trajectories,determine deviation thresholds,and identify commonly used deviation paths.By combining multiple similarity metrics,including Euclidean distance,Hausdorff distance,and sector edit distance,with the density-based spatial clustering of applications with noise(DBSCAN)algorithm,the study clusters deviation trajectories to construct a multi-option trajectory set for city pairs.A case study of 23578 flight trajectories between the Guangzhou airport cluster and the Shanghai airport cluster demonstrates the effectiveness of the proposed framework.Experimental results show that sector edit distance achieves superior clustering performance compared to Euclidean and Hausdorff distances,with higher silhouette coefficients and lower Davies⁃Bouldin indices,ensuring better intra-cluster compactness and inter-cluster separation.Based on clustering results,19 representative trajectory options are identified,covering both nominal and deviation paths,which significantly enhance route diversity and reflect actual flight practices.This provides a practical basis for optimizing flight paths and scheduling,enhancing the flexibility of route selection for flights between city pairs.展开更多
The transitional path towards a highly renewable power system based on wind and solar energy sources is investigated considering their intermittent and spatially distributed characteristics. Using an extensive weather...The transitional path towards a highly renewable power system based on wind and solar energy sources is investigated considering their intermittent and spatially distributed characteristics. Using an extensive weather-driven simulation of hourly power mismatches between generation and load, we explore the interplay between geographical resource complementarity and energy storage strategies. Solar and wind resources are considered at variable spatial scales across Europe and related to the Swiss load curve, which serve as a typical demand side reference. The optimal spatial distribution of renewable units is further assessed through a parameterized optimization method based on a genetic algorithm. It allows us to explore systematically the effective potential of combined integration strategies depending on the sizing of the system, with a focus on how overall performance is affected by the definition of network boundaries. Upper bounds on integration schemes are provided considering both renewable penetration and needed reserve power capacity. The quantitative trade-off between grid extension, storage and optimal wind-solar mix is highlighted.This paper also brings insights on how optimal geographical distribution of renewable units evolves as a function of renewable penetration and grid extent.展开更多
As an essential tool for realistic description of the current or future debris environment,the Space Debris Environment Engineering Model(SDEEM)has been developed to provide support for risk assessment of spacecraft.I...As an essential tool for realistic description of the current or future debris environment,the Space Debris Environment Engineering Model(SDEEM)has been developed to provide support for risk assessment of spacecraft.In contrast with SDEEM2015,SDEEM2019,the latest version,extends the orbital range from the Low Earth Orbit(LEO)to Geosynchronous Orbit(GEO)for the years 1958-2050.In this paper,improved modeling algorithms used by SDEEM2019 in propagating simulation,spatial density distribution,and spacecraft flux evaluation are presented.The debris fluxes of SDEEM2019 are compared with those of three typical models,i.e.,SDEEM2015,Orbital Debris Engineering Model 3.1(ORDEM 3.1),and Meteoroid and Space Debris Terrestrial Environment Reference(MASTER-8),in terms of two assessment modes.Three orbital cases,including the Geostationary Transfer Orbit(GTO),Sun-Synchronous Orbit(SSO)and International Space Station(ISS)orbit,are selected for the spacecraft assessment mode,and the LEO region is selected for the spatial density assessment mode.The analysis indicates that compared with previous algorithms,the variable step-size orbital propagating algorithm based on semi-major axis control is more precise,the spatial density algorithm based on the second zonal harmonic of the non-spherical Earth gravity(J_(2))is more applicable,and the result of the position-centered spacecraft flux algorithm is more convergent.The comparison shows that SDEEM2019 and MASTER-8 have consistent trends due to similar modeling processes,while the differences between SDEEM2019 and ORDEM 3.1 are mainly caused by different modeling approaches for uncatalogued debris.展开更多
To reduce the error from measurement and retrieval process, a new technology of spatial heterodyne spectroscopy is proposed. The principle of this technology and the instrument spatial het- erodyne spectrometer (SHS...To reduce the error from measurement and retrieval process, a new technology of spatial heterodyne spectroscopy is proposed. The principle of this technology and the instrument spatial het- erodyne spectrometer (SHS) are introduced. The first application of this technology will be for CO2 measurements from space on a high spectral observation satellite. The outstanding measurement principle and the priority of combination of retrieval algorithm and three channels ( O2 A-band, CO2 1.58 μm and 2.06 μm bands) are theoretically analyzed and numerically simulated. Experiments u- sing SHS prototype with low spectral resolution of 0. 4 cm -1are carried out for preliminary valida- tion. The measurements show clear CO2 absorption lines and follow the expected signature with the- ory spectrum, and the retrievals agreed well with GOSAT CO2 products, except a small bias of about 4 × 10 ^-6. The results show that the ability of spatial heterodyne spectroscopy for CO2 detecting is ob- vious, and SHS is a competent sensor.展开更多
As a promising technique to enhance the spatial reso- lution of remote sensing imagery, sub-pixel mapping is processed based on the spatial dependence theory with the assumption that the land cover is spatially depend...As a promising technique to enhance the spatial reso- lution of remote sensing imagery, sub-pixel mapping is processed based on the spatial dependence theory with the assumption that the land cover is spatially dependent both within pixels and be- tween them. The spatial attraction is used as a tool to describe the dependence. First, the spatial attractions between pixels, sub- pixel/pixel spatial attraction model (SPSAM), are described by the modified SPSAM (MSPSAM) that estimates the attractions accord- ing to the distribution of sub-pixels within neighboring pixels. Then a mixed spatial attraction model (MSAM) for sub-pixel mapping is proposed that integrates the spatial attractions both within pix- els and between them. According to the expression of the MSAM maximumising the spatial attraction, the genetic algorithm is em- ployed to search the optimum solution and generate the sub-pixel mapping results. Experiments show that compared with SPSAM, MSPSAM and pixel swapping algorithm modified by initialization from SPSAM (MPS), MSAM can provide higher accuracy and more rational sub-pixel mapping results.展开更多
About one third of the water needed for agriculture in the world is generated by melting snow. Snow cover, surface and ground water recharge are considered as sustainable and renewable resources. It is therefore neces...About one third of the water needed for agriculture in the world is generated by melting snow. Snow cover, surface and ground water recharge are considered as sustainable and renewable resources. It is therefore necessary to identify and study these criteria. The aim of this study is to determine the spatial and temporal distribution of snow cover in the district of the Sheshpir basin in Fars province (in south of Iran). Ground-based observation of snow covers, especially in mountainous areas, is associated with many problems due to the insufficient accuracy of optical observation, as opposed to digital observation. Therefore, GIS and remote sensing technology can be partially effective in solving this problem. Images of Landsat 5<sup>TM</sup> and Landsat 7<sup>TM</sup> satellites were used to derive snow cover maps. The images in ENVI 4.8 software were classified by using the maximum likelihood algorithm. Other spatial analyses were performed in ARC-GIS 9.3 software. The maximum likelihood method was accuracy assessed by operation points of testing. The least and the average of overall accuracy of produced maps were found to be 91% and 98%, respectively. This demonstrates that the maximum likelihood method has high performance in the classification of images. Overall snow cover and the review of terrain through the years 2008-2009 and 2009-2010 showed that snow cover begins to accumulate in November and reaches its highest magnitude in February. Finally, no trace of snow can be observed on the surface of the basin in the month of May. By average, 34% of the basin is covered in snow from November through to May.展开更多
An improved circular synthetic aperture radar(CSAR) imaging algorithm of omega-k(ω-k) type mainly for reconstructing an image on a cylindrical surface is proposed.In the typical CSAR ω-k algorithm,the rage traje...An improved circular synthetic aperture radar(CSAR) imaging algorithm of omega-k(ω-k) type mainly for reconstructing an image on a cylindrical surface is proposed.In the typical CSAR ω-k algorithm,the rage trajectory is approximated by Taylor series expansion to the quadratic terms,which limits the valid synthetic aperture length and the angular reconstruction range severely.Based on the model of the CSAR echo signal,the proposed algorithm directly transforms the signal to the two-dimensional(2D) wavenumber domain,not using approximation processing to the range trajectory.Based on form of the signal spectrum in the wavenumber domain,the formula for the wavenumber domain interpolation of the w-k algorithm is deduced,and the wavenumber spectrum of the reference point used for bulk compression is obtained from numerical method.The improved CSAR ω-k imaging algorithm increases the valid synthetic aperture length and the angular area greatly and hence improves the angular resolution of the cylindrical imaging.Additionally,the proposed algorithm can be repeated on different cylindrical surfaces to achieve three dimensional(3D) image reconstruction.The 3D spatial resolution of the CSAR system is discussed,and the simulation results validate the correctness of the analysis and the feasibility of the algorithm.展开更多
The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far ap...The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far apart in space. This property is ignored in machine learning (ML) for spatial domains of application. Most classical machine learning algorithms are generally inappropriate unless modified in some way to account for it. In this study, we proposed an approach that aimed to improve a ML model to detect the dependence without incorporating any spatial features in the learning process. To detect this dependence while also improving performance, a hybrid model was used based on two representative algorithms. In addition, cross-validation method was used to make the model stable. Furthermore, global moran’s I and local moran were used to capture the spatial dependence in the residuals. The results show that the HM has significant with a R2 of 99.91% performance compared to RBFNN and RF that have 74.22% and 82.26% as R2 respectively. With lower errors, the HM was able to achieve an average test error of 0.033% and a positive global moran’s of 0.12. We concluded that as the R2 value increases, the models become weaker in terms of capturing the dependence.展开更多
Without any prior information about related wireless transmitting nodes,joint estimation of the position and power of a blind signal combined with multiple co-frequency radio waves is a challenging task.Measuring the ...Without any prior information about related wireless transmitting nodes,joint estimation of the position and power of a blind signal combined with multiple co-frequency radio waves is a challenging task.Measuring the signal related data based on a group distributed sensor is an efficient way to infer the various characteristics of the signal sources.In this paper,we propose a particle swarm optimization to estimate multiple co-frequency"blind"source nodes,which is based on the received power data measured by the sensors.To distract the mix signals precisely,a genetic algorithm is applied,and it further improves the estimation performance of the system.The simulation results show the efficiency of the proposed algorithm.展开更多
Although many computing algorithms have been developed to analyze the relationship between land use pattern and driving forces (RLPDF), little has been done to assess and reduce the uncertainty of predictions. In this...Although many computing algorithms have been developed to analyze the relationship between land use pattern and driving forces (RLPDF), little has been done to assess and reduce the uncertainty of predictions. In this study, we investigated RLPDF based on 1990, 2005 and 2012 datasets at two spatial scales using eight state-of-the-art single computing algorithms and four consensus methods in Jinjing rive catchment in Hunan Province, China. At the entire catchment scale, the mean AUC values were between 0.715 (ANN) and 0.948 (RF) for the single-algorithms, and from 0.764 to 0.962 for the consensus methods. At the subcatchment scale, the mean AUC values between 0.624 (CTA) and 0.972 (RF) for the single-algorithms, and from 0.758 to 0.979 for the consensus methods. At the subcatchment scale, the mean AUC values were between 0.624 (CTA) and 0.972 (RF) for the single-algorithms, and from 0.758 to 0.979 for the consensus methods. The result suggested that among the eight single computing algorithms, RF performed the best overall for woodland and paddy field;consensus method showed higher predictive performance for woodland and paddy field models than the single computing algorithms. We compared the simulation results of the best - and worst-performing algorithms for the entire catchment in 2012, and found that approximately 72.5% of woodland and 72.4% of paddy field had probabilities of occurrence of less than 0.1, and 3.6% of woodland and 14.5% of paddy field had probabilities of occurrence of more than 0.5. In other words, the simulation errors associated with using different computing algorithms can be up to 14.5% if a probability level of 0.5 is set as the threshold. The results of this study showed that the choice of modeling approaches can greatly affect the accuracy of RLPDF prediction. The computing algorithms for specific RLPDF tasks in specific regions have to be localized and optimized.展开更多
A random walk Metropolis-Hastings algorithm has been widely used in sampling the parameter of spatial interaction in spatial autoregressive model from a Bayesian point of view. In addition, as an alternative approach,...A random walk Metropolis-Hastings algorithm has been widely used in sampling the parameter of spatial interaction in spatial autoregressive model from a Bayesian point of view. In addition, as an alternative approach, the griddy Gibbs sampler is proposed by [1] and utilized by [2]. This paper proposes an acceptance-rejection Metropolis-Hastings algorithm as a third approach, and compares these three algorithms through Monte Carlo experiments. The experimental results show that the griddy Gibbs sampler is the most efficient algorithm among the algorithms whether the number of observations is small or not in terms of the computation time and the inefficiency factors. Moreover, it seems to work well when the size of grid is 100.展开更多
基金the financial support from the National Key R&D Program of China(Grant No.2021YFC3001003)Science and Technology Development Fund,Macao SAR(File No.0056/2023/RIB2)Guangdong Provincial Department of Science and Technology(Grant No.2022A0505030019).
文摘Challenges in stratigraphic modeling arise from underground uncertainty.While borehole exploration is reliable,it remains sparse due to economic and site constraints.Electrical resistivity tomography(ERT)as a cost-effective geophysical technique can acquire high-density data;however,uncertainty and nonuniqueness inherent in ERT impede its usage for stratigraphy identification.This paper integrates ERT and onsite observations for the first time to propose a novel method for characterizing stratigraphic profiles.The method consists of two steps:(1)ERT for prior knowledge:ERT data are processed by soft clustering using the Gaussian mixture model,followed by probability smoothing to quantify its depthdependent uncertainty;and(2)Observations for calibration:a spatial sequential Bayesian updating(SSBU)algorithm is developed to update the prior knowledge based on likelihoods derived from onsite observations,namely topsoil and boreholes.The effectiveness of the proposed method is validated through its application to a real slope site in Foshan,China.Comparative analysis with advanced borehole-driven methods highlights the superiority of incorporating ERT data in stratigraphic modeling,in terms of prediction accuracy at borehole locations and sensitivity to borehole data.Informed by ERT,reduced sensitivity to boreholes provides a fundamental solution to the longstanding challenge of sparse measurements.The paper further discusses the impact of ERT uncertainty on the proposed model using time-lapse measurements,the impact of model resolution,and applicability in engineering projects.This study,as a breakthrough in stratigraphic modeling,bridges gaps in combining geophysical and geotechnical data to address measurement sparsity and paves the way for more economical geotechnical exploration.
文摘Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasoning on the basis of greedy and directional strategy to establish a spatial heuristic,so as to improve running efficiency and suitability of shortest path algorithm for traffic network.The authors divide urban traffic network into three hierarchies and set forward a new node hierarchy division rule to avoid the unreliable solution of shortest path.It is argued that the shortest path,no matter distance shortest or time shortest,is usually not the favorite of drivers in practice.Some factors difficult to expect or quantify influence the drivers’ choice greatly.It makes the drivers prefer choosing a less shortest,but more reliable or flexible path to travel on.The presented optimum path algorithm,in addition to the improvement of the running efficiency of shortest path algorithms up to several times,reduces the emergence of those factors,conforms to the intellection characteristic of human beings,and is more easily accepted by drivers.Moreover,it does not require the completeness of networks in the lowest hierarchy and the applicability and fault tolerance of the algorithm have improved.The experiment result shows the advantages of the presented algorithm.The authors argued that the algorithm has great potential application for navigation systems of large_scale traffic networks.
基金Supported by the Open Researches Fund Program of L IESMARS(WKL(0 0 ) 0 30 2 )
文摘Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.
文摘Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to the minimum condition principle of form error evaluation, the mathematic model and optimization objective of the GASA are given. The algorithm avoids the optimization and linearization, and can be fulfilled in three steps. First construct two parallel quadrates based on the preset two reference points of the spatial line respectively;second construct centerlines by connecting one quadrate each vertices to another quadrate each vertices;after that, calculate the distances between measured points and the constructed centerlines. The minimum zone straightness error is obtained by repeating comparing and reconstructing quadrates. The principle and steps of the algorithm to evaluate spatial straightness error is described in detail, and the mathematical formula and program flowchart are given also. Results show that this algorithm can evaluate spatial straightness error more effectively and exactly.
文摘This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteristics with respect to the dynamic data environment. On GIS and CAD systems, the R-tree and its successors have been used. In addition, the NN search algorithm is also proposed in an attempt to obtain good performance from the R-tree. On the other hand, the GBD tree is superior to the R-tree with respect to exact match retrieval, because the GBD tree has auxiliary data that uniquely determines the position of the object in the structure. The proposed NN search algorithm depends on the property of the GBD tree described above. The NN search algorithm on the GBD tree was studied and the performance thereof was evaluated through experiments.
基金supported in part by Boeing Company and Nanjing University of Aeronautics and Astronautics(NUAA)through the Research on Decision Support Technology of Air Traffic Operation Management in Convective Weather under Project 2022-GT-129in part by the Postgraduate Research and Practice Innovation Program of NUAA(No.xcxjh20240709)。
文摘Addressing the issue that flight plans between Chinese city pairs typically rely on a single route,lacking alternative paths and posing challenges in responding to emergencies,this study employs the“quantile-inflection point method”to analyze specific deviation trajectories,determine deviation thresholds,and identify commonly used deviation paths.By combining multiple similarity metrics,including Euclidean distance,Hausdorff distance,and sector edit distance,with the density-based spatial clustering of applications with noise(DBSCAN)algorithm,the study clusters deviation trajectories to construct a multi-option trajectory set for city pairs.A case study of 23578 flight trajectories between the Guangzhou airport cluster and the Shanghai airport cluster demonstrates the effectiveness of the proposed framework.Experimental results show that sector edit distance achieves superior clustering performance compared to Euclidean and Hausdorff distances,with higher silhouette coefficients and lower Davies⁃Bouldin indices,ensuring better intra-cluster compactness and inter-cluster separation.Based on clustering results,19 representative trajectory options are identified,covering both nominal and deviation paths,which significantly enhance route diversity and reflect actual flight practices.This provides a practical basis for optimizing flight paths and scheduling,enhancing the flexibility of route selection for flights between city pairs.
文摘The transitional path towards a highly renewable power system based on wind and solar energy sources is investigated considering their intermittent and spatially distributed characteristics. Using an extensive weather-driven simulation of hourly power mismatches between generation and load, we explore the interplay between geographical resource complementarity and energy storage strategies. Solar and wind resources are considered at variable spatial scales across Europe and related to the Swiss load curve, which serve as a typical demand side reference. The optimal spatial distribution of renewable units is further assessed through a parameterized optimization method based on a genetic algorithm. It allows us to explore systematically the effective potential of combined integration strategies depending on the sizing of the system, with a focus on how overall performance is affected by the definition of network boundaries. Upper bounds on integration schemes are provided considering both renewable penetration and needed reserve power capacity. The quantitative trade-off between grid extension, storage and optimal wind-solar mix is highlighted.This paper also brings insights on how optimal geographical distribution of renewable units evolves as a function of renewable penetration and grid extent.
文摘As an essential tool for realistic description of the current or future debris environment,the Space Debris Environment Engineering Model(SDEEM)has been developed to provide support for risk assessment of spacecraft.In contrast with SDEEM2015,SDEEM2019,the latest version,extends the orbital range from the Low Earth Orbit(LEO)to Geosynchronous Orbit(GEO)for the years 1958-2050.In this paper,improved modeling algorithms used by SDEEM2019 in propagating simulation,spatial density distribution,and spacecraft flux evaluation are presented.The debris fluxes of SDEEM2019 are compared with those of three typical models,i.e.,SDEEM2015,Orbital Debris Engineering Model 3.1(ORDEM 3.1),and Meteoroid and Space Debris Terrestrial Environment Reference(MASTER-8),in terms of two assessment modes.Three orbital cases,including the Geostationary Transfer Orbit(GTO),Sun-Synchronous Orbit(SSO)and International Space Station(ISS)orbit,are selected for the spacecraft assessment mode,and the LEO region is selected for the spatial density assessment mode.The analysis indicates that compared with previous algorithms,the variable step-size orbital propagating algorithm based on semi-major axis control is more precise,the spatial density algorithm based on the second zonal harmonic of the non-spherical Earth gravity(J_(2))is more applicable,and the result of the position-centered spacecraft flux algorithm is more convergent.The comparison shows that SDEEM2019 and MASTER-8 have consistent trends due to similar modeling processes,while the differences between SDEEM2019 and ORDEM 3.1 are mainly caused by different modeling approaches for uncatalogued debris.
基金Supported by the National Natural Science Foundation of China(41175037)
文摘To reduce the error from measurement and retrieval process, a new technology of spatial heterodyne spectroscopy is proposed. The principle of this technology and the instrument spatial het- erodyne spectrometer (SHS) are introduced. The first application of this technology will be for CO2 measurements from space on a high spectral observation satellite. The outstanding measurement principle and the priority of combination of retrieval algorithm and three channels ( O2 A-band, CO2 1.58 μm and 2.06 μm bands) are theoretically analyzed and numerically simulated. Experiments u- sing SHS prototype with low spectral resolution of 0. 4 cm -1are carried out for preliminary valida- tion. The measurements show clear CO2 absorption lines and follow the expected signature with the- ory spectrum, and the retrievals agreed well with GOSAT CO2 products, except a small bias of about 4 × 10 ^-6. The results show that the ability of spatial heterodyne spectroscopy for CO2 detecting is ob- vious, and SHS is a competent sensor.
基金supported by the National Natural Science Foundation of China (60802059)the Foundation for the Doctoral Program of Higher Education of China (200802171003)
文摘As a promising technique to enhance the spatial reso- lution of remote sensing imagery, sub-pixel mapping is processed based on the spatial dependence theory with the assumption that the land cover is spatially dependent both within pixels and be- tween them. The spatial attraction is used as a tool to describe the dependence. First, the spatial attractions between pixels, sub- pixel/pixel spatial attraction model (SPSAM), are described by the modified SPSAM (MSPSAM) that estimates the attractions accord- ing to the distribution of sub-pixels within neighboring pixels. Then a mixed spatial attraction model (MSAM) for sub-pixel mapping is proposed that integrates the spatial attractions both within pix- els and between them. According to the expression of the MSAM maximumising the spatial attraction, the genetic algorithm is em- ployed to search the optimum solution and generate the sub-pixel mapping results. Experiments show that compared with SPSAM, MSPSAM and pixel swapping algorithm modified by initialization from SPSAM (MPS), MSAM can provide higher accuracy and more rational sub-pixel mapping results.
文摘About one third of the water needed for agriculture in the world is generated by melting snow. Snow cover, surface and ground water recharge are considered as sustainable and renewable resources. It is therefore necessary to identify and study these criteria. The aim of this study is to determine the spatial and temporal distribution of snow cover in the district of the Sheshpir basin in Fars province (in south of Iran). Ground-based observation of snow covers, especially in mountainous areas, is associated with many problems due to the insufficient accuracy of optical observation, as opposed to digital observation. Therefore, GIS and remote sensing technology can be partially effective in solving this problem. Images of Landsat 5<sup>TM</sup> and Landsat 7<sup>TM</sup> satellites were used to derive snow cover maps. The images in ENVI 4.8 software were classified by using the maximum likelihood algorithm. Other spatial analyses were performed in ARC-GIS 9.3 software. The maximum likelihood method was accuracy assessed by operation points of testing. The least and the average of overall accuracy of produced maps were found to be 91% and 98%, respectively. This demonstrates that the maximum likelihood method has high performance in the classification of images. Overall snow cover and the review of terrain through the years 2008-2009 and 2009-2010 showed that snow cover begins to accumulate in November and reaches its highest magnitude in February. Finally, no trace of snow can be observed on the surface of the basin in the month of May. By average, 34% of the basin is covered in snow from November through to May.
文摘An improved circular synthetic aperture radar(CSAR) imaging algorithm of omega-k(ω-k) type mainly for reconstructing an image on a cylindrical surface is proposed.In the typical CSAR ω-k algorithm,the rage trajectory is approximated by Taylor series expansion to the quadratic terms,which limits the valid synthetic aperture length and the angular reconstruction range severely.Based on the model of the CSAR echo signal,the proposed algorithm directly transforms the signal to the two-dimensional(2D) wavenumber domain,not using approximation processing to the range trajectory.Based on form of the signal spectrum in the wavenumber domain,the formula for the wavenumber domain interpolation of the w-k algorithm is deduced,and the wavenumber spectrum of the reference point used for bulk compression is obtained from numerical method.The improved CSAR ω-k imaging algorithm increases the valid synthetic aperture length and the angular area greatly and hence improves the angular resolution of the cylindrical imaging.Additionally,the proposed algorithm can be repeated on different cylindrical surfaces to achieve three dimensional(3D) image reconstruction.The 3D spatial resolution of the CSAR system is discussed,and the simulation results validate the correctness of the analysis and the feasibility of the algorithm.
文摘The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far apart in space. This property is ignored in machine learning (ML) for spatial domains of application. Most classical machine learning algorithms are generally inappropriate unless modified in some way to account for it. In this study, we proposed an approach that aimed to improve a ML model to detect the dependence without incorporating any spatial features in the learning process. To detect this dependence while also improving performance, a hybrid model was used based on two representative algorithms. In addition, cross-validation method was used to make the model stable. Furthermore, global moran’s I and local moran were used to capture the spatial dependence in the residuals. The results show that the HM has significant with a R2 of 99.91% performance compared to RBFNN and RF that have 74.22% and 82.26% as R2 respectively. With lower errors, the HM was able to achieve an average test error of 0.033% and a positive global moran’s of 0.12. We concluded that as the R2 value increases, the models become weaker in terms of capturing the dependence.
文摘Without any prior information about related wireless transmitting nodes,joint estimation of the position and power of a blind signal combined with multiple co-frequency radio waves is a challenging task.Measuring the signal related data based on a group distributed sensor is an efficient way to infer the various characteristics of the signal sources.In this paper,we propose a particle swarm optimization to estimate multiple co-frequency"blind"source nodes,which is based on the received power data measured by the sensors.To distract the mix signals precisely,a genetic algorithm is applied,and it further improves the estimation performance of the system.The simulation results show the efficiency of the proposed algorithm.
文摘Although many computing algorithms have been developed to analyze the relationship between land use pattern and driving forces (RLPDF), little has been done to assess and reduce the uncertainty of predictions. In this study, we investigated RLPDF based on 1990, 2005 and 2012 datasets at two spatial scales using eight state-of-the-art single computing algorithms and four consensus methods in Jinjing rive catchment in Hunan Province, China. At the entire catchment scale, the mean AUC values were between 0.715 (ANN) and 0.948 (RF) for the single-algorithms, and from 0.764 to 0.962 for the consensus methods. At the subcatchment scale, the mean AUC values between 0.624 (CTA) and 0.972 (RF) for the single-algorithms, and from 0.758 to 0.979 for the consensus methods. At the subcatchment scale, the mean AUC values were between 0.624 (CTA) and 0.972 (RF) for the single-algorithms, and from 0.758 to 0.979 for the consensus methods. The result suggested that among the eight single computing algorithms, RF performed the best overall for woodland and paddy field;consensus method showed higher predictive performance for woodland and paddy field models than the single computing algorithms. We compared the simulation results of the best - and worst-performing algorithms for the entire catchment in 2012, and found that approximately 72.5% of woodland and 72.4% of paddy field had probabilities of occurrence of less than 0.1, and 3.6% of woodland and 14.5% of paddy field had probabilities of occurrence of more than 0.5. In other words, the simulation errors associated with using different computing algorithms can be up to 14.5% if a probability level of 0.5 is set as the threshold. The results of this study showed that the choice of modeling approaches can greatly affect the accuracy of RLPDF prediction. The computing algorithms for specific RLPDF tasks in specific regions have to be localized and optimized.
文摘A random walk Metropolis-Hastings algorithm has been widely used in sampling the parameter of spatial interaction in spatial autoregressive model from a Bayesian point of view. In addition, as an alternative approach, the griddy Gibbs sampler is proposed by [1] and utilized by [2]. This paper proposes an acceptance-rejection Metropolis-Hastings algorithm as a third approach, and compares these three algorithms through Monte Carlo experiments. The experimental results show that the griddy Gibbs sampler is the most efficient algorithm among the algorithms whether the number of observations is small or not in terms of the computation time and the inefficiency factors. Moreover, it seems to work well when the size of grid is 100.