Parallel vector buffer analysis approaches can be classified into 2 types:algorithm-oriented parallel strategy and the data-oriented parallel strategy.These methods do not take its applicability on the existing geogra...Parallel vector buffer analysis approaches can be classified into 2 types:algorithm-oriented parallel strategy and the data-oriented parallel strategy.These methods do not take its applicability on the existing geographic information systems(GIS)platforms into consideration.In order to address the problem,a spatial decomposition approach for accelerating buffer analysis of vector data is proposed.The relationship between the number of vertices of each feature and the buffer analysis computing time is analyzed to generate computational intensity transformation functions(CITFs).Then,computational intensity grids(CIGs)of polyline and polygon are constructed based on the relative CITFs.Using the corresponding CIGs,a spatial decomposition method for parallel buffer analysis is developed.Based on the computational intensity of the features and the sub-domains generated in the decomposition,the features are averagely assigned within the sub-domains into parallel buffer analysis tasks for load balance.Compared with typical regular domain decomposition methods,the new approach accomplishes greater balanced decomposition of computational intensity for parallel buffer analysis and achieves near-linear speedups.展开更多
In order to improve the concurrent access performance of the web-based spatial computing system in cluster,a parallel scheduling strategy based on the multi-core environment is proposed,which includes two levels of pa...In order to improve the concurrent access performance of the web-based spatial computing system in cluster,a parallel scheduling strategy based on the multi-core environment is proposed,which includes two levels of parallel processing mechanisms.One is that it can evenly allocate tasks to each server node in the cluster and the other is that it can implement the load balancing inside a server node.Based on the strategy,a new web-based spatial computing model is designed in this paper,in which,a task response ratio calculation method,a request queue buffer mechanism and a thread scheduling strategy are focused on.Experimental results show that the new model can fully use the multi-core computing advantage of each server node in the concurrent access environment and improve the average hits per second,average I/O Hits,CPU utilization and throughput.Using speed-up ratio to analyze the traditional model and the new one,the result shows that the new model has the best performance.The performance of the multi-core server nodes in the cluster is optimized;the resource utilization and the parallel processing capabilities are enhanced.The more CPU cores you have,the higher parallel processing capabilities will be obtained.展开更多
We propose a novel method of slice image reconstruction with controllable spatial filtering by using the correlation of periodic delta-function arrays (PDFAs) with elemental images in computational integral imaging....We propose a novel method of slice image reconstruction with controllable spatial filtering by using the correlation of periodic delta-function arrays (PDFAs) with elemental images in computational integral imaging. The multiple PDFAs, whose spatial periods correspond to object's depths with the elemental image array (EIA), can generate a set of spatially filtered EIAs for multiple object depths compared with the conventional method for the depth of a single object. We analyze a controllable spatial filtering effect by the proposed method. To show the feasibility of the proposed method, we carry out preliminary experiments for multiple objects and present the results.展开更多
The near-eye display feature in emerging spatial computing systems produces a distinctive visual effect of mixing virtual and real worlds.However,its application for all-day wear is greatly limited by the bulky struct...The near-eye display feature in emerging spatial computing systems produces a distinctive visual effect of mixing virtual and real worlds.However,its application for all-day wear is greatly limited by the bulky structure,energy expenditure,and continuous battery heating.Here,we propose a lightweight holographic near-eye display system that takes advantage of solar energy for self-charging.To guarantee the collection of solar energy and near-eye display without crosstalk,we implement holographic optical elements(HOEs)to diffract sunlight and signal light into a common waveguide.Then,small-area solar cells convert the collected solar energy and power the system.Compact power supply components replace heavy batteries,thus contributing to the lightweight design.The simple acquisition and management of solar energy provide the system with sustainable self-charging capability.We believe that the lightweight design and continuous energy input solution will significantly promote the popularity of near-eye display in our daily lives.展开更多
With growing regional economic integration,transportation systems have become critical to regional development and economic vitality but vulnerable to disasters.However,the regional economic ripple effect of a disaste...With growing regional economic integration,transportation systems have become critical to regional development and economic vitality but vulnerable to disasters.However,the regional economic ripple effect of a disaster is difficult to quantify accurately,especially considering the cumulated influence of traffic disruptions.This study explored integrating transportation system analysis with economic modeling to capture the regional economic ripple effect.A state-of-the-art spatial computable general equilibrium model is leveraged to simulate the operation of the economic system,and the marginal rate of transport cost is introduced to reflect traffic network damage post-disaster.The model is applied to the 50-year return period flood in2020 in Hubei Province,China.The results show the following.First,when traffic disruption costs are considered,the total output loss of non-affected areas is 1.81 times than before,and non-negligible losses reach relatively remote zones of the country,such as the Northwest Comprehensive Economic Zone(36%of total ripple effects).Second,traffic disruptions have a significant hindering effect on regional trade activities,especially in the regional intermediate input—about three times more than before.The industries most sensitive to traffic disruptions were transportation,storage,and postal service(5 times),and processing and assembly manufacturing(4.4 times).Third,the longer the distance,the stronger traffic disruptions'impact on interregional intermediate inputs.Thus,increasing investment in transportation infrastructure significantly contributes to mitigating disaster ripple effects and accelerating the process of industrial recovery in affected areas.展开更多
A spatial web portal(SWP)provides a web-based gateway to discover,access,manage,and integrate worldwide geospatial resources through the Internet and has the access characteristics of regional to global interest and s...A spatial web portal(SWP)provides a web-based gateway to discover,access,manage,and integrate worldwide geospatial resources through the Internet and has the access characteristics of regional to global interest and spiking.Although various technologies have been adopted to improve SWP performance,enabling high-speed resource access for global users to better support Digital Earth remains challenging because of the computing and communication intensities in the SWP operation and the dynamic distribution of end users.This paper proposes a cloud-enabled framework for high-speed SWP access by leveraging elastic resource pooling,dynamic workload balancing,and global deployment.Experimental results demonstrate that the new SWP framework outperforms the traditional computing infrastructure and better supports users of a global system such as Digital Earth.Reported methodologies and framework can be adopted to support operational geospatial systems,such as monitoring national geographic state and spanning across regional and global geographic extent.展开更多
The spatial impulse response(SIR) method is often used as the 'gold standard5 in simulation of transient acoustic wave fields due to its high accuracy in the linear domain.However, a high sampling frequency is ofte...The spatial impulse response(SIR) method is often used as the 'gold standard5 in simulation of transient acoustic wave fields due to its high accuracy in the linear domain.However, a high sampling frequency is often required in order to achieve the high accuracy. As a result, a large amount of data has to be processed. In this paper a fast approach for computing spatial impulse response is proposed to reduce the computation burden. The proposed approach is developed by employing the relationship of SIRs at observed points and SIRs of the projection points on the transducer surface. Two critical parameters used in the proposed approach, the calculation sampling frequency and the interpolation sampling frequency, are then analyzed.Results show that for a 2.25 MHz rectangular transducer with the size of 5 mm×10 mm,a calculation sampling frequency of 1000 MHz and an interpolation sampling frequency of500 MHz can achieve superior performance while improving the computation efficiency 18 times than the direct solving.展开更多
The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing...The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing platform for a dust storm forecasting operational system should support a disruptive fashion by scaling up to enable high-resolution forecasting and massive public access when dust storms come and scaling down when no dust storm events occur to save energy and costs.With the capability of providing a large,elastic,and virtualized pool of computational resources,cloud computing becomes a new and advantageous computing paradigm to resolve scientific problems traditionally requiring a large-scale and high-performance cluster.This paper examines the viability for cloud computing to support dust storm forecasting.Through a holistic study by systematically comparing cloud computing using Amazon EC2 to traditional high performance computing(HPC)cluster,we find that cloud computing is emerging as a credible solution for(1)supporting dust storm forecasting in spinning off a large group of computing resources in a few minutes to satisfy the disruptive computing requirements of dust storm forecasting,(2)performing high-resolution dust storm forecasting when required,(3)supporting concurrent computing requirements,(4)supporting real dust storm event forecasting for a large geographic domain by using recent dust storm event in Phoniex,05 July 2011 as example,and(5)reducing cost by maintaining low computing support when there is no dust storm events while invoking a large amount of computing resource to perform high-resolution forecasting and responding to large amount of concurrent public accesses.展开更多
Traditional data collection methods such as remote sensing and field surveying often fail to offer timely information during or immediately following disaster events.Social sensing enables all citizens to become part ...Traditional data collection methods such as remote sensing and field surveying often fail to offer timely information during or immediately following disaster events.Social sensing enables all citizens to become part of a large sensor network,which is low cost,more comprehensive,and always broadcasting situational awareness information.However,data collected with social sensing is often massive,heterogeneous,noisy,unreliable from some aspects,comes in continuous streams,and often lacks geospatial reference information.Together,these issues represent a grand challenge toward fully leveraging social sensing for emergency management decision making under extreme duress.Meanwhile,big data computing methods and technologies such as high-performance computing,deep learning,and multi-source data fusion become critical components of using social sensing to understand the impact of and response to the disaster events in a timely fashion.This special issue captures recent advancements in leveraging social sensing and big data computing for supporting disaster management.Specifically analyzed within these papers are some of the promises and pitfalls of social sensing data for disaster relevant information extraction,impact area assessment,population mapping,occurrence patterns,geographical disparities in social media use,and inclusion in larger decision support systems.展开更多
基金the National Natural Science Foundation of China(No.41971356,41701446)National Key Research and Development Program of China(No.2017YFB0503600,2018YFB0505500,2017YFC0602204).
文摘Parallel vector buffer analysis approaches can be classified into 2 types:algorithm-oriented parallel strategy and the data-oriented parallel strategy.These methods do not take its applicability on the existing geographic information systems(GIS)platforms into consideration.In order to address the problem,a spatial decomposition approach for accelerating buffer analysis of vector data is proposed.The relationship between the number of vertices of each feature and the buffer analysis computing time is analyzed to generate computational intensity transformation functions(CITFs).Then,computational intensity grids(CIGs)of polyline and polygon are constructed based on the relative CITFs.Using the corresponding CIGs,a spatial decomposition method for parallel buffer analysis is developed.Based on the computational intensity of the features and the sub-domains generated in the decomposition,the features are averagely assigned within the sub-domains into parallel buffer analysis tasks for load balance.Compared with typical regular domain decomposition methods,the new approach accomplishes greater balanced decomposition of computational intensity for parallel buffer analysis and achieves near-linear speedups.
基金Supported by the China Postdoctoral Science Foundation(No.2014M552115)the Fundamental Research Funds for the Central Universities,ChinaUniversity of Geosciences(Wuhan)(No.CUGL140833)the National Key Technology Support Program of China(No.2011BAH06B04)
文摘In order to improve the concurrent access performance of the web-based spatial computing system in cluster,a parallel scheduling strategy based on the multi-core environment is proposed,which includes two levels of parallel processing mechanisms.One is that it can evenly allocate tasks to each server node in the cluster and the other is that it can implement the load balancing inside a server node.Based on the strategy,a new web-based spatial computing model is designed in this paper,in which,a task response ratio calculation method,a request queue buffer mechanism and a thread scheduling strategy are focused on.Experimental results show that the new model can fully use the multi-core computing advantage of each server node in the concurrent access environment and improve the average hits per second,average I/O Hits,CPU utilization and throughput.Using speed-up ratio to analyze the traditional model and the new one,the result shows that the new model has the best performance.The performance of the multi-core server nodes in the cluster is optimized;the resource utilization and the parallel processing capabilities are enhanced.The more CPU cores you have,the higher parallel processing capabilities will be obtained.
基金supported by the information technology(IT)research and development program of MKE/KEIT(10041682Development of High-Definition 3D Image Processing Technologies Using Advanced Integral Imaging with Improved Depth Range)
文摘We propose a novel method of slice image reconstruction with controllable spatial filtering by using the correlation of periodic delta-function arrays (PDFAs) with elemental images in computational integral imaging. The multiple PDFAs, whose spatial periods correspond to object's depths with the elemental image array (EIA), can generate a set of spatially filtered EIAs for multiple object depths compared with the conventional method for the depth of a single object. We analyze a controllable spatial filtering effect by the proposed method. To show the feasibility of the proposed method, we carry out preliminary experiments for multiple objects and present the results.
基金National Natural Science Foundation of China(U22A2079,62035003,61975014)Beijing Municipal Science and Technology Commission,Adminitrative Commission of Zhongguancun Science Park(Z211100004821012).
文摘The near-eye display feature in emerging spatial computing systems produces a distinctive visual effect of mixing virtual and real worlds.However,its application for all-day wear is greatly limited by the bulky structure,energy expenditure,and continuous battery heating.Here,we propose a lightweight holographic near-eye display system that takes advantage of solar energy for self-charging.To guarantee the collection of solar energy and near-eye display without crosstalk,we implement holographic optical elements(HOEs)to diffract sunlight and signal light into a common waveguide.Then,small-area solar cells convert the collected solar energy and power the system.Compact power supply components replace heavy batteries,thus contributing to the lightweight design.The simple acquisition and management of solar energy provide the system with sustainable self-charging capability.We believe that the lightweight design and continuous energy input solution will significantly promote the popularity of near-eye display in our daily lives.
基金supported by the National Natural Science Foundation of China(Grant Nos.42177448 and 41907393)。
文摘With growing regional economic integration,transportation systems have become critical to regional development and economic vitality but vulnerable to disasters.However,the regional economic ripple effect of a disaster is difficult to quantify accurately,especially considering the cumulated influence of traffic disruptions.This study explored integrating transportation system analysis with economic modeling to capture the regional economic ripple effect.A state-of-the-art spatial computable general equilibrium model is leveraged to simulate the operation of the economic system,and the marginal rate of transport cost is introduced to reflect traffic network damage post-disaster.The model is applied to the 50-year return period flood in2020 in Hubei Province,China.The results show the following.First,when traffic disruption costs are considered,the total output loss of non-affected areas is 1.81 times than before,and non-negligible losses reach relatively remote zones of the country,such as the Northwest Comprehensive Economic Zone(36%of total ripple effects).Second,traffic disruptions have a significant hindering effect on regional trade activities,especially in the regional intermediate input—about three times more than before.The industries most sensitive to traffic disruptions were transportation,storage,and postal service(5 times),and processing and assembly manufacturing(4.4 times).Third,the longer the distance,the stronger traffic disruptions'impact on interregional intermediate inputs.Thus,increasing investment in transportation infrastructure significantly contributes to mitigating disaster ripple effects and accelerating the process of industrial recovery in affected areas.
基金Research reported is partially supported by NSF[grant numbers PLR-1349259 and IIP-1338925],FGDC[grant number G13PG00091],and NASA[grant number NNG12PP37I].
文摘A spatial web portal(SWP)provides a web-based gateway to discover,access,manage,and integrate worldwide geospatial resources through the Internet and has the access characteristics of regional to global interest and spiking.Although various technologies have been adopted to improve SWP performance,enabling high-speed resource access for global users to better support Digital Earth remains challenging because of the computing and communication intensities in the SWP operation and the dynamic distribution of end users.This paper proposes a cloud-enabled framework for high-speed SWP access by leveraging elastic resource pooling,dynamic workload balancing,and global deployment.Experimental results demonstrate that the new SWP framework outperforms the traditional computing infrastructure and better supports users of a global system such as Digital Earth.Reported methodologies and framework can be adopted to support operational geospatial systems,such as monitoring national geographic state and spanning across regional and global geographic extent.
基金supported by the National Natural Science Foundation of China(51074121)the China Postdoctoral Science Foundation(2015M572653XB)+1 种基金the Doctoral Fund of Xi'an University of Science and Technology(2014QDJ003),the Cultivation Fund of Xi'an University of Science and Technology(201332)Scientific Research Program Funded by Shaanxi Provincial Education Department
文摘The spatial impulse response(SIR) method is often used as the 'gold standard5 in simulation of transient acoustic wave fields due to its high accuracy in the linear domain.However, a high sampling frequency is often required in order to achieve the high accuracy. As a result, a large amount of data has to be processed. In this paper a fast approach for computing spatial impulse response is proposed to reduce the computation burden. The proposed approach is developed by employing the relationship of SIRs at observed points and SIRs of the projection points on the transducer surface. Two critical parameters used in the proposed approach, the calculation sampling frequency and the interpolation sampling frequency, are then analyzed.Results show that for a 2.25 MHz rectangular transducer with the size of 5 mm×10 mm,a calculation sampling frequency of 1000 MHz and an interpolation sampling frequency of500 MHz can achieve superior performance while improving the computation efficiency 18 times than the direct solving.
基金Research reported is supported by NSF(CSR-1117300 and IIP-1160979)NASA(NNX07AD99G)Microsoft Research.
文摘The simulations and potential forecasting of dust storms are of significant interest to public health and environment sciences.Dust storms have interannual variabilities and are typical disruptive events.The computing platform for a dust storm forecasting operational system should support a disruptive fashion by scaling up to enable high-resolution forecasting and massive public access when dust storms come and scaling down when no dust storm events occur to save energy and costs.With the capability of providing a large,elastic,and virtualized pool of computational resources,cloud computing becomes a new and advantageous computing paradigm to resolve scientific problems traditionally requiring a large-scale and high-performance cluster.This paper examines the viability for cloud computing to support dust storm forecasting.Through a holistic study by systematically comparing cloud computing using Amazon EC2 to traditional high performance computing(HPC)cluster,we find that cloud computing is emerging as a credible solution for(1)supporting dust storm forecasting in spinning off a large group of computing resources in a few minutes to satisfy the disruptive computing requirements of dust storm forecasting,(2)performing high-resolution dust storm forecasting when required,(3)supporting concurrent computing requirements,(4)supporting real dust storm event forecasting for a large geographic domain by using recent dust storm event in Phoniex,05 July 2011 as example,and(5)reducing cost by maintaining low computing support when there is no dust storm events while invoking a large amount of computing resource to perform high-resolution forecasting and responding to large amount of concurrent public accesses.
文摘Traditional data collection methods such as remote sensing and field surveying often fail to offer timely information during or immediately following disaster events.Social sensing enables all citizens to become part of a large sensor network,which is low cost,more comprehensive,and always broadcasting situational awareness information.However,data collected with social sensing is often massive,heterogeneous,noisy,unreliable from some aspects,comes in continuous streams,and often lacks geospatial reference information.Together,these issues represent a grand challenge toward fully leveraging social sensing for emergency management decision making under extreme duress.Meanwhile,big data computing methods and technologies such as high-performance computing,deep learning,and multi-source data fusion become critical components of using social sensing to understand the impact of and response to the disaster events in a timely fashion.This special issue captures recent advancements in leveraging social sensing and big data computing for supporting disaster management.Specifically analyzed within these papers are some of the promises and pitfalls of social sensing data for disaster relevant information extraction,impact area assessment,population mapping,occurrence patterns,geographical disparities in social media use,and inclusion in larger decision support systems.