The amount of produced data required to be visualized and analyzed has grown year by year,and the traditional approach of using larger computational resources or exploiting task and data parallelism seems to have reac...The amount of produced data required to be visualized and analyzed has grown year by year,and the traditional approach of using larger computational resources or exploiting task and data parallelism seems to have reached its limit.Therefore,a new paradigm for large-scale data visualization becomes highly desired,and in this paper,we propose a new and optimized visualization pipeline which uses a point rendering-based early visibility testing for reducing the unnecessary rendering in the final stage of the visualization pipeline,and we named this technique“Early Visibility Test Point Rendering”.In a densely populated polygonal mesh scene,where multiple triangles may cover a single pixel,unnecessary and wasteful rendering will occur in the final stage of the traditional visualization pipeline,that is,during the rasterization process.Therefore,we propose an alternative visualization pipeline by introducing the“Early Visibility Test Point Rendering”for selecting only the visible polygonal elements for a given visualization scene.This visibility testing can be executed on the CPU side,and only the visible polygonal elements are needed to be sent to the GPU for an optimized rendering.We could verify the effectiveness of our proposed approach by using synthetic datasets,and also a real-world large-scale simulation result.展开更多
Ultra-scale data analysis has created many new challenges for visualization. For example, in climate research with two-dimensional time-varying data, scientists find it crucial to study the hidden temporal relationshi...Ultra-scale data analysis has created many new challenges for visualization. For example, in climate research with two-dimensional time-varying data, scientists find it crucial to study the hidden temporal relationships from a set of large scale images, whose resolutions are much higher than that of general computer monitors. When scientists can only visualize a small portion (〈 1/1000) of a time step at one time, it is extremely challenging to analyze the temporal features from multiple time steps. As this problem cannot be simply solved with interaction or display technologies, this paper presents a milli-scaling approach by designing downscaling algorithms with significant ratios. Our approach can produce readable-sized images of multiple ultra-scale visualizations, while preserving important data features and temporal relationships. Using the climate visualization as the testing application, we demonstrate that our approach provides a new tool for users to effectively make sense of multiple, arge-format visualizations展开更多
文摘The amount of produced data required to be visualized and analyzed has grown year by year,and the traditional approach of using larger computational resources or exploiting task and data parallelism seems to have reached its limit.Therefore,a new paradigm for large-scale data visualization becomes highly desired,and in this paper,we propose a new and optimized visualization pipeline which uses a point rendering-based early visibility testing for reducing the unnecessary rendering in the final stage of the visualization pipeline,and we named this technique“Early Visibility Test Point Rendering”.In a densely populated polygonal mesh scene,where multiple triangles may cover a single pixel,unnecessary and wasteful rendering will occur in the final stage of the traditional visualization pipeline,that is,during the rasterization process.Therefore,we propose an alternative visualization pipeline by introducing the“Early Visibility Test Point Rendering”for selecting only the visible polygonal elements for a given visualization scene.This visibility testing can be executed on the CPU side,and only the visible polygonal elements are needed to be sent to the GPU for an optimized rendering.We could verify the effectiveness of our proposed approach by using synthetic datasets,and also a real-world large-scale simulation result.
基金Co-authors Zhang and Lu were supported by DHS Center of Excellence-Natural Disasters,Coastal Infrastructure and Emergency Management (DIEM) and DOE (No. DEFG02-06ER25733)Work by co-author Huang was in part funded through the Institute of Ultra-Scale Visualization(http://www.ultravis.org) under the auspices of the SciDAC program within the U.S.Department of Energy (No. DEFC02-06ER25778)
文摘Ultra-scale data analysis has created many new challenges for visualization. For example, in climate research with two-dimensional time-varying data, scientists find it crucial to study the hidden temporal relationships from a set of large scale images, whose resolutions are much higher than that of general computer monitors. When scientists can only visualize a small portion (〈 1/1000) of a time step at one time, it is extremely challenging to analyze the temporal features from multiple time steps. As this problem cannot be simply solved with interaction or display technologies, this paper presents a milli-scaling approach by designing downscaling algorithms with significant ratios. Our approach can produce readable-sized images of multiple ultra-scale visualizations, while preserving important data features and temporal relationships. Using the climate visualization as the testing application, we demonstrate that our approach provides a new tool for users to effectively make sense of multiple, arge-format visualizations