High-Performance Computing(HPC)has advanced tremendously over the past several decades,with immense technological developments revolutionizing the ability to model,simulate,and analyze large amounts of data.Over time,...High-Performance Computing(HPC)has advanced tremendously over the past several decades,with immense technological developments revolutionizing the ability to model,simulate,and analyze large amounts of data.Over time,these HPC systems have become much larger and more complex,featuring multi-/many-core processors with thousands of cores that are often connected to form a large-scale computing system.展开更多
Cloud computing is expanding widely in the world of IT infrastructure. This is due partly to the cost-saving effect of economies of scale. Fair market conditions can in theory provide a healthy environment to reflect ...Cloud computing is expanding widely in the world of IT infrastructure. This is due partly to the cost-saving effect of economies of scale. Fair market conditions can in theory provide a healthy environment to reflect the most reasonable costs of computations. While fixed cloud pricing provides an attractive low entry barrier for compute-intensive applications, both the consumer and supplier of computing resources can see high efficiency for their investments by participating in auction-based exchanges. There are huge incentives for the cloud provider to offer auctioned resources. However, from the consumer perspective, using these resources is a sparsely discussed challenge. This paper reports a methodology and framework designed to address the challenges of using HPC (High Performance Computing) applications on auction-based cloud clusters. The authors focus on HPC applications and describe a method for determining bid-aware checkpointing intervals. They extend a theoretical model for determining checkpoint intervals using statistical analysis of pricing histories. Also the latest developments in the SpotHPC framework are introduced which aim at facilitating the managed execution of real MPI applications on auction-based cloud environments. The authors use their model to simulate a set of algorithms with different computing and communication densities. The results show the complex interactions between optimal bidding strategies and parallel applications performance.展开更多
In this paper, an experimental study on the sulphate attack resistance of high-performance concrete (HPC) with two different water-to-binder ratios (w/b) under compressive loading is presented. The sulphate concentrat...In this paper, an experimental study on the sulphate attack resistance of high-performance concrete (HPC) with two different water-to-binder ratios (w/b) under compressive loading is presented. The sulphate concentration, compressive strength, and the mass change in the HPC specimens were determined for immersion in a Na2SO4 solution over different durations under external compressive loading by self-regulating loading equipment. The effects of the compressive stress, the w/b ratio, and the Na2SO4 solution concentration on the HPC sulphate attack resistance under compressive loading were analysed. The results showed that the HPC sulphate attack resistance under compressive loading was closely related to the stress level, the w/b ratio, and the Na2SO4 solution concentration. Applying a 0.3 stress ratio for the compressive loading or reducing the w/b ratio clearly improved the HPC sulphate attack resistance, whereas applying a 0.6 stress ratio for the compressive loading or exposing the HPC to a more concentrated Na2SO4 solution accelerated the sulphate attack and HPC deterioration.展开更多
Discrete element method can effectively simulate the discontinuity,inhomogeneity and large deformation and failure of rock and soil.Based on the innovative matrix computing of the discrete element method,the highperfo...Discrete element method can effectively simulate the discontinuity,inhomogeneity and large deformation and failure of rock and soil.Based on the innovative matrix computing of the discrete element method,the highperformance discrete element software MatDEM may handle millions of elements in one computer,and enables the discrete element simulation at the engineering scale.It supports heat calculation,multi-field and fluidsolid coupling numerical simulations.Furthermore,the software integrates pre-processing,solver,postprocessing,and powerful secondary development,allowing recompiling new discrete element software.The basic principles of the DEM,the implement and development of the MatDEM software,and its applications are introduced in this paper.The software and sample source code are available online(http://matdem.com).展开更多
Factors that have effect on concrete creep include mixture composition,curing conditions,ambient exposure conditions,and element geometry.Considering concrete mixtures influence and in order to improve the prediction ...Factors that have effect on concrete creep include mixture composition,curing conditions,ambient exposure conditions,and element geometry.Considering concrete mixtures influence and in order to improve the prediction of prestress loss in important structures,an experimental test under laboratory conditions was carried out to investigate compression creep of two high performance concrete mixtures used for prestressed members in one bridge.Based on the experimental results,a power exponent function of creep degree for structural numerical analysis was used to model the creep degree of two HPCs,and two series of parameters of this function for two HPCs were calculated with evolution program optimum method.The experimental data was compared with CEB-FIP 90 and ACI 209(92) models,and the two code models both overestimated creep degrees of the two HPCs.So it is recommended that the power exponent function should be used in this bridge structure analysis.展开更多
In order to investigate the compression creep of two kinds of high-performance concrete mixtures used for prestressed members in a bridge,an experimental test under laboratory conditions was carried out.Based on the e...In order to investigate the compression creep of two kinds of high-performance concrete mixtures used for prestressed members in a bridge,an experimental test under laboratory conditions was carried out.Based on the experimental results,a power exponent function was used to model the creep degree of these high-performance concretes(HPCs) for structural numerical analysis,and two series parameters of this function for the HPCs were given with the optimum method of evolution program.The experimental data were compared with CEB-FIP 90 and ACI 92 models.Results show that the two code models both overestimate the creep degree of two HPCs,so it is recommended that the power exponent function should be used for the creep analysis of bridge structure.展开更多
The meteorological high-performance computing resource is the support platform for the weather forecast and climate prediction numerical model operation. The scientific and objective method to evaluate the application...The meteorological high-performance computing resource is the support platform for the weather forecast and climate prediction numerical model operation. The scientific and objective method to evaluate the application of meteorological high-performance computing resources can not only provide reference for the optimization of active resources, but also provide a quantitative basis for future resource construction and planning. In this paper, the concept of the utility value B and index compliance rate E of the meteorological high performance computing system are presented. The evaluation process, evaluation index and calculation method of the high performance computing resource application benefits are introduced.展开更多
Within the last few decades, increases in computational resources have contributed enormously to the progress of science and engineering (S & E). To continue making rapid advancements, the S & E community must...Within the last few decades, increases in computational resources have contributed enormously to the progress of science and engineering (S & E). To continue making rapid advancements, the S & E community must be able to access computing resources. One way to provide such resources is through High-Performance Computing (HPC) centers. Many academic research institutions offer their own HPC Centers but struggle to make the computing resources easily accessible and user-friendly. Here we present SHABU, a RESTful Web API framework that enables S & E communities to access resources from Boston University’s Shared Computing Center (SCC). The SHABU requirements are derived from the use cases described in this work.展开更多
The Dynamical Density Functional Theory(DDFT)algorithm,derived by associating classical Density Functional Theory(DFT)with the fundamental Smoluchowski dynamical equation,describes the evolution of inhomo-geneous flui...The Dynamical Density Functional Theory(DDFT)algorithm,derived by associating classical Density Functional Theory(DFT)with the fundamental Smoluchowski dynamical equation,describes the evolution of inhomo-geneous fluid density distributions over time.It plays a significant role in studying the evolution of density distributions over time in inhomogeneous systems.The Sunway Bluelight II supercomputer,as a new generation of China’s developed supercomputer,possesses powerful computational capabilities.Porting and optimizing industrial software on this platform holds significant importance.For the optimization of the DDFT algorithm,based on the Sunway Bluelight II supercomputer and the unique hardware architecture of the SW39000 processor,this work proposes three acceleration strategies to enhance computational efficiency and performance,including direct parallel optimization,local-memory constrained optimization for CPEs,and multi-core groups collaboration and communication optimization.This method combines the characteristics of the program’s algorithm with the unique hardware architecture of the Sunway Bluelight II supercomputer,optimizing the storage and transmission structures to achieve a closer integration of software and hardware.For the first time,this paper presents Sunway-Dynamical Density Functional Theory(SW-DDFT).Experimental results show that SW-DDFT achieves a speedup of 6.67 times within a single-core group compared to the original DDFT implementation,with six core groups(a total of 384 CPEs),the maximum speedup can reach 28.64 times,and parallel efficiency can reach 71%,demonstrating excellent acceleration performance.展开更多
以ChatGPT为代表的生成式大模型,展现出前所未有的通用能力,并驱动科学研究范式向科学智能(AI for science)转变。这一趋势使得高性能计算与人工智能计算加速融合,超智融合的智能超算系统成为支撑未来大模型发展与科学发现的关键基础设...以ChatGPT为代表的生成式大模型,展现出前所未有的通用能力,并驱动科学研究范式向科学智能(AI for science)转变。这一趋势使得高性能计算与人工智能计算加速融合,超智融合的智能超算系统成为支撑未来大模型发展与科学发现的关键基础设施。本文对智能超算系统在这一历史性交汇点上面临的重大机遇进行了系统分析,并深入探讨了其在计算芯片、体系结构、硬件系统、软件生态、可靠性与能耗等方面所遭遇的严峻挑战,提出需要软硬件协同设计与全产业链的紧密合作,从而为构建高效、普惠、可持续的新一代智能超算系统奠定基础。展开更多
The study of global climate change seeks to understand:(1)the components of the Earth’s varying environmental system,with a particular focus on climate;(2)how these components interact to determine present conditions...The study of global climate change seeks to understand:(1)the components of the Earth’s varying environmental system,with a particular focus on climate;(2)how these components interact to determine present conditions;(3)the factors driving these components;(4)the history of global change and the projection of future change;and(5)how knowledge about global environmental variability and change can be applied to present-day and future decision-making.This paper addresses the use of high-performance computing and high-throughput computing for a global change study on the Digital Earth(DE)platform.Two aspects of the use of high-performance computing(HPC)/high-throughput computing(HTC)on the DE platform are the processing of data from all sources,especially Earth observation data,and the simulation of global change models.The HPC/HTC is an essential and efficient tool for the processing of vast amounts of global data,especially Earth observation data.The current trend involves running complex global climate models using potentially millions of personal computers to achieve better climate change predictions than would ever be possible using the supercomputers currently available to scientists.展开更多
In the last two decades, computational hydraulics has undergone a rapid development following the advancement of data acquisition and computing technologies. Using a finite-volume Godunov-type hydrodynamic model, this...In the last two decades, computational hydraulics has undergone a rapid development following the advancement of data acquisition and computing technologies. Using a finite-volume Godunov-type hydrodynamic model, this work demonstrates the promise of modern high-performance computing technology to achieve real-time flood modeling at a regional scale. The software is implemented for high-performance heterogeneous computing using the OpenCL programming framework, and developed to support simulations across multiple GPUs using a domain decomposition technique and across multiple systems through an efficient implementation of the Message Passing Interface (MPI) standard. The software is applied for a convective storm induced flood event in Newcastle upon Tyne, demonstrating high computational performance across a GPU cluster, and good agreement against crowd- sourced observations. Issues relating to data availability, complex urban topography and differences in drainage capacity affect results for a small number of areas.展开更多
文摘High-Performance Computing(HPC)has advanced tremendously over the past several decades,with immense technological developments revolutionizing the ability to model,simulate,and analyze large amounts of data.Over time,these HPC systems have become much larger and more complex,featuring multi-/many-core processors with thousands of cores that are often connected to form a large-scale computing system.
基金"This paper is an extended version of "SpotMPl: a framework for auction-based HPC computing using amazon spot instances" published in the International Symposium on Advances of Distributed Computing and Networking (ADCN 2011).Acknowledgment This research is supported in part by the National Science Foundation grant CNS 0958854 and educational resource grants from Amazon.com.
文摘Cloud computing is expanding widely in the world of IT infrastructure. This is due partly to the cost-saving effect of economies of scale. Fair market conditions can in theory provide a healthy environment to reflect the most reasonable costs of computations. While fixed cloud pricing provides an attractive low entry barrier for compute-intensive applications, both the consumer and supplier of computing resources can see high efficiency for their investments by participating in auction-based exchanges. There are huge incentives for the cloud provider to offer auctioned resources. However, from the consumer perspective, using these resources is a sparsely discussed challenge. This paper reports a methodology and framework designed to address the challenges of using HPC (High Performance Computing) applications on auction-based cloud clusters. The authors focus on HPC applications and describe a method for determining bid-aware checkpointing intervals. They extend a theoretical model for determining checkpoint intervals using statistical analysis of pricing histories. Also the latest developments in the SpotHPC framework are introduced which aim at facilitating the managed execution of real MPI applications on auction-based cloud environments. The authors use their model to simulate a set of algorithms with different computing and communication densities. The results show the complex interactions between optimal bidding strategies and parallel applications performance.
基金supported by the National Natural Science Foundation of China (No. 50974107)the Engineering Project of High School Subject Innovation (No. B07028), China
文摘In this paper, an experimental study on the sulphate attack resistance of high-performance concrete (HPC) with two different water-to-binder ratios (w/b) under compressive loading is presented. The sulphate concentration, compressive strength, and the mass change in the HPC specimens were determined for immersion in a Na2SO4 solution over different durations under external compressive loading by self-regulating loading equipment. The effects of the compressive stress, the w/b ratio, and the Na2SO4 solution concentration on the HPC sulphate attack resistance under compressive loading were analysed. The results showed that the HPC sulphate attack resistance under compressive loading was closely related to the stress level, the w/b ratio, and the Na2SO4 solution concentration. Applying a 0.3 stress ratio for the compressive loading or reducing the w/b ratio clearly improved the HPC sulphate attack resistance, whereas applying a 0.6 stress ratio for the compressive loading or exposing the HPC to a more concentrated Na2SO4 solution accelerated the sulphate attack and HPC deterioration.
基金Financial supports from the Natural Science Foundation of China(41761134089,41977218)Six Talent Peaks Project of Jiangsu Province(RJFW-003)the Fundamental Research Funds for the Central Universities(14380103)are gratefully acknowledged.
文摘Discrete element method can effectively simulate the discontinuity,inhomogeneity and large deformation and failure of rock and soil.Based on the innovative matrix computing of the discrete element method,the highperformance discrete element software MatDEM may handle millions of elements in one computer,and enables the discrete element simulation at the engineering scale.It supports heat calculation,multi-field and fluidsolid coupling numerical simulations.Furthermore,the software integrates pre-processing,solver,postprocessing,and powerful secondary development,allowing recompiling new discrete element software.The basic principles of the DEM,the implement and development of the MatDEM software,and its applications are introduced in this paper.The software and sample source code are available online(http://matdem.com).
文摘Factors that have effect on concrete creep include mixture composition,curing conditions,ambient exposure conditions,and element geometry.Considering concrete mixtures influence and in order to improve the prediction of prestress loss in important structures,an experimental test under laboratory conditions was carried out to investigate compression creep of two high performance concrete mixtures used for prestressed members in one bridge.Based on the experimental results,a power exponent function of creep degree for structural numerical analysis was used to model the creep degree of two HPCs,and two series of parameters of this function for two HPCs were calculated with evolution program optimum method.The experimental data was compared with CEB-FIP 90 and ACI 209(92) models,and the two code models both overestimated creep degrees of the two HPCs.So it is recommended that the power exponent function should be used in this bridge structure analysis.
文摘In order to investigate the compression creep of two kinds of high-performance concrete mixtures used for prestressed members in a bridge,an experimental test under laboratory conditions was carried out.Based on the experimental results,a power exponent function was used to model the creep degree of these high-performance concretes(HPCs) for structural numerical analysis,and two series parameters of this function for the HPCs were given with the optimum method of evolution program.The experimental data were compared with CEB-FIP 90 and ACI 92 models.Results show that the two code models both overestimate the creep degree of two HPCs,so it is recommended that the power exponent function should be used for the creep analysis of bridge structure.
文摘The meteorological high-performance computing resource is the support platform for the weather forecast and climate prediction numerical model operation. The scientific and objective method to evaluate the application of meteorological high-performance computing resources can not only provide reference for the optimization of active resources, but also provide a quantitative basis for future resource construction and planning. In this paper, the concept of the utility value B and index compliance rate E of the meteorological high performance computing system are presented. The evaluation process, evaluation index and calculation method of the high performance computing resource application benefits are introduced.
文摘Within the last few decades, increases in computational resources have contributed enormously to the progress of science and engineering (S & E). To continue making rapid advancements, the S & E community must be able to access computing resources. One way to provide such resources is through High-Performance Computing (HPC) centers. Many academic research institutions offer their own HPC Centers but struggle to make the computing resources easily accessible and user-friendly. Here we present SHABU, a RESTful Web API framework that enables S & E communities to access resources from Boston University’s Shared Computing Center (SCC). The SHABU requirements are derived from the use cases described in this work.
基金supported by National Key Research and Development Program of China under Grant 2024YFE0210800National Natural Science Foundation of China under Grant 62495062Beijing Natural Science Foundation under Grant L242017.
文摘The Dynamical Density Functional Theory(DDFT)algorithm,derived by associating classical Density Functional Theory(DFT)with the fundamental Smoluchowski dynamical equation,describes the evolution of inhomo-geneous fluid density distributions over time.It plays a significant role in studying the evolution of density distributions over time in inhomogeneous systems.The Sunway Bluelight II supercomputer,as a new generation of China’s developed supercomputer,possesses powerful computational capabilities.Porting and optimizing industrial software on this platform holds significant importance.For the optimization of the DDFT algorithm,based on the Sunway Bluelight II supercomputer and the unique hardware architecture of the SW39000 processor,this work proposes three acceleration strategies to enhance computational efficiency and performance,including direct parallel optimization,local-memory constrained optimization for CPEs,and multi-core groups collaboration and communication optimization.This method combines the characteristics of the program’s algorithm with the unique hardware architecture of the Sunway Bluelight II supercomputer,optimizing the storage and transmission structures to achieve a closer integration of software and hardware.For the first time,this paper presents Sunway-Dynamical Density Functional Theory(SW-DDFT).Experimental results show that SW-DDFT achieves a speedup of 6.67 times within a single-core group compared to the original DDFT implementation,with six core groups(a total of 384 CPEs),the maximum speedup can reach 28.64 times,and parallel efficiency can reach 71%,demonstrating excellent acceleration performance.
文摘以ChatGPT为代表的生成式大模型,展现出前所未有的通用能力,并驱动科学研究范式向科学智能(AI for science)转变。这一趋势使得高性能计算与人工智能计算加速融合,超智融合的智能超算系统成为支撑未来大模型发展与科学发现的关键基础设施。本文对智能超算系统在这一历史性交汇点上面临的重大机遇进行了系统分析,并深入探讨了其在计算芯片、体系结构、硬件系统、软件生态、可靠性与能耗等方面所遭遇的严峻挑战,提出需要软硬件协同设计与全产业链的紧密合作,从而为构建高效、普惠、可持续的新一代智能超算系统奠定基础。
基金This work was supported in part by the MOST,China under Grant Nos.2009CB723906 and 2008AA12Z109by CAS under Grant No.KZCX2-YW-313.
文摘The study of global climate change seeks to understand:(1)the components of the Earth’s varying environmental system,with a particular focus on climate;(2)how these components interact to determine present conditions;(3)the factors driving these components;(4)the history of global change and the projection of future change;and(5)how knowledge about global environmental variability and change can be applied to present-day and future decision-making.This paper addresses the use of high-performance computing and high-throughput computing for a global change study on the Digital Earth(DE)platform.Two aspects of the use of high-performance computing(HPC)/high-throughput computing(HTC)on the DE platform are the processing of data from all sources,especially Earth observation data,and the simulation of global change models.The HPC/HTC is an essential and efficient tool for the processing of vast amounts of global data,especially Earth observation data.The current trend involves running complex global climate models using potentially millions of personal computers to achieve better climate change predictions than would ever be possible using the supercomputers currently available to scientists.
基金Project supported by the UK NERC SINATRA Project(Grant No.NE/K008781/1)
文摘In the last two decades, computational hydraulics has undergone a rapid development following the advancement of data acquisition and computing technologies. Using a finite-volume Godunov-type hydrodynamic model, this work demonstrates the promise of modern high-performance computing technology to achieve real-time flood modeling at a regional scale. The software is implemented for high-performance heterogeneous computing using the OpenCL programming framework, and developed to support simulations across multiple GPUs using a domain decomposition technique and across multiple systems through an efficient implementation of the Message Passing Interface (MPI) standard. The software is applied for a convective storm induced flood event in Newcastle upon Tyne, demonstrating high computational performance across a GPU cluster, and good agreement against crowd- sourced observations. Issues relating to data availability, complex urban topography and differences in drainage capacity affect results for a small number of areas.