Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental mea...Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental measurements,but these are often limited by the observation conditions,such as the number of configured sensors.Therefore,the resulting linear velocity profiles usually exhibit limitations in reproducing the temporal-varied and nonlinear behavior during the debris flow process.In this study,we present a novel approach to explore the debris flow velocity profile in detail upon our previous 3D-HBPSPH numerical model,i.e.,the three-dimensional Smoothed Particle Hydrodynamic model incorporating the Herschel-Bulkley-Papanastasiou rheology.Specifically,we propose a stratification aggregation algorithm for interpreting the details of SPH particles,which enables the recording of temporal velocities of debris flow at different mud depths.To analyze the velocity profile,we introduce a logarithmic-based nonlinear model with two key parameters,that a controlling the shape of velocity profile and b concerning its temporal evolution.We verify the proposed velocity profile and explore its sensitivity using 34 sets of velocity data from three individual flume experiments in previous literature.Our results demonstrate that the proposed temporalvaried nonlinear velocity profile outperforms the previous linear profiles.展开更多
Direct soil temperature(ST)measurement is time-consuming and costly;thus,the use of simple and cost-effective machine learning(ML)tools is helpful.In this study,ML approaches,including KStar,instance-based K-nearest l...Direct soil temperature(ST)measurement is time-consuming and costly;thus,the use of simple and cost-effective machine learning(ML)tools is helpful.In this study,ML approaches,including KStar,instance-based K-nearest learning(IBK),and locally weighted learning(LWL),coupled with resampling algorithms of bagging(BA)and dagging(DA)(BA-IBK,BA-KStar,BA-LWL,DA-IBK,DA-KStar,and DA-LWL)were developed and tested for multi-step ahead(3,6,and 9 d ahead)ST forecasting.In addition,a linear regression(LR)model was used as a benchmark to evaluate the results.A dataset was established,with daily ST time-series at 5 and 50 cm soil depths in a farmland as models’output and meteorological data as models’input,including mean(T_(mean)),minimum(Tmin),and maximum(T_(max))air temperatures,evaporation(Eva),sunshine hours(SSH),and solar radiation(SR),which were collected at Isfahan Synoptic Station(Iran)for 13 years(1992–2005).Six different input combination scenarios were selected based on Pearson’s correlation coefficients between inputs and outputs and fed into the models.We used 70%of the data to train the models,with the remaining 30%used for model evaluation via multiple visual and quantitative metrics.Our?ndings showed that T_(mean)was the most effective input variable for ST forecasting in most of the developed models,while in some cases the combinations of variables,including T_(mean)and T_(max)and T_(mean),T_(max),Tmin,Eva,and SSH proved to be the best input combinations.Among the evaluated models,BA-KStar showed greater compatibility,while in most cases,BA-IBK and-LWL provided more accurate results,depending on soil depth.For the 5 cm soil depth,BA-KStar had superior performance(i.e.,Nash-Sutcliffe efficiency(NSE)=0.90,0.87,and 0.85 for 3,6,and 9 d ahead forecasting,respectively);for the 50 cm soil depth,DA-KStar outperformed the other models(i.e.,NSE=0.88,0.89,and 0.89 for 3,6,and 9 d ahead forecasting,respectively).The results con?rmed that all hybrid models had higher prediction capabilities than the LR model.展开更多
Three dimensional frictional contact problems are formulated as linear complementarity problems based on the parametric variational principle. Two aggregate-functionbased algorithms for solving complementarity problem...Three dimensional frictional contact problems are formulated as linear complementarity problems based on the parametric variational principle. Two aggregate-functionbased algorithms for solving complementarity problems are proposed. One is called the self-adjusting interior point algorithm, the other is called the aggregate function smoothing algorithm. Numerical experiment shows the efficiency of the proposed two algorithms.展开更多
Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify dat...Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse. An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology. It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data. Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.展开更多
To solve the problem that most of existing layered multicast protocols cannot adapt todynamic network conditions because their layers are coarsely granulated and static,a new congestioncontrol mechanism for dynamic ad...To solve the problem that most of existing layered multicast protocols cannot adapt todynamic network conditions because their layers are coarsely granulated and static,a new congestioncontrol mechanism for dynamic adaptive layered multicast(DALM) is presented.In this mechanism,anovel feedback aggregating algorithm is put forward,which can dynamically determine the number oflayers and the rate of each layer,and can efficiently improve network bandwidth utilization ratio.Additionally,because all layers is transmitted in only one group,the intricate and time-consuminginternet group management protocol(IGMP) operations,caused by receiver joining a new layer orleaving the topmost subscribed layer,are thoroughly eliminated.And this mechanism also avoids otherproblems resulted from multiple groups.Simulation results show that DALM is adaptive and TCPfriendly.展开更多
We analyze a common feature of p-Kemeny AGGregation(p-KAGG) and p-One-Sided Crossing Minimization(p-OSCM) to provide new insights and findings of interest to both the graph drawing community and the social choice ...We analyze a common feature of p-Kemeny AGGregation(p-KAGG) and p-One-Sided Crossing Minimization(p-OSCM) to provide new insights and findings of interest to both the graph drawing community and the social choice community. We obtain parameterized subexponential-time algorithms for p-KAGG—a problem in social choice theory—and for p-OSCM—a problem in graph drawing. These algorithms run in time O*(2O(√k log k)),where k is the parameter, and significantly improve the previous best algorithms with running times O.1.403k/and O.1.4656k/, respectively. We also study natural "above-guarantee" versions of these problems and show them to be fixed parameter tractable. In fact, we show that the above-guarantee versions of these problems are equivalent to a weighted variant of p-directed feedback arc set. Our results for the above-guarantee version of p-KAGG reveal an interesting contrast. We show that when the number of "votes" in the input to p-KAGG is odd the above guarantee version can still be solved in time O*(2O(√k log k)), while if it is even then the problem cannot have a subexponential time algorithm unless the exponential time hypothesis fails(equivalently, unless FPT D M[1]).展开更多
An aggregate generation and packing algorithm based on Monte-Carlo method is developed to express the aggregate random distribution in cement concrete. A mesoscale model is proposed on the basis of the algorithm. In t...An aggregate generation and packing algorithm based on Monte-Carlo method is developed to express the aggregate random distribution in cement concrete. A mesoscale model is proposed on the basis of the algorithm. In this model, the concrete con- sists of three parts, namely coarse aggregate, cement matrix and the interracial transition zone (ITZ) between them. To verify the proposed model, a three-point bending beam test was performed and a series of two-dimensional mesoscale concrete mod- els were generated for crack behavior investigation. The results indicate that the numerical model proposed in this study is helpful in modeling crack behavior of concrete, and that treating concrete as heterogeneous material is very important in frac- ture modeling.展开更多
Accurate measurements of the three-dimensional structure characteristics of urban buildings and their greenhouse effect are important for evaluating the impact of urbanization on the radiation energy budget and resear...Accurate measurements of the three-dimensional structure characteristics of urban buildings and their greenhouse effect are important for evaluating the impact of urbanization on the radiation energy budget and research on the urban heat island(UHI)effect.The decrease in evapotranspiration or the increase in sensible heat caused by urbanization is considered to be the main cause of the UHI effect,but little is known about the influence of the main factor“net radiant flux”of the urban surface heat balance.In this study,experimental observation and quantitative model simulation were used to find that with the increase of building surface area after urbanization,the direct solar radiation flux and net radiation flux on building surface areas changed significantly.In order to accurately quantify the relationship between the positive and negative effects,this study puts forward the equivalent calculation principle of“aggregation element”,which is composed of a building’s sunny face and its shadow face,and the algorithm of the contribution of the area to thermal effect.This research clarifies the greenhouse effect of a building with walls of glass windows.Research shows that when the difference between absorption rates of a concrete wall and grass is−0.21,the cooling effect is shown.In the case of concrete walls with glass windows,the difference between absorption rates of a building wall and grass is−0.11,which is also a cooling effect.The greenhouse effect value of a building with glass windows reduces the cooling effect value to 56%of the effect of a building with concrete walls.The simulation of changes in net radiant flux and flux density shows that the greenhouse effect of a 5-story building with windows yields 15.5%less cooling effect than one with concrete walls,and a 30-story building with windows reduces the cooling effect by 23.0%.The simulation results confirmed that the difference in the equivalent absorption rate of the aggregation element is the“director”of cooling and heating effects,and the area of the aggregation element is the“amplifier”of cooling and heating effects.At the same time,the simulation results prove the greenhouse effect of glass windows,which significantly reduces the cold effect of concrete wall buildings.The model reveals the real contribution of optimized urban design to mitigating UHI and building a comfortable environment where there is no atmospheric circulation.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.52078493)the Natural Science Foundation of Hunan Province(Grant No.2022JJ30700)+2 种基金the Natural Science Foundation for Excellent Young Scholars of Hunan(Grant No.2021JJ20057)the Science and Technology Plan Project of Changsha(Grant No.kq2305006)the Innovation Driven Program of Central South University(Grant No.2023CXQD033).
文摘Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental measurements,but these are often limited by the observation conditions,such as the number of configured sensors.Therefore,the resulting linear velocity profiles usually exhibit limitations in reproducing the temporal-varied and nonlinear behavior during the debris flow process.In this study,we present a novel approach to explore the debris flow velocity profile in detail upon our previous 3D-HBPSPH numerical model,i.e.,the three-dimensional Smoothed Particle Hydrodynamic model incorporating the Herschel-Bulkley-Papanastasiou rheology.Specifically,we propose a stratification aggregation algorithm for interpreting the details of SPH particles,which enables the recording of temporal velocities of debris flow at different mud depths.To analyze the velocity profile,we introduce a logarithmic-based nonlinear model with two key parameters,that a controlling the shape of velocity profile and b concerning its temporal evolution.We verify the proposed velocity profile and explore its sensitivity using 34 sets of velocity data from three individual flume experiments in previous literature.Our results demonstrate that the proposed temporalvaried nonlinear velocity profile outperforms the previous linear profiles.
文摘Direct soil temperature(ST)measurement is time-consuming and costly;thus,the use of simple and cost-effective machine learning(ML)tools is helpful.In this study,ML approaches,including KStar,instance-based K-nearest learning(IBK),and locally weighted learning(LWL),coupled with resampling algorithms of bagging(BA)and dagging(DA)(BA-IBK,BA-KStar,BA-LWL,DA-IBK,DA-KStar,and DA-LWL)were developed and tested for multi-step ahead(3,6,and 9 d ahead)ST forecasting.In addition,a linear regression(LR)model was used as a benchmark to evaluate the results.A dataset was established,with daily ST time-series at 5 and 50 cm soil depths in a farmland as models’output and meteorological data as models’input,including mean(T_(mean)),minimum(Tmin),and maximum(T_(max))air temperatures,evaporation(Eva),sunshine hours(SSH),and solar radiation(SR),which were collected at Isfahan Synoptic Station(Iran)for 13 years(1992–2005).Six different input combination scenarios were selected based on Pearson’s correlation coefficients between inputs and outputs and fed into the models.We used 70%of the data to train the models,with the remaining 30%used for model evaluation via multiple visual and quantitative metrics.Our?ndings showed that T_(mean)was the most effective input variable for ST forecasting in most of the developed models,while in some cases the combinations of variables,including T_(mean)and T_(max)and T_(mean),T_(max),Tmin,Eva,and SSH proved to be the best input combinations.Among the evaluated models,BA-KStar showed greater compatibility,while in most cases,BA-IBK and-LWL provided more accurate results,depending on soil depth.For the 5 cm soil depth,BA-KStar had superior performance(i.e.,Nash-Sutcliffe efficiency(NSE)=0.90,0.87,and 0.85 for 3,6,and 9 d ahead forecasting,respectively);for the 50 cm soil depth,DA-KStar outperformed the other models(i.e.,NSE=0.88,0.89,and 0.89 for 3,6,and 9 d ahead forecasting,respectively).The results con?rmed that all hybrid models had higher prediction capabilities than the LR model.
基金The project supported by the National Natural Science foundation of china(10225212,50178016.10302007)the National Kev Basic Research Special Foundation and the Ministry of Education of China
文摘Three dimensional frictional contact problems are formulated as linear complementarity problems based on the parametric variational principle. Two aggregate-functionbased algorithms for solving complementarity problems are proposed. One is called the self-adjusting interior point algorithm, the other is called the aggregate function smoothing algorithm. Numerical experiment shows the efficiency of the proposed two algorithms.
基金Supported by National Natural Science Foundation of China (No. 50475117)Tianjin Natural Science Foundation (No.06YFJMJC03700).
文摘Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse. An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology. It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data. Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.
基金Supported by the Youth Science and Technology Foundation of UESTC(No.YF020803)and National Defense Prestudy Foundation(No.51406070201DZ0211).
文摘To solve the problem that most of existing layered multicast protocols cannot adapt todynamic network conditions because their layers are coarsely granulated and static,a new congestioncontrol mechanism for dynamic adaptive layered multicast(DALM) is presented.In this mechanism,anovel feedback aggregating algorithm is put forward,which can dynamically determine the number oflayers and the rate of each layer,and can efficiently improve network bandwidth utilization ratio.Additionally,because all layers is transmitted in only one group,the intricate and time-consuminginternet group management protocol(IGMP) operations,caused by receiver joining a new layer orleaving the topmost subscribed layer,are thoroughly eliminated.And this mechanism also avoids otherproblems resulted from multiple groups.Simulation results show that DALM is adaptive and TCPfriendly.
基金supported by a GermanNorwegian PPP grantsupported by the Indo-German Max Planck Center for Computer Science (IMPECS)
文摘We analyze a common feature of p-Kemeny AGGregation(p-KAGG) and p-One-Sided Crossing Minimization(p-OSCM) to provide new insights and findings of interest to both the graph drawing community and the social choice community. We obtain parameterized subexponential-time algorithms for p-KAGG—a problem in social choice theory—and for p-OSCM—a problem in graph drawing. These algorithms run in time O*(2O(√k log k)),where k is the parameter, and significantly improve the previous best algorithms with running times O.1.403k/and O.1.4656k/, respectively. We also study natural "above-guarantee" versions of these problems and show them to be fixed parameter tractable. In fact, we show that the above-guarantee versions of these problems are equivalent to a weighted variant of p-directed feedback arc set. Our results for the above-guarantee version of p-KAGG reveal an interesting contrast. We show that when the number of "votes" in the input to p-KAGG is odd the above guarantee version can still be solved in time O*(2O(√k log k)), while if it is even then the problem cannot have a subexponential time algorithm unless the exponential time hypothesis fails(equivalently, unless FPT D M[1]).
基金supported by the Specialized Research Fund for the Doctoral Program (SRFDP) of Higher Education of China (Grant No.20100092110049)Jiangsu Provincial Science Foundation Program of China (Grant No. BK2009259)+1 种基金the National Basic Research Program of China ("973" Project) (Grant No. 2009CB623202)the National Natural Science Foundation of China (Grant No. 11072060)
文摘An aggregate generation and packing algorithm based on Monte-Carlo method is developed to express the aggregate random distribution in cement concrete. A mesoscale model is proposed on the basis of the algorithm. In this model, the concrete con- sists of three parts, namely coarse aggregate, cement matrix and the interracial transition zone (ITZ) between them. To verify the proposed model, a three-point bending beam test was performed and a series of two-dimensional mesoscale concrete mod- els were generated for crack behavior investigation. The results indicate that the numerical model proposed in this study is helpful in modeling crack behavior of concrete, and that treating concrete as heterogeneous material is very important in frac- ture modeling.
基金This work was supported by the National Natural Science Foundation of China(NSFC)(Grant No.41871343).
文摘Accurate measurements of the three-dimensional structure characteristics of urban buildings and their greenhouse effect are important for evaluating the impact of urbanization on the radiation energy budget and research on the urban heat island(UHI)effect.The decrease in evapotranspiration or the increase in sensible heat caused by urbanization is considered to be the main cause of the UHI effect,but little is known about the influence of the main factor“net radiant flux”of the urban surface heat balance.In this study,experimental observation and quantitative model simulation were used to find that with the increase of building surface area after urbanization,the direct solar radiation flux and net radiation flux on building surface areas changed significantly.In order to accurately quantify the relationship between the positive and negative effects,this study puts forward the equivalent calculation principle of“aggregation element”,which is composed of a building’s sunny face and its shadow face,and the algorithm of the contribution of the area to thermal effect.This research clarifies the greenhouse effect of a building with walls of glass windows.Research shows that when the difference between absorption rates of a concrete wall and grass is−0.21,the cooling effect is shown.In the case of concrete walls with glass windows,the difference between absorption rates of a building wall and grass is−0.11,which is also a cooling effect.The greenhouse effect value of a building with glass windows reduces the cooling effect value to 56%of the effect of a building with concrete walls.The simulation of changes in net radiant flux and flux density shows that the greenhouse effect of a 5-story building with windows yields 15.5%less cooling effect than one with concrete walls,and a 30-story building with windows reduces the cooling effect by 23.0%.The simulation results confirmed that the difference in the equivalent absorption rate of the aggregation element is the“director”of cooling and heating effects,and the area of the aggregation element is the“amplifier”of cooling and heating effects.At the same time,the simulation results prove the greenhouse effect of glass windows,which significantly reduces the cold effect of concrete wall buildings.The model reveals the real contribution of optimized urban design to mitigating UHI and building a comfortable environment where there is no atmospheric circulation.