With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in th...With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.展开更多
Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical s...Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical structures.In view of the fact that there are many parameters of airborne induced polarization data in time domain,and the sensitivity diff erence between parameters is large,which brings challenges to the stability and accuracy of the inversion.In this paper,we propose an inversion mehtod for time-domain AEM data with IP effect based on the Pearson correlation constraints.This method uses the Pearson correlation coeffi cient in statistics to characterize the correlation between the resistivity and the chargeability and constructs the Pearson correlation constraints for inverting the objective function to reduce the non uniqueness of inversion.To verify the eff ectiveness of this method,we perform both Occam’s inversion and Pearson correlation constrained inversion on the synthetic data.The experiments show that the Pearson correlation constrained inverison is more accurate and stable than the Occam’s inversion.Finally,we carried out the inversion to a survey dataset with and without IP eff ect.The results show that the data misfit and the continuity of the inverted section are greatly improved when the IP eff ect is considered.展开更多
Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow ...Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow shear wave. The polarization and time delay of the fast and slow shear wave can be used to predict the azimuth and density of fracture. The current identification method of fracture azimuth and fracture density is cross-correlation method. It is assumed that fast and slow shear waves were symmetrical wavelets after completely separating,and use the most similar characteristics of wavelets to identify fracture azimuth and density,but in the experiment the identification is poor in accuracy. Pearson correlation coefficient method is one of the methods for separating the fast wave and slow wave. This method is faster in calculating speed and better in noise immunity and resolution compared with the traditional cross-correlation method. Pearson correlation coefficient method is a non-linear problem,particle swarm optimization( PSO) is a good nonlinear global optimization method which converges fast and is easy to implement. In this study,PSO is combined with the Pearson correlation coefficient method to achieve identifying fracture property and improve the computational efficiency.展开更多
Correlations between two time series,including the linear Pearson correlation and the nonlinear transfer entropy,have attracted significant attention.In this work,we studied the correlations between multiple stock dat...Correlations between two time series,including the linear Pearson correlation and the nonlinear transfer entropy,have attracted significant attention.In this work,we studied the correlations between multiple stock data with the introduction of a time delay and a rolling window.In most cases,the Pearson correlation and transfer entropy share the same tendency,where a higher correlation provides more information for predicting future trends from one stock to another,but a lower correlation provides less.Considering the computational complexity of the transfer entropy and the simplicity of the Pearson correlation,using the linear correlation with time delays and a rolling window is a robust and simple method to quantify the mutual information between stocks.Predictions made by the long short-term memory method with mutual information outperform those made only with selfinformation when there are high correlations between two stocks.展开更多
We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estima...We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estimate the regression coefficients,upon which bootstrap-based method is used to test the significance of covariates of interest.Simulation studies show the effectiveness of the method in terms of type-I error control,power performance in moderate sample size and robustness with respect to model mis-specification.We illustrate the application of the proposed method to some real data concerning health measurements.展开更多
In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme l...In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme learning machine(ELM)is proposed.In this paper,the Pearson correlation coefficient is used to screen out the main influencing factors as the input-independent variables of the ELM algorithm and IPSO based on a ladder-structure coding method is used to optimize the number of hidden-layer nodes,input weights and bias values of the ELM.Therefore,the prediction model for the cost data of power transmission and transformation projects based on the Pearson correlation coefficient-IPSO-ELM algorithm is constructed.Through the analysis of calculation examples,it is proved that the prediction accuracy of the proposed method is higher than that of other algorithms,which verifies the effectiveness of the model.展开更多
Many applications for control of autonomous platform are being developed and one important aspect is the excess of information, frequently redundant, that imposes a great computational cost in data processing. Taking ...Many applications for control of autonomous platform are being developed and one important aspect is the excess of information, frequently redundant, that imposes a great computational cost in data processing. Taking into account the temporal coherence between consecutive frames, the PCC (Pearson's Correlation Coefficient) was proposed and applied as: discarding criteria methodology, dynamic power management solution, environment observer method which selects automatically only the regions-of-interest; and taking place in the obstacle avoidance context, as a method for collision risk estimation for vehicles in dynamic and unknown environments. Even if the PCC is a great tool to help the autonomous or semi-autonomous navigation, distortions in the imaging system, pixel noise, slight variations in the object's position relative to the camera, and other factors produce a false PCC threshold. Whereas there are homogeneous regions in the image, in order to obtain a more realistic Pearson's correlation, we propose to use some prior known environment information.展开更多
The traditional academic warning methods for students in higher vocational colleges are relatively backward,single,and have many influencing factors,which have a limited effect on improving their learning ability.A da...The traditional academic warning methods for students in higher vocational colleges are relatively backward,single,and have many influencing factors,which have a limited effect on improving their learning ability.A data set was established by collecting academic warning data of students in a certain university.The importance of the school,major,grade,and warning level for the students was analyzed using the Pearson correlation coefficient,random forest variable importance,and permutation importance.It was found that the characteristic of the major has a great impact on the academic warning level.Countermeasures such as dynamic adjustment of majors,reform of cognitive adaptation of courses,full-cycle academic support,and data-driven precise intervention were proposed to provide theoretical support and practical paths for universities to improve the efficiency of academic warning and enhance students’learning ability.展开更多
This study investigates the impact of carbon tax policies on carbon emission reductions in G20 countries to support the achievement of the Net Zero Emissions target by 2060.As the G20 collectively accounts for a signi...This study investigates the impact of carbon tax policies on carbon emission reductions in G20 countries to support the achievement of the Net Zero Emissions target by 2060.As the G20 collectively accounts for a significant share of global greenhouse gas emissions,effective policy interventions in these nations are pivotal to addressing the climate crisis.The research employs the Pearson correlation test to quantify the statistical relationship between carbon tax rates and emission levels,alongside a content analysis of sustainability reports from G20 countries to evaluate policy implementation and outcomes.The results reveal a moderate yet statistically significant negative correlation(r=-0.30,p<0.05),indicating that higher carbon taxes are associated with lower emission levels.Content analysis further demonstrates that countries with high and consistently enforced carbon taxes,such as Japan and South Korea,achieve more substantial emissions reductions compared to nations with lower tax rates or inconsistent policy implementation.The findings emphasize that while carbon taxes serve as an effective instrument to internalize the social costs of carbon pollution,their impact is maximized when integrated with broader strategies,including investments in renewable energy,advancements in energy efficiency,and technological innovation.This research contributes to the understanding of carbon tax effectiveness and offers policy recommendations to strengthen fiscal measures as part of comprehensive climate action strategies toward achieving global sustainability targets.展开更多
Groundwater(GW)is a vital freshwater resource extensively exploited in the Vietnamese Mekong Delta,especially during the dry seasons.This study applies the Cumulative Rainfall Departure(CRD)method to estimate GW recha...Groundwater(GW)is a vital freshwater resource extensively exploited in the Vietnamese Mekong Delta,especially during the dry seasons.This study applies the Cumulative Rainfall Departure(CRD)method to estimate GW recharge in deep aquifers of Soc Trang Province,located in the southernmost region of Vietnam under a tropical climate.Monthly rainfall records and daily GW level data of the aquifers from 2010 to 2020 were used.The Pearson correlation between observed GW levels and CDR model GW levels exceeds 0.995,indicating high model accuracy.The analysis reveals that the CRD fractions for the Upper Pleistocene(qp3),Middle Pleistocene(qp2-3),Lower Pleistocene(qp1),and Middle Pliocene(n22)aquifers are 0.085%,0.104%,0.130%,and 0.180%,respectively,totaling approximately 0.5%of the annual rainfall.This corresponds to an annual GW recharge of 25.86 million m3,or 70,850 m3/day,equivalent to 70%of the current GW abstraction rate of 101,000 m3/day.Given the critical role of GW as a freshwater source,implementing an enhanced GW recharge program using surface water and rainwater is strongly recommended.Additionally,the analysis suggests that the decline in GW levels due to abstraction corresponds to 0.85 times the mean annual precipitation,a finding that warrants further investigation.展开更多
This work demonstrates experimentally the close relation between return currents from relativistic laser-driven target polarization and the quality of the relativistic laser–plasma interaction for laser-driven second...This work demonstrates experimentally the close relation between return currents from relativistic laser-driven target polarization and the quality of the relativistic laser–plasma interaction for laser-driven secondary sources,taking as an example ion acceleration by target normal sheath acceleration.The Pearson linear correlation of maximum return current amplitude and proton spectrum cutoff energy is found to be in the range from~0.70 to 0.94.kA-scale return currents rise in all interaction schemes where targets of any kind are charged by escaping laser-accelerated relativistic electrons.Their precise measurement is demonstrated using an inductive scheme that allows operation at high repetition rates.Thus,return currents can be used as a metrological online tool for the optimization of many laser-driven secondary sources and for diagnosing their stability.In particular,in two parametric studies of laser-driven ion acceleration,we carry out a noninvasive online measurement of return currents in a tape target system irradiated by the 1 PW VEGA-3 laser at Centro de Láseres Pulsados:first the size of the irradiated area is varied at best compression of the laser pulse;second,the pulse duration is varied by means of induced group delay dispersion at best focus.This work paves the way to the development of feedback systems that operate at the high repetition rates of PW-class lasers.展开更多
Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devi...Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devising effective control strategies.However,viruses are complex to propagate and quantify.Existing methodologies for viral quantification tend to be expensive and time-consuming.Here,we present a rapid cost-effective approach to quantify viral propagation using an engineered virus expressing a fluorescent reporter.Using a microplate reader,we measured viral protein levels and we validated our findings through comparison by western blot analysis of viral coat protein,the most common approach to quantify viral titer.Our proposed methodology provides a practical and accessible approach to studying virus-host interactions and could contribute to enhancing our understanding of plant virology.展开更多
As an important means to solve water shortage,reclaimed water has been widely used for landscape water supply.However,with the emergence of large-scale epidemic diseases such as SARS,avian influenza and COVID-19 in re...As an important means to solve water shortage,reclaimed water has been widely used for landscape water supply.However,with the emergence of large-scale epidemic diseases such as SARS,avian influenza and COVID-19 in recent years,people are increasingly concerned about the public health safety of reclaimed water discharged into landscape water,especially the pathogenic microorganisms in it.In this study,the water quality and microorganisms of the Old Summer Palace,a landscape water body with reclaimed water as the only replenishment water source,were tracked through long-term dynamic monitoring.And the health risks of indicator microorganisms were analyzed using Quantitative Microbial Risk Assessment(QMRA).It was found that the concentration of indicator microorganisms Enterococcus(ENT),Escherichia coli(EC)and Fecal coliform(FC)generally showed an upward trend along the direction of water flow and increased by more than 0.6 log at the end of the flow.The concentrations of indicator microorganisms were higher in summer and autumn than those in spring.And there was a positive correlation between the concentration of indicator microorganisms and COD.Further research suggested that increased concentration of indicator microorganisms also led to increased health risks,which were more than 30%higher in other areas of the park than the water inlet area and required special attention.In addition,(water)surface operation exposure pathway had much higher health risks than other pathways and people in related occupations were advised to take precautions to reduce the risks.展开更多
Increasing bacteria levels in the Lower Neches River caused by Hurricane Harvey has been of a serious concern.This study is to analyze the historical water sampling measurements and real-time water quality data collec...Increasing bacteria levels in the Lower Neches River caused by Hurricane Harvey has been of a serious concern.This study is to analyze the historical water sampling measurements and real-time water quality data collected with wireless sensors to monitor and evaluate water quality under different hydrological and hydraulic conditions.The statistical and Pearson correlation analysis on historical water samples determines that alkalinity,chloride,hardness,conductivity,and pH are highly correlated,and they decrease with increasing flow rate due to dilution.The flow rate has positive correlations with Escherichia coli,total suspended solids,and turbidity,which demonstrates that runoff is one of the causes of the elevated bacteria and sediment loadings in the river.The correlation between E.coli and turbidity indicates that turbidity greater than 45 nephelometric turbidity units in the Neches River can serve as a proxy for E.coli to indicate the bacterial outbreak.A series of statistical tools and an innovative two-layer data smoothing filter are developed to detect outliers,fill missing values,and filter spikes of the sensor measurements.The correlation analysis on the sensor data illustrates that the elevated sediment/bacteria/algae in the river is either caused by the first flush rain and heavy rain events in December to March or practices of land use and land cover.Therefore,utilizing sensor measurements along with rainfall and discharge data is recommended to monitor and evaluate water quality,then in turn to provide early alerts on water resources management decisions.展开更多
Huaihe River Basin(HRB) is located in China’s north-south climatic transition zone,which is very sensitive to global climate change.Based on the daily maximum temperature,minimum temperature,and precipitation data of...Huaihe River Basin(HRB) is located in China’s north-south climatic transition zone,which is very sensitive to global climate change.Based on the daily maximum temperature,minimum temperature,and precipitation data of 40 meteorological stations and nine monthly large-scale ocean-atmospheric circulation indices data during 1959–2019,we present an assessment of the spatial and temporal variations of extreme temperature and precipitation events in the HRB using nine extreme climate indices,and analyze the teleconnection relationship between extreme climate indices and large-scale ocean-atmospheric circulation indices.The results show that warm extreme indices show a significant(P < 0.05) increasing trend,while cold extreme indices(except for cold spell duration) and diurnal temperature range(DTR) show a significant decreasing trend.Furthermore,all extreme temperature indices show significant mutations during 1959-2019.Spatially,a stronger warming trend occurs in eastern HRB than western HRB,while maximum 5-d precipitation(Rx5day) and rainstorm days(R25) show an increasing trend in the southern,central,and northwestern regions of HRB.Arctic oscillation(AO),Atlantic multidecadal oscillation(AMO),and East Atlantic/Western Russia(EA/WR) have a stronger correlation with extreme climate indices compared to other circulation indices.AO and AMO(EA/WR) exhibit a significant(P < 0.05) negative(positive)correlation with frost days and diurnal temperature range.Extreme warm events are strongly correlated with the variability of AMO and EA/WR in most parts of HRB,while extreme cold events are closely related to the variability of AO and AMO in eastern HRB.In contrast,AMO,AO,and EA/WR show limited impacts on extreme precipitation events in most parts of HRB.展开更多
Accurate prediction of shield tunneling-induced settlement is a complex problem that requires consideration of many influential parameters.Recent studies reveal that machine learning(ML)algorithms can predict the sett...Accurate prediction of shield tunneling-induced settlement is a complex problem that requires consideration of many influential parameters.Recent studies reveal that machine learning(ML)algorithms can predict the settlement caused by tunneling.However,well-performing ML models are usually less interpretable.Irrelevant input features decrease the performance and interpretability of an ML model.Nonetheless,feature selection,a critical step in the ML pipeline,is usually ignored in most studies that focused on predicting tunneling-induced settlement.This study applies four techniques,i.e.Pearson correlation method,sequential forward selection(SFS),sequential backward selection(SBS)and Boruta algorithm,to investigate the effect of feature selection on the model’s performance when predicting the tunneling-induced maximum surface settlement(S_(max)).The data set used in this study was compiled from two metro tunnel projects excavated in Hangzhou,China using earth pressure balance(EPB)shields and consists of 14 input features and a single output(i.e.S_(max)).The ML model that is trained on features selected from the Boruta algorithm demonstrates the best performance in both the training and testing phases.The relevant features chosen from the Boruta algorithm further indicate that tunneling-induced settlement is affected by parameters related to tunnel geometry,geological conditions and shield operation.The recently proposed Shapley additive explanations(SHAP)method explores how the input features contribute to the output of a complex ML model.It is observed that the larger settlements are induced during shield tunneling in silty clay.Moreover,the SHAP analysis reveals that the low magnitudes of face pressure at the top of the shield increase the model’s output。展开更多
Based on the analysis of its basic characteristics, this article investigated the disparities of Chinese service industry among the three regions (the eastern China, the western China and the middle China) and inter...Based on the analysis of its basic characteristics, this article investigated the disparities of Chinese service industry among the three regions (the eastern China, the western China and the middle China) and inter-provincial disparities of that in the three regions by Theil coefficient and cluster analysis. Then, major factors influencing its spatial disparity were explored by correlation analysis and regression analysis. The conclusions could be drawn as follows. 1) The development of Chinese service industry experienced three phases since the 1980s: rapid growth period, slow growth period, and recovery period. From the proportion of value-added and employment, its development was obviously on the low level. From the composition of industrial structure, traditional service sectors were dominant, but modem service sectors were lagged. Moreover, its spatial disparity was distinct. 2) The level of Chinese service industry was divided into five basic regional ranks: well-developed, developed, relatively-developed, underdeveloped and undeveloped regions, As a whole, the overall structure of spatial disparity was steady in 1990-2005. But there was notable gradient disparity in the interior structure of service industry among different provinces. Furthermore, the overall disparity expanded rapidly in 1990-2005. The inter-provincial disparity of service industry in the three regions, especially in the eastern China, was bigger than the disparity among the three regions. And 3) the level of economic development, the level of urban development, the scale of market capacity, the level of transportation and telecommunication, and the abundance of human resources were major factors influencing the development of Chinese service industry.展开更多
Steel slag(SS)is one of byproduct of steel manufacture industry.The environmental concerns of SS may limit their re-use in different applications.The goal of this study was to investigate the leaching behavior of meta...Steel slag(SS)is one of byproduct of steel manufacture industry.The environmental concerns of SS may limit their re-use in different applications.The goal of this study was to investigate the leaching behavior of metals from SS before and after treated by microbially induced carbonate precipitation(MICP).Toxicity characteristic leaching procedure,synthetic precipitation leaching procedure and water leaching tests were performed to evaluate the leaching behavior of major elements(Fe,Mg and Ca)and trace elements(Ba,Cu and Mn)in three scenarios.The concentrations of leaching metals increased with the content of SS.After it reached the peak concentration,the leaching concentration decreased with the content of SS.The leachability of all elements concerned in this study was below 0.5%.The carbonate generated from the MICP process contributed to the low leachability of metals.After bio-modified by MICP process,the leaching concentrations of Ba from TCLP,SPLP and WLT tests were below 2.0 mg/L,which was the limit in drinking water regulated by U.S.EPA.The concentrations of Cu leached out from MICP-treated SS-sand samples were below 1.3 mg/L which is the limit regulated by national secondary drinking water.Compared with the regulations of U.S.EPA and Mississippi Department of Environment Quality(MDEQ),MICP-treated samples were classified as non-hazardous materials with respects to the leaching of metals.Meanwhile,maximum contaminant limits regulated by U.S.EPA states that MICP-treated SS are eco-friendly materials that can be reused as construction materials.展开更多
The effect of salinity on sludge alkaline fermentation at low temperature(20°C) was investigated, and a kinetic analysis was performed. Different doses of sodium chloride(Na Cl, 0–25 g/L) were added into the...The effect of salinity on sludge alkaline fermentation at low temperature(20°C) was investigated, and a kinetic analysis was performed. Different doses of sodium chloride(Na Cl, 0–25 g/L) were added into the fermentation system. The batch-mode results showed that the soluble chemical oxygen demand(SCOD) increased with salinity. The hydrolysate(soluble protein, polysaccharide) and the acidification products(short chain fatty acids(SCFAs), NH+4–N, and PO_4^(3-)–P) increased with salinity initially, but slightly declined respectively at higher level salinity(20 g/L or 20–25 g/L). However, the hydrolytic acidification performance increased in the presence of salt compared to that without salt.Furthermore, the results of Haldane inhibition kinetics analysis showed that the salt enhanced the hydrolysis rate of particulate organic matter from sludge particulate and the specific utilization of hydrolysate, and decreased the specific utilization of SCFAs. Pearson correlation coefficient analysis indicated that the importance of polysaccharide on the accumulation of SCFAs was reduced with salt addition, but the importance of protein and NH+4–N on SCFA accumulation was increased.展开更多
Mastering the pattern of food loss caused by droughts and floods aids in planning the layout of agricultural production,determining the scale of drought and flood control projects,and reducing food loss.The Standardiz...Mastering the pattern of food loss caused by droughts and floods aids in planning the layout of agricultural production,determining the scale of drought and flood control projects,and reducing food loss.The Standardized Precipitation Evapotranspiration Index is calculated using monthly meteorological data from 1984 to 2020 in Shandong Province of China and is used to identify the province’s drought and flood characteristics.Then,food losses due to droughts and floods are estimated separately from disaster loss data.Finally,the relationship between drought/flood-related factors and food losses is quantified using methods such as the Pearson correlation coefficient and linear regression.The results show that:1)there is a trend of aridity in Shandong Province,and the drought characteristic variables are increasing yearly while flood duration and severity are decreasing.2)The food losses caused by droughts in Shandong Province are more than those caused by floods,and the area where droughts and floods occur frequently is located in Linyi City.3)The impact of precipitation on food loss due to drought/flood is significant,followed by potential evapotranspiration and temperature.4)The relationship between drought and flood conditions and food losses can be precisely quantified.The accumulated drought duration of one month led to 1.939×10^(4)t of grain loss,and an increase in cumulative flood duration of one month resulted in1.134×10^(4)t of grain loss.If the cumulative drought severity and average drought peak increased by one unit,food loss due to drought will increase by 1.562×10^(4)t and 1.511×10^(6)t,respectively.If the cumulative flood severity and average flood peak increase by one unit,food loss will increase by 8.470×103t and 1.034×10^(6)t,respectively.展开更多
基金Shanghai Rising-Star Program(Grant No.21QA1403400)Shanghai Sailing Program(Grant No.20YF1414800)Shanghai Key Laboratory of Power Station Automation Technology(Grant No.13DZ2273800).
文摘With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.
基金This paper was fi nancially supported by the National Natural Science Foundation of China(Nos.42030806,41774125,41904104,41804098)the Pioneer Project of Chinese Academy of Sciences(No.XDA14020102).
文摘Due to the induced polarization(IP)eff ect,the sign reversal often occurs in timedomain airborne electromagnetic(AEM)data.The inversions that do not consider IP eff ect cannot recover the true umderground electrical structures.In view of the fact that there are many parameters of airborne induced polarization data in time domain,and the sensitivity diff erence between parameters is large,which brings challenges to the stability and accuracy of the inversion.In this paper,we propose an inversion mehtod for time-domain AEM data with IP effect based on the Pearson correlation constraints.This method uses the Pearson correlation coeffi cient in statistics to characterize the correlation between the resistivity and the chargeability and constructs the Pearson correlation constraints for inverting the objective function to reduce the non uniqueness of inversion.To verify the eff ectiveness of this method,we perform both Occam’s inversion and Pearson correlation constrained inversion on the synthetic data.The experiments show that the Pearson correlation constrained inverison is more accurate and stable than the Occam’s inversion.Finally,we carried out the inversion to a survey dataset with and without IP eff ect.The results show that the data misfit and the continuity of the inverted section are greatly improved when the IP eff ect is considered.
文摘Prediction of reservoir fracture is the key to explore fracture-type reservoir. When a shear-wave propagates in anisotropic media containing fracture,it splits into two polarized shear waves: fast shear wave and slow shear wave. The polarization and time delay of the fast and slow shear wave can be used to predict the azimuth and density of fracture. The current identification method of fracture azimuth and fracture density is cross-correlation method. It is assumed that fast and slow shear waves were symmetrical wavelets after completely separating,and use the most similar characteristics of wavelets to identify fracture azimuth and density,but in the experiment the identification is poor in accuracy. Pearson correlation coefficient method is one of the methods for separating the fast wave and slow wave. This method is faster in calculating speed and better in noise immunity and resolution compared with the traditional cross-correlation method. Pearson correlation coefficient method is a non-linear problem,particle swarm optimization( PSO) is a good nonlinear global optimization method which converges fast and is easy to implement. In this study,PSO is combined with the Pearson correlation coefficient method to achieve identifying fracture property and improve the computational efficiency.
文摘Correlations between two time series,including the linear Pearson correlation and the nonlinear transfer entropy,have attracted significant attention.In this work,we studied the correlations between multiple stock data with the introduction of a time delay and a rolling window.In most cases,the Pearson correlation and transfer entropy share the same tendency,where a higher correlation provides more information for predicting future trends from one stock to another,but a lower correlation provides less.Considering the computational complexity of the transfer entropy and the simplicity of the Pearson correlation,using the linear correlation with time delays and a rolling window is a robust and simple method to quantify the mutual information between stocks.Predictions made by the long short-term memory method with mutual information outperform those made only with selfinformation when there are high correlations between two stocks.
文摘We propose two simple regression models of Pearson correlation coefficient of two normal responses or binary responses to assess the effect of covariates of interest.Likelihood-based inference is established to estimate the regression coefficients,upon which bootstrap-based method is used to test the significance of covariates of interest.Simulation studies show the effectiveness of the method in terms of type-I error control,power performance in moderate sample size and robustness with respect to model mis-specification.We illustrate the application of the proposed method to some real data concerning health measurements.
文摘In view of the difficulty in predicting the cost data of power transmission and transformation projects at present,a method based on Pearson correlation coefficient-improved particle swarm optimization(IPSO)-extreme learning machine(ELM)is proposed.In this paper,the Pearson correlation coefficient is used to screen out the main influencing factors as the input-independent variables of the ELM algorithm and IPSO based on a ladder-structure coding method is used to optimize the number of hidden-layer nodes,input weights and bias values of the ELM.Therefore,the prediction model for the cost data of power transmission and transformation projects based on the Pearson correlation coefficient-IPSO-ELM algorithm is constructed.Through the analysis of calculation examples,it is proved that the prediction accuracy of the proposed method is higher than that of other algorithms,which verifies the effectiveness of the model.
文摘Many applications for control of autonomous platform are being developed and one important aspect is the excess of information, frequently redundant, that imposes a great computational cost in data processing. Taking into account the temporal coherence between consecutive frames, the PCC (Pearson's Correlation Coefficient) was proposed and applied as: discarding criteria methodology, dynamic power management solution, environment observer method which selects automatically only the regions-of-interest; and taking place in the obstacle avoidance context, as a method for collision risk estimation for vehicles in dynamic and unknown environments. Even if the PCC is a great tool to help the autonomous or semi-autonomous navigation, distortions in the imaging system, pixel noise, slight variations in the object's position relative to the camera, and other factors produce a false PCC threshold. Whereas there are homogeneous regions in the image, in order to obtain a more realistic Pearson's correlation, we propose to use some prior known environment information.
基金supported by the Basic Ability Improvement Project of Young and Middle-Aged Teachers in Colleges and Universities of Guangxi(2022KY1922,2021KY1938).
文摘The traditional academic warning methods for students in higher vocational colleges are relatively backward,single,and have many influencing factors,which have a limited effect on improving their learning ability.A data set was established by collecting academic warning data of students in a certain university.The importance of the school,major,grade,and warning level for the students was analyzed using the Pearson correlation coefficient,random forest variable importance,and permutation importance.It was found that the characteristic of the major has a great impact on the academic warning level.Countermeasures such as dynamic adjustment of majors,reform of cognitive adaptation of courses,full-cycle academic support,and data-driven precise intervention were proposed to provide theoretical support and practical paths for universities to improve the efficiency of academic warning and enhance students’learning ability.
文摘This study investigates the impact of carbon tax policies on carbon emission reductions in G20 countries to support the achievement of the Net Zero Emissions target by 2060.As the G20 collectively accounts for a significant share of global greenhouse gas emissions,effective policy interventions in these nations are pivotal to addressing the climate crisis.The research employs the Pearson correlation test to quantify the statistical relationship between carbon tax rates and emission levels,alongside a content analysis of sustainability reports from G20 countries to evaluate policy implementation and outcomes.The results reveal a moderate yet statistically significant negative correlation(r=-0.30,p<0.05),indicating that higher carbon taxes are associated with lower emission levels.Content analysis further demonstrates that countries with high and consistently enforced carbon taxes,such as Japan and South Korea,achieve more substantial emissions reductions compared to nations with lower tax rates or inconsistent policy implementation.The findings emphasize that while carbon taxes serve as an effective instrument to internalize the social costs of carbon pollution,their impact is maximized when integrated with broader strategies,including investments in renewable energy,advancements in energy efficiency,and technological innovation.This research contributes to the understanding of carbon tax effectiveness and offers policy recommendations to strengthen fiscal measures as part of comprehensive climate action strategies toward achieving global sustainability targets.
基金completed as part of the Research project(CT.2022.01.MDA.02):"Research and evaluation of the role of the hydrogeological and engineering geological factors in the change of the flows and of the coastal area of the Hau River mouth in the southwest region of Vietnam",funded by the Vietnam Ministry of Education and Training.The GW monitoring data used in this study were sourced from BIGDATA,NAWAPI.
文摘Groundwater(GW)is a vital freshwater resource extensively exploited in the Vietnamese Mekong Delta,especially during the dry seasons.This study applies the Cumulative Rainfall Departure(CRD)method to estimate GW recharge in deep aquifers of Soc Trang Province,located in the southernmost region of Vietnam under a tropical climate.Monthly rainfall records and daily GW level data of the aquifers from 2010 to 2020 were used.The Pearson correlation between observed GW levels and CDR model GW levels exceeds 0.995,indicating high model accuracy.The analysis reveals that the CRD fractions for the Upper Pleistocene(qp3),Middle Pleistocene(qp2-3),Lower Pleistocene(qp1),and Middle Pliocene(n22)aquifers are 0.085%,0.104%,0.130%,and 0.180%,respectively,totaling approximately 0.5%of the annual rainfall.This corresponds to an annual GW recharge of 25.86 million m3,or 70,850 m3/day,equivalent to 70%of the current GW abstraction rate of 101,000 m3/day.Given the critical role of GW as a freshwater source,implementing an enhanced GW recharge program using surface water and rainwater is strongly recommended.Additionally,the analysis suggests that the decline in GW levels due to abstraction corresponds to 0.85 times the mean annual precipitation,a finding that warrants further investigation.
基金funding from the European Union’s Horizon 2020 research and innovation program through the European IMPULSE project under Grant Agreement No.871161from LASERLAB-EUROPE V under Grant Agreement No.871124+6 种基金from the Grant Agency of the Czech Republic(Grant No.GM23-05027M)Grant No.PDC2021120933-I00 funded by MCIN/AEI/10.13039/501100011033by the European Union Next Generation EU/PRTRsupported by funding from the Ministerio de Ciencia,Innovación y Universidades in Spain through ICTS Equipment Grant No.EQC2018-005230-Pfrom Grant No.PID2021-125389O A-I00 funded by MCIN/AEI/10.13039/501100011033/FEDER,UEby“ERDF A Way of Making Europe”by the European Unionfrom grants of the Junta de Castilla y León with Grant Nos.CLP263P20 and CLP087U16。
文摘This work demonstrates experimentally the close relation between return currents from relativistic laser-driven target polarization and the quality of the relativistic laser–plasma interaction for laser-driven secondary sources,taking as an example ion acceleration by target normal sheath acceleration.The Pearson linear correlation of maximum return current amplitude and proton spectrum cutoff energy is found to be in the range from~0.70 to 0.94.kA-scale return currents rise in all interaction schemes where targets of any kind are charged by escaping laser-accelerated relativistic electrons.Their precise measurement is demonstrated using an inductive scheme that allows operation at high repetition rates.Thus,return currents can be used as a metrological online tool for the optimization of many laser-driven secondary sources and for diagnosing their stability.In particular,in two parametric studies of laser-driven ion acceleration,we carry out a noninvasive online measurement of return currents in a tape target system irradiated by the 1 PW VEGA-3 laser at Centro de Láseres Pulsados:first the size of the irradiated area is varied at best compression of the laser pulse;second,the pulse duration is varied by means of induced group delay dispersion at best focus.This work paves the way to the development of feedback systems that operate at the high repetition rates of PW-class lasers.
基金Funding from Natural Sciences and Engineering Research Council of Canada award number RGPIN/4002-2020.
文摘Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devising effective control strategies.However,viruses are complex to propagate and quantify.Existing methodologies for viral quantification tend to be expensive and time-consuming.Here,we present a rapid cost-effective approach to quantify viral propagation using an engineered virus expressing a fluorescent reporter.Using a microplate reader,we measured viral protein levels and we validated our findings through comparison by western blot analysis of viral coat protein,the most common approach to quantify viral titer.Our proposed methodology provides a practical and accessible approach to studying virus-host interactions and could contribute to enhancing our understanding of plant virology.
基金supported by the National Natural Science Foundation of China(Nos.52070192 and 51778618).
文摘As an important means to solve water shortage,reclaimed water has been widely used for landscape water supply.However,with the emergence of large-scale epidemic diseases such as SARS,avian influenza and COVID-19 in recent years,people are increasingly concerned about the public health safety of reclaimed water discharged into landscape water,especially the pathogenic microorganisms in it.In this study,the water quality and microorganisms of the Old Summer Palace,a landscape water body with reclaimed water as the only replenishment water source,were tracked through long-term dynamic monitoring.And the health risks of indicator microorganisms were analyzed using Quantitative Microbial Risk Assessment(QMRA).It was found that the concentration of indicator microorganisms Enterococcus(ENT),Escherichia coli(EC)and Fecal coliform(FC)generally showed an upward trend along the direction of water flow and increased by more than 0.6 log at the end of the flow.The concentrations of indicator microorganisms were higher in summer and autumn than those in spring.And there was a positive correlation between the concentration of indicator microorganisms and COD.Further research suggested that increased concentration of indicator microorganisms also led to increased health risks,which were more than 30%higher in other areas of the park than the water inlet area and required special attention.In addition,(water)surface operation exposure pathway had much higher health risks than other pathways and people in related occupations were advised to take precautions to reduce the risks.
基金supported by Center for Resiliency(CfR)at Lamar University(Grant No.22PSSO1).
文摘Increasing bacteria levels in the Lower Neches River caused by Hurricane Harvey has been of a serious concern.This study is to analyze the historical water sampling measurements and real-time water quality data collected with wireless sensors to monitor and evaluate water quality under different hydrological and hydraulic conditions.The statistical and Pearson correlation analysis on historical water samples determines that alkalinity,chloride,hardness,conductivity,and pH are highly correlated,and they decrease with increasing flow rate due to dilution.The flow rate has positive correlations with Escherichia coli,total suspended solids,and turbidity,which demonstrates that runoff is one of the causes of the elevated bacteria and sediment loadings in the river.The correlation between E.coli and turbidity indicates that turbidity greater than 45 nephelometric turbidity units in the Neches River can serve as a proxy for E.coli to indicate the bacterial outbreak.A series of statistical tools and an innovative two-layer data smoothing filter are developed to detect outliers,fill missing values,and filter spikes of the sensor measurements.The correlation analysis on the sensor data illustrates that the elevated sediment/bacteria/algae in the river is either caused by the first flush rain and heavy rain events in December to March or practices of land use and land cover.Therefore,utilizing sensor measurements along with rainfall and discharge data is recommended to monitor and evaluate water quality,then in turn to provide early alerts on water resources management decisions.
基金Under the auspices of National Natural Science Foundation of China(No.52279016,51909106,51879108,42002247,41471160)Natural Science Foundation of Guangdong Province,China(No.2020A1515011038,2020A1515111054)+1 种基金Special Fund for Science and Technology Development in 2016 of Department of Science and Technology of Guangdong Province,China(No.2016A020223007)the Project of Jinan Science and Technology Bureau(No.2021GXRC070)。
文摘Huaihe River Basin(HRB) is located in China’s north-south climatic transition zone,which is very sensitive to global climate change.Based on the daily maximum temperature,minimum temperature,and precipitation data of 40 meteorological stations and nine monthly large-scale ocean-atmospheric circulation indices data during 1959–2019,we present an assessment of the spatial and temporal variations of extreme temperature and precipitation events in the HRB using nine extreme climate indices,and analyze the teleconnection relationship between extreme climate indices and large-scale ocean-atmospheric circulation indices.The results show that warm extreme indices show a significant(P < 0.05) increasing trend,while cold extreme indices(except for cold spell duration) and diurnal temperature range(DTR) show a significant decreasing trend.Furthermore,all extreme temperature indices show significant mutations during 1959-2019.Spatially,a stronger warming trend occurs in eastern HRB than western HRB,while maximum 5-d precipitation(Rx5day) and rainstorm days(R25) show an increasing trend in the southern,central,and northwestern regions of HRB.Arctic oscillation(AO),Atlantic multidecadal oscillation(AMO),and East Atlantic/Western Russia(EA/WR) have a stronger correlation with extreme climate indices compared to other circulation indices.AO and AMO(EA/WR) exhibit a significant(P < 0.05) negative(positive)correlation with frost days and diurnal temperature range.Extreme warm events are strongly correlated with the variability of AMO and EA/WR in most parts of HRB,while extreme cold events are closely related to the variability of AO and AMO in eastern HRB.In contrast,AMO,AO,and EA/WR show limited impacts on extreme precipitation events in most parts of HRB.
基金support provided by The Science and Technology Development Fund,Macao SAR,China(File Nos.0057/2020/AGJ and SKL-IOTSC-2021-2023)Science and Technology Program of Guangdong Province,China(Grant No.2021A0505080009).
文摘Accurate prediction of shield tunneling-induced settlement is a complex problem that requires consideration of many influential parameters.Recent studies reveal that machine learning(ML)algorithms can predict the settlement caused by tunneling.However,well-performing ML models are usually less interpretable.Irrelevant input features decrease the performance and interpretability of an ML model.Nonetheless,feature selection,a critical step in the ML pipeline,is usually ignored in most studies that focused on predicting tunneling-induced settlement.This study applies four techniques,i.e.Pearson correlation method,sequential forward selection(SFS),sequential backward selection(SBS)and Boruta algorithm,to investigate the effect of feature selection on the model’s performance when predicting the tunneling-induced maximum surface settlement(S_(max)).The data set used in this study was compiled from two metro tunnel projects excavated in Hangzhou,China using earth pressure balance(EPB)shields and consists of 14 input features and a single output(i.e.S_(max)).The ML model that is trained on features selected from the Boruta algorithm demonstrates the best performance in both the training and testing phases.The relevant features chosen from the Boruta algorithm further indicate that tunneling-induced settlement is affected by parameters related to tunnel geometry,geological conditions and shield operation.The recently proposed Shapley additive explanations(SHAP)method explores how the input features contribute to the output of a complex ML model.It is observed that the larger settlements are induced during shield tunneling in silty clay.Moreover,the SHAP analysis reveals that the low magnitudes of face pressure at the top of the shield increase the model’s output。
基金Under the auspices of National Natural Science Foundation of China (No 40871069)Megaproject of Science and Technology Research for the 11th Five-Year Plan of China (No 2006BAJ05A06)Natural Science Foundation of Beijing City (No 9072002)
文摘Based on the analysis of its basic characteristics, this article investigated the disparities of Chinese service industry among the three regions (the eastern China, the western China and the middle China) and inter-provincial disparities of that in the three regions by Theil coefficient and cluster analysis. Then, major factors influencing its spatial disparity were explored by correlation analysis and regression analysis. The conclusions could be drawn as follows. 1) The development of Chinese service industry experienced three phases since the 1980s: rapid growth period, slow growth period, and recovery period. From the proportion of value-added and employment, its development was obviously on the low level. From the composition of industrial structure, traditional service sectors were dominant, but modem service sectors were lagged. Moreover, its spatial disparity was distinct. 2) The level of Chinese service industry was divided into five basic regional ranks: well-developed, developed, relatively-developed, underdeveloped and undeveloped regions, As a whole, the overall structure of spatial disparity was steady in 1990-2005. But there was notable gradient disparity in the interior structure of service industry among different provinces. Furthermore, the overall disparity expanded rapidly in 1990-2005. The inter-provincial disparity of service industry in the three regions, especially in the eastern China, was bigger than the disparity among the three regions. And 3) the level of economic development, the level of urban development, the scale of market capacity, the level of transportation and telecommunication, and the abundance of human resources were major factors influencing the development of Chinese service industry.
基金supported by the US National Science Foundation(No.1924241)。
文摘Steel slag(SS)is one of byproduct of steel manufacture industry.The environmental concerns of SS may limit their re-use in different applications.The goal of this study was to investigate the leaching behavior of metals from SS before and after treated by microbially induced carbonate precipitation(MICP).Toxicity characteristic leaching procedure,synthetic precipitation leaching procedure and water leaching tests were performed to evaluate the leaching behavior of major elements(Fe,Mg and Ca)and trace elements(Ba,Cu and Mn)in three scenarios.The concentrations of leaching metals increased with the content of SS.After it reached the peak concentration,the leaching concentration decreased with the content of SS.The leachability of all elements concerned in this study was below 0.5%.The carbonate generated from the MICP process contributed to the low leachability of metals.After bio-modified by MICP process,the leaching concentrations of Ba from TCLP,SPLP and WLT tests were below 2.0 mg/L,which was the limit in drinking water regulated by U.S.EPA.The concentrations of Cu leached out from MICP-treated SS-sand samples were below 1.3 mg/L which is the limit regulated by national secondary drinking water.Compared with the regulations of U.S.EPA and Mississippi Department of Environment Quality(MDEQ),MICP-treated samples were classified as non-hazardous materials with respects to the leaching of metals.Meanwhile,maximum contaminant limits regulated by U.S.EPA states that MICP-treated SS are eco-friendly materials that can be reused as construction materials.
基金supported by the National Natural Science Foundation of China (No. 51178007)
文摘The effect of salinity on sludge alkaline fermentation at low temperature(20°C) was investigated, and a kinetic analysis was performed. Different doses of sodium chloride(Na Cl, 0–25 g/L) were added into the fermentation system. The batch-mode results showed that the soluble chemical oxygen demand(SCOD) increased with salinity. The hydrolysate(soluble protein, polysaccharide) and the acidification products(short chain fatty acids(SCFAs), NH+4–N, and PO_4^(3-)–P) increased with salinity initially, but slightly declined respectively at higher level salinity(20 g/L or 20–25 g/L). However, the hydrolytic acidification performance increased in the presence of salt compared to that without salt.Furthermore, the results of Haldane inhibition kinetics analysis showed that the salt enhanced the hydrolysis rate of particulate organic matter from sludge particulate and the specific utilization of hydrolysate, and decreased the specific utilization of SCFAs. Pearson correlation coefficient analysis indicated that the importance of polysaccharide on the accumulation of SCFAs was reduced with salt addition, but the importance of protein and NH+4–N on SCFA accumulation was increased.
基金Under the auspices of the National Social Science Foundation of China (No.19CGL045)。
文摘Mastering the pattern of food loss caused by droughts and floods aids in planning the layout of agricultural production,determining the scale of drought and flood control projects,and reducing food loss.The Standardized Precipitation Evapotranspiration Index is calculated using monthly meteorological data from 1984 to 2020 in Shandong Province of China and is used to identify the province’s drought and flood characteristics.Then,food losses due to droughts and floods are estimated separately from disaster loss data.Finally,the relationship between drought/flood-related factors and food losses is quantified using methods such as the Pearson correlation coefficient and linear regression.The results show that:1)there is a trend of aridity in Shandong Province,and the drought characteristic variables are increasing yearly while flood duration and severity are decreasing.2)The food losses caused by droughts in Shandong Province are more than those caused by floods,and the area where droughts and floods occur frequently is located in Linyi City.3)The impact of precipitation on food loss due to drought/flood is significant,followed by potential evapotranspiration and temperature.4)The relationship between drought and flood conditions and food losses can be precisely quantified.The accumulated drought duration of one month led to 1.939×10^(4)t of grain loss,and an increase in cumulative flood duration of one month resulted in1.134×10^(4)t of grain loss.If the cumulative drought severity and average drought peak increased by one unit,food loss due to drought will increase by 1.562×10^(4)t and 1.511×10^(6)t,respectively.If the cumulative flood severity and average flood peak increase by one unit,food loss will increase by 8.470×103t and 1.034×10^(6)t,respectively.