In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper prese...In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper presents a dynamic SLAM algorithm that leverages object detection and regional dynamic probability.Firstly,a parallel thread employs the YOLOX object detectionmodel to gather 2D semantic information and compensate for missed detections.Next,an improved K-means++clustering algorithm clusters bounding box regions,adaptively determining the threshold for extracting dynamic object contours as dynamic points change.This process divides the image into low dynamic,suspicious dynamic,and high dynamic regions.In the tracking thread,the dynamic point removal module assigns dynamic probability weights to the feature points in these regions.Combined with geometric methods,it detects and removes the dynamic points.The final evaluation on the public TUM RGB-D dataset shows that the proposed dynamic SLAM algorithm surpasses most existing SLAM algorithms,providing better pose estimation accuracy and robustness in dynamic environments.展开更多
To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military ...To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.展开更多
In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distr...In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.展开更多
We study the conditional entropy of topological dynamical systems using a family of metrics induced by probability bi-sequences.We present a Brin-Katok formula by replacing the mean metric by a family of metrics induc...We study the conditional entropy of topological dynamical systems using a family of metrics induced by probability bi-sequences.We present a Brin-Katok formula by replacing the mean metric by a family of metrics induced by a probability bi-sequence.We also establish the Katok’s entropy formula for conditional entropy for ergodic measures in the case of the new family of metrics.展开更多
The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitatio...The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitations of traditional thresholds in Argo data quality control(QC),this study proposes a novel probability distribution-based inference method(PDIM)for temperature-salinity threshold inference.By integrating historical observations with climatological data,the method utilizes historical data corresponding to latitude and longitude grids,calculates temperature/salinity frequency distributions for each depth,and determines“zero probability”boundaries through combined frequency distribution and climatology data.Then a probability distribution model is established to detect outliers automatically based on the features in the probability density function,which eliminates the traditional dependence on the normal distribution hypothesis.When applied to global Argo datasets from China Argo Real-time Data Center(CARDC),PDIM successfully identifies suspicious profiles and sensor drifts with high reliability,achieving a low false positive rate(0.55%for temperature,0.18%for salinity)while maintaining competitive true positive rate(28.29%for temperature,55.15%for salinity).This method is expected to improve the reliability of Argo data QC and has important significance for Argo QC.展开更多
Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturi...Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturing multimodal structures due to their uniform weighting scheme,leading to mode loss and degraded estimation accuracy.This paper presents the flexible kernel density estimator(F-KDE),a novel nonparametric approach designed to address these limitations.F-KDE introduces the concept of kernel unit inequivalence,assigning adaptive weights to each kernel unit,which better models local density variations in multimodal data.The method optimises an objective function that integrates estimation error and log-likelihood,using a particle swarm optimisation(PSO)algorithm that automatically determines optimal weights and bandwidths.Through extensive experiments on synthetic and real-world datasets,we demonstrated that(1)the weights and bandwidths in F-KDE stabilise as the optimisation algorithm iterates,(2)F-KDE effectively captures the multimodal characteristics and(3)F-KDE outperforms state-of-the-art density estimation methods regarding accuracy and robustness.The results confirm that F-KDE provides a valuable solution for accurately estimating multimodal PDFs.展开更多
Vaccination is critical for controlling infectious diseases,but negative vaccination information can lead to vaccine hesitancy.To study how the interplay between information diffusion and disease transmission impacts ...Vaccination is critical for controlling infectious diseases,but negative vaccination information can lead to vaccine hesitancy.To study how the interplay between information diffusion and disease transmission impacts vaccination and epidemic spread,we propose a novel two-layer multiplex network model that integrates an unaware-acceptant-negative-unaware(UANU)information diffusion model with a susceptible-vaccinated-exposed-infected-susceptible(SVEIS)epidemiological framework.This model includes individual exposure and vaccination statuses,time-varying forgetting probabilities,and information conversion thresholds.Through the microscopic Markov chain approach(MMCA),we derive dynamic transition equations and the epidemic threshold expression,validated by Monte Carlo simulations.Using MMCA equations,we predict vaccination densities and analyze parameter effects on vaccination,disease transmission,and the epidemic threshold.Our findings suggest that promoting positive information,curbing the spread of negative information,enhancing vaccine effectiveness,and promptly identifying asymptomatic carriers can significantly increase vaccination rates,reduce epidemic spread,and raise the epidemic threshold.展开更多
The study aims to develop an empirical model to predict the rainfall intensity in Al-Diwaniyah City,Iraq,according to a statistical analysis based on probability and the specific rainfall return period.Rainfall data w...The study aims to develop an empirical model to predict the rainfall intensity in Al-Diwaniyah City,Iraq,according to a statistical analysis based on probability and the specific rainfall return period.Rainfall data were collected daily for 25 years starting in 2000.Daily rainfall data were converted to rainfall intensity for five duration periods ranging from one to five hours.The extreme values were checked,and data that deviated from the group trend were removed for each period,and then arranged in descending order using the Weibull formula to calculate the probability.Statistically,the model performance with a return period of two years is considered good when compared with observed results and other methods such as Talbot and Sherman with a coefficient of determination(R2)>0.97 and Nash-Sutcliffe efficiency(NSE)>0.80.The results showed that a mathematical equation was obtained that describes the relationship between rainfall intensity,probability,and rainfall duration,which can be used for a confined return period with a 50% probability.Therefore,decision-makers can rely on the model to improve the performance of the city’s current drainage system during flood periods in the future.展开更多
Decision-makers usually have an aspiration level,a target,or a benchmark they aim to achieve.This behavior can be rationalized within the expected utility framework,which incorporates the probability of success(achiev...Decision-makers usually have an aspiration level,a target,or a benchmark they aim to achieve.This behavior can be rationalized within the expected utility framework,which incorporates the probability of success(achieving the aspiration level)as an important aspect of decision-making.Motivated by these theories,this study defines the probability of success as the number of days a firm’s return outperformed its benchmark in the portfolio formation month.This study uses portfolio-level and firm-level analyses,revealing an economically substantial and statistically significant relationship between the probability of success and expected stock returns,even after controlling for common risk factors and various characteristics.Additional analyses support the behavioral theory of the firm,which posits that firms act to achieve short-term aspiration levels.展开更多
In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes...In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.展开更多
To address prediction errors and limited information extraction in machine learning(ML)-based interval prediction,a hybrid model was proposed for interval estimation and failure assessment of step-like landslides unde...To address prediction errors and limited information extraction in machine learning(ML)-based interval prediction,a hybrid model was proposed for interval estimation and failure assessment of step-like landslides under uncertainty.The model decomposed displacements into trend and periodic components via Variational Mode Decomposition(VMD)and K-shape clustering.The Residual and Moving Block Bootstrap methods were used to generate pseudo datasets.Polynomial regressionwas adopted for trend forecasting,whereas the Dense Convolutional Network(DenseNet)and Long Short-Term Memory(LSTM)networks were employed for periodic displacement prediction.An Extreme Learning Machine(ELM)was used to estimate the noise variance,enabling the construction of Prediction Intervals(PIs)and quantificationof displacement uncertainty.Failure probabilities(Pf)were derived from PIs using an improved tangential angle criterion and reliability analysis.The model was validated on three step-like landslides in the Three Gorges Reservoir Area,achieving stability assessment accuracies of 99.88%(XD01),99.93%(ZG93),99.89%(ZG118),and 100%for ZG110 and ZG111 across the Baishuihe and Bazimen landslides.For the Shuping landslide,the predictions aligned with fieldobservations before and after the 2014–2015 remediation,with P_(f)remaining near zero post-2015 except for occasional peaks.The model outperformed conventional ML approaches by yielding narrower PIs.At XD01 with 90%PI nominal confidencelevel(PINC),the coverage width-based criterion(CWC)and PI average width(PIAW)were 3.38 mm.The mean values of the PIs exhibited high accuracy,with a Mean Absolute Error(MAE)of 0.28 mm and Root Mean Square Error(RMSE)of 0.39 mm.These results demonstrate the robustness of the proposed model in improving landslide risk assessment and decision-making under uncertainty.展开更多
With the implementation of General Senior High School Mathematics Curriculum Standards(2017 Edition,Revised in 2020),probability and statistics,as important carriers of the core mathematical competencies“mathematical...With the implementation of General Senior High School Mathematics Curriculum Standards(2017 Edition,Revised in 2020),probability and statistics,as important carriers of the core mathematical competencies“mathematical modeling”and“data analysis,”have increasingly highlighted their educational value.By summarizing the historical evolution of probability and statistics thinking and combining with teaching practice cases,this study explores its unique role in cultivating students’core mathematical competencies.The research proposes a project-based teaching strategy relying on real scenarios and empowered by technology.Through cases,it demonstrates how to use modern educational technology to realize the whole-process exploration of data collection,model construction,and conclusion verification,so as to promote the transformation of middle school probability and statistics teaching from knowledge imparting to competency development,and provide a practical reference for curriculum reform.展开更多
基金the National Natural Science Foundation of China(No.62063006)to the Guangxi Natural Science Foundation under Grant(Nos.2023GXNSFAA026025,AA24010001)+3 种基金to the Innovation Fund of Chinese Universities Industry-University-Research(ID:2023RY018)to the Special Guangxi Industry and Information Technology Department,Textile and Pharmaceutical Division(ID:2021 No.231)to the Special Research Project of Hechi University(ID:2021GCC028)to the Key Laboratory of AI and Information Processing,Education Department of Guangxi Zhuang Autonomous Region(Hechi University),No.2024GXZDSY009。
文摘In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper presents a dynamic SLAM algorithm that leverages object detection and regional dynamic probability.Firstly,a parallel thread employs the YOLOX object detectionmodel to gather 2D semantic information and compensate for missed detections.Next,an improved K-means++clustering algorithm clusters bounding box regions,adaptively determining the threshold for extracting dynamic object contours as dynamic points change.This process divides the image into low dynamic,suspicious dynamic,and high dynamic regions.In the tracking thread,the dynamic point removal module assigns dynamic probability weights to the feature points in these regions.Combined with geometric methods,it detects and removes the dynamic points.The final evaluation on the public TUM RGB-D dataset shows that the proposed dynamic SLAM algorithm surpasses most existing SLAM algorithms,providing better pose estimation accuracy and robustness in dynamic environments.
基金supported by the National Major Science and Technology Project,China(No.J2019-Ⅳ-0007-0075)the Fundamental Research Funds for the Central Universities,China(No.JKF-20240036)。
文摘To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.
基金supported in part by the Open Fund of Intelligent Control Laboratory,China(No.ICL-2023–0202)in part by National Key R&D Program of China(Nos.2021YFC2202600,2021YFC2202603)。
文摘In distributed fusion,when one or more sensors are disturbed by faults,a common problem is that their local estimations are inconsistent with those of other fault-free sensors.Most of the existing fault-tolerant distributed fusion algorithms,such as the Covariance Union(CU)and Faulttolerant Generalized Convex Combination(FGCC),are only used for the point estimation case where local estimates and their associated error covariances are provided.A treatment with focus on the fault-tolerant distributed fusions of arbitrary local Probability Density Functions(PDFs)is lacking.For this problem,we first propose Kullback–Leibler Divergence(KLD)and reversed KLD induced functional Fuzzy c-Means(FCM)clustering algorithms to soft cluster all local PDFs,respectively.On this basis,two fault-tolerant distributed fusion algorithms of arbitrary local PDFs are then developed.They select the representing PDF of the cluster with the largest sum of memberships as the fused PDF.Numerical examples verify the better fault tolerance of the developed two distributed fusion algorithms.
文摘We study the conditional entropy of topological dynamical systems using a family of metrics induced by probability bi-sequences.We present a Brin-Katok formula by replacing the mean metric by a family of metrics induced by a probability bi-sequence.We also establish the Katok’s entropy formula for conditional entropy for ergodic measures in the case of the new family of metrics.
基金The National Key Research and Development Program of China under contract No.2021YFC3101503the Hunan Provincial Natural Science Foundation of China under contract No.2023JJ10053+1 种基金the National Natural Science Foundation of China under contract Nos 42276205 and 42406195the Youth Independent Innovation Science Foundation under contract No.ZK24-54.
文摘The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitations of traditional thresholds in Argo data quality control(QC),this study proposes a novel probability distribution-based inference method(PDIM)for temperature-salinity threshold inference.By integrating historical observations with climatological data,the method utilizes historical data corresponding to latitude and longitude grids,calculates temperature/salinity frequency distributions for each depth,and determines“zero probability”boundaries through combined frequency distribution and climatology data.Then a probability distribution model is established to detect outliers automatically based on the features in the probability density function,which eliminates the traditional dependence on the normal distribution hypothesis.When applied to global Argo datasets from China Argo Real-time Data Center(CARDC),PDIM successfully identifies suspicious profiles and sensor drifts with high reliability,achieving a low false positive rate(0.55%for temperature,0.18%for salinity)while maintaining competitive true positive rate(28.29%for temperature,55.15%for salinity).This method is expected to improve the reliability of Argo data QC and has important significance for Argo QC.
基金supported by the Natural Science Foundation of Guangdong Province(Grant 2023A1515011667)Science and Technology Major Project of Shenzhen(Grant KJZD20230923114809020)Key Basic Research Foundation of Shenzhen(Grant JCYJ20220818100205012).
文摘Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturing multimodal structures due to their uniform weighting scheme,leading to mode loss and degraded estimation accuracy.This paper presents the flexible kernel density estimator(F-KDE),a novel nonparametric approach designed to address these limitations.F-KDE introduces the concept of kernel unit inequivalence,assigning adaptive weights to each kernel unit,which better models local density variations in multimodal data.The method optimises an objective function that integrates estimation error and log-likelihood,using a particle swarm optimisation(PSO)algorithm that automatically determines optimal weights and bandwidths.Through extensive experiments on synthetic and real-world datasets,we demonstrated that(1)the weights and bandwidths in F-KDE stabilise as the optimisation algorithm iterates,(2)F-KDE effectively captures the multimodal characteristics and(3)F-KDE outperforms state-of-the-art density estimation methods regarding accuracy and robustness.The results confirm that F-KDE provides a valuable solution for accurately estimating multimodal PDFs.
基金supported by the National Social Science Foundation of China(Grant Nos.21BGL217 and 22CGL050)the Philosophy and Social Science Fund of Education Department of Jiangsu Province(Grant No.2020SJA2346).
文摘Vaccination is critical for controlling infectious diseases,but negative vaccination information can lead to vaccine hesitancy.To study how the interplay between information diffusion and disease transmission impacts vaccination and epidemic spread,we propose a novel two-layer multiplex network model that integrates an unaware-acceptant-negative-unaware(UANU)information diffusion model with a susceptible-vaccinated-exposed-infected-susceptible(SVEIS)epidemiological framework.This model includes individual exposure and vaccination statuses,time-varying forgetting probabilities,and information conversion thresholds.Through the microscopic Markov chain approach(MMCA),we derive dynamic transition equations and the epidemic threshold expression,validated by Monte Carlo simulations.Using MMCA equations,we predict vaccination densities and analyze parameter effects on vaccination,disease transmission,and the epidemic threshold.Our findings suggest that promoting positive information,curbing the spread of negative information,enhancing vaccine effectiveness,and promptly identifying asymptomatic carriers can significantly increase vaccination rates,reduce epidemic spread,and raise the epidemic threshold.
文摘The study aims to develop an empirical model to predict the rainfall intensity in Al-Diwaniyah City,Iraq,according to a statistical analysis based on probability and the specific rainfall return period.Rainfall data were collected daily for 25 years starting in 2000.Daily rainfall data were converted to rainfall intensity for five duration periods ranging from one to five hours.The extreme values were checked,and data that deviated from the group trend were removed for each period,and then arranged in descending order using the Weibull formula to calculate the probability.Statistically,the model performance with a return period of two years is considered good when compared with observed results and other methods such as Talbot and Sherman with a coefficient of determination(R2)>0.97 and Nash-Sutcliffe efficiency(NSE)>0.80.The results showed that a mathematical equation was obtained that describes the relationship between rainfall intensity,probability,and rainfall duration,which can be used for a confined return period with a 50% probability.Therefore,decision-makers can rely on the model to improve the performance of the city’s current drainage system during flood periods in the future.
文摘Decision-makers usually have an aspiration level,a target,or a benchmark they aim to achieve.This behavior can be rationalized within the expected utility framework,which incorporates the probability of success(achieving the aspiration level)as an important aspect of decision-making.Motivated by these theories,this study defines the probability of success as the number of days a firm’s return outperformed its benchmark in the portfolio formation month.This study uses portfolio-level and firm-level analyses,revealing an economically substantial and statistically significant relationship between the probability of success and expected stock returns,even after controlling for common risk factors and various characteristics.Additional analyses support the behavioral theory of the firm,which posits that firms act to achieve short-term aspiration levels.
基金supported by the National Natural Science Foundation of China(No.71501183).
文摘In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.
基金funding support from the National Science Fund for Distinguished Young Scholars(Grant No.52125904)the National Key R&D Plan(Grant No.2022YFC3004403)the National Natural Science Foundation of China(Grant No.52039008).
文摘To address prediction errors and limited information extraction in machine learning(ML)-based interval prediction,a hybrid model was proposed for interval estimation and failure assessment of step-like landslides under uncertainty.The model decomposed displacements into trend and periodic components via Variational Mode Decomposition(VMD)and K-shape clustering.The Residual and Moving Block Bootstrap methods were used to generate pseudo datasets.Polynomial regressionwas adopted for trend forecasting,whereas the Dense Convolutional Network(DenseNet)and Long Short-Term Memory(LSTM)networks were employed for periodic displacement prediction.An Extreme Learning Machine(ELM)was used to estimate the noise variance,enabling the construction of Prediction Intervals(PIs)and quantificationof displacement uncertainty.Failure probabilities(Pf)were derived from PIs using an improved tangential angle criterion and reliability analysis.The model was validated on three step-like landslides in the Three Gorges Reservoir Area,achieving stability assessment accuracies of 99.88%(XD01),99.93%(ZG93),99.89%(ZG118),and 100%for ZG110 and ZG111 across the Baishuihe and Bazimen landslides.For the Shuping landslide,the predictions aligned with fieldobservations before and after the 2014–2015 remediation,with P_(f)remaining near zero post-2015 except for occasional peaks.The model outperformed conventional ML approaches by yielding narrower PIs.At XD01 with 90%PI nominal confidencelevel(PINC),the coverage width-based criterion(CWC)and PI average width(PIAW)were 3.38 mm.The mean values of the PIs exhibited high accuracy,with a Mean Absolute Error(MAE)of 0.28 mm and Root Mean Square Error(RMSE)of 0.39 mm.These results demonstrate the robustness of the proposed model in improving landslide risk assessment and decision-making under uncertainty.
基金2021 Annual Research Project of Yili Normal University(2021YSBS012)。
文摘With the implementation of General Senior High School Mathematics Curriculum Standards(2017 Edition,Revised in 2020),probability and statistics,as important carriers of the core mathematical competencies“mathematical modeling”and“data analysis,”have increasingly highlighted their educational value.By summarizing the historical evolution of probability and statistics thinking and combining with teaching practice cases,this study explores its unique role in cultivating students’core mathematical competencies.The research proposes a project-based teaching strategy relying on real scenarios and empowered by technology.Through cases,it demonstrates how to use modern educational technology to realize the whole-process exploration of data collection,model construction,and conclusion verification,so as to promote the transformation of middle school probability and statistics teaching from knowledge imparting to competency development,and provide a practical reference for curriculum reform.