Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemio...Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.展开更多
The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the effi...The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.展开更多
Cloud diurnal variation is crucial for regulating cloud radiative effects and atmospheric dynamics.However,it is often overlooked in the evaluation and development of climate models.Thus,this study aims to investigate...Cloud diurnal variation is crucial for regulating cloud radiative effects and atmospheric dynamics.However,it is often overlooked in the evaluation and development of climate models.Thus,this study aims to investigate the daily mean(CFR)and diurnal variation(CDV)of cloud fraction across high-,middle-,low-level,and total clouds in the FGOALS-f3-L general circulation model.The bias of total CDV is decomposed into the model biases in CFRs and CDVs of clouds at all three levels.Results indicate that the model generally underestimates low-level cloud fraction during the daytime and high-/middle-level cloud fraction at nighttime.The simulation biases of low clouds,especially their CDV biases,dominate the bias of total CDV.Compensation effects exist among the bias decompositions,where the negative contributions of underestimated daytime low-level cloud fraction are partially offset by the opposing contributions from biases in high-/middle-level clouds.Meanwhile,the bias contributions have notable land–ocean differences and region-dependent characteristics,consistent with the model biases in these variables.Additionally,the study estimates the influences of CFR and CDV biases on the bias of shortwave cloud radiative effects.It reveals that the impacts of CDV biases can reach half of those from CFR biases,highlighting the importance of accurate CDV representation in climate models.展开更多
Pronounced climatic differences occur over subtropical South China(SC)and tropical South China Sea(SCS)and understanding the key cloud-radiation characteristics is essential to simulating East Asian climate.This study...Pronounced climatic differences occur over subtropical South China(SC)and tropical South China Sea(SCS)and understanding the key cloud-radiation characteristics is essential to simulating East Asian climate.This study investigated cloud fractions and cloud radiative effects(CREs)over SC and SCS simulated by CMIP6 atmospheric models.Remarkable differences in cloud-radiation characteristics appeared over these two regions.In observations,considerable amounts of low-middle level clouds and cloud radiative cooling effect appeared over SC.In contrast,high clouds prevailed over SCS,where longwave and shortwave CREs offset each other,resulting in a weaker net cloud radiative effect(NCRE).The models underestimated NCRE over SC mainly due to weaker shortwave CRE and less cloud fractions.Conversely,most models overestimated NCRE over SCS because of stronger shortwave CRE and weaker longwave CRE.Regional CREs were closely linked to their dominant cloud fractions.Both observations and simulations showed a negative spatial correlation between total(low)cloud fraction and shortwave CRE over SC,especially in winter,and exhibited a positive correlation between high cloud fraction and longwave CRE over these two regions.Compared with SCS,most models overestimated the spatial correlation between low(high)cloud fraction and SWCRE(LWCRE)over SC,with larger bias ranges among models,indicating the exaggerated cloud radiative cooling(warming)effect caused by low(high)clouds.Moreover,most models struggled to describe regional ascent and its connection with CREs over SC while they can better reproduce these connections over SCS.This study further suggests that reasonable circulation conditions are crucial to simulating well cloud-radiation characteristics over the East Asian regions.展开更多
Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be so...Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.展开更多
DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive...DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive exists in a molten liquid state, where high-temperature gases expand and react in the form of bubble clouds within the liquid explosive;this process is distinctly different from the dynamic crack propagation process observed in the case of solid explosives. In this study, a control model for the reaction evolution of burning-bubble clouds was established to describe the reaction process and quantify the reaction violence of DNAN-based melt-cast explosives, considering the size distribution and activation mechanism of the burning-bubble clouds. The feasibility of the model was verified through experimental results. The results revealed that under geometrically similar conditions, with identical confinement strength and aspect ratio, larger charge structures led to extended initial gas flow and surface burning processes, resulting in greater reaction equivalence and violence at the casing fracture.Under constant charge volume and size, a stronger casing confinement accelerated self-enhanced burning, increasing the internal pressure, reaction degree, and reaction violence. Under a constant casing thickness and radius, higher aspect ratios led to a greater reaction violence at the casing fracture.Moreover, under a constant charge volume and casing thickness, higher aspect ratios resulted in a higher internal pressure, increased reaction degree, and greater reaction violence at the casing fracture. Further,larger ullage volumes extended the reaction evolution time and increased the reaction violence under constant casing dimensions. Through a matching design of the opening threshold of the pressure relief holes and the relief structure area, a stable burning reaction could be maintained until completion,thereby achieving a control of the reaction violence. The proposed model could effectively reflect the effects of the intrinsic burning rate, casing confinement strength, charge size, ullage volume, and pressure relief structure on the reaction evolution process and reaction violence, providing a theoretical method for the thermal safety design and reaction violence evaluation of melt-cast explosives.展开更多
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
The spatial distribution of discontinuities and the size of rock blocks are the key indicators for rock mass quality evaluation and rockfall risk assessment.Traditional manual measurement is often dangerous or unreach...The spatial distribution of discontinuities and the size of rock blocks are the key indicators for rock mass quality evaluation and rockfall risk assessment.Traditional manual measurement is often dangerous or unreachable at some high-steep rock slopes.In contrast,unmanned aerial vehicle(UAV)photogrammetry is not limited by terrain conditions,and can efficiently collect high-precision three-dimensional(3D)point clouds of rock masses through all-round and multiangle photography for rock mass characterization.In this paper,a new method based on a 3D point cloud is proposed for discontinuity identification and refined rock block modeling.The method is based on four steps:(1)Establish a point cloud spatial topology,and calculate the point cloud normal vector and average point spacing based on several machine learning algorithms;(2)Extract discontinuities using the density-based spatial clustering of applications with noise(DBSCAN)algorithm and fit the discontinuity plane by combining principal component analysis(PCA)with the natural breaks(NB)method;(3)Propose a method of inserting points in the line segment to generate an embedded discontinuity point cloud;and(4)Adopt a Poisson reconstruction method for refined rock block modeling.The proposed method was applied to an outcrop of an ultrahigh steep rock slope and compared with the results of previous studies and manual surveys.The results show that the method can eliminate the influence of discontinuity undulations on the orientation measurement and describe the local concave-convex characteristics on the modeling of rock blocks.The calculation results are accurate and reliable,which can meet the practical requirements of engineering.展开更多
In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model...In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model. Some interesting observations are revealed. The IPCC model equated average temperatures with average energy fluxes, which can cause significant errors. The model assumed that all energy fluxes remained constant, and the Earth emitted infrared radiation as if it were a blackbody. Neither of those conditions exists. The IPCC’s definition of Climate Change only includes events caused by human actions, excluding most causes. Satellite data aimed at the tops of clouds may have inferred a high Greenhouse Gas absorption flux. The model showed more energy coming from the atmosphere than absorbed from the sun, which may have caused a violation of the First and Second Laws of Thermodynamics. There were unexpectedly large gaps in the satellite data that aligned with various absorption bands of Greenhouse Gases, possibly caused by photon scattering associated with re-emissions. Based on science, we developed a cloud-based climate model that complied with the Radiation Laws and the First and Second Laws of Thermodynamics. The Cloud Model showed that 81.3% of the outgoing reflected and infrared radiation was applicable to the clouds and water vapor. In comparison, the involvement of CO<sub>2</sub> was only 0.04%, making it too minuscule to measure reliably.展开更多
Increasing development of accurate and efficient road three-dimensional(3D)modeling presents great opportunities to improve the data exchange and integration of building information modeling(BIM)models.3D modeling of ...Increasing development of accurate and efficient road three-dimensional(3D)modeling presents great opportunities to improve the data exchange and integration of building information modeling(BIM)models.3D modeling of road scenes is crucial for reference in asset management,construction,and maintenance.Light detection and ranging(Li DAR)technology is increasingly employed to generate high-quality point clouds for road inventory.In this paper,we specifically investigate the use of Li DAR data for road 3D modeling.The purpose of this review is to provide references about the existing work on the road 3D modeling based on Li DAR point clouds,critically discuss them,and provide challenges for further study.Besides,we introduce modeling standards for roads and discuss the components,types,and distinctions of various Li DAR measurement systems.Then,we review state-of-the-art methods and provide a detailed examination of road segmentation and feature extraction.Furthermore,we systematically introduce point cloud-based 3D modeling methods,namely,parametric modeling and surface reconstruction.Parameters and rules are used to define model components based on geometric and non-geometric information,whereas surface modeling is conducted through individual faces within its geometry.Finally,we discuss and summarize future research directions in this field.This review can assist researchers in enhancing existing approaches and developing new techniques for road modeling based on Li DAR point clouds.展开更多
Latent factor(LF)models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS)matrices which are commonly seen in various industrial applications.An LF model usually adopts iterativ...Latent factor(LF)models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS)matrices which are commonly seen in various industrial applications.An LF model usually adopts iterative optimizers,which may consume many iterations to achieve a local optima,resulting in considerable time cost.Hence,determining how to accelerate the training process for LF models has become a significant issue.To address this,this work proposes a randomized latent factor(RLF)model.It incorporates the principle of randomized learning techniques from neural networks into the LF analysis of HiDS matrices,thereby greatly alleviating computational burden.It also extends a standard learning process for randomized neural networks in context of LF analysis to make the resulting model represent an HiDS matrix correctly.Experimental results on three HiDS matrices from industrial applications demonstrate that compared with state-of-the-art LF models,RLF is able to achieve significantly higher computational efficiency and comparable prediction accuracy for missing data.I provides an important alternative approach to LF analysis of HiDS matrices,which is especially desired for industrial applications demanding highly efficient models.展开更多
To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage p...To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage probability leveraging spatio-temporal finite multilayer fragments distribution and the target damage assessment algorithm based on cloud model theory.Drawing on the spatial dispersion characteristics of fragments of projectile proximity explosion,we divide into a finite number of fragments distribution planes based on the time series in space,set up a fragment layer dispersion model grounded in the time series and intersection criterion for determining the effective penetration of each layer of fragments into the target.Building on the precondition that the multilayer fragments of the time series effectively assail the target,we also establish the damage criterion of the perforation and penetration damage and deduce the damage probability calculation model.Taking the damage probability of the fragment layer in the spatio-temporal sequence to the target as the input state variable,we introduce cloud model theory to research the target damage assessment method.Combining the equivalent simulation experiment,the scientific and rational nature of the proposed method were validated through quantitative calculations and comparative analysis.展开更多
Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation i...Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation invariant equations. In this paper, we study the Painlevé integrability of some special (3+1)-dimensional Schwarzian models.展开更多
The multi-fidelity Kriging model is a promising technique in surrogate-based design,balancing model accuracy and the cost of sample generation by combining low-and high-fidelity data.However,the cost of building a mul...The multi-fidelity Kriging model is a promising technique in surrogate-based design,balancing model accuracy and the cost of sample generation by combining low-and high-fidelity data.However,the cost of building a multifidelity Kriging model increases significantly as problem complexity grows.To address this issue,we propose an e cient Hierarchical Kriging modeling method.In building the low-fidelity model,distance correlation is used to determine the relative value of the hyperparameter.This transforms the maximum likelihood estimation problem into a one-dimensional optimization task,which can be solved e ciently,significantly improving modeling e ciency.The high-fidelity model is built similarly,with the low-fidelity model's hyperparameter used as the relative value for the high-fidelity model's hyperparameter.The proposed method's e ectiveness is evaluated through analytical problems and a real-world engineering problem involving modeling the isentropic e ciency of a compressor rotor.Experimental results show that the proposed method reduces modeling time significantly without compromising accuracy.For the compressor rotor isentropic e ciency model,the proposed method yields over 99%cost savings compared to conventional approaches,while also achieving higher accuracy.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technolog...In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technology and BIM(Building Information Modeling)model was discussed.Focused on the efficient acquisition of building geometric information using the fast-developing 3D point cloud technology,an improved deep learning-based 3D point cloud recognition method was proposed.The method optimised the network structure based on RandLA-Net to adapt to the large-scale point cloud processing requirements,while the semantic and instance features of the point cloud were integrated to significantly improve the recognition accuracy and provide a precise basis for BIM model remodeling.In addition,a visual BIM model generation system was developed,which systematically transformed the point cloud recognition results into BIM component parameters,automatically constructed BIM models,and promoted the open sharing and secondary development of models.The research results not only effectively promote the automation process of converting 3D point cloud data to refined BIM models,but also provide important technical support for promoting building informatisation and accelerating the construction of smart cities,showing a wide range of application potential and practical value.展开更多
Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable c...Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable centralized traffic control(CTC)system risk assessment method.Design/methodologylapproach-First,system-theoretic process analysis(STPA)is used to conduct risk analysis on the CTC system and constructs risk assessment indexes based on this analysis.Then,to enhance the accuracy of weight calculation,the fuzzy analytical hierarchy process(FAHP),fuzzy decision-making trial and evaluation laboratory(FDEMATEL)and entropy weight method are employed to calculate the subjective weight,relative weight and objective weight of each index.These three types of weights are combined using game theory to obtain the combined weight for each index.To reduce subjectivity and uncertainty in the assessment process,the backward cloud generator method is utilized to obtain the numerical character(NC)of the cloud model for each index.The NCs of the indexes are then weighted to derive the comprehensive cloud for risk assessment of the CTC system.This cloud model is used to obtain the CTC system's comprehensive risk assessment.The model's similarity measurement method gauges the likeness between the comprehensive risk assessment cloud and the risk standard cloud.Finally,this process yields the risk assessment results for the CTC system.Findings-The cloud model can handle the subjectivity and fuzziness in the risk assessment process well.The cloud model-based risk assessment method was applied to the CTC system risk assessment of a railway group and achieved good results.Originality/value-This study provides a cloud model-based method for risk assessment of CTC systems,which accurately calculates the weight of risk indexes and uses cloud models to reduce uncertainty and subjectivity in the assessment,achieving effective risk assessment of CTC systems.It can provide a reference and theoretical basis for risk management of the CTC system.展开更多
Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced fe...Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.展开更多
The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achievi...The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.展开更多
基金supported in part by the Young Scientists Fund of the National Natural Science Foundation of China(Grant Nos.82304253)(and 82273709)the Foundation for Young Talents in Higher Education of Guangdong Province(Grant No.2022KQNCX021)the PhD Starting Project of Guangdong Medical University(Grant No.GDMUB2022054).
文摘Objective Humans are exposed to complex mixtures of environmental chemicals and other factors that can affect their health.Analysis of these mixture exposures presents several key challenges for environmental epidemiology and risk assessment,including high dimensionality,correlated exposure,and subtle individual effects.Methods We proposed a novel statistical approach,the generalized functional linear model(GFLM),to analyze the health effects of exposure mixtures.GFLM treats the effect of mixture exposures as a smooth function by reordering exposures based on specific mechanisms and capturing internal correlations to provide a meaningful estimation and interpretation.The robustness and efficiency was evaluated under various scenarios through extensive simulation studies.Results We applied the GFLM to two datasets from the National Health and Nutrition Examination Survey(NHANES).In the first application,we examined the effects of 37 nutrients on BMI(2011–2016 cycles).The GFLM identified a significant mixture effect,with fiber and fat emerging as the nutrients with the greatest negative and positive effects on BMI,respectively.For the second application,we investigated the association between four pre-and perfluoroalkyl substances(PFAS)and gout risk(2007–2018 cycles).Unlike traditional methods,the GFLM indicated no significant association,demonstrating its robustness to multicollinearity.Conclusion GFLM framework is a powerful tool for mixture exposure analysis,offering improved handling of correlated exposures and interpretable results.It demonstrates robust performance across various scenarios and real-world applications,advancing our understanding of complex environmental exposures and their health impacts on environmental epidemiology and toxicology.
基金supported by the Innovation Fund Project of the Gansu Education Department(Grant No.2021B-099).
文摘The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.
基金supported by the National Natural Science Foundation of China[grant number 42275074].
文摘Cloud diurnal variation is crucial for regulating cloud radiative effects and atmospheric dynamics.However,it is often overlooked in the evaluation and development of climate models.Thus,this study aims to investigate the daily mean(CFR)and diurnal variation(CDV)of cloud fraction across high-,middle-,low-level,and total clouds in the FGOALS-f3-L general circulation model.The bias of total CDV is decomposed into the model biases in CFRs and CDVs of clouds at all three levels.Results indicate that the model generally underestimates low-level cloud fraction during the daytime and high-/middle-level cloud fraction at nighttime.The simulation biases of low clouds,especially their CDV biases,dominate the bias of total CDV.Compensation effects exist among the bias decompositions,where the negative contributions of underestimated daytime low-level cloud fraction are partially offset by the opposing contributions from biases in high-/middle-level clouds.Meanwhile,the bias contributions have notable land–ocean differences and region-dependent characteristics,consistent with the model biases in these variables.Additionally,the study estimates the influences of CFR and CDV biases on the bias of shortwave cloud radiative effects.It reveals that the impacts of CDV biases can reach half of those from CFR biases,highlighting the importance of accurate CDV representation in climate models.
基金Guangdong Major Project of Basic and Applied Basic Research(2020B0301030004)National Natural Science Foundation of China(72293604,42275026)Open Grants of the State Key Laboratory of Severe Weather(2023LASW-B09)。
文摘Pronounced climatic differences occur over subtropical South China(SC)and tropical South China Sea(SCS)and understanding the key cloud-radiation characteristics is essential to simulating East Asian climate.This study investigated cloud fractions and cloud radiative effects(CREs)over SC and SCS simulated by CMIP6 atmospheric models.Remarkable differences in cloud-radiation characteristics appeared over these two regions.In observations,considerable amounts of low-middle level clouds and cloud radiative cooling effect appeared over SC.In contrast,high clouds prevailed over SCS,where longwave and shortwave CREs offset each other,resulting in a weaker net cloud radiative effect(NCRE).The models underestimated NCRE over SC mainly due to weaker shortwave CRE and less cloud fractions.Conversely,most models overestimated NCRE over SCS because of stronger shortwave CRE and weaker longwave CRE.Regional CREs were closely linked to their dominant cloud fractions.Both observations and simulations showed a negative spatial correlation between total(low)cloud fraction and shortwave CRE over SC,especially in winter,and exhibited a positive correlation between high cloud fraction and longwave CRE over these two regions.Compared with SCS,most models overestimated the spatial correlation between low(high)cloud fraction and SWCRE(LWCRE)over SC,with larger bias ranges among models,indicating the exaggerated cloud radiative cooling(warming)effect caused by low(high)clouds.Moreover,most models struggled to describe regional ascent and its connection with CREs over SC while they can better reproduce these connections over SCS.This study further suggests that reasonable circulation conditions are crucial to simulating well cloud-radiation characteristics over the East Asian regions.
基金Heilongjiang Provincial Natural Science Foundation of China (LH2021F009)。
文摘Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.
基金supported by the National Natural Science Foundation of China (Grant No. 12002044)。
文摘DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive exists in a molten liquid state, where high-temperature gases expand and react in the form of bubble clouds within the liquid explosive;this process is distinctly different from the dynamic crack propagation process observed in the case of solid explosives. In this study, a control model for the reaction evolution of burning-bubble clouds was established to describe the reaction process and quantify the reaction violence of DNAN-based melt-cast explosives, considering the size distribution and activation mechanism of the burning-bubble clouds. The feasibility of the model was verified through experimental results. The results revealed that under geometrically similar conditions, with identical confinement strength and aspect ratio, larger charge structures led to extended initial gas flow and surface burning processes, resulting in greater reaction equivalence and violence at the casing fracture.Under constant charge volume and size, a stronger casing confinement accelerated self-enhanced burning, increasing the internal pressure, reaction degree, and reaction violence. Under a constant casing thickness and radius, higher aspect ratios led to a greater reaction violence at the casing fracture.Moreover, under a constant charge volume and casing thickness, higher aspect ratios resulted in a higher internal pressure, increased reaction degree, and greater reaction violence at the casing fracture. Further,larger ullage volumes extended the reaction evolution time and increased the reaction violence under constant casing dimensions. Through a matching design of the opening threshold of the pressure relief holes and the relief structure area, a stable burning reaction could be maintained until completion,thereby achieving a control of the reaction violence. The proposed model could effectively reflect the effects of the intrinsic burning rate, casing confinement strength, charge size, ullage volume, and pressure relief structure on the reaction evolution process and reaction violence, providing a theoretical method for the thermal safety design and reaction violence evaluation of melt-cast explosives.
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
基金supported by the National Natural Science Foundation of China(Grant Nos.41941017 and 42177139)Graduate Innovation Fund of Jilin University(Grant No.2024CX099)。
文摘The spatial distribution of discontinuities and the size of rock blocks are the key indicators for rock mass quality evaluation and rockfall risk assessment.Traditional manual measurement is often dangerous or unreachable at some high-steep rock slopes.In contrast,unmanned aerial vehicle(UAV)photogrammetry is not limited by terrain conditions,and can efficiently collect high-precision three-dimensional(3D)point clouds of rock masses through all-round and multiangle photography for rock mass characterization.In this paper,a new method based on a 3D point cloud is proposed for discontinuity identification and refined rock block modeling.The method is based on four steps:(1)Establish a point cloud spatial topology,and calculate the point cloud normal vector and average point spacing based on several machine learning algorithms;(2)Extract discontinuities using the density-based spatial clustering of applications with noise(DBSCAN)algorithm and fit the discontinuity plane by combining principal component analysis(PCA)with the natural breaks(NB)method;(3)Propose a method of inserting points in the line segment to generate an embedded discontinuity point cloud;and(4)Adopt a Poisson reconstruction method for refined rock block modeling.The proposed method was applied to an outcrop of an ultrahigh steep rock slope and compared with the results of previous studies and manual surveys.The results show that the method can eliminate the influence of discontinuity undulations on the orientation measurement and describe the local concave-convex characteristics on the modeling of rock blocks.The calculation results are accurate and reliable,which can meet the practical requirements of engineering.
文摘In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model. Some interesting observations are revealed. The IPCC model equated average temperatures with average energy fluxes, which can cause significant errors. The model assumed that all energy fluxes remained constant, and the Earth emitted infrared radiation as if it were a blackbody. Neither of those conditions exists. The IPCC’s definition of Climate Change only includes events caused by human actions, excluding most causes. Satellite data aimed at the tops of clouds may have inferred a high Greenhouse Gas absorption flux. The model showed more energy coming from the atmosphere than absorbed from the sun, which may have caused a violation of the First and Second Laws of Thermodynamics. There were unexpectedly large gaps in the satellite data that aligned with various absorption bands of Greenhouse Gases, possibly caused by photon scattering associated with re-emissions. Based on science, we developed a cloud-based climate model that complied with the Radiation Laws and the First and Second Laws of Thermodynamics. The Cloud Model showed that 81.3% of the outgoing reflected and infrared radiation was applicable to the clouds and water vapor. In comparison, the involvement of CO<sub>2</sub> was only 0.04%, making it too minuscule to measure reliably.
基金supported by the projects found by the Jiangsu Transportation Science and Technology Project under Grants 2020Y191(1)Postgraduate Research&Practice Innovation Program of Jiangsu Province under Grants KYCX23_0294。
文摘Increasing development of accurate and efficient road three-dimensional(3D)modeling presents great opportunities to improve the data exchange and integration of building information modeling(BIM)models.3D modeling of road scenes is crucial for reference in asset management,construction,and maintenance.Light detection and ranging(Li DAR)technology is increasingly employed to generate high-quality point clouds for road inventory.In this paper,we specifically investigate the use of Li DAR data for road 3D modeling.The purpose of this review is to provide references about the existing work on the road 3D modeling based on Li DAR point clouds,critically discuss them,and provide challenges for further study.Besides,we introduce modeling standards for roads and discuss the components,types,and distinctions of various Li DAR measurement systems.Then,we review state-of-the-art methods and provide a detailed examination of road segmentation and feature extraction.Furthermore,we systematically introduce point cloud-based 3D modeling methods,namely,parametric modeling and surface reconstruction.Parameters and rules are used to define model components based on geometric and non-geometric information,whereas surface modeling is conducted through individual faces within its geometry.Finally,we discuss and summarize future research directions in this field.This review can assist researchers in enhancing existing approaches and developing new techniques for road modeling based on Li DAR point clouds.
基金supported in part by the National Natural Science Foundation of China (6177249391646114)+1 种基金Chongqing research program of technology innovation and application (cstc2017rgzn-zdyfX0020)in part by the Pioneer Hundred Talents Program of Chinese Academy of Sciences
文摘Latent factor(LF)models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS)matrices which are commonly seen in various industrial applications.An LF model usually adopts iterative optimizers,which may consume many iterations to achieve a local optima,resulting in considerable time cost.Hence,determining how to accelerate the training process for LF models has become a significant issue.To address this,this work proposes a randomized latent factor(RLF)model.It incorporates the principle of randomized learning techniques from neural networks into the LF analysis of HiDS matrices,thereby greatly alleviating computational burden.It also extends a standard learning process for randomized neural networks in context of LF analysis to make the resulting model represent an HiDS matrix correctly.Experimental results on three HiDS matrices from industrial applications demonstrate that compared with state-of-the-art LF models,RLF is able to achieve significantly higher computational efficiency and comparable prediction accuracy for missing data.I provides an important alternative approach to LF analysis of HiDS matrices,which is especially desired for industrial applications demanding highly efficient models.
基金supported by National Natural Science Foundation of China(Grant No.62073256)the Shaanxi Provincial Science and Technology Department(Grant No.2023-YBGY-342).
文摘To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage probability leveraging spatio-temporal finite multilayer fragments distribution and the target damage assessment algorithm based on cloud model theory.Drawing on the spatial dispersion characteristics of fragments of projectile proximity explosion,we divide into a finite number of fragments distribution planes based on the time series in space,set up a fragment layer dispersion model grounded in the time series and intersection criterion for determining the effective penetration of each layer of fragments into the target.Building on the precondition that the multilayer fragments of the time series effectively assail the target,we also establish the damage criterion of the perforation and penetration damage and deduce the damage probability calculation model.Taking the damage probability of the fragment layer in the spatio-temporal sequence to the target as the input state variable,we introduce cloud model theory to research the target damage assessment method.Combining the equivalent simulation experiment,the scientific and rational nature of the proposed method were validated through quantitative calculations and comparative analysis.
文摘Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation invariant equations. In this paper, we study the Painlevé integrability of some special (3+1)-dimensional Schwarzian models.
基金Supported by Research Program of the University of South China(Grant No.220XQD064)。
文摘The multi-fidelity Kriging model is a promising technique in surrogate-based design,balancing model accuracy and the cost of sample generation by combining low-and high-fidelity data.However,the cost of building a multifidelity Kriging model increases significantly as problem complexity grows.To address this issue,we propose an e cient Hierarchical Kriging modeling method.In building the low-fidelity model,distance correlation is used to determine the relative value of the hyperparameter.This transforms the maximum likelihood estimation problem into a one-dimensional optimization task,which can be solved e ciently,significantly improving modeling e ciency.The high-fidelity model is built similarly,with the low-fidelity model's hyperparameter used as the relative value for the high-fidelity model's hyperparameter.The proposed method's e ectiveness is evaluated through analytical problems and a real-world engineering problem involving modeling the isentropic e ciency of a compressor rotor.Experimental results show that the proposed method reduces modeling time significantly without compromising accuracy.For the compressor rotor isentropic e ciency model,the proposed method yields over 99%cost savings compared to conventional approaches,while also achieving higher accuracy.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
文摘In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technology and BIM(Building Information Modeling)model was discussed.Focused on the efficient acquisition of building geometric information using the fast-developing 3D point cloud technology,an improved deep learning-based 3D point cloud recognition method was proposed.The method optimised the network structure based on RandLA-Net to adapt to the large-scale point cloud processing requirements,while the semantic and instance features of the point cloud were integrated to significantly improve the recognition accuracy and provide a precise basis for BIM model remodeling.In addition,a visual BIM model generation system was developed,which systematically transformed the point cloud recognition results into BIM component parameters,automatically constructed BIM models,and promoted the open sharing and secondary development of models.The research results not only effectively promote the automation process of converting 3D point cloud data to refined BIM models,but also provide important technical support for promoting building informatisation and accelerating the construction of smart cities,showing a wide range of application potential and practical value.
基金National Natural Science Foundation of China under Grant 62203468Technological Research and Development Program of China State Railway Group Co.,Ltd.under Grant J2023G007+2 种基金Young Elite Scientist Sponsorship Program by China Association for Science and Technology(CAST)under Grant 2022QNRC001Youth Talent Program Supported by China Railway SocietyResearch Program of Beijing Hua-Tie Information Technology Corporation Limited under Grant 2023HT02.
文摘Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable centralized traffic control(CTC)system risk assessment method.Design/methodologylapproach-First,system-theoretic process analysis(STPA)is used to conduct risk analysis on the CTC system and constructs risk assessment indexes based on this analysis.Then,to enhance the accuracy of weight calculation,the fuzzy analytical hierarchy process(FAHP),fuzzy decision-making trial and evaluation laboratory(FDEMATEL)and entropy weight method are employed to calculate the subjective weight,relative weight and objective weight of each index.These three types of weights are combined using game theory to obtain the combined weight for each index.To reduce subjectivity and uncertainty in the assessment process,the backward cloud generator method is utilized to obtain the numerical character(NC)of the cloud model for each index.The NCs of the indexes are then weighted to derive the comprehensive cloud for risk assessment of the CTC system.This cloud model is used to obtain the CTC system's comprehensive risk assessment.The model's similarity measurement method gauges the likeness between the comprehensive risk assessment cloud and the risk standard cloud.Finally,this process yields the risk assessment results for the CTC system.Findings-The cloud model can handle the subjectivity and fuzziness in the risk assessment process well.The cloud model-based risk assessment method was applied to the CTC system risk assessment of a railway group and achieved good results.Originality/value-This study provides a cloud model-based method for risk assessment of CTC systems,which accurately calculates the weight of risk indexes and uses cloud models to reduce uncertainty and subjectivity in the assessment,achieving effective risk assessment of CTC systems.It can provide a reference and theoretical basis for risk management of the CTC system.
文摘Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.
文摘The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.