When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding bia...When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.展开更多
This study introduces an innovative“Big Model”strategy to enhance Bridge Structural Health Monitoring(SHM)using a Convolutional Neural Network(CNN),time-frequency analysis,and fine element analysis.Leveraging ensemb...This study introduces an innovative“Big Model”strategy to enhance Bridge Structural Health Monitoring(SHM)using a Convolutional Neural Network(CNN),time-frequency analysis,and fine element analysis.Leveraging ensemble methods,collaborative learning,and distributed computing,the approach effectively manages the complexity and scale of large-scale bridge data.The CNN employs transfer learning,fine-tuning,and continuous monitoring to optimize models for adaptive and accurate structural health assessments,focusing on extracting meaningful features through time-frequency analysis.By integrating Finite Element Analysis,time-frequency analysis,and CNNs,the strategy provides a comprehensive understanding of bridge health.Utilizing diverse sensor data,sophisticated feature extraction,and advanced CNN architecture,the model is optimized through rigorous preprocessing and hyperparameter tuning.This approach significantly enhances the ability to make accurate predictions,monitor structural health,and support proactive maintenance practices,thereby ensuring the safety and longevity of critical infrastructure.展开更多
Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysi...Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.展开更多
Materials development has historically been driven by human needs and desires, and this is likely to con- tinue in the foreseeable future. The global population is expected to reach ten billion by 2050, which will pro...Materials development has historically been driven by human needs and desires, and this is likely to con- tinue in the foreseeable future. The global population is expected to reach ten billion by 2050, which will promote increasingly large demands for clean and high-ef ciency energy, personalized consumer prod- ucts, secure food supplies, and professional healthcare. New functional materials that are made and tai- lored for targeted properties or behaviors will be the key to tackling this challenge. Traditionally, advanced materials are found empirically or through experimental trial-and-error approaches. As big data generated by modern experimental and computational techniques is becoming more readily avail- able, data-driven or machine learning (ML) methods have opened new paradigms for the discovery and rational design of materials. In this review article, we provide a brief introduction on various ML methods and related software or tools. Main ideas and basic procedures for employing ML approaches in materials research are highlighted. We then summarize recent important applications of ML for the large-scale screening and optimal design of polymer and porous materials, catalytic materials, and energetic mate- rials. Finally, concluding remarks and an outlook are provided.展开更多
A corrosion defect is recognized as one of the most severe phenomena for high-pressure pipelines,especially those served for a long time.Finite-element method and empirical formulas are thereby used for the strength p...A corrosion defect is recognized as one of the most severe phenomena for high-pressure pipelines,especially those served for a long time.Finite-element method and empirical formulas are thereby used for the strength prediction of such pipes with corrosion.However,it is time-consuming for finite-element method and there is a limited application range by using empirical formulas.In order to improve the prediction of strength,this paper investigates the burst pressure of line pipelines with a single corrosion defect subjected to internal pressure based on data-driven methods.Three supervised ML(machine learning)algorithms,including the ANN(artificial neural network),the SVM(support vector machine)and the LR(linear regression),are deployed to train models based on experimental data.Data analysis is first conducted to determine proper pipe features for training.Hyperparameter tuning to control the learning process is then performed to fit the best strength models for corroded pipelines.Among all the proposed data-driven models,the ANN model with three neural layers has the highest training accuracy,but also presents the largest variance.The SVM model provides both high training accuracy and high validation accuracy.The LR model has the best performance in terms of generalization ability.These models can be served as surrogate models by transfer learning with new coming data in future research,facilitating a sustainable and intelligent decision-making of corroded pipelines.展开更多
This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models ...This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.展开更多
Background A photometric stereo method aims to recover the surface normal of a 3D object observed under varying light directions.It is an ill-defined problem because the general reflectance properties of the surface a...Background A photometric stereo method aims to recover the surface normal of a 3D object observed under varying light directions.It is an ill-defined problem because the general reflectance properties of the surface are unknown.Methods This paper reviews existing data-driven methods,with a focus on their technical insights into the photometric stereo problem.We divide these methods into two categories,per-pixel and all-pixel,according to how they process an image.We discuss the differences and relationships between these methods from the perspective of inputs,networks,and data,which are key factors in designing a deep learning approach.Results We demonstrate the performance of the models using a popular benchmark dataset.Conclusions Data-driven photometric stereo methods have shown that they possess a superior performance advantage over traditional methods.However,these methods suffer from various limitations,such as limited generalization capability.Finally,this study suggests directions for future research.展开更多
Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tecto...Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tectonic activities.In the big data era,the establishment of new data platforms and the application of big data methods have become a focus for metamorphic rocks.Significant progress has been made in creating specialized databases,compiling comprehensive datasets,and utilizing data analytics to address complex scientific questions.However,many existing databases are inadequate in meeting the specific requirements of metamorphic research,resulting from a substantial amount of valuable data remaining uncollected.Therefore,constructing new databases that can cope with the development of the data era is necessary.This article provides an extensive review of existing databases related to metamorphic rocks and discusses data-driven studies in this.Accordingly,several crucial factors that need to be taken into consideration in the establishment of specialized metamorphic databases are identified,aiming to leverage data-driven applications to achieve broader scientific objectives in metamorphic research.展开更多
In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data ...In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data collection and integration,limitations of analytical technologies,talent development,team building,and policy support when applying agricultural big data.Effective application strategies are proposed,including data-driven precision agriculture practices,construction of data integration and management platforms,data security and privacy protection strategies,as well as long-term planning and development strategies for agricultural big data,to maximize its impact on agricultural economic management.Future advancements require collaborative efforts in technological innovation,talent cultivation,and policy support,to realize the extensive application of agricultural big data in agricultural economic management and ensure sustainable industrial development.展开更多
The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies m...The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies make it possible to fabricate these highly spatially programmable structures and greatly enhance the freedom in their design.However,traditional analytical methods do not sufficiently reflect the actual vibration-damping mechanism of lattice structures and are limited by their high computational cost.In this study,a hybrid lattice structure consisting of various cells was designed based on quasi-static and vibration experiments.Subsequently,a novel parametric design method based on a data-driven approach was developed for hybrid lattices with engineered properties.The response surface method was adopted to define the sensitive optimization target.A prediction model for the lattice geometric parameters and vibration properties was established using a backpropagation neural network.Then,it was integrated into the genetic algorithm to create the optimal hybrid lattice with varying geometric features and the required wide-band vibration-damping characteristics.Validation experiments were conducted,demonstrating that the optimized hybrid lattice can achieve the target properties.In addition,the data-driven parametric design method can reduce computation time and be widely applied to complex structural designs when analytical and empirical solutions are unavailable.展开更多
In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This s...In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This system restores the essential characteristics of currency while providing auxiliary services related to the formation,circulation,storage,application,and promotion of digital currency.Compared to traditional currency management technologies,big data analysis technology,which is primarily embedded in digital currency systems,enables the rapid acquisition of information.This facilitates the identification of standard associations within currency data and provides technical support for the operational framework of digital currency.展开更多
Through various investigations, it can be found that the current situation of our country's audit industry is still dominated by traditional methods, which rely more on the auditors' experience and judgment, a...Through various investigations, it can be found that the current situation of our country's audit industry is still dominated by traditional methods, which rely more on the auditors' experience and judgment, and cannot obtain the full amount of data for audit, but only discover what to evaluate, which is commonly known as "seeing only trees but not forests". Therefore, it is impossible to objectively and accurately evaluate the real situation of the audited units, and the audit efficiency and effectiveness are greatly reduced. With the continuous rapid development of the global economy, science and technology in many fields are progressing together, and many industries are increasingly relying on the use of big data information technology. Against this background, the introduction of big data technology can bring about significant changes in audit work. Because the audit work is based on big data, this enables auditors to rely more on the use of this technology. At the same time, it can adapt to the development of the current big data era and bring convenience to the audit work. Under the background of big data, the use and innovation of audit methods are conducive to the effective adjustment and improvement of audit methods themselves, and are also conducive to the faster extraction of effective information and its effective analysis and use. This is the direction of the progress and development of China's audit industry, and is also the realistic need of our auditors. Therefore, the audit work under the background of big data is of great significance.展开更多
The implementation of artificial intelligence(AI)in a smart society,in which the analysis of human habits is mandatory,requires automated data scheduling and analysis using smart applications,a smart infrastructure,sm...The implementation of artificial intelligence(AI)in a smart society,in which the analysis of human habits is mandatory,requires automated data scheduling and analysis using smart applications,a smart infrastructure,smart systems,and a smart network.In this context,which is characterized by a large gap between training and operative processes,a dedicated method is required to manage and extract the massive amount of data and the related information mining.The method presented in this work aims to reduce this gap with near-zero-failure advanced diagnostics(AD)for smart management,which is exploitable in any context of Society 5.0,thus reducing the risk factors at all management levels and ensuring quality and sustainability.We have also developed innovative applications for a humancentered management system to support scheduling in the maintenance of operative processes,for reducing training costs,for improving production yield,and for creating a human–machine cyberspace for smart infrastructure design.The results obtained in 12 international companies demonstrate a possible global standardization of operative processes,leading to the design of a near-zero-failure intelligent system that is able to learn and upgrade itself.Our new method provides guidance for selecting the new generation of intelligent manufacturing and smart systems in order to optimize human–machine interactions,with the related smart maintenance and education.展开更多
In this paper, a real-time online data-driven adaptive method is developed to deal with uncertainties such as high nonlinearity, strong coupling, parameter perturbation and external disturbances in attitude control of...In this paper, a real-time online data-driven adaptive method is developed to deal with uncertainties such as high nonlinearity, strong coupling, parameter perturbation and external disturbances in attitude control of fixed-wing unmanned aerial vehicles (UAVs). Firstly, a model-free adaptive control (MFAC) method requiring only input/output (I/O) data and no model information is adopted for control scheme design of angular velocity subsystem which contains all model information and up-mentioned uncertainties. Secondly, the internal model control (IMC) method featured with less tuning parameters and convenient tuning process is adopted for control scheme design of the certain Euler angle subsystem. Simulation results show that, the method developed is obviously superior to the cascade PID (CPID) method and the nonlinear dynamic inversion (NDI) method.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
Big Bang nucleosynthesis(BBN)theory predicts the primordial abundances of the light elements^(2) H(referred to as deuterium,or D for short),^(3)He,^(4)He,and^(7) Li produced in the early universe.Among these,deuterium...Big Bang nucleosynthesis(BBN)theory predicts the primordial abundances of the light elements^(2) H(referred to as deuterium,or D for short),^(3)He,^(4)He,and^(7) Li produced in the early universe.Among these,deuterium,the first nuclide produced by BBN,is a key primordial material for subsequent reactions.To date,the uncertainty in predicted deuterium abundance(D/H)remains larger than the observational precision.In this study,the Monte Carlo simulation code PRIMAT was used to investigate the sensitivity of 11 important BBN reactions to deuterium abundance.We found that the reaction rate uncertainties of the four reactions d(d,n)^(3)He,d(d,p)t,d(p,γ)^(3)He,and p(n,γ)d had the largest influence on the calculated D/H uncertainty.Currently,the calculated D/H uncertainty cannot reach observational precision even with the recent LUNA precise d(p,γ)^(3) He rate.From the nuclear physics aspect,there is still room to largely reduce the reaction-rate uncertainties;hence,further measurements of the important reactions involved in BBN are still necessary.A photodisintegration experiment will be conducted at the Shanghai Laser Electron Gamma Source Facility to precisely study the deuterium production reaction of p(n,γ)d.展开更多
Data-driven computing in elasticity attempts to directly use experimental data on material,without constructing an empirical model of the constitutive relation,to predict an equilibrium state of a structure subjected ...Data-driven computing in elasticity attempts to directly use experimental data on material,without constructing an empirical model of the constitutive relation,to predict an equilibrium state of a structure subjected to a specified external load.Provided that a data set comprising stress-strain pairs of material is available,a data-driven method using the kernel method and the regularized least-squares was developed to extract a manifold on which the points in the data set approximately lie(Kanno 2021,Jpn.J.Ind.Appl.Math.).From the perspective of physical experiments,stress field cannot be directly measured,while displacement and force fields are measurable.In this study,we extend the previous kernel method to the situation that pairs of displacement and force,instead of pairs of stress and strain,are available as an input data set.A new regularized least-squares problem is formulated in this problem setting,and an alternating minimization algorithm is proposed to solve the problem.展开更多
Big data could be utilized in work and life. Asset appraisal could also make full use of big data to improve the efficiency and effectiveness of appraisal. The paper is going to study the application of big data in di...Big data could be utilized in work and life. Asset appraisal could also make full use of big data to improve the efficiency and effectiveness of appraisal. The paper is going to study the application of big data in different fields to learn how big data works in practices and what the effect is after utilizing the new tool. Then, the paper is going to apply big data in appraisal in specific work environment. By collecting information, researching literature and practicing with appraisers, this paper f'mds some means to improve market method by making full use of big data. The article researched further by applying the method in different projects of asset appraisal. Real estate, intangible asset, corporate valuation and machines could be valued by the market method improved by big data. There are different details for appraisers to be careful in practical work. Some companies have already put the technology into practice and achieved great benefit, which makes the application of big data meaningful.展开更多
The big data era is coming, which influences the life of human beings in every aspect, such as working, studying, shopping and so on. The data could be uploaded and recorded by the digital devices like smart-phone and...The big data era is coming, which influences the life of human beings in every aspect, such as working, studying, shopping and so on. The data could be uploaded and recorded by the digital devices like smart-phone and pad. The volume of data could provide useful information to hdp learn the habit of human beings and improve the efficiency of work. The domain of asset appraisal could make full use of big data to collect and sort information involving the appraised asset and market. On the one hand, market method of asset appraisal needs a plenty of information of reference substance and industry development. On the other hand, big data with the trait of volume and velocity could be utilized to collect information. The paper reveals that taking advantage of big data application in asset appraisal using market method is an evolutionary process in which the gradual understanding of the potential of big data plays a crucial role.展开更多
The database research method is a method that analyses, generalizes and deduces from the data of subject investigated with database techniques, quantitative statistics and mathematical models. As the big data age come...The database research method is a method that analyses, generalizes and deduces from the data of subject investigated with database techniques, quantitative statistics and mathematical models. As the big data age comes with the data explosion in modem society, the International Chinese Language Teaching (ICLT) shows signs of sizable data accumulation, remarkable economic property, strong modeling requirements and notable cross-research trends, which thus make this method necessary as a new and independent research method in the researches on this area. Theory bases, applicative areas, available software and data resources, research program designs, as well as their advantages and disadvantages will be figured out in this paper. In the near future, it will bring about a revolution to the international Chinese language teaching.展开更多
文摘When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.
文摘This study introduces an innovative“Big Model”strategy to enhance Bridge Structural Health Monitoring(SHM)using a Convolutional Neural Network(CNN),time-frequency analysis,and fine element analysis.Leveraging ensemble methods,collaborative learning,and distributed computing,the approach effectively manages the complexity and scale of large-scale bridge data.The CNN employs transfer learning,fine-tuning,and continuous monitoring to optimize models for adaptive and accurate structural health assessments,focusing on extracting meaningful features through time-frequency analysis.By integrating Finite Element Analysis,time-frequency analysis,and CNNs,the strategy provides a comprehensive understanding of bridge health.Utilizing diverse sensor data,sophisticated feature extraction,and advanced CNN architecture,the model is optimized through rigorous preprocessing and hyperparameter tuning.This approach significantly enhances the ability to make accurate predictions,monitor structural health,and support proactive maintenance practices,thereby ensuring the safety and longevity of critical infrastructure.
基金funded by the Key Research and Development Program of Shaanxi,China(No.2024GX-YBXM-503)the National Natural Science Foundation of China(No.51974254)。
文摘Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.
文摘Materials development has historically been driven by human needs and desires, and this is likely to con- tinue in the foreseeable future. The global population is expected to reach ten billion by 2050, which will promote increasingly large demands for clean and high-ef ciency energy, personalized consumer prod- ucts, secure food supplies, and professional healthcare. New functional materials that are made and tai- lored for targeted properties or behaviors will be the key to tackling this challenge. Traditionally, advanced materials are found empirically or through experimental trial-and-error approaches. As big data generated by modern experimental and computational techniques is becoming more readily avail- able, data-driven or machine learning (ML) methods have opened new paradigms for the discovery and rational design of materials. In this review article, we provide a brief introduction on various ML methods and related software or tools. Main ideas and basic procedures for employing ML approaches in materials research are highlighted. We then summarize recent important applications of ML for the large-scale screening and optimal design of polymer and porous materials, catalytic materials, and energetic mate- rials. Finally, concluding remarks and an outlook are provided.
文摘A corrosion defect is recognized as one of the most severe phenomena for high-pressure pipelines,especially those served for a long time.Finite-element method and empirical formulas are thereby used for the strength prediction of such pipes with corrosion.However,it is time-consuming for finite-element method and there is a limited application range by using empirical formulas.In order to improve the prediction of strength,this paper investigates the burst pressure of line pipelines with a single corrosion defect subjected to internal pressure based on data-driven methods.Three supervised ML(machine learning)algorithms,including the ANN(artificial neural network),the SVM(support vector machine)and the LR(linear regression),are deployed to train models based on experimental data.Data analysis is first conducted to determine proper pipe features for training.Hyperparameter tuning to control the learning process is then performed to fit the best strength models for corroded pipelines.Among all the proposed data-driven models,the ANN model with three neural layers has the highest training accuracy,but also presents the largest variance.The SVM model provides both high training accuracy and high validation accuracy.The LR model has the best performance in terms of generalization ability.These models can be served as surrogate models by transfer learning with new coming data in future research,facilitating a sustainable and intelligent decision-making of corroded pipelines.
基金sponsored by the U.S.Department of Housing and Urban Development(Grant No.NJLTS0027-22)The opinions expressed in this study are the authors alone,and do not represent the U.S.Depart-ment of HUD’s opinions.
文摘This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.
文摘Background A photometric stereo method aims to recover the surface normal of a 3D object observed under varying light directions.It is an ill-defined problem because the general reflectance properties of the surface are unknown.Methods This paper reviews existing data-driven methods,with a focus on their technical insights into the photometric stereo problem.We divide these methods into two categories,per-pixel and all-pixel,according to how they process an image.We discuss the differences and relationships between these methods from the perspective of inputs,networks,and data,which are key factors in designing a deep learning approach.Results We demonstrate the performance of the models using a popular benchmark dataset.Conclusions Data-driven photometric stereo methods have shown that they possess a superior performance advantage over traditional methods.However,these methods suffer from various limitations,such as limited generalization capability.Finally,this study suggests directions for future research.
基金funded by the National Natural Science Foundation of China(No.42220104008)。
文摘Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tectonic activities.In the big data era,the establishment of new data platforms and the application of big data methods have become a focus for metamorphic rocks.Significant progress has been made in creating specialized databases,compiling comprehensive datasets,and utilizing data analytics to address complex scientific questions.However,many existing databases are inadequate in meeting the specific requirements of metamorphic research,resulting from a substantial amount of valuable data remaining uncollected.Therefore,constructing new databases that can cope with the development of the data era is necessary.This article provides an extensive review of existing databases related to metamorphic rocks and discusses data-driven studies in this.Accordingly,several crucial factors that need to be taken into consideration in the establishment of specialized metamorphic databases are identified,aiming to leverage data-driven applications to achieve broader scientific objectives in metamorphic research.
基金Supported by Research and Application of Soil Collection Software and Soil Ecological Big Data Platform in Guangxi Woodland(GUILINKEYAN[2022ZC]44)Construction of Soil Information Database and Visualization System for Artificial Forests in Central Guangxi(2023GXZCLK62).
文摘In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data collection and integration,limitations of analytical technologies,talent development,team building,and policy support when applying agricultural big data.Effective application strategies are proposed,including data-driven precision agriculture practices,construction of data integration and management platforms,data security and privacy protection strategies,as well as long-term planning and development strategies for agricultural big data,to maximize its impact on agricultural economic management.Future advancements require collaborative efforts in technological innovation,talent cultivation,and policy support,to realize the extensive application of agricultural big data in agricultural economic management and ensure sustainable industrial development.
基金supported by National Natural Science Foundation of China(Grant No.52375380)National Key R&D Program of China(Grant No.2022YFB3402200)the Key Project of National Natural Science Foundation of China(Grant No.12032018).
文摘The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies make it possible to fabricate these highly spatially programmable structures and greatly enhance the freedom in their design.However,traditional analytical methods do not sufficiently reflect the actual vibration-damping mechanism of lattice structures and are limited by their high computational cost.In this study,a hybrid lattice structure consisting of various cells was designed based on quasi-static and vibration experiments.Subsequently,a novel parametric design method based on a data-driven approach was developed for hybrid lattices with engineered properties.The response surface method was adopted to define the sensitive optimization target.A prediction model for the lattice geometric parameters and vibration properties was established using a backpropagation neural network.Then,it was integrated into the genetic algorithm to create the optimal hybrid lattice with varying geometric features and the required wide-band vibration-damping characteristics.Validation experiments were conducted,demonstrating that the optimized hybrid lattice can achieve the target properties.In addition,the data-driven parametric design method can reduce computation time and be widely applied to complex structural designs when analytical and empirical solutions are unavailable.
文摘In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This system restores the essential characteristics of currency while providing auxiliary services related to the formation,circulation,storage,application,and promotion of digital currency.Compared to traditional currency management technologies,big data analysis technology,which is primarily embedded in digital currency systems,enables the rapid acquisition of information.This facilitates the identification of standard associations within currency data and provides technical support for the operational framework of digital currency.
文摘Through various investigations, it can be found that the current situation of our country's audit industry is still dominated by traditional methods, which rely more on the auditors' experience and judgment, and cannot obtain the full amount of data for audit, but only discover what to evaluate, which is commonly known as "seeing only trees but not forests". Therefore, it is impossible to objectively and accurately evaluate the real situation of the audited units, and the audit efficiency and effectiveness are greatly reduced. With the continuous rapid development of the global economy, science and technology in many fields are progressing together, and many industries are increasingly relying on the use of big data information technology. Against this background, the introduction of big data technology can bring about significant changes in audit work. Because the audit work is based on big data, this enables auditors to rely more on the use of this technology. At the same time, it can adapt to the development of the current big data era and bring convenience to the audit work. Under the background of big data, the use and innovation of audit methods are conducive to the effective adjustment and improvement of audit methods themselves, and are also conducive to the faster extraction of effective information and its effective analysis and use. This is the direction of the progress and development of China's audit industry, and is also the realistic need of our auditors. Therefore, the audit work under the background of big data is of great significance.
文摘The implementation of artificial intelligence(AI)in a smart society,in which the analysis of human habits is mandatory,requires automated data scheduling and analysis using smart applications,a smart infrastructure,smart systems,and a smart network.In this context,which is characterized by a large gap between training and operative processes,a dedicated method is required to manage and extract the massive amount of data and the related information mining.The method presented in this work aims to reduce this gap with near-zero-failure advanced diagnostics(AD)for smart management,which is exploitable in any context of Society 5.0,thus reducing the risk factors at all management levels and ensuring quality and sustainability.We have also developed innovative applications for a humancentered management system to support scheduling in the maintenance of operative processes,for reducing training costs,for improving production yield,and for creating a human–machine cyberspace for smart infrastructure design.The results obtained in 12 international companies demonstrate a possible global standardization of operative processes,leading to the design of a near-zero-failure intelligent system that is able to learn and upgrade itself.Our new method provides guidance for selecting the new generation of intelligent manufacturing and smart systems in order to optimize human–machine interactions,with the related smart maintenance and education.
文摘In this paper, a real-time online data-driven adaptive method is developed to deal with uncertainties such as high nonlinearity, strong coupling, parameter perturbation and external disturbances in attitude control of fixed-wing unmanned aerial vehicles (UAVs). Firstly, a model-free adaptive control (MFAC) method requiring only input/output (I/O) data and no model information is adopted for control scheme design of angular velocity subsystem which contains all model information and up-mentioned uncertainties. Secondly, the internal model control (IMC) method featured with less tuning parameters and convenient tuning process is adopted for control scheme design of the certain Euler angle subsystem. Simulation results show that, the method developed is obviously superior to the cascade PID (CPID) method and the nonlinear dynamic inversion (NDI) method.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
基金supported by the National Key R&D Program of China(No.2022YFA1602401)by the National Natural Science Foundation of China(No.11825504)。
文摘Big Bang nucleosynthesis(BBN)theory predicts the primordial abundances of the light elements^(2) H(referred to as deuterium,or D for short),^(3)He,^(4)He,and^(7) Li produced in the early universe.Among these,deuterium,the first nuclide produced by BBN,is a key primordial material for subsequent reactions.To date,the uncertainty in predicted deuterium abundance(D/H)remains larger than the observational precision.In this study,the Monte Carlo simulation code PRIMAT was used to investigate the sensitivity of 11 important BBN reactions to deuterium abundance.We found that the reaction rate uncertainties of the four reactions d(d,n)^(3)He,d(d,p)t,d(p,γ)^(3)He,and p(n,γ)d had the largest influence on the calculated D/H uncertainty.Currently,the calculated D/H uncertainty cannot reach observational precision even with the recent LUNA precise d(p,γ)^(3) He rate.From the nuclear physics aspect,there is still room to largely reduce the reaction-rate uncertainties;hence,further measurements of the important reactions involved in BBN are still necessary.A photodisintegration experiment will be conducted at the Shanghai Laser Electron Gamma Source Facility to precisely study the deuterium production reaction of p(n,γ)d.
基金supported by Research Grant from the Kajima Foundation,JST CREST Grant No.JPMJCR1911,JapanJSPS KAKENHI(Nos.17K06633,21K04351).
文摘Data-driven computing in elasticity attempts to directly use experimental data on material,without constructing an empirical model of the constitutive relation,to predict an equilibrium state of a structure subjected to a specified external load.Provided that a data set comprising stress-strain pairs of material is available,a data-driven method using the kernel method and the regularized least-squares was developed to extract a manifold on which the points in the data set approximately lie(Kanno 2021,Jpn.J.Ind.Appl.Math.).From the perspective of physical experiments,stress field cannot be directly measured,while displacement and force fields are measurable.In this study,we extend the previous kernel method to the situation that pairs of displacement and force,instead of pairs of stress and strain,are available as an input data set.A new regularized least-squares problem is formulated in this problem setting,and an alternating minimization algorithm is proposed to solve the problem.
文摘Big data could be utilized in work and life. Asset appraisal could also make full use of big data to improve the efficiency and effectiveness of appraisal. The paper is going to study the application of big data in different fields to learn how big data works in practices and what the effect is after utilizing the new tool. Then, the paper is going to apply big data in appraisal in specific work environment. By collecting information, researching literature and practicing with appraisers, this paper f'mds some means to improve market method by making full use of big data. The article researched further by applying the method in different projects of asset appraisal. Real estate, intangible asset, corporate valuation and machines could be valued by the market method improved by big data. There are different details for appraisers to be careful in practical work. Some companies have already put the technology into practice and achieved great benefit, which makes the application of big data meaningful.
文摘The big data era is coming, which influences the life of human beings in every aspect, such as working, studying, shopping and so on. The data could be uploaded and recorded by the digital devices like smart-phone and pad. The volume of data could provide useful information to hdp learn the habit of human beings and improve the efficiency of work. The domain of asset appraisal could make full use of big data to collect and sort information involving the appraised asset and market. On the one hand, market method of asset appraisal needs a plenty of information of reference substance and industry development. On the other hand, big data with the trait of volume and velocity could be utilized to collect information. The paper reveals that taking advantage of big data application in asset appraisal using market method is an evolutionary process in which the gradual understanding of the potential of big data plays a crucial role.
文摘The database research method is a method that analyses, generalizes and deduces from the data of subject investigated with database techniques, quantitative statistics and mathematical models. As the big data age comes with the data explosion in modem society, the International Chinese Language Teaching (ICLT) shows signs of sizable data accumulation, remarkable economic property, strong modeling requirements and notable cross-research trends, which thus make this method necessary as a new and independent research method in the researches on this area. Theory bases, applicative areas, available software and data resources, research program designs, as well as their advantages and disadvantages will be figured out in this paper. In the near future, it will bring about a revolution to the international Chinese language teaching.