期刊文献+
共找到107,874篇文章
< 1 2 250 >
每页显示 20 50 100
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
1
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
Current Situation of Application and Development Prospects of the Statistical Analysis of Big Data
2
作者 Zhuoran LI 《Meteorological and Environmental Research》 2026年第1期45-47,共3页
With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues su... With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues such as computer memory capacity and distributed storage of data across different locations,and are unable to directly apply to large-scale data sets.Therefore,in the context of big data,designing efficient and theoretically guaranteed statistical learning and inference algorithms has become a key issue that the current field of statistics urgently needs to address.In this paper,the application status of statistical analysis methods in the big data environment was systematically reviewed,and its future development directions were analyzed to provide reference and support for the further development of theory and methods of the statistical analysis of big data. 展开更多
关键词 Big data Statistical analysis Current status Development prospects
在线阅读 下载PDF
A Virtual Probe Deployment Method Based on User Behavioral Feature Analysis
3
作者 Bing Zhang Wenqi Shi 《Computers, Materials & Continua》 2026年第2期2017-2035,共19页
To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,thi... To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,this paper proposes a novel virtual probe deployment method based on user behavioral feature analysis.The core idea is to circumvent LBS anomaly detection by mimicking real-user behavior patterns.First,we design an automated data extraction algorithm that recognizes graphical user interface(GUI)elements to collect spatio-temporal behavior data.Then,by analyzing the automatically collected user data,we identify normal users’spatio-temporal patterns and extract their features such as high-activity time windows and spatial clustering characteristics.Subsequently,an antidetection scheduling strategy is developed,integrating spatial clustering optimization,load-balanced allocation,and time window control to generate probe scheduling schemes.Additionally,a self-correction mechanism based on an exponential backoff strategy is implemented to rectify anomalous behaviors andmaintain system stability.Experiments in real-world environments demonstrate that the proposed method significantly outperforms baseline methods in terms of both probe ban rate and task completion rate,while maintaining high time efficiency.This study provides a more reliable and clandestine solution for geosocial data collection and lays the foundation for building more robust virtual probe systems. 展开更多
关键词 Virtual probe behavior feature analysis anomaly detection scheduling strategy geosocial data collection
在线阅读 下载PDF
GranuSAS:Software of rapid particle size distribution analysis from small angle scattering data
4
作者 Qiaoyu Guo Fei Xie +3 位作者 Xuefei Feng Zhe Sun Changda Wang Xuechen Jiao 《Chinese Physics B》 2026年第2期216-225,共10页
Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces th... Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization. 展开更多
关键词 small angle x-ray scattering data analysis software particle size distribution inverse problem
原文传递
Bending Analysis of Functionally Graded Material and Cracked Homogeneous Thin Plates Using Meshfree Numerical Manifold Method
5
作者 Shouyang Huang Hong Zheng +2 位作者 Xuguang Yu Ziheng Li Zhiwei Pan 《Computer Modeling in Engineering & Sciences》 2026年第3期304-340,共37页
Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture ... Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture mechanics analysis.These two classes of problems respectively embody material non-uniformity and geometric discontinuity,thereby imposing more stringent requirements on numerical methods in terms of high-order field continuity and accurate defect representation.Based on the classical Kirchhoff-Love plate theory,a numerical manifold method(MLS-NMM)incorporating moving least squares(MLS)interpolation is developed for bending analysis of FGM plates and fracture simulation of homogeneous plates with defects.The method constructs an H^(2)-regular approximation with high-order continuous weighting functions and,combined with the separation of mathematical and physical covers,establishes a unified framework that accurately handles material gradients and cracks without mesh reconstruction.For the crack tip,a singular physical cover incorporating the Williams asymptotic field is introduced to achieve local enrichment,enabling the natural capture of displacement discontinuity and stress singularity.Stress intensity factors are extracted using the interaction integral method,and the dimensionless J-integral shows a maximum relative error below 1.2%compared with the reference solution.Numerical results indicate that MLS-NMM exhibits excellent convergence performance:using 676 mathematical nodes,the nondimensional central deflection of both FGM and homogeneous plates agrees with reference solutions with a maximum relative error below 0.81%,and no shear locking occurs.A systematic analysis reveals that for a simply supported on all four edges(SSSS)FGM square plate with a/h=10,the nondimensional central deflection increases by 212%as the gradient index nrises from 0 to 5.For a homogeneous plate containing a central crack with c/a=0.6,the nondimensional central deflection increases by approximately 46%compared with the intact plate.Under weak boundary constraints(e.g.,SFSF),the deformation is markedly amplified,with the deflection reaching more than three times that under strong constraints(SCSC).The proposed method provides an efficient,reconstruction-free numerical tool for high-accuracy bending and fracture analyses of FGM and cracked thin-plate structures. 展开更多
关键词 Kirchhoff-love plate theory functionally graded materials moving least squares interpolation numerical manifold method bending analysis fracture mechanics stress intensity factor
在线阅读 下载PDF
Bayesian analysis of Gamow resonances with reduced basis methods:from eigenvector continuation to post-emulation corrections
6
作者 Ruo-Yu Cheng Zhi-Cheng Xu 《Nuclear Science and Techniques》 2025年第12期233-243,共11页
To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The R... To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The RBM,constructed via eigenvector continuation and trained on both bound and resonant configurations,enables the fast and accurate emulation of resonance properties across the parameter space.To identify the physical resonant states from the emulator’s output,we introduce an overlap-based selection technique that effectively isolates true solutions from background artifacts.By applying this framework to unbound nucleus ^(6)Be,we quantified the model uncertainty in the predicted complex energies.The results demonstrate relative errors of 17.48%in the real part and 8.24%in the imaginary part,while achieving a speedup of four orders of magnitude compared with the full GCC calculations.To further investigate the asymptotic behavior of the resonant-state wavefunctions within the RBM framework,we employed a Lippmann–Schwinger(L–S)-based correction scheme.This approach not only improves the consistency between eigenvalues and wavefunctions but also enables a seamless extension from real-space training data to the complex energy plane.By bridging the gap between bound-state and continuum regimes,the L–S correction significantly enhances the emulator’s capability to accurately capture continuum structures in open quantum systems. 展开更多
关键词 Uncertainty quantification Reduced basis method Resonance emulator Bayesian analysis Gamow coupled-channel model
在线阅读 下载PDF
A method to address the challenges of charging conditions on incremental capacity analysis:An ICA-compensation technique incorporating current interrupt methods
7
作者 Jinghua Sun Josef Kainz 《Journal of Energy Chemistry》 2025年第9期65-80,I0004,共17页
The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ... The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ICA-compensation technique to address this limitation and propose a generalized framework for assessing the state of health(SOH)of batteries based on ICA that is applicable under differing charging conditions.This novel approach calculates the voltage profile under quasi-static conditions by subtracting the voltage increase attributable to the additional polarization effects at high currents from the measured voltage profile.This approach's efficacy is contingent upon precisely acquiring the equivalent impedance.To obtain the equivalent impedance throughout the batteries'lifespan while minimizing testing costs,this study employs a current interrupt technique in conjunction with a long short-term memory(LSTM)network to develop a predictive model for equivalent impedance.Following the derivation of ICA curves using voltage profiles under quasi-static conditions,the research explores two scenarios for SOH estimation:one utilizing only incremental capacity(IC)features and the other incorporating both IC features and IC sampling.A genetic algorithm-optimized backpropagation neural network(GABPNN)is employed for the SOH estimation.The proposed generalized framework is validated using independent training and test datasets.Variable test conditions are applied for the test set to rigorously evaluate the methodology under challenging conditions.These evaluation results demonstrate that the proposed framework achieves an estimation accuracy of 1.04%for RMSE and 0.90%for MAPE across a spectrum of charging rates ranging from 0.1 C to 1 C and starting SOCs between 0%and 70%,which constitutes a major advancement compared to established ICA methods.It also significantly enhances the applicability of conventional ICA techniques in varying charging conditions and negates the necessity for separate testing protocols for each charging scenario. 展开更多
关键词 Lithium-ion batteries Incremental capacity analysis Charging conditions State of health Current interrupt method
在线阅读 下载PDF
Discriminating Among Relatively Efficient Units in Data Envelopment Analysis: A Comparison of Alternative Methods and Some Extensions 被引量:1
8
作者 Antreas D. Athanassopoulos 《American Journal of Operations Research》 2012年第1期1-9,共9页
This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performa... This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performance of such units. The paper provides a systematic framework of alternative ways for ranking DEA-efficient units. The framework contains criteria derived as by-products of the basic DEA models and also criteria derived from complementary DEA analysis that needs to be carried out. The proposed framework is applied to rank a set of relatively efficient restaurants on the basis of their market efficiency. 展开更多
关键词 data Envelopment analysis Cross-Efficiency SUPER-EFFICIENCY ABSOLUTE RANKING Linear PROGRAMMING
暂未订购
Intelligent Electrocardiogram Analysis in Medicine:Data,Methods,and Applications
9
作者 Yu-Xia Guan Ying An +2 位作者 Feng-Yi Guo Wei-Bai Pan Jian-Xin Wang 《Chinese Medical Sciences Journal》 CAS CSCD 2023年第1期38-48,共11页
Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been wi... Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been widely used in various biomedical applications such as arrhythmia detection,disease-specific detection,mortality prediction,and biometric recognition.In recent years,ECG-related studies have been carried out using a variety of publicly available datasets,with many differences in the datasets used,data preprocessing methods,targeted challenges,and modeling and analysis techniques.Here we systematically summarize and analyze the ECGbased automatic analysis methods and applications.Specifically,we first reviewed 22 commonly used ECG public datasets and provided an overview of data preprocessing processes.Then we described some of the most widely used applications of ECG signals and analyzed the advanced methods involved in these applications.Finally,we elucidated some of the challenges in ECG analysis and provided suggestions for further research. 展开更多
关键词 ELECTROCARDIOGRAM dataBASE PREPROCESSING machine learning medical big data analysis
在线阅读 下载PDF
Collecting Statistical Methods for the Analysis of Climate Data as Service for Adaptation Projects
10
作者 Barbara Hennemuth Steffen Bender +4 位作者 Katharina Bülow Norman Dreier Peter Hoffmann Elke Keup-Thiel Christoph Mudersbach 《American Journal of Climate Change》 2015年第1期9-21,共13页
The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods... The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure. 展开更多
关键词 STATISTICAL methods COLLECTION CLIMATE data CLIMATE ADAPTATION
暂未订购
Integrated interpretation of dual frequency induced polarization measurement based on wavelet analysis and metal factor methods 被引量:3
11
作者 韩世礼 张术根 +2 位作者 柳建新 胡厚继 张文山 《Transactions of Nonferrous Metals Society of China》 SCIE EI CAS CSCD 2013年第5期1465-1471,共7页
In mineral exploration, the apparent resistivity and apparent frequency (or apparent polarizability) parameters of induced polarization method are commonly utilized to describe the induced polarization anomaly. When... In mineral exploration, the apparent resistivity and apparent frequency (or apparent polarizability) parameters of induced polarization method are commonly utilized to describe the induced polarization anomaly. When the target geology structure is significantly complicated, these parameters would fail to reflect the nature of the anomaly source, and wrong conclusions may be obtained. A wavelet approach and a metal factor method were used to comprehensively interpret the induced polarization anomaly of complex geologic bodies in the Adi Bladia mine. Db5 wavelet basis was used to conduct two-scale decomposition and reconstruction, which effectively suppress the noise interference of greenschist facies regional metamorphism and magma intrusion, making energy concentrated and boundary problem unobservable. On the basis of that, the ore-induced anomaly was effectively extracted by the metal factor method. 展开更多
关键词 dual frequency induced polarization method wavelet analysis metal factor Arabian-Nubian shield volcanogenic massive sulfide deposit
在线阅读 下载PDF
Statistical Methods of SNP Data Analysis and Applications
12
作者 Alexander Bulinski Oleg Butkovsky +5 位作者 Victor Sadovnichy Alexey Shashkin Pavel Yaskov Alexander Balatskiy Larisa Samokhodskaya Vsevolod Tkachuk 《Open Journal of Statistics》 2012年第1期73-87,共15页
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction... We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed. 展开更多
关键词 Genetic data Statistical analysis Multifactor Dimensionality Reduction Ternary Logic Regression Random FORESTS Stochastic Gradient Boosting Independent Rule Single NUCLEOTIDE POLYMORPHISMS CORONARY Heart Disease MYOCARDIAL INFARCTION
暂未订购
Systematic Review of Graphical Visual Methods in Honeypot Attack Data Analysis
13
作者 Gbenga Ikuomenisan Yasser Morgan 《Journal of Information Security》 2022年第4期210-243,共34页
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili... Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation. 展开更多
关键词 Honeypot data analysis Network Intrusion Detection Visualization and Visual analysis Graphical methods and Perception Systematic Literature Review
在线阅读 下载PDF
Techno-Economic and Sustainability Analysis of Potential Cooling Methods in Irish Data Centres
14
作者 Lee Gibbons Tim Persoons Sajad Alimohammadi 《Journal of Electronics Cooling and Thermal Control》 2021年第3期35-54,共20页
11% of Irish electricity was consumed by data centres in 2020. The Irish data centre industry and the cooling methods utilised require reformative actions in the coming years to meet EU Energy policies. The resell of ... 11% of Irish electricity was consumed by data centres in 2020. The Irish data centre industry and the cooling methods utilised require reformative actions in the coming years to meet EU Energy policies. The resell of heat, alternative cooling methods or carbon reduction methods are all possibilities to conform to these policies. This study aims to determine the viability of the resell of waste heat from data centres both technically and economically. This was determined using a novel application of thermodynamics to determine waste heat recovery potential in Irish data centres, and the current methods of heat generation for economical comparison. This paper also explores policy surrounding waste heat recovery within the industry. The Recoverable Carnot Equivalent Power (RCEP) is theoretically calculated for the three potential cooling methods for Irish data centres. These are air, hybrid, and immersion cooling techniques. This is the maximum useable heat that can be recovered from a data centre rack. This study is established under current operating conditions which are optimised for cooling performance, that air cooling has the highest potential RCEP of 0.39 kW/rack. This is approximately 8% of the input electrical power that can be captured as useable heat. Indicating that Irish data centres have the energy potential to be heat providers in the Irish economy. This study highlighted the technical and economic aspects of prevalent cooling techniques and determined air cooling heat recovery cost can be reduced to 0.01 €/kWhth using offsetting. This is financially competitive with current heating solutions in Ireland. 展开更多
关键词 IRELAND data Centres TECHNO-ECONOMIC Novel Cooling methods Heat Resell SUSTAINABILITY Energy Demand
在线阅读 下载PDF
Data Analysis Methods and Signal Processing Techniques in Gravitational Wave Detection
15
作者 Bojun Yan 《Journal of Applied Mathematics and Physics》 2024年第11期3774-3783,共10页
Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive r... Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy. 展开更多
关键词 Gravitational Wave Detection data analysis Signal Processing Matched Filtering Machine Learning
在线阅读 下载PDF
Artificial intelligence-assisted non-metallic inclusion particle analysis in advanced steels using machine learning:A review
16
作者 Gonghao Lian Xiaoming Liu +3 位作者 Qiang Wang Chunguang Shen Yi Wang Wangzhong Mu 《International Journal of Minerals,Metallurgy and Materials》 2026年第2期401-416,共16页
The detection and characterization of non-metallic inclusions are essential for clean steel production.Recently,imaging analysis combined with high-dimensional data processing of metallic materials using artificial in... The detection and characterization of non-metallic inclusions are essential for clean steel production.Recently,imaging analysis combined with high-dimensional data processing of metallic materials using artificial intelligence(AI)-based machine learning(ML)has developed rapidly.This technique has achieved impressive results in the field of inclusion classification in process metallurgy.The present study surveys the ML modeling of inclusion prediction in advanced steels,including the detection,classification,and feature prediction of inclusions in different steel grades.Studies on clean steel with different features based on data and image analysis via ML are summarized.Regarding the data analysis,the inclusion prediction methodology based on ML establishes a connection between the experimental parameters and inclusion characteristics and analyzes the importance of the experimental parameters.Regarding the image analysis,the focus is placed on the classification of different types of inclusions via deep learning,in comparison with data analysis.Finally,further development of inclusion analyses using ML-based methods is recommended.This work paves the way for the application of AIbased methodologies for ultraclean-steel studies from a sustainable metallurgy perspective. 展开更多
关键词 machine learning inclusion classification image analysis data analysis clean steel
在线阅读 下载PDF
Bibliometric analysis of papers on inflammation in glaucoma from 2000 to 2025
17
作者 Wen-Li Chen Xue Wu Li-Xia Zhang 《International Journal of Ophthalmology(English edition)》 2026年第3期590-599,共10页
AIM:To perform a bibliometric analysis of publications focusing on inflammatory mechanisms in glaucoma,thereby comprehensively understanding the current research status and identifying potential frontier directions fo... AIM:To perform a bibliometric analysis of publications focusing on inflammatory mechanisms in glaucoma,thereby comprehensively understanding the current research status and identifying potential frontier directions for future studies.METHODS:A systematic search was conducted in the Web of Science Core Collection(WoSCC)database to retrieve relevant literature published from January 1,2000,to August 31,2025(data accessed on September 12,2025).Multiple data visualization tools were employed to conduct in-depth analyses of the included publications,covering aspects such as publication quantity and quality,evolutionary trends of research hotspots,keyword cooccurrence networks,and collaborative patterns among countries/regions,institutions,and authors.RESULTS:A total of 3381 articles related to glaucoma inflammation were extracted from WoSCC.The analysis showed that the USA had the highest research output in this field(29.04%,n=982),followed by China(18.40%,n=622)and UK(6.01%,n=203).Based on citation frequency and burst intensity,the USA also ranked as the most influential country.Baudouin C and Sun X were identified as the most productive authors,while Journal of Glaucoma and Investigative Ophthalmology&Visual Science were the journals with the highest number of published relevant articles.Additionally,keyword analysis revealed that“neuroinflammation”,“retinal ganglion cells(RGCs)”,“pathophysiology”,and“traditional Chinese medicine”are emerging research hotspots in the field of immuneinflammatory responses in glaucoma.CONCLUSION:This study presents a comprehensive bibliometric overview of research on glaucoma-related inflammation,indicating that this field has received extensive scientific attention with a steady upward trend in research activity.Furthermore,it establishes a theoretical basis for the development of neuroinflammation-targeted therapeutic strategies for glaucoma and emphasizes the necessity of strengthening interdisciplinary collaboration to promote the clinical translation of research findings. 展开更多
关键词 GLAUCOMA inflammatory mechanism bibliometric analysis data visualization research hotspot NEUROINFLAMMATION
原文传递
Commentary on:Intensity modifies the association between continuous bouts of physical activity and risk of mortality:A prospective UK Biobank cohort analysis
18
作者 Barbara E.Ainsworth Zhenghua Cai 《Journal of Sport and Health Science》 2026年第2期77-79,共3页
Rowlands et al.1present an analysis of accelerometer data from the UK Biobank cohort,examining variations in the duration,intensity,and accumulation of moderate-intensity physical activity(MPA)and vigorous-intensity p... Rowlands et al.1present an analysis of accelerometer data from the UK Biobank cohort,examining variations in the duration,intensity,and accumulation of moderate-intensity physical activity(MPA)and vigorous-intensity physical activity(VPA)sufficient to reduce the risk of all-cause mortality.In this study,the authors questioned if shorter durations(i.e.,1,2,3,4,5,10,15,and 20 min/day)of MPA and VPA performed continuously or accumulated throughout the day would equally reduce the risks of all-cause mortality as longer duration MPA and VPA recommended in the physical activity(PA)guidelines. 展开更多
关键词 INTENSITY ACCELEROMETER MORTALITY ASSOCIATION risk prospective cohort analysis accelerometer data UK Biobank
在线阅读 下载PDF
Stability analysis of soft-hard interbedded anti-inclined rock slope under rainfall based on deformation compatibility
19
作者 GUO Jianjun WU Zhenwei +2 位作者 CAO Heng ZHANG Wei WANG Junjie 《Journal of Mountain Science》 2026年第1期380-393,共14页
Rock slope instability is a prevalent geological hazard that imposes significant adverse impacts on engineering activities.Although existing studies have focused on homogeneous rock slopes,the theoretical models for q... Rock slope instability is a prevalent geological hazard that imposes significant adverse impacts on engineering activities.Although existing studies have focused on homogeneous rock slopes,the theoretical models for quantifying the stability of softhard interbedded anti-inclined slopes remain underdeveloped,primarily due to the complex force transfer mechanisms involved.This study proposed a novel theoretical model for the stability analysis of soft-hard interbedded anti-inclined slopes under rainfall conditions.The framework models stratified rock layers as layered cantilever beams with material heterogeneity.Based on the principle of deformation compatibility,it comprehensively accounted for interlayer force transfer and strength degradation resulting from differential deformations among rock layers.Furthermore,it integrated the critical instability length induced by the self-weight of rock layers to determine the fracture depth.The proposed method was validated against engineering case studies and physical model tests,with error falling within an acceptable range.Compared to existing theoretical methods,the proposed method provided a more realistic representation of the slope's stress field.The analysis results demonstrate that rainfall not only reduces the inclination angle of the failure surface but also leads to an approximate 30%decrease in the safety factor.The proposed theoretical model is particularly useful for quickly calculating the stability of soft-hard interbedded anti-inclined rock slope under rainfall conditions,compared to complex and time-consuming numerical simulation calculations. 展开更多
关键词 Soft-hard interbedded Anti-inclined slope RAINFALL Stability analysis Theoretical method
原文传递
Modal analysis on a fluid-conveying pipe subject to elastic supports with unknown-but-bounded parameters
20
作者 Sha Wei Xulong Li +2 位作者 Xiong Yan Hu Ding Liqun Chen 《Acta Mechanica Sinica》 2026年第1期310-324,共15页
Uncertain parameters are widespread in engineering systems.This study investigates the modal analysis of a fluid-conveying pipe subjected to elastic supports with unknown-but-bound parameters.The governing equation fo... Uncertain parameters are widespread in engineering systems.This study investigates the modal analysis of a fluid-conveying pipe subjected to elastic supports with unknown-but-bound parameters.The governing equation for the elastically supported fluid-conveying pipe is transformed into ordinary differential equations using the Galerkin truncation method.The Chebyshev interval approach,integrated with the assumed mode method is then used to investigate the effects of uncertainties of support stiffness,fluid speed,and pipe length on the natural frequencies and mode shapes of the pipe.Additionally,both symmetrical and asymmetrical support stiffnesses are discussed.The accuracy and effectiveness of the Chebyshev interval approach are verified through comparison with the Monte Carlo method.The results reveal that,for the same deviation coefficient,uncertainties in symmetrical support stiffness have a greater impact on the first four natural frequencies than those of the asymmetrical one.There may be significant differences in the sensitivity of natural frequencies and mode shapes of the same order to uncertain parameters.Notably,mode shapes susceptible to uncertain parameters exhibit wider fluctuation intervals near the elastic supports,requiring more attention. 展开更多
关键词 Fluid-conveying pipe Elastic support UNCERTAINTY Modal analysis Chebyshev interval method
原文传递
上一页 1 2 250 下一页 到第
使用帮助 返回顶部