Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei...Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.展开更多
With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues su...With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues such as computer memory capacity and distributed storage of data across different locations,and are unable to directly apply to large-scale data sets.Therefore,in the context of big data,designing efficient and theoretically guaranteed statistical learning and inference algorithms has become a key issue that the current field of statistics urgently needs to address.In this paper,the application status of statistical analysis methods in the big data environment was systematically reviewed,and its future development directions were analyzed to provide reference and support for the further development of theory and methods of the statistical analysis of big data.展开更多
To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,thi...To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,this paper proposes a novel virtual probe deployment method based on user behavioral feature analysis.The core idea is to circumvent LBS anomaly detection by mimicking real-user behavior patterns.First,we design an automated data extraction algorithm that recognizes graphical user interface(GUI)elements to collect spatio-temporal behavior data.Then,by analyzing the automatically collected user data,we identify normal users’spatio-temporal patterns and extract their features such as high-activity time windows and spatial clustering characteristics.Subsequently,an antidetection scheduling strategy is developed,integrating spatial clustering optimization,load-balanced allocation,and time window control to generate probe scheduling schemes.Additionally,a self-correction mechanism based on an exponential backoff strategy is implemented to rectify anomalous behaviors andmaintain system stability.Experiments in real-world environments demonstrate that the proposed method significantly outperforms baseline methods in terms of both probe ban rate and task completion rate,while maintaining high time efficiency.This study provides a more reliable and clandestine solution for geosocial data collection and lays the foundation for building more robust virtual probe systems.展开更多
Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces th...Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization.展开更多
Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture ...Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture mechanics analysis.These two classes of problems respectively embody material non-uniformity and geometric discontinuity,thereby imposing more stringent requirements on numerical methods in terms of high-order field continuity and accurate defect representation.Based on the classical Kirchhoff-Love plate theory,a numerical manifold method(MLS-NMM)incorporating moving least squares(MLS)interpolation is developed for bending analysis of FGM plates and fracture simulation of homogeneous plates with defects.The method constructs an H^(2)-regular approximation with high-order continuous weighting functions and,combined with the separation of mathematical and physical covers,establishes a unified framework that accurately handles material gradients and cracks without mesh reconstruction.For the crack tip,a singular physical cover incorporating the Williams asymptotic field is introduced to achieve local enrichment,enabling the natural capture of displacement discontinuity and stress singularity.Stress intensity factors are extracted using the interaction integral method,and the dimensionless J-integral shows a maximum relative error below 1.2%compared with the reference solution.Numerical results indicate that MLS-NMM exhibits excellent convergence performance:using 676 mathematical nodes,the nondimensional central deflection of both FGM and homogeneous plates agrees with reference solutions with a maximum relative error below 0.81%,and no shear locking occurs.A systematic analysis reveals that for a simply supported on all four edges(SSSS)FGM square plate with a/h=10,the nondimensional central deflection increases by 212%as the gradient index nrises from 0 to 5.For a homogeneous plate containing a central crack with c/a=0.6,the nondimensional central deflection increases by approximately 46%compared with the intact plate.Under weak boundary constraints(e.g.,SFSF),the deformation is markedly amplified,with the deflection reaching more than three times that under strong constraints(SCSC).The proposed method provides an efficient,reconstruction-free numerical tool for high-accuracy bending and fracture analyses of FGM and cracked thin-plate structures.展开更多
To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The R...To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The RBM,constructed via eigenvector continuation and trained on both bound and resonant configurations,enables the fast and accurate emulation of resonance properties across the parameter space.To identify the physical resonant states from the emulator’s output,we introduce an overlap-based selection technique that effectively isolates true solutions from background artifacts.By applying this framework to unbound nucleus ^(6)Be,we quantified the model uncertainty in the predicted complex energies.The results demonstrate relative errors of 17.48%in the real part and 8.24%in the imaginary part,while achieving a speedup of four orders of magnitude compared with the full GCC calculations.To further investigate the asymptotic behavior of the resonant-state wavefunctions within the RBM framework,we employed a Lippmann–Schwinger(L–S)-based correction scheme.This approach not only improves the consistency between eigenvalues and wavefunctions but also enables a seamless extension from real-space training data to the complex energy plane.By bridging the gap between bound-state and continuum regimes,the L–S correction significantly enhances the emulator’s capability to accurately capture continuum structures in open quantum systems.展开更多
The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ...The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ICA-compensation technique to address this limitation and propose a generalized framework for assessing the state of health(SOH)of batteries based on ICA that is applicable under differing charging conditions.This novel approach calculates the voltage profile under quasi-static conditions by subtracting the voltage increase attributable to the additional polarization effects at high currents from the measured voltage profile.This approach's efficacy is contingent upon precisely acquiring the equivalent impedance.To obtain the equivalent impedance throughout the batteries'lifespan while minimizing testing costs,this study employs a current interrupt technique in conjunction with a long short-term memory(LSTM)network to develop a predictive model for equivalent impedance.Following the derivation of ICA curves using voltage profiles under quasi-static conditions,the research explores two scenarios for SOH estimation:one utilizing only incremental capacity(IC)features and the other incorporating both IC features and IC sampling.A genetic algorithm-optimized backpropagation neural network(GABPNN)is employed for the SOH estimation.The proposed generalized framework is validated using independent training and test datasets.Variable test conditions are applied for the test set to rigorously evaluate the methodology under challenging conditions.These evaluation results demonstrate that the proposed framework achieves an estimation accuracy of 1.04%for RMSE and 0.90%for MAPE across a spectrum of charging rates ranging from 0.1 C to 1 C and starting SOCs between 0%and 70%,which constitutes a major advancement compared to established ICA methods.It also significantly enhances the applicability of conventional ICA techniques in varying charging conditions and negates the necessity for separate testing protocols for each charging scenario.展开更多
This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performa...This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performance of such units. The paper provides a systematic framework of alternative ways for ranking DEA-efficient units. The framework contains criteria derived as by-products of the basic DEA models and also criteria derived from complementary DEA analysis that needs to be carried out. The proposed framework is applied to rank a set of relatively efficient restaurants on the basis of their market efficiency.展开更多
Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been wi...Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been widely used in various biomedical applications such as arrhythmia detection,disease-specific detection,mortality prediction,and biometric recognition.In recent years,ECG-related studies have been carried out using a variety of publicly available datasets,with many differences in the datasets used,data preprocessing methods,targeted challenges,and modeling and analysis techniques.Here we systematically summarize and analyze the ECGbased automatic analysis methods and applications.Specifically,we first reviewed 22 commonly used ECG public datasets and provided an overview of data preprocessing processes.Then we described some of the most widely used applications of ECG signals and analyzed the advanced methods involved in these applications.Finally,we elucidated some of the challenges in ECG analysis and provided suggestions for further research.展开更多
The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods...The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.展开更多
In mineral exploration, the apparent resistivity and apparent frequency (or apparent polarizability) parameters of induced polarization method are commonly utilized to describe the induced polarization anomaly. When...In mineral exploration, the apparent resistivity and apparent frequency (or apparent polarizability) parameters of induced polarization method are commonly utilized to describe the induced polarization anomaly. When the target geology structure is significantly complicated, these parameters would fail to reflect the nature of the anomaly source, and wrong conclusions may be obtained. A wavelet approach and a metal factor method were used to comprehensively interpret the induced polarization anomaly of complex geologic bodies in the Adi Bladia mine. Db5 wavelet basis was used to conduct two-scale decomposition and reconstruction, which effectively suppress the noise interference of greenschist facies regional metamorphism and magma intrusion, making energy concentrated and boundary problem unobservable. On the basis of that, the ore-induced anomaly was effectively extracted by the metal factor method.展开更多
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction...We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.展开更多
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili...Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.展开更多
11% of Irish electricity was consumed by data centres in 2020. The Irish data centre industry and the cooling methods utilised require reformative actions in the coming years to meet EU Energy policies. The resell of ...11% of Irish electricity was consumed by data centres in 2020. The Irish data centre industry and the cooling methods utilised require reformative actions in the coming years to meet EU Energy policies. The resell of heat, alternative cooling methods or carbon reduction methods are all possibilities to conform to these policies. This study aims to determine the viability of the resell of waste heat from data centres both technically and economically. This was determined using a novel application of thermodynamics to determine waste heat recovery potential in Irish data centres, and the current methods of heat generation for economical comparison. This paper also explores policy surrounding waste heat recovery within the industry. The Recoverable Carnot Equivalent Power (RCEP) is theoretically calculated for the three potential cooling methods for Irish data centres. These are air, hybrid, and immersion cooling techniques. This is the maximum useable heat that can be recovered from a data centre rack. This study is established under current operating conditions which are optimised for cooling performance, that air cooling has the highest potential RCEP of 0.39 kW/rack. This is approximately 8% of the input electrical power that can be captured as useable heat. Indicating that Irish data centres have the energy potential to be heat providers in the Irish economy. This study highlighted the technical and economic aspects of prevalent cooling techniques and determined air cooling heat recovery cost can be reduced to 0.01 €/kWhth using offsetting. This is financially competitive with current heating solutions in Ireland.展开更多
Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive r...Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.展开更多
The detection and characterization of non-metallic inclusions are essential for clean steel production.Recently,imaging analysis combined with high-dimensional data processing of metallic materials using artificial in...The detection and characterization of non-metallic inclusions are essential for clean steel production.Recently,imaging analysis combined with high-dimensional data processing of metallic materials using artificial intelligence(AI)-based machine learning(ML)has developed rapidly.This technique has achieved impressive results in the field of inclusion classification in process metallurgy.The present study surveys the ML modeling of inclusion prediction in advanced steels,including the detection,classification,and feature prediction of inclusions in different steel grades.Studies on clean steel with different features based on data and image analysis via ML are summarized.Regarding the data analysis,the inclusion prediction methodology based on ML establishes a connection between the experimental parameters and inclusion characteristics and analyzes the importance of the experimental parameters.Regarding the image analysis,the focus is placed on the classification of different types of inclusions via deep learning,in comparison with data analysis.Finally,further development of inclusion analyses using ML-based methods is recommended.This work paves the way for the application of AIbased methodologies for ultraclean-steel studies from a sustainable metallurgy perspective.展开更多
AIM:To perform a bibliometric analysis of publications focusing on inflammatory mechanisms in glaucoma,thereby comprehensively understanding the current research status and identifying potential frontier directions fo...AIM:To perform a bibliometric analysis of publications focusing on inflammatory mechanisms in glaucoma,thereby comprehensively understanding the current research status and identifying potential frontier directions for future studies.METHODS:A systematic search was conducted in the Web of Science Core Collection(WoSCC)database to retrieve relevant literature published from January 1,2000,to August 31,2025(data accessed on September 12,2025).Multiple data visualization tools were employed to conduct in-depth analyses of the included publications,covering aspects such as publication quantity and quality,evolutionary trends of research hotspots,keyword cooccurrence networks,and collaborative patterns among countries/regions,institutions,and authors.RESULTS:A total of 3381 articles related to glaucoma inflammation were extracted from WoSCC.The analysis showed that the USA had the highest research output in this field(29.04%,n=982),followed by China(18.40%,n=622)and UK(6.01%,n=203).Based on citation frequency and burst intensity,the USA also ranked as the most influential country.Baudouin C and Sun X were identified as the most productive authors,while Journal of Glaucoma and Investigative Ophthalmology&Visual Science were the journals with the highest number of published relevant articles.Additionally,keyword analysis revealed that“neuroinflammation”,“retinal ganglion cells(RGCs)”,“pathophysiology”,and“traditional Chinese medicine”are emerging research hotspots in the field of immuneinflammatory responses in glaucoma.CONCLUSION:This study presents a comprehensive bibliometric overview of research on glaucoma-related inflammation,indicating that this field has received extensive scientific attention with a steady upward trend in research activity.Furthermore,it establishes a theoretical basis for the development of neuroinflammation-targeted therapeutic strategies for glaucoma and emphasizes the necessity of strengthening interdisciplinary collaboration to promote the clinical translation of research findings.展开更多
Rowlands et al.1present an analysis of accelerometer data from the UK Biobank cohort,examining variations in the duration,intensity,and accumulation of moderate-intensity physical activity(MPA)and vigorous-intensity p...Rowlands et al.1present an analysis of accelerometer data from the UK Biobank cohort,examining variations in the duration,intensity,and accumulation of moderate-intensity physical activity(MPA)and vigorous-intensity physical activity(VPA)sufficient to reduce the risk of all-cause mortality.In this study,the authors questioned if shorter durations(i.e.,1,2,3,4,5,10,15,and 20 min/day)of MPA and VPA performed continuously or accumulated throughout the day would equally reduce the risks of all-cause mortality as longer duration MPA and VPA recommended in the physical activity(PA)guidelines.展开更多
Rock slope instability is a prevalent geological hazard that imposes significant adverse impacts on engineering activities.Although existing studies have focused on homogeneous rock slopes,the theoretical models for q...Rock slope instability is a prevalent geological hazard that imposes significant adverse impacts on engineering activities.Although existing studies have focused on homogeneous rock slopes,the theoretical models for quantifying the stability of softhard interbedded anti-inclined slopes remain underdeveloped,primarily due to the complex force transfer mechanisms involved.This study proposed a novel theoretical model for the stability analysis of soft-hard interbedded anti-inclined slopes under rainfall conditions.The framework models stratified rock layers as layered cantilever beams with material heterogeneity.Based on the principle of deformation compatibility,it comprehensively accounted for interlayer force transfer and strength degradation resulting from differential deformations among rock layers.Furthermore,it integrated the critical instability length induced by the self-weight of rock layers to determine the fracture depth.The proposed method was validated against engineering case studies and physical model tests,with error falling within an acceptable range.Compared to existing theoretical methods,the proposed method provided a more realistic representation of the slope's stress field.The analysis results demonstrate that rainfall not only reduces the inclination angle of the failure surface but also leads to an approximate 30%decrease in the safety factor.The proposed theoretical model is particularly useful for quickly calculating the stability of soft-hard interbedded anti-inclined rock slope under rainfall conditions,compared to complex and time-consuming numerical simulation calculations.展开更多
Uncertain parameters are widespread in engineering systems.This study investigates the modal analysis of a fluid-conveying pipe subjected to elastic supports with unknown-but-bound parameters.The governing equation fo...Uncertain parameters are widespread in engineering systems.This study investigates the modal analysis of a fluid-conveying pipe subjected to elastic supports with unknown-but-bound parameters.The governing equation for the elastically supported fluid-conveying pipe is transformed into ordinary differential equations using the Galerkin truncation method.The Chebyshev interval approach,integrated with the assumed mode method is then used to investigate the effects of uncertainties of support stiffness,fluid speed,and pipe length on the natural frequencies and mode shapes of the pipe.Additionally,both symmetrical and asymmetrical support stiffnesses are discussed.The accuracy and effectiveness of the Chebyshev interval approach are verified through comparison with the Monte Carlo method.The results reveal that,for the same deviation coefficient,uncertainties in symmetrical support stiffness have a greater impact on the first four natural frequencies than those of the asymmetrical one.There may be significant differences in the sensitivity of natural frequencies and mode shapes of the same order to uncertain parameters.Notably,mode shapes susceptible to uncertain parameters exhibit wider fluctuation intervals near the elastic supports,requiring more attention.展开更多
文摘Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.
文摘With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues such as computer memory capacity and distributed storage of data across different locations,and are unable to directly apply to large-scale data sets.Therefore,in the context of big data,designing efficient and theoretically guaranteed statistical learning and inference algorithms has become a key issue that the current field of statistics urgently needs to address.In this paper,the application status of statistical analysis methods in the big data environment was systematically reviewed,and its future development directions were analyzed to provide reference and support for the further development of theory and methods of the statistical analysis of big data.
基金supported by theNationalNatural Science Foundation of China(No.U23A20305)National Key Research and Development Program of China(No.2022YFB3102900)+1 种基金Innovation Scientists and Technicians Troop Construction Projects of Henan Province,China(No.254000510007)Key Research and Development Project of Henan Province(No.221111321200).
文摘To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,this paper proposes a novel virtual probe deployment method based on user behavioral feature analysis.The core idea is to circumvent LBS anomaly detection by mimicking real-user behavior patterns.First,we design an automated data extraction algorithm that recognizes graphical user interface(GUI)elements to collect spatio-temporal behavior data.Then,by analyzing the automatically collected user data,we identify normal users’spatio-temporal patterns and extract their features such as high-activity time windows and spatial clustering characteristics.Subsequently,an antidetection scheduling strategy is developed,integrating spatial clustering optimization,load-balanced allocation,and time window control to generate probe scheduling schemes.Additionally,a self-correction mechanism based on an exponential backoff strategy is implemented to rectify anomalous behaviors andmaintain system stability.Experiments in real-world environments demonstrate that the proposed method significantly outperforms baseline methods in terms of both probe ban rate and task completion rate,while maintaining high time efficiency.This study provides a more reliable and clandestine solution for geosocial data collection and lays the foundation for building more robust virtual probe systems.
基金Project supported by the Project of the Anhui Provincial Natural Science Foundation(Grant No.2308085MA19)Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDA0410401)+2 种基金the National Natural Science Foundation of China(Grant No.52202120)the National Key Research and Development Program of China(Grant No.2023YFA1609800)USTC Research Funds of the Double First-Class Initiative(Grant No.YD2310002013)。
文摘Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization.
基金supported by Beijing Natural Science Foundation(L233025)。
文摘Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture mechanics analysis.These two classes of problems respectively embody material non-uniformity and geometric discontinuity,thereby imposing more stringent requirements on numerical methods in terms of high-order field continuity and accurate defect representation.Based on the classical Kirchhoff-Love plate theory,a numerical manifold method(MLS-NMM)incorporating moving least squares(MLS)interpolation is developed for bending analysis of FGM plates and fracture simulation of homogeneous plates with defects.The method constructs an H^(2)-regular approximation with high-order continuous weighting functions and,combined with the separation of mathematical and physical covers,establishes a unified framework that accurately handles material gradients and cracks without mesh reconstruction.For the crack tip,a singular physical cover incorporating the Williams asymptotic field is introduced to achieve local enrichment,enabling the natural capture of displacement discontinuity and stress singularity.Stress intensity factors are extracted using the interaction integral method,and the dimensionless J-integral shows a maximum relative error below 1.2%compared with the reference solution.Numerical results indicate that MLS-NMM exhibits excellent convergence performance:using 676 mathematical nodes,the nondimensional central deflection of both FGM and homogeneous plates agrees with reference solutions with a maximum relative error below 0.81%,and no shear locking occurs.A systematic analysis reveals that for a simply supported on all four edges(SSSS)FGM square plate with a/h=10,the nondimensional central deflection increases by 212%as the gradient index nrises from 0 to 5.For a homogeneous plate containing a central crack with c/a=0.6,the nondimensional central deflection increases by approximately 46%compared with the intact plate.Under weak boundary constraints(e.g.,SFSF),the deformation is markedly amplified,with the deflection reaching more than three times that under strong constraints(SCSC).The proposed method provides an efficient,reconstruction-free numerical tool for high-accuracy bending and fracture analyses of FGM and cracked thin-plate structures.
基金supported by the National Key Research and Development Program(MOST 2023YFA1606404 and MOST 2022YFA1602303)the National Natural Science Foundation of China(Nos.12347106,12147101,and 12447122)the China Postdoctoral Science Foundation(No.2024M760489).
文摘To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The RBM,constructed via eigenvector continuation and trained on both bound and resonant configurations,enables the fast and accurate emulation of resonance properties across the parameter space.To identify the physical resonant states from the emulator’s output,we introduce an overlap-based selection technique that effectively isolates true solutions from background artifacts.By applying this framework to unbound nucleus ^(6)Be,we quantified the model uncertainty in the predicted complex energies.The results demonstrate relative errors of 17.48%in the real part and 8.24%in the imaginary part,while achieving a speedup of four orders of magnitude compared with the full GCC calculations.To further investigate the asymptotic behavior of the resonant-state wavefunctions within the RBM framework,we employed a Lippmann–Schwinger(L–S)-based correction scheme.This approach not only improves the consistency between eigenvalues and wavefunctions but also enables a seamless extension from real-space training data to the complex energy plane.By bridging the gap between bound-state and continuum regimes,the L–S correction significantly enhances the emulator’s capability to accurately capture continuum structures in open quantum systems.
基金funded by the Bavarian State Ministry of ScienceResearch and Art(Grant number:H.2-F1116.WE/52/2)。
文摘The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ICA-compensation technique to address this limitation and propose a generalized framework for assessing the state of health(SOH)of batteries based on ICA that is applicable under differing charging conditions.This novel approach calculates the voltage profile under quasi-static conditions by subtracting the voltage increase attributable to the additional polarization effects at high currents from the measured voltage profile.This approach's efficacy is contingent upon precisely acquiring the equivalent impedance.To obtain the equivalent impedance throughout the batteries'lifespan while minimizing testing costs,this study employs a current interrupt technique in conjunction with a long short-term memory(LSTM)network to develop a predictive model for equivalent impedance.Following the derivation of ICA curves using voltage profiles under quasi-static conditions,the research explores two scenarios for SOH estimation:one utilizing only incremental capacity(IC)features and the other incorporating both IC features and IC sampling.A genetic algorithm-optimized backpropagation neural network(GABPNN)is employed for the SOH estimation.The proposed generalized framework is validated using independent training and test datasets.Variable test conditions are applied for the test set to rigorously evaluate the methodology under challenging conditions.These evaluation results demonstrate that the proposed framework achieves an estimation accuracy of 1.04%for RMSE and 0.90%for MAPE across a spectrum of charging rates ranging from 0.1 C to 1 C and starting SOCs between 0%and 70%,which constitutes a major advancement compared to established ICA methods.It also significantly enhances the applicability of conventional ICA techniques in varying charging conditions and negates the necessity for separate testing protocols for each charging scenario.
文摘This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performance of such units. The paper provides a systematic framework of alternative ways for ranking DEA-efficient units. The framework contains criteria derived as by-products of the basic DEA models and also criteria derived from complementary DEA analysis that needs to be carried out. The proposed framework is applied to rank a set of relatively efficient restaurants on the basis of their market efficiency.
基金Supported by the NSFC-Zhejiang Joint Fund for the Integration of Industrialization and Informatization(U1909208)the Science and Technology Major Project of Changsha(kh2202004)the Changsha Municipal Natural Science Foundation(kq2202106).
文摘Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been widely used in various biomedical applications such as arrhythmia detection,disease-specific detection,mortality prediction,and biometric recognition.In recent years,ECG-related studies have been carried out using a variety of publicly available datasets,with many differences in the datasets used,data preprocessing methods,targeted challenges,and modeling and analysis techniques.Here we systematically summarize and analyze the ECGbased automatic analysis methods and applications.Specifically,we first reviewed 22 commonly used ECG public datasets and provided an overview of data preprocessing processes.Then we described some of the most widely used applications of ECG signals and analyzed the advanced methods involved in these applications.Finally,we elucidated some of the challenges in ECG analysis and provided suggestions for further research.
文摘The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.
基金Project(41174103)supported by the National Natural Science Foundation of ChinaProject(2010-211)supported by the Foreign Mineral Resources Venture Exploration Special Fund of China
文摘In mineral exploration, the apparent resistivity and apparent frequency (or apparent polarizability) parameters of induced polarization method are commonly utilized to describe the induced polarization anomaly. When the target geology structure is significantly complicated, these parameters would fail to reflect the nature of the anomaly source, and wrong conclusions may be obtained. A wavelet approach and a metal factor method were used to comprehensively interpret the induced polarization anomaly of complex geologic bodies in the Adi Bladia mine. Db5 wavelet basis was used to conduct two-scale decomposition and reconstruction, which effectively suppress the noise interference of greenschist facies regional metamorphism and magma intrusion, making energy concentrated and boundary problem unobservable. On the basis of that, the ore-induced anomaly was effectively extracted by the metal factor method.
文摘We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed.
文摘Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.
文摘11% of Irish electricity was consumed by data centres in 2020. The Irish data centre industry and the cooling methods utilised require reformative actions in the coming years to meet EU Energy policies. The resell of heat, alternative cooling methods or carbon reduction methods are all possibilities to conform to these policies. This study aims to determine the viability of the resell of waste heat from data centres both technically and economically. This was determined using a novel application of thermodynamics to determine waste heat recovery potential in Irish data centres, and the current methods of heat generation for economical comparison. This paper also explores policy surrounding waste heat recovery within the industry. The Recoverable Carnot Equivalent Power (RCEP) is theoretically calculated for the three potential cooling methods for Irish data centres. These are air, hybrid, and immersion cooling techniques. This is the maximum useable heat that can be recovered from a data centre rack. This study is established under current operating conditions which are optimised for cooling performance, that air cooling has the highest potential RCEP of 0.39 kW/rack. This is approximately 8% of the input electrical power that can be captured as useable heat. Indicating that Irish data centres have the energy potential to be heat providers in the Irish economy. This study highlighted the technical and economic aspects of prevalent cooling techniques and determined air cooling heat recovery cost can be reduced to 0.01 €/kWhth using offsetting. This is financially competitive with current heating solutions in Ireland.
文摘Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.
基金support from the National Key Research and Development Program of China(No.2024YFB3713705)is acknowledgedWangzhong Mu would like to acknowledge the Strategic Mobility,Sweden(SSF,No.SM22-0039)+1 种基金the Swedish Foundation for International Cooperation in Research and Higher Education(STINT,No.IB2022-9228)the Jernkontoret(Sweden)for supporting this clean steel research.Gonghao Lian would like to acknowledge China Scholarship Council(CSC,No.202306080032).
文摘The detection and characterization of non-metallic inclusions are essential for clean steel production.Recently,imaging analysis combined with high-dimensional data processing of metallic materials using artificial intelligence(AI)-based machine learning(ML)has developed rapidly.This technique has achieved impressive results in the field of inclusion classification in process metallurgy.The present study surveys the ML modeling of inclusion prediction in advanced steels,including the detection,classification,and feature prediction of inclusions in different steel grades.Studies on clean steel with different features based on data and image analysis via ML are summarized.Regarding the data analysis,the inclusion prediction methodology based on ML establishes a connection between the experimental parameters and inclusion characteristics and analyzes the importance of the experimental parameters.Regarding the image analysis,the focus is placed on the classification of different types of inclusions via deep learning,in comparison with data analysis.Finally,further development of inclusion analyses using ML-based methods is recommended.This work paves the way for the application of AIbased methodologies for ultraclean-steel studies from a sustainable metallurgy perspective.
基金Supported by the National Natural Science Foundation of China(No.82074500)Beijing Natural Science Foundation(No.7252273)+2 种基金CACMS Innovation Fund(No.CI2021A02605)Administration of Traditional Chinese Medicine of Zhejiang Province(No.2024ZR029)Science and Technology Program of Wenzhou City(No.Y2023210).
文摘AIM:To perform a bibliometric analysis of publications focusing on inflammatory mechanisms in glaucoma,thereby comprehensively understanding the current research status and identifying potential frontier directions for future studies.METHODS:A systematic search was conducted in the Web of Science Core Collection(WoSCC)database to retrieve relevant literature published from January 1,2000,to August 31,2025(data accessed on September 12,2025).Multiple data visualization tools were employed to conduct in-depth analyses of the included publications,covering aspects such as publication quantity and quality,evolutionary trends of research hotspots,keyword cooccurrence networks,and collaborative patterns among countries/regions,institutions,and authors.RESULTS:A total of 3381 articles related to glaucoma inflammation were extracted from WoSCC.The analysis showed that the USA had the highest research output in this field(29.04%,n=982),followed by China(18.40%,n=622)and UK(6.01%,n=203).Based on citation frequency and burst intensity,the USA also ranked as the most influential country.Baudouin C and Sun X were identified as the most productive authors,while Journal of Glaucoma and Investigative Ophthalmology&Visual Science were the journals with the highest number of published relevant articles.Additionally,keyword analysis revealed that“neuroinflammation”,“retinal ganglion cells(RGCs)”,“pathophysiology”,and“traditional Chinese medicine”are emerging research hotspots in the field of immuneinflammatory responses in glaucoma.CONCLUSION:This study presents a comprehensive bibliometric overview of research on glaucoma-related inflammation,indicating that this field has received extensive scientific attention with a steady upward trend in research activity.Furthermore,it establishes a theoretical basis for the development of neuroinflammation-targeted therapeutic strategies for glaucoma and emphasizes the necessity of strengthening interdisciplinary collaboration to promote the clinical translation of research findings.
文摘Rowlands et al.1present an analysis of accelerometer data from the UK Biobank cohort,examining variations in the duration,intensity,and accumulation of moderate-intensity physical activity(MPA)and vigorous-intensity physical activity(VPA)sufficient to reduce the risk of all-cause mortality.In this study,the authors questioned if shorter durations(i.e.,1,2,3,4,5,10,15,and 20 min/day)of MPA and VPA performed continuously or accumulated throughout the day would equally reduce the risks of all-cause mortality as longer duration MPA and VPA recommended in the physical activity(PA)guidelines.
基金supported by the Chongqing Water Conservancy Science and Technology Project(grant number:CQSLK-202329)the Natural Science Foundation of Chongqing,China(grant number:CSTB2022NSCQ-MSX0991)+1 种基金the National Natural Science Foundation of China(grant number:52378327)the Chongqing Natural Science Foundation Innovation Development Joint Fund(grant number:CSTB2022NSCQ-LZX0049)。
文摘Rock slope instability is a prevalent geological hazard that imposes significant adverse impacts on engineering activities.Although existing studies have focused on homogeneous rock slopes,the theoretical models for quantifying the stability of softhard interbedded anti-inclined slopes remain underdeveloped,primarily due to the complex force transfer mechanisms involved.This study proposed a novel theoretical model for the stability analysis of soft-hard interbedded anti-inclined slopes under rainfall conditions.The framework models stratified rock layers as layered cantilever beams with material heterogeneity.Based on the principle of deformation compatibility,it comprehensively accounted for interlayer force transfer and strength degradation resulting from differential deformations among rock layers.Furthermore,it integrated the critical instability length induced by the self-weight of rock layers to determine the fracture depth.The proposed method was validated against engineering case studies and physical model tests,with error falling within an acceptable range.Compared to existing theoretical methods,the proposed method provided a more realistic representation of the slope's stress field.The analysis results demonstrate that rainfall not only reduces the inclination angle of the failure surface but also leads to an approximate 30%decrease in the safety factor.The proposed theoretical model is particularly useful for quickly calculating the stability of soft-hard interbedded anti-inclined rock slope under rainfall conditions,compared to complex and time-consuming numerical simulation calculations.
基金supported by the National Natural Science Foundation of China(Grant Nos.12272211,12072181,and 12121002).
文摘Uncertain parameters are widespread in engineering systems.This study investigates the modal analysis of a fluid-conveying pipe subjected to elastic supports with unknown-but-bound parameters.The governing equation for the elastically supported fluid-conveying pipe is transformed into ordinary differential equations using the Galerkin truncation method.The Chebyshev interval approach,integrated with the assumed mode method is then used to investigate the effects of uncertainties of support stiffness,fluid speed,and pipe length on the natural frequencies and mode shapes of the pipe.Additionally,both symmetrical and asymmetrical support stiffnesses are discussed.The accuracy and effectiveness of the Chebyshev interval approach are verified through comparison with the Monte Carlo method.The results reveal that,for the same deviation coefficient,uncertainties in symmetrical support stiffness have a greater impact on the first four natural frequencies than those of the asymmetrical one.There may be significant differences in the sensitivity of natural frequencies and mode shapes of the same order to uncertain parameters.Notably,mode shapes susceptible to uncertain parameters exhibit wider fluctuation intervals near the elastic supports,requiring more attention.