期刊文献+
共找到1,037篇文章
< 1 2 52 >
每页显示 20 50 100
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
1
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
GranuSAS:Software of rapid particle size distribution analysis from small angle scattering data
2
作者 Qiaoyu Guo Fei Xie +3 位作者 Xuefei Feng Zhe Sun Changda Wang Xuechen Jiao 《Chinese Physics B》 2026年第2期216-225,共10页
Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces th... Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization. 展开更多
关键词 small angle x-ray scattering data analysis software particle size distribution inverse problem
原文传递
Diversity,Complexity,and Challenges of Viral Infectious Disease Data in the Big Data Era:A Comprehensive Review 被引量:1
3
作者 Yun Ma Lu-Yao Qin +1 位作者 Xiao Ding Ai-Ping Wu 《Chinese Medical Sciences Journal》 2025年第1期29-44,I0005,共17页
Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning fr... Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape. 展开更多
关键词 viral infectious diseases big data data diversity and complexity data standardization artificial intelligence data analysis
暂未订购
Enhancing Stellar Spectra with Diffusion Probabilistic Models:A Novel Approach to Denoising Low SNR Astronomical Data
4
作者 Jingzhen Sun Yude Bu +8 位作者 Jiangchuan Zhang Mengmeng Zhang Shanshan Li Ke Wang Yuhang Zhang Zhenping Yi Xiaoming Kong Meng Liu Minglei Wu 《Research in Astronomy and Astrophysics》 2025年第10期69-77,共9页
Astronomical spectra are vital for deriving stellar properties,yet low signal-to-noise ratio(SNR)spectra often obscure key features,complicating accurate analysis.This study presents spec-Diffusion Probabilistic Model... Astronomical spectra are vital for deriving stellar properties,yet low signal-to-noise ratio(SNR)spectra often obscure key features,complicating accurate analysis.This study presents spec-Diffusion Probabilistic Models(DDPM),a novel deep learning approach based on DDPM,aimed at denoising low SNR spectra to improve stellar parameter estimation.Leveraging the LAMOST DR10 data set,we developed spec-DDPM using a tailored U-Net architecture(spec-Unet)to iteratively predict and remove noise.The model was trained on 28,500 low and high SNR spectral pairs and benchmarked against conventional methods,including Principal Component Analysis,wavelet techniques,and a modified DnCNN model.The spec-DDPM demonstrated superior performance,with reduced Mean Absolute Error,elevated Structural Similarity Index Measure,and enhanced spectral loss metrics.It effectively preserved critical spectral features and corrected continuum distortions.Validation experiments further confirmed its ability to improve stellar parameter estimation with reduced errors.These results underscore spec-DDPM’s potential to elevate spectral data quality,offering applications in restoring defective spectra and refining large-scale astronomical surveys.This work highlights the transformative role of deep learning in astronomical data processing. 展开更多
关键词 techniques spectroscopic-methods STATISTICAL-METHODS data analysis
在线阅读 下载PDF
Research on the Development Strategies of Realtime Data Analysis and Decision-support Systems
5
作者 Wei Tang 《Journal of Electronic Research and Application》 2025年第2期204-210,共7页
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This... With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques. 展开更多
关键词 Real-time data analysis Decision-support system Big data System architecture data processing Visualization technology
在线阅读 下载PDF
Visualization of Industrial Big Data:State-of-the-Art and Future Perspectives
6
作者 Tongkang Zhang Jinliang Ding +1 位作者 Zheng Liu Wenjun Zhang 《Engineering》 2025年第9期85-101,共17页
As industrial production progresses toward digitalization,massive amounts of data have been collected,transmitted,and stored,with characteristics of large-scale,high-dimensional,heterogeneous,and spatiotemporal dynami... As industrial production progresses toward digitalization,massive amounts of data have been collected,transmitted,and stored,with characteristics of large-scale,high-dimensional,heterogeneous,and spatiotemporal dynamics.The high complexity of industrial big data poses challenges for the practical decision-making of domain experts,leading to ever-increasing needs for integrating computational intelligence with human perception into traditional data analysis.Industrial big data visualization integrates theoretical methods and practical technologies from multiple disciplines,including data mining,information visualization,computer graphics,and human-computer interaction,providing a highly effective manner for understanding and exploring the complex industrial processes.This review summarizes the state-of-the-art approaches,characterizes them with six visualization methods,and categorizes them based on analytical tasks and applications.Furthermore,key research challenges and potential future directions are identified. 展开更多
关键词 Industrial big data data analysis Visual analytics Information visualization Human-computer interaction
在线阅读 下载PDF
Gene Expression Data Analysis Based on Mixed Effects Model
7
作者 Yuanbo Dai 《Journal of Computer and Communications》 2025年第2期223-235,共13页
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres... DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions. 展开更多
关键词 Mixed Effects Model Gene Expression data Analysis Gene Analysis Gene Chip
暂未订购
A Flexible Exponential Log-Logistic Distribution for Modeling Complex Failure Behaviors in Reliability and Engineering Data
8
作者 Hadeel Al Qadi Fatimah M.Alghamdi +2 位作者 Hamada H.Hassan Mohamed E.Mead Ahmed Z.Afify 《Computer Modeling in Engineering & Sciences》 2025年第8期2029-2061,共33页
Parametric survival models are essential for analyzing time-to-event data in fields such as engineering and biomedicine.While the log-logistic distribution is popular for its simplicity and closed-form expressions,it ... Parametric survival models are essential for analyzing time-to-event data in fields such as engineering and biomedicine.While the log-logistic distribution is popular for its simplicity and closed-form expressions,it often lacks the flexibility needed to capture complex hazard patterns.In this article,we propose a novel extension of the classical log-logistic distribution,termed the new exponential log-logistic(NExLL)distribution,designed to provide enhanced flexibility in modeling time-to-event data with complex failure behaviors.The NExLL model incorporates a new exponential generator to expand the shape adaptability of the baseline log-logistic distribution,allowing it to capture a wide range of hazard rate shapes,including increasing,decreasing,J-shaped,reversed J-shaped,modified bathtub,and unimodal forms.A key feature of the NExLL distribution is its formulation as a mixture of log-logistic densities,offering both symmetric and asymmetric patterns suitable for diverse real-world reliability scenarios.We establish several theoretical properties of the model,including closed-form expressions for its probability density function,cumulative distribution function,moments,hazard rate function,and quantiles.Parameter estimation is performed using seven classical estimation techniques,with extensive Monte Carlo simulations used to evaluate and compare their performance under various conditions.The practical utility and flexibility of the proposed model are illustrated using two real-world datasets from reliability and engineering applications,where the NExLL model demonstrates superior fit and predictive performance compared to existing log-logistic-basedmodels.This contribution advances the toolbox of parametric survivalmodels,offering a robust alternative formodeling complex aging and failure patterns in reliability,engineering,and other applied domains. 展开更多
关键词 Failure rate new exponential class log-logistic distribution maximum likelihood order statistics reallife data analysis
在线阅读 下载PDF
Topology Data Analysis-Based Error Detection for Semantic Image Transmission with Incremental Knowledge-Based HARQ
9
作者 Ni Fei Li Rongpeng +1 位作者 Zhao Zhifeng Zhang Honggang 《China Communications》 2025年第1期235-255,共21页
Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpe... Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission. 展开更多
关键词 error detection incremental knowledgebased HARQ joint source-channel coding semantic communication swin transformer topological data analysis
在线阅读 下载PDF
Correction:Data analysis framework for silicon strip detector in compact spectrometer for heavy-ion experiments
10
作者 Xiao-Bao Wei Yu-Hao Qin +15 位作者 Sheng Xiao Da-Wei Si Dong Guo Zhi Qin Fen-Hai Guan Xin-Yue Diao Bo-Yuan Zhang Bai-Ting Tian Jun-Huai Xu Tian-Ren Zhuo Yi-Bo Hao Zeng-Xiang Wang Shi-Tao Wang Chun-Wang Ma Yi-Jie Wang Zhi-Gang Xiao 《Nuclear Science and Techniques》 2025年第11期367-367,共1页
In section‘Track decoding’of this article,one of the paragraphs was inadvertently missed out after the text'…shows the flow diagram of the Tr2-1121 track mode.'The missed paragraph is provided below.
关键词 tr track mode flow diagram data analysis heavy ion experiments silicon strip detector compact spectrometer track decoding
在线阅读 下载PDF
Evaluating fracture volume loss during production process by comparative analysis of initial and second flowback data
11
作者 Chong Cao Tamer Moussa Hassan Dehghanpour 《International Journal of Coal Science & Technology》 2025年第3期274-290,共17页
The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pr... The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days. 展开更多
关键词 Second flowback data analysis Infill development Preloading effect Effective fracture volume loss Flowback rate-transient analysis
在线阅读 下载PDF
Analysis of the Impact of Legal Digital Currencies on Bank Big Data Practices
12
作者 Zhengkun Xiu 《Journal of Electronic Research and Application》 2025年第1期23-27,共5页
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e... This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies. 展开更多
关键词 Legal digital currency Bank big data data processing efficiency data analysis and application Countermeasures and suggestions
在线阅读 下载PDF
GT-scopy:A Data Processing and Enhancing Package(Level 1.0-1.5)for Ground Solar Telescopes——Based on the 1.6 m Goode Solar Telescope
13
作者 Ding Yuan Wei Wu +4 位作者 Song Feng Libo Fu Wenda Cao Jianchuan Zheng Lin Mei 《Research in Astronomy and Astrophysics》 2025年第11期191-197,共7页
The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a pytho... The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community. 展开更多
关键词 techniques:image processing methods:data analysis Astronomical Instrumentation Methods and Techniques
在线阅读 下载PDF
Single-Cell and Multi-Dimensional Data Analysis of the Key Role of IDH2 in Cervical Squamous Cell Carcinoma Progression
14
作者 Xiaojuan Liu Zhenpeng Zhu +5 位作者 Chenyang Hou Hui Ma Xiaoyan Li Chunxing Ma Lisha Shu Huiying Zhang 《Biomedical and Environmental Sciences》 2025年第6期773-778,共6页
Cervical cancer,a leading malignancy globally,poses a significant threat to women's health,with an estimated 604,000 new cases and 342,000 deaths reported in 2020^([1]).As cervical cancer is closely linked to huma... Cervical cancer,a leading malignancy globally,poses a significant threat to women's health,with an estimated 604,000 new cases and 342,000 deaths reported in 2020^([1]).As cervical cancer is closely linked to human papilloma virus(HPV)infection,early detection relies on HPV screening;however,late-stage prognosis remains poor,underscoring the need for novel diagnostic and therapeutic targets^([2]). 展开更多
关键词 cervical squamous cell carcinoma IDH cervical cancera multi dimensional data analysis novel diagnostic therapeutic targets cervical cancer prognosis human papilloma virus hpv infectionearly
暂未订购
Detecting the Lunar Wrinkle Ridges Through Deep Learning Based on DEM and Aspect Data
15
作者 Xin Lu Jiacheng Sun +2 位作者 Gaofeng Shu Jianhui Zhao Ning Li 《Research in Astronomy and Astrophysics》 2025年第8期167-179,共13页
Lunar wrinkle ridges are an important stress geological structure on the Moon, which reflect the stress state and geological activity on the Moon. They provide important insights into the evolution of the Moon and are... Lunar wrinkle ridges are an important stress geological structure on the Moon, which reflect the stress state and geological activity on the Moon. They provide important insights into the evolution of the Moon and are key factors influencing future lunar activity, such as the choice of landing sites. However, automatic extraction of lunar wrinkle ridges is a challenging task due to their complex morphology and ambiguous features. Traditional manual extraction methods are time-consuming and labor-intensive. To achieve automated and detailed detection of lunar wrinkle ridges, we have constructed a lunar wrinkle ridge data set, incorporating previously unused aspect data to provide edge information, and proposed a Dual-Branch Ridge Detection Network(DBR-Net) based on deep learning technology. This method employs a dual-branch architecture and an Attention Complementary Feature Fusion module to address the issue of insufficient lunar wrinkle ridge features. Through comparisons with the results of various deep learning approaches, it is demonstrated that the proposed method exhibits superior detection performance. Furthermore, the trained model was applied to lunar mare regions, generating a distribution map of lunar mare wrinkle ridges;a significant linear relationship between the length and area of the lunar wrinkle ridges was obtained through statistical analysis, and six previously unrecorded potential lunar wrinkle ridges were detected. The proposed method upgrades the automated extraction of lunar wrinkle ridges to a pixel-level precision and verifies the effectiveness of DBR-Net in lunar wrinkle ridge detection. 展开更多
关键词 MOON methods:data analysis planets and satellites:surfaces techniques:image processing
在线阅读 下载PDF
BYSpec:An Automatic Data Reduction Package for BFOSC and YFOSC Spectroscopic Data
16
作者 Zi-Chong Zhang Jun-Bo Zhang +6 位作者 Ju-Jia Zhang De-Yang Song Jing Chen Ming-Yi Ding Nan Zhou Liang Wang Kai Zhang 《Research in Astronomy and Astrophysics》 2025年第2期182-196,共15页
BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Pack... BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Package)dedicated to automatically reducing the long-slit and echelle spectra obtained by these two instruments.The package supports bias and flat-fielding correction,order location,background subtraction,automatic wavelength calibration,and absolute flux calibration.The optimal extraction method maximizes the signal-to-noise ratio and removes most of the cosmic rays imprinted in the spectra.A comparison with the 1D spectra reduced with IRAF verifies the reliability of the results.This open-source software is publicly available to the community. 展开更多
关键词 methods:data analysis instrumentation:spectrographs techniques:image processing
在线阅读 下载PDF
Enhancing Healthcare Data Privacy in Cloud IoT Networks Using Anomaly Detection and Optimization with Explainable AI (ExAI)
17
作者 Jitendra Kumar Samriya Virendra Singh +4 位作者 Gourav Bathla Meena Malik Varsha Arya Wadee Alhalabi Brij B.Gupta 《Computers, Materials & Continua》 2025年第8期3893-3910,共18页
The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated cha... The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated challenges,such as data security,interoperability,and ethical concerns,is crucial to realizing the full potential of IoT in healthcare.Real-time anomaly detection plays a key role in protecting patient data and maintaining device integrity amidst the additional security risks posed by interconnected systems.In this context,this paper presents a novelmethod for healthcare data privacy analysis.The technique is based on the identification of anomalies in cloud-based Internet of Things(IoT)networks,and it is optimized using explainable artificial intelligence.For anomaly detection,the Radial Boltzmann Gaussian Temporal Fuzzy Network(RBGTFN)is used in the process of doing information privacy analysis for healthcare data.Remora Colony SwarmOptimization is then used to carry out the optimization of the network.The performance of the model in identifying anomalies across a variety of healthcare data is evaluated by an experimental study.This evaluation suggested that themodel measures the accuracy,precision,latency,Quality of Service(QoS),and scalability of themodel.A remarkable 95%precision,93%latency,89%quality of service,98%detection accuracy,and 96%scalability were obtained by the suggested model,as shown by the subsequent findings. 展开更多
关键词 Healthcare data privacy analysis anomaly detection cloud IoT network explainable artificial intelligence temporal fuzzy network
在线阅读 下载PDF
Multi-Source Heterogeneous Data Fusion Analysis Platform for Thermal Power Plants
18
作者 Jianqiu Wang Jianting Wen +1 位作者 Hui Gao Chenchen Kang 《Journal of Architectural Research and Development》 2025年第6期24-28,共5页
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter... With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%. 展开更多
关键词 Thermal power plant Multi-source heterogeneous data data fusion analysis platform Edge computing
在线阅读 下载PDF
Stenting for symptomatic intracranial arterial stenosis with different qualifying arteries:a preplanned pooled individual patient data analysis
19
作者 Tianhua Li Jichang Luo +17 位作者 Xuesong Bai Eyad Almallouhi Peng Gao Delin Liu Ran Xu Wenlong Xu Guangdong Lu Haozhi Gong Xiao Zhang Taoyuan Lu Jie Wang Renjie Yang Zixuan Xing Guangjie Liu Yufu Dai Colin P Derdeyn Liqun Jiao Tao Wang 《Stroke & Vascular Neurology》 2025年第4期422-430,共9页
Background The efficacy of percutaneous transluminal angioplasty and stenting(PTAS)relative to medical management in treating symptomatic intracranial arterial stenosis(ICAS)varies based on the qualifying artery.This ... Background The efficacy of percutaneous transluminal angioplasty and stenting(PTAS)relative to medical management in treating symptomatic intracranial arterial stenosis(ICAS)varies based on the qualifying artery.This study aims to evaluate PTAS compared with medical therapy alone in cases of ICAS involving the internal carotid artery(ICA),middle cerebral artery(MCA),vertebral artery(VA)and basilar artery(BA).Methods This study involves a thorough pooled analysis of individual patient data from two randomised controlled trials,evaluating the efficacy of PTAS in comparison to medical management for symptomatic ICAS with different qualifying arteries.The primary outcome was stroke or death within 30 days postenrolment,or stroke in the region of the qualifying artery beyond 30 days through 1 year.A methodology based on intention-to-treat was employed,and HR accompanied by 95%CIs were used to convey risk estimates.Results The data of 809 individuals were collected from Stenting vs Aggressive Medical Management for Preventing Recurrent Stroke in Intracranial Stenosis trial and China Angioplasty and Stenting for Symptomatic Intracranial Severe Stenosis trial.Four hundred were designated for PTAS,while 409 were assigned to medical therapy alone.For the primary outcome,patients with symptomatic BA stenosis had a significantly higher risk of receiving PTAS compared with medical therapy(17.17%vs 7.77%;9.40;HR,2.38(1.03 to 5.52);p=0.04).However,PTAS had no significant difference in patients with symptomatic ICA(26.67%vs 16.67%;HR,1.68(0.78 to 3.62);p=0.19),MCA(8.28%vs 9.79%;HR,0.85(0.42 to 1.74);p=0.66)and VA stenosis(9.52%vs 10.71%;HR,0.91(0.32 to 2.62);p=0.86)compared with medical therapy.Conclusions PTAS significantly increases the risk of both short-term and long-term stroke in patients with symptomatic BA stenosis.Without significant technological advancements to mitigate these risks,PTAS offers limited benefits.For symptomatic ICA,MCA and VA stenosis,PTAS provided no significant advantage. 展开更多
关键词 percutaneous transluminal angioplasty stenting ptas relative percutaneous transluminal angioplasty internal carotid artery ica middle medical management pooled analysis individual patient data intracranial arterial stenosis medical therapy intracranial arterial stenosis icas varies
原文传递
Seismic data analysis based on spatial subsets 被引量:2
20
作者 蔡希玲 刘学伟 +2 位作者 李虹 钱宇明 吕英梅 《Applied Geophysics》 SCIE CSCD 2009年第4期384-392,395,共10页
There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from ... There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information. 展开更多
关键词 spatial subset 3D visualization high density sampling noise attenuation data analysis
在线阅读 下载PDF
上一页 1 2 52 下一页 到第
使用帮助 返回顶部