期刊文献+
共找到359,249篇文章
< 1 2 250 >
每页显示 20 50 100
Purposeless repeated acquisition time-lapse seismic data processing 被引量:4
1
作者 Li Jingye Chen Xiaohong +1 位作者 Zhao Wei Zhang Yunpeng 《Petroleum Science》 SCIE CAS CSCD 2008年第1期31-36,共6页
In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for search... In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for searching residual oil in matured fields. Time-lapse seismic reservoir monitoring technique is one of most important techniques to define residual oil distribution. According to the demand for and development of time-lapse seismic reservoir monitoring in China, purposeless repeated acquisition time-lapse seismic data processing was studied. The four key steps in purposeless repeated acquisition time-lapse seismic data processing, including amplitude-preserved processing with relative consistency, rebinning, match filtering and difference calculation, were analyzed by combining theory and real seismic data processing. Meanwhile, quality control during real time-lapse seismic processing was emphasized. 展开更多
关键词 time-lapse seismic purposeless repeated acquisition rebinning match filtering amplitude-preserved processing
原文传递
Time-lapse microgravity monitoring technology for underground gas storage and application based on Robust Principal Component Analysis
2
作者 Peng Xiang Xue-guo Chen +3 位作者 Hong-mei Luo Juan Zhang Ling-wei Meng Tao Guo 《Applied Geophysics》 2025年第3期770-783,895,896,共16页
Underground gas storage(UGS)are an important guarantee for national energy strategic reserves,but the monitoring of gas reservoir distribution has always faced challenges.Time-lapse microgravity monitoring technology ... Underground gas storage(UGS)are an important guarantee for national energy strategic reserves,but the monitoring of gas reservoir distribution has always faced challenges.Time-lapse microgravity monitoring technology can infer the movement patterns of substances based on density changes at different times.Simulation results indicate that this technology provides strong support for the dynamic monitoring of UGS.However,in the process of processing time-lapse microgravity data,it is necessary to use field separation technology to obtain the gravity anomaly of the target body.In order to obtain more accurate and stable field separation results,this paper utilizes the low-rank nature of the regional field and the sparsity of the local field in potential field data,and adopts a method based on Robust Principal Component Analysis(RPCA)for field separation processing.In the study of the gas injection process in the Y21 UGS,microgravity measurement and processing results show that the areas with enriched natural gas in the UGS are approximately annular and located in the structural high-point areas,which basically match the geological structural characteristics.Due to the presence of boundary faults,according to the results of time-lapse microgravity,it is inferred that groundwater moves towards the structural high-point areas,and natural gas mainly moves towards the southwest direction,providing the direction of underground fluid movement during the gas injection process in the UGS. 展开更多
关键词 time-lapse microgravity monitoring RPCA UGS
在线阅读 下载PDF
Mechanism and Application of Dynamic Monitoring for Seepage Processes in Earth-Rock Dams Using the Time-Lapse Electrical Resistivity Method
3
作者 Cai Zu-gen Yang Ming-sheng +1 位作者 Wang Jian-jun Xu Yao-hui 《Applied Geophysics》 2025年第4期1464-1474,1502,共12页
In the leakage detection of reservoir dam bodies,traditional geophysical methods can only achieve one-time detection.Meanwhile,due to the non-uniqueness of geophysicalin version,how to improve the fidelity of geophysi... In the leakage detection of reservoir dam bodies,traditional geophysical methods can only achieve one-time detection.Meanwhile,due to the non-uniqueness of geophysicalin version,how to improve the fidelity of geophysical detection inversion profiles has become a key challenge in the industry.This study aims to construct a long-term real-time monitoring system using the time-lapse resistivity method,reveal the spatiotemporal evolution law of dam leakage,and provide technical support for accurate treatment.By integrating the Internet of Things(IoT),5G technology,and AI technology,real-time data acquisition,real-time transmission,and automatic inversionare realized.Through dynamic imaging analysis of the electrical anomaly characteristics of the leakage area and comparison between corresponding rainfall events,the leakage range and resistivity changes,the reliability and efficiency of dam leakagedete ction are significantly improved.This achieves long-termdynamic monitoring of dam leakage and provides a new perspective for the safe operation and maintenance of reservoirs. 展开更多
关键词 geophysical detection time-lapse resistivity method IoT and 5G technologies dynamic monitoring
在线阅读 下载PDF
Accurate reconstruction method of virtual shot records in passive source time-lapse monitoring based on SBA network
4
作者 Ying-He Wu Shu-Lin Pan +5 位作者 Kai Chen Yao-Jie Chen Da-Wei Liu Zi-Yu Qin Sheng-Bo Yi Ze-Yang Liu 《Petroleum Science》 2025年第9期3548-3564,共17页
Passive source imaging can reconstruct body wave reflections similar to those of active sources through seismic interferometry(SI).It has become a low-cost,environmentally friendly alternative to active source seismic... Passive source imaging can reconstruct body wave reflections similar to those of active sources through seismic interferometry(SI).It has become a low-cost,environmentally friendly alternative to active source seismic,showing great potential.However,this method faces many challenges in practical applications,including uneven distribution of underground sources and complex survey environments.These situations seriously affect the reconstruction quality of virtual shot records,resulting in unguaranteed imaging results and greatly limiting passive source seismic exploration applications.In addition,the quality of the reconstructed records is directly related to the time length of the noise records,but in practice it is often difficult to obtain long-term,high-quality noise segments containing body wave events.To solve the above problems,we propose a deep learning method for reconstructing passive source virtual shot records and apply it to passive source time-lapse monitoring.This method combines the UNet network and the BiLSTM(Bidirectional Long Short-Term Memory)network for extracting spatial features and temporal features respectively.It introduces the spatial attention mechanism to establish a hybrid SUNet-BiLSTM-Attention(SBA)network for supervised training.Through pre-training and fine-tuning training,the network can accurately reconstruct passive source virtual shot records directly from short-time noisy segments containing body wave events.The experimental results of theoretical data show that the virtual shot records reconstructed by the network have high resolution and signal to noise ratio(SNR),providing high-quality data for subsequent monitoring and imaging.Finally,to further validate the effectiveness of proposed method,we applied it to field data collected from gas storage in northwest China.The reconstruction results of field data effectively improve the quality of virtual records and obtain more reliable time-lapse imaging monitoring results,which have significant practical value. 展开更多
关键词 Passive source virtual shot reconstruction Passive source time-lapse monitoring SUNet-BiLSTM-attention network
原文传递
Dynamic reservoir monitoring using similarity analysis of passive source time-lapse seismic images: Application to waterflooding front monitoring in Shengli Oilfield, China
5
作者 Ying-He Wu Shu-Lin Pan +5 位作者 Hai-Qiang Lan Jing-Yi Chen Jose Badal Yao-Jie Chen Zi-Lin Zhang Zi-Yu Qin 《Petroleum Science》 2025年第3期1062-1079,共18页
In common practice in the oil fields,the injection of water and gas into reservoirs is a crucial technique to increase production.The control of the waterflooding front in oil/gas exploitation is a matter of great con... In common practice in the oil fields,the injection of water and gas into reservoirs is a crucial technique to increase production.The control of the waterflooding front in oil/gas exploitation is a matter of great concern to reservoir engineers.Monitoring the waterflooding front in oil/gas wells plays a very important role in adjusting the well network and later in production,taking advantage of the remaining oil po-tential and ultimately achieving great success in improving the recovery rate.For a long time,micro-seismic monitoring,numerical simulation,four-dimensional seismic and other methods have been widely used in waterflooding front monitoring.However,reconciling their reliability and cost poses a significant challenge.In order to achieve real-time,reliable and cost-effective monitoring,we propose an innovative method for waterflooding front monitoring through the similarity analysis of passive source time-lapse seismic images.Typically,passive source seismic data collected from oil fields have extremely low signal-to-noise ratio(SNR),which poses a serious problem for obtaining structural images.The proposed method aims to visualize and analyze underground changes by highlighting time-lapse images and provide a strategy for underground monitoring using long-term passive source data under low SNR conditions.First,we verify the feasibility of the proposed method by designing a theoretical model.Then,we conduct an analysis of the correlation coefficient(similarity)on the passive source time-lapse seismic imaging results to enhance the image differences and identify the simulated waterflooding fronts.Finally,the proposed method is applied to the actual waterflooding front monitoring tasks in Shengli Oilfield,China.The research findings indicate that the monitoring results are consistent with the actual devel-opment conditions,which in turn demonstrates that the proposed method has great potential for practical application and is very suitable for monitoring common development tasks in oil fields. 展开更多
关键词 Passive source time-lapse seismic imaging Seismic interferometry Dynamic reservoir monitoring Similarityan alysis Waterflooding front monitoring Shengli Oilfield
原文传递
Time-Lapse Geoenvironmental Hydrocarbon Attenuation in Nigeria:Integrated Geoelectrical,Geochemical and Geotechnical Site Characterization
6
作者 Nurudeen Ahmed Onomhoale Nik Norsyahariati Nik Daud +2 位作者 Ipoola Ajani Okunlola Syazwani Idrus Siti Nur Aliaa Roslan 《Journal of Environmental & Earth Sciences》 2025年第8期408-436,共29页
Hydrocarbon contamination from oil spills presents geoenvironmental and geoengineering challenges,notably in Eleme,Nigeria.This study integrates electrical resistivity tomography(ERT),soil total petroleum hydrocarbon(... Hydrocarbon contamination from oil spills presents geoenvironmental and geoengineering challenges,notably in Eleme,Nigeria.This study integrates electrical resistivity tomography(ERT),soil total petroleum hydrocarbon(TPH)analysis,and geotechnical testing for treated spill site monitoring and characterization over six months.Four 100 m ERT lines,L1 to L4,with spacings at 1.5 m,3 m,6 m,9 m,12 m,and 15 m,were established for the first and second sampling phases.Twenty-one soil samples,12 TPH,and 9 mechanical analyses,were obtained from 5 boreholes,BH1 to BH4,for the study site and the BH5 control site across the phases at 0.5 m,3.0 m,and 5.0 m depths along ERT lines.ERT results reveal resistivity reductions averaging 18%in shallow zones of active degradation,correlating with an average 41%TPH-decrease.Specific gravity averaged 2.49 in the spill soils,compared to 2.58 in control samples,reflecting hydrocarbon-induced density reductions of 3.5%.Particle size showed spill soils contained>50%fines,increasing water retention and reducing permeability by 30%.Consolidation tests highlighted increased compressibility,with settlements of 1.89 mm in spill soils versus 1.01 mm in control samples,indicating a 47%increase in settlement from hydrocarbon reduction.Correlation analysis shows slower consolidation at BH3(−0.62 Cv)with moderate settlement increase(0.25),while BH4 exhibits much higher compressibility(0.95)but minimal Cv impact(0.23),indicating increased structural weakness with higher residual TPH.Spill degradation reduced TPH by 19%-64%in shallow zones,with persistent contamination at deeper layers exceeding the regulatory limits,emphasising the need for ongoing monitoring and targeted remediation for long-term stability and sustainability. 展开更多
关键词 Degradation Monitoring Geoenvironmental Hydrocarbon Contamination time-lapse Electrical Resistivity Tomography(TL-ERT) Soil Mechanical Properties Temporal Geochemical Assessment Correlation Matrix
在线阅读 下载PDF
A Composite Loss-Based Autoencoder for Accurate and Scalable Missing Data Imputation
7
作者 Thierry Mugenzi Cahit Perkgoz 《Computers, Materials & Continua》 2026年第1期1985-2005,共21页
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a... Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications. 展开更多
关键词 Missing data imputation autoencoder deep learning missing mechanisms
在线阅读 下载PDF
Advances in Machine Learning for Explainable Intrusion Detection Using Imbalance Datasets in Cybersecurity with Harris Hawks Optimization
8
作者 Amjad Rehman Tanzila Saba +2 位作者 Mona M.Jamjoom Shaha Al-Otaibi Muhammad I.Khan 《Computers, Materials & Continua》 2026年第1期1804-1818,共15页
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a... Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability. 展开更多
关键词 Intrusion detection XAI machine learning ensemble method CYBERSECURITY imbalance data
在线阅读 下载PDF
Enhanced Capacity Reversible Data Hiding Based on Pixel Value Ordering in Triple Stego Images
9
作者 Kim Sao Nguyen Ngoc Dung Bui 《Computers, Materials & Continua》 2026年第1期1571-1586,共16页
Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi... Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi-stego images provides good image quality but often results in low embedding capability.To address these challenges,this paper proposes a high-capacity RDH scheme based on PVO that generates three stego images from a single cover image.The cover image is partitioned into non-overlapping blocks with pixels sorted in ascending order.Four secret bits are embedded into each block’s maximum pixel value,while three additional bits are embedded into the second-largest value when the pixel difference exceeds a predefined threshold.A similar embedding strategy is also applied to the minimum side of the block,including the second-smallest pixel value.This design enables each block to embed up to 14 bits of secret data.Experimental results demonstrate that the proposed method achieves significantly higher embedding capacity and improved visual quality compared to existing triple-stego RDH approaches,advancing the field of reversible steganography. 展开更多
关键词 RDH reversible data hiding PVO RDH base three stego images
在线阅读 下载PDF
Impact of Data Processing Techniques on AI Models for Attack-Based Imbalanced and Encrypted Traffic within IoT Environments
10
作者 Yeasul Kim Chaeeun Won Hwankuk Kim 《Computers, Materials & Continua》 2026年第1期247-274,共28页
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp... With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy. 展开更多
关键词 Encrypted traffic attack detection data sampling technique AI-based detection IoT environment
在线阅读 下载PDF
Graph-Based Unified Settlement Framework for Complex Electricity Markets:Data Integration and Automated Refund Clearing
11
作者 Xiaozhe Guo Suyan Long +4 位作者 Ziyu Yue Yifan Wang Guanting Yin Yuyang Wang Zhaoyuan Wu 《Energy Engineering》 2026年第1期56-90,共35页
The increasing complexity of China’s electricity market creates substantial challenges for settlement automation,data consistency,and operational scalability.Existing provincial settlement systems are fragmented,lack... The increasing complexity of China’s electricity market creates substantial challenges for settlement automation,data consistency,and operational scalability.Existing provincial settlement systems are fragmented,lack a unified data structure,and depend heavily on manual intervention to process high-frequency and retroactive transactions.To address these limitations,a graph-based unified settlement framework is proposed to enhance automation,flexibility,and adaptability in electricity market settlements.A flexible attribute-graph model is employed to represent heterogeneousmulti-market data,enabling standardized integration,rapid querying,and seamless adaptation to evolving business requirements.An extensible operator library is designed to support configurable settlement rules,and a suite of modular tools—including dataset generation,formula configuration,billing templates,and task scheduling—facilitates end-to-end automated settlement processing.A robust refund-clearing mechanism is further incorporated,utilizing sandbox execution,data-version snapshots,dynamic lineage tracing,and real-time changecapture technologies to enable rapid and accurate recalculations under dynamic policy and data revisions.Case studies based on real-world data from regional Chinese markets validate the effectiveness of the proposed approach,demonstrating marked improvements in computational efficiency,system robustness,and automation.Moreover,enhanced settlement accuracy and high temporal granularity improve price-signal fidelity,promote cost-reflective tariffs,and incentivize energy-efficient and demand-responsive behavior among market participants.The method not only supports equitable and transparent market operations but also provides a generalizable,scalable foundation for modern electricity settlement platforms in increasingly complex and dynamic market environments. 展开更多
关键词 Electricity market market settlement data model graph database market refund clearing
在线阅读 下载PDF
Efficient Arabic Essay Scoring with Hybrid Models: Feature Selection, Data Optimization, and Performance Trade-Offs
12
作者 Mohamed Ezz Meshrif Alruily +4 位作者 Ayman Mohamed Mostafa Alaa SAlaerjan Bader Aldughayfiq Hisham Allahem Abdulaziz Shehab 《Computers, Materials & Continua》 2026年第1期2274-2301,共28页
Automated essay scoring(AES)systems have gained significant importance in educational settings,offering a scalable,efficient,and objective method for evaluating student essays.However,developing AES systems for Arabic... Automated essay scoring(AES)systems have gained significant importance in educational settings,offering a scalable,efficient,and objective method for evaluating student essays.However,developing AES systems for Arabic poses distinct challenges due to the language’s complex morphology,diglossia,and the scarcity of annotated datasets.This paper presents a hybrid approach to Arabic AES by combining text-based,vector-based,and embeddingbased similarity measures to improve essay scoring accuracy while minimizing the training data required.Using a large Arabic essay dataset categorized into thematic groups,the study conducted four experiments to evaluate the impact of feature selection,data size,and model performance.Experiment 1 established a baseline using a non-machine learning approach,selecting top-N correlated features to predict essay scores.The subsequent experiments employed 5-fold cross-validation.Experiment 2 showed that combining embedding-based,text-based,and vector-based features in a Random Forest(RF)model achieved an R2 of 88.92%and an accuracy of 83.3%within a 0.5-point tolerance.Experiment 3 further refined the feature selection process,demonstrating that 19 correlated features yielded optimal results,improving R2 to 88.95%.In Experiment 4,an optimal data efficiency training approach was introduced,where training data portions increased from 5%to 50%.The study found that using just 10%of the data achieved near-peak performance,with an R2 of 85.49%,emphasizing an effective trade-off between performance and computational costs.These findings highlight the potential of the hybrid approach for developing scalable Arabic AES systems,especially in low-resource environments,addressing linguistic challenges while ensuring efficient data usage. 展开更多
关键词 Automated essay scoring text-based features vector-based features embedding-based features feature selection optimal data efficiency
在线阅读 下载PDF
Individual Software Expertise Formalization and Assessment from Project Management Tool Databases
13
作者 Traian-Radu Plosca Alexandru-Mihai Pescaru +1 位作者 Bianca-Valeria Rus Daniel-Ioan Curiac 《Computers, Materials & Continua》 2026年第1期389-411,共23页
Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods... Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results. 展开更多
关键词 Expertise formalization transformer-based models natural language processing augmented data project management tool skill classification
在线阅读 下载PDF
Harnessing deep learning for the discovery of latent patterns in multi-omics medical data
14
作者 Okechukwu Paul-Chima Ugwu Fabian COgenyi +8 位作者 Chinyere Nkemjika Anyanwu Melvin Nnaemeka Ugwu Esther Ugo Alum Mariam Basajja Joseph Obiezu Chukwujekwu Ezeonwumelu Daniel Ejim Uti Ibe Michael Usman Chukwuebuka Gabriel Eze Simeon Ikechukwu Egba 《Medical Data Mining》 2026年第1期32-45,共14页
The rapid growth of biomedical data,particularly multi-omics data including genomes,transcriptomics,proteomics,metabolomics,and epigenomics,medical research and clinical decision-making confront both new opportunities... The rapid growth of biomedical data,particularly multi-omics data including genomes,transcriptomics,proteomics,metabolomics,and epigenomics,medical research and clinical decision-making confront both new opportunities and obstacles.The huge and diversified nature of these datasets cannot always be managed using traditional data analysis methods.As a consequence,deep learning has emerged as a strong tool for analysing numerous omics data due to its ability to handle complex and non-linear relationships.This paper explores the fundamental concepts of deep learning and how they are used in multi-omics medical data mining.We demonstrate how autoencoders,variational autoencoders,multimodal models,attention mechanisms,transformers,and graph neural networks enable pattern analysis and recognition across all omics data.Deep learning has been found to be effective in illness classification,biomarker identification,gene network learning,and therapeutic efficacy prediction.We also consider critical problems like as data quality,model explainability,whether findings can be repeated,and computational power requirements.We now consider future elements of combining omics with clinical and imaging data,explainable AI,federated learning,and real-time diagnostics.Overall,this study emphasises the need of collaborating across disciplines to advance deep learning-based multi-omics research for precision medicine and comprehending complicated disorders. 展开更多
关键词 deep learning multi-omics integration biomedical data mining precision medicine graph neural networks autoencoders and transformers
在线阅读 下载PDF
AI-driven integration of multi-omics and multimodal data for precision medicine
15
作者 Heng-Rui Liu 《Medical Data Mining》 2026年第1期1-2,共2页
High-throughput transcriptomics has evolved from bulk RNA-seq to single-cell and spatial profiling,yet its clinical translation still depends on effective integration across diverse omics and data modalities.Emerging ... High-throughput transcriptomics has evolved from bulk RNA-seq to single-cell and spatial profiling,yet its clinical translation still depends on effective integration across diverse omics and data modalities.Emerging foundation models and multimodal learning frameworks are enabling scalable and transferable representations of cellular states,while advances in interpretability and real-world data integration are bridging the gap between discovery and clinical application.This paper outlines a concise roadmap for AI-driven,transcriptome-centered multi-omics integration in precision medicine(Figure 1). 展开更多
关键词 high throughput transcriptomics multi omics single cell multimodal learning frameworks foundation models omics data modalitiesemerging ai driven precision medicine
在线阅读 下载PDF
Multimodal artificial intelligence integrates imaging,endoscopic,and omics data for intelligent decision-making in individualized gastrointestinal tumor treatment
16
作者 Hui Nian Yi-Bin Wu +5 位作者 Yu Bai Zhi-Long Zhang Xiao-Huang Tu Qi-Zhi Liu De-Hua Zhou Qian-Cheng Du 《Artificial Intelligence in Gastroenterology》 2026年第1期1-19,共19页
Gastrointestinal tumors require personalized treatment strategies due to their heterogeneity and complexity.Multimodal artificial intelligence(AI)addresses this challenge by integrating diverse data sources-including ... Gastrointestinal tumors require personalized treatment strategies due to their heterogeneity and complexity.Multimodal artificial intelligence(AI)addresses this challenge by integrating diverse data sources-including computed tomography(CT),magnetic resonance imaging(MRI),endoscopic imaging,and genomic profiles-to enable intelligent decision-making for individualized therapy.This approach leverages AI algorithms to fuse imaging,endoscopic,and omics data,facilitating comprehensive characterization of tumor biology,prediction of treatment response,and optimization of therapeutic strategies.By combining CT and MRI for structural assessment,endoscopic data for real-time visual inspection,and genomic information for molecular profiling,multimodal AI enhances the accuracy of patient stratification and treatment personalization.The clinical implementation of this technology demonstrates potential for improving patient outcomes,advancing precision oncology,and supporting individualized care in gastrointestinal cancers.Ultimately,multimodal AI serves as a transformative tool in oncology,bridging data integration with clinical application to effectively tailor therapies. 展开更多
关键词 Multimodal artificial intelligence Gastrointestinal tumors Individualized therapy Intelligent diagnosis Treatment optimization Prognostic prediction data fusion Deep learning Precision medicine
在线阅读 下载PDF
Cosmic Acceleration and the Hubble Tension from Baryon Acoustic Oscillation Data
17
作者 Xuchen Lu Shengqing Gao Yungui Gong 《Chinese Physics Letters》 2026年第1期327-332,共6页
We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parame... We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parameter E(z)from the DESI BAO Alcock-Paczynski(AP)data using Gaussian process to perform the null test.We find strong evidence of accelerated expansion from the DESI BAO AP data.By reconstructing the deceleration parameter q(z) from the DESI BAO AP data,we find that accelerated expansion persisted until z■0.7 with a 99.7%confidence level.Additionally,to provide insights into the Hubble tension problem,we propose combining the reconstructed E(z) with D_(H)/r_(d) data to derive a model-independent result r_(d)h=99.8±3.1 Mpc.This result is consistent with measurements from cosmic microwave background(CMB)anisotropies using the ΛCDM model.We also propose a model-independent method for reconstructing the comoving angular diameter distance D_(M)(z) from the distance modulus μ,using SNe Ia data and combining this result with DESI BAO data of D_(M)/r_(d) to constrain the value of r_(d).We find that the value of r_(d),derived from this model-independent method,is smaller than that obtained from CMB measurements,with a significant discrepancy of at least 4.17σ.All the conclusions drawn in this paper are independent of cosmological models and gravitational theories. 展开更多
关键词 baryon acoustic oscillation bao data cosmic accelerated expansion dimensionless hubble parameter reconstructing deceleration parameter null testwe accelerated expansion null tests gaussian process
原文传递
A Convolutional Neural Network-Based Deep Support Vector Machine for Parkinson’s Disease Detection with Small-Scale and Imbalanced Datasets
18
作者 Kwok Tai Chui Varsha Arya +2 位作者 Brij B.Gupta Miguel Torres-Ruiz Razaz Waheeb Attar 《Computers, Materials & Continua》 2026年第1期1410-1432,共23页
Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using d... Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested. 展开更多
关键词 Convolutional neural network data generation deep support vector machine feature extraction generative artificial intelligence imbalanced dataset medical diagnosis Parkinson’s disease small-scale dataset
在线阅读 下载PDF
Landslide monitoring in southwestern China via time-lapse electrical resistivity tomography 被引量:14
19
作者 徐冬 胡祥云 +1 位作者 单春玲 李睿恒 《Applied Geophysics》 SCIE CSCD 2016年第1期1-12,217,共13页
The dynamic monitoring of landslides in engineering geology has focused on the correlation among landslide stability,rainwater infiltration,and subsurface hydrogeology.However,the understanding of this complicated cor... The dynamic monitoring of landslides in engineering geology has focused on the correlation among landslide stability,rainwater infiltration,and subsurface hydrogeology.However,the understanding of this complicated correlation is still poor and inadequate.Thus,in this study,we investigated a typical landslide in southwestern China via time-lapse electrical resistivity tomography(TLERT) in November 2013 and August 2014.We studied landslide mechanisms based on the spatiotemporal characteristics of surface water infiltration and flow within the landslide body.Combined with borehole data,inverted resistivity models accurately defined the interface between Quaternary sediments and bedrock.Preferential flow pathways attributed to fracture zones and fissures were also delineated.In addition,we found that surface water permeates through these pathways into the slipping mass and drains away as fissure water in the fractured bedrock,probably causing the weakly weathered layer to gradually soften and erode,eventually leading to a landslide.Clearly,TLERT dynamic monitoring can provide precursory information of critical sliding and can be used in landslide stability analysis and prediction. 展开更多
关键词 time-lapse electrical resistivity tomography LANDSLIDE HYDROGEOPHYSICS MONITORING preferential flow
在线阅读 下载PDF
使用time-lapse筛选早期IVF/ICSI胚胎及其临床结局 被引量:4
20
作者 陈明颢 黄军 +1 位作者 钟影 全松 《南方医科大学学报》 CAS CSCD 北大核心 2015年第12期1760-1764,1781,共6页
目的通过比较使用time-lapse(延迟摄像)和传统形态学方法筛选IVF/ICSI胚胎的临床结局,评价time-lapse用于早期胚胎观察和筛选的价值。方法回顾性分析139个IVF/ICSI周期的资料,根据胚胎的筛选方法,分为time-lapse monitoring组(TLM组... 目的通过比较使用time-lapse(延迟摄像)和传统形态学方法筛选IVF/ICSI胚胎的临床结局,评价time-lapse用于早期胚胎观察和筛选的价值。方法回顾性分析139个IVF/ICSI周期的资料,根据胚胎的筛选方法,分为time-lapse monitoring组(TLM组)(n=68)和对照组(n=71),比较两组间的βHCG阳性率、临床妊娠率和胚胎着床率,并根据女方年龄、受精方式进行亚组分析。结果 TLM组的βHCG阳性率、临床妊娠率、胚胎着床率分别为:66.2%、61.8%、47.1%;对照组的βHCG阳性率、临床妊娠率、胚胎着床率分别为:47.9%、43.7%、30.3%;TLM组的βHCG阳性率、临床妊娠率、胚胎着床率均高于对照组,且差异均有统计学意义(P〈0.05)。亚组分析显示:相较于年龄≤30岁的患者,年龄31~35岁的患者利用time-lapse更能明显改善临床结局;利用time-lapse能明显提高IVF周期的βHCG阳性率、临床妊娠率、胚胎着床率,但对于ICSI和TESA周期,效果则不理想。结论使用time-lapse动态监测胚胎并根据胚胎的形态动力学参数对胚胎进行评价和筛选,与传统方法相比,能获得更好的临床结局;年龄较大的(〉30岁)或者是进行IVF周期的患者更能从中获益。 展开更多
关键词 time-lapse 传统形态学方法 胚胎筛选 临床妊娠率 胚胎着床率
暂未订购
上一页 1 2 250 下一页 到第
使用帮助 返回顶部