期刊文献+
共找到106,261篇文章
< 1 2 250 >
每页显示 20 50 100
Discriminating Among Relatively Efficient Units in Data Envelopment Analysis: A Comparison of Alternative Methods and Some Extensions 被引量:1
1
作者 Antreas D. Athanassopoulos 《American Journal of Operations Research》 2012年第1期1-9,共9页
This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performa... This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performance of such units. The paper provides a systematic framework of alternative ways for ranking DEA-efficient units. The framework contains criteria derived as by-products of the basic DEA models and also criteria derived from complementary DEA analysis that needs to be carried out. The proposed framework is applied to rank a set of relatively efficient restaurants on the basis of their market efficiency. 展开更多
关键词 data Envelopment analysis Cross-Efficiency SUPER-EFFICIENCY ABSOLUTE RANKING Linear PROGRAMMING
暂未订购
Gene Expression Data Analysis Based on Mixed Effects Model
2
作者 Yuanbo Dai 《Journal of Computer and Communications》 2025年第2期223-235,共13页
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres... DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions. 展开更多
关键词 Mixed Effects Model Gene Expression data analysis Gene analysis Gene Chip
暂未订购
Extraction of effective response for controlled-source electromagnetic data based on clustering analysis
3
作者 Cong Zhou Zhan-zi Qin +2 位作者 Liang Yang Tara P.Banjade Xiao-fei Zhou 《Applied Geophysics》 2025年第4期1297-1312,1499,共17页
The issue of strong noise has increasingly become a bottleneck restricting the precision and application space of electromagnetic exploration methods.Noise suppression and extraction of effective electromagnetic respo... The issue of strong noise has increasingly become a bottleneck restricting the precision and application space of electromagnetic exploration methods.Noise suppression and extraction of effective electromagnetic response information under a strong noise background is a crucial scientific task to be addressed.To solve the noise suppression problem of the controlled-source electromagnetic method in strong interference areas,we propose an approach based on complex-plane 2D k-means clustering for data processing.Based on the stability of the controlled-source signal response,clustering analysis is applied to classify the spectra of different sources and noises in multiple time segments.By identifying the power spectra with controlled-source characteristics,it helps to improve the quality of the controlled-source response extraction.This paper presents the principle and workflow of the proposed algorithm,and demonstrates feasibility and effectiveness of the new algorithm through synthetic and real data examples.The results show that,compared with the conventional Robust denoising method,the clustering algorithm has a stronger suppression effect on common noise,can identify high-quality signals,and improve the preprocessing data quality of the controlledsource electromagnetic method. 展开更多
关键词 controlled-source electromagnetic method data processing Cluster analysis Noise
在线阅读 下载PDF
Research on the Development Strategies of Realtime Data Analysis and Decision-support Systems
4
作者 Wei Tang 《Journal of Electronic Research and Application》 2025年第2期204-210,共7页
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This... With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques. 展开更多
关键词 Real-time data analysis Decision-support system Big data System architecture data processing Visualization technology
在线阅读 下载PDF
Analysis of the Impact of Legal Digital Currencies on Bank Big Data Practices
5
作者 Zhengkun Xiu 《Journal of Electronic Research and Application》 2025年第1期23-27,共5页
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e... This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies. 展开更多
关键词 Legal digital currency Bank big data data processing efficiency data analysis and application Countermeasures and suggestions
在线阅读 下载PDF
Multi-Source Heterogeneous Data Fusion Analysis Platform for Thermal Power Plants
6
作者 Jianqiu Wang Jianting Wen +1 位作者 Hui Gao Chenchen Kang 《Journal of Architectural Research and Development》 2025年第6期24-28,共5页
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter... With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%. 展开更多
关键词 Thermal power plant Multi-source heterogeneous data data fusion analysis platform Edge computing
在线阅读 下载PDF
The Role of Big Data Analysis in Digital Currency Systems
7
作者 Zhengkun Xiu 《Proceedings of Business and Economic Studies》 2025年第1期1-5,共5页
In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This s... In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This system restores the essential characteristics of currency while providing auxiliary services related to the formation,circulation,storage,application,and promotion of digital currency.Compared to traditional currency management technologies,big data analysis technology,which is primarily embedded in digital currency systems,enables the rapid acquisition of information.This facilitates the identification of standard associations within currency data and provides technical support for the operational framework of digital currency. 展开更多
关键词 Big data Digital currency Computational methods Transaction speed
在线阅读 下载PDF
Evaluating fracture volume loss during production process by comparative analysis of initial and second flowback data
8
作者 Chong Cao Tamer Moussa Hassan Dehghanpour 《International Journal of Coal Science & Technology》 2025年第3期274-290,共17页
The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pr... The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days. 展开更多
关键词 Second flowback data analysis Infill development Preloading effect Effective fracture volume loss Flowback rate-transient analysis
在线阅读 下载PDF
Leveraging Bayesian methods for addressing multi-uncertainty in data-driven seismic liquefaction assessment
9
作者 Zhihui Wang Roberto Cudmani +2 位作者 Andrés Alfonso Peña Olarte Chaozhe Zhang Pan Zhou 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第4期2474-2491,共18页
When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding bia... When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis. 展开更多
关键词 data-driven method Bayes analysis Seismic liquefaction UNCERTAINTY Neural network
在线阅读 下载PDF
General Improvement of Image Interpolation-Based Data Hiding Methods Using Multiple-Based Number Conversion
10
作者 Da-Chun Wu Bing-Han 《Computer Modeling in Engineering & Sciences》 2025年第7期535-580,共46页
Data hiding methods involve embedding secret messages into cover objects to enable covert communication in a way that is difficult to detect.In data hiding methods based on image interpolation,the image size is reduce... Data hiding methods involve embedding secret messages into cover objects to enable covert communication in a way that is difficult to detect.In data hiding methods based on image interpolation,the image size is reduced and then enlarged through interpolation,followed by the embedding of secret data into the newly generated pixels.A general improving approach for embedding secret messages is proposed.The approach may be regarded a general model for enhancing the data embedding capacity of various existing image interpolation-based data hiding methods.This enhancement is achieved by expanding the range of pixel values available for embedding secret messages,removing the limitations of many existing methods,where the range is restricted to powers of two to facilitate the direct embedding of bit-based messages.This improvement is accomplished through the application of multiple-based number conversion to the secret message data.The method converts the message bits into a multiple-based number and uses an algorithm to embed each digit of this number into an individual pixel,thereby enhancing the message embedding efficiency,as proved by a theorem derived in this study.The proposed improvement method has been tested through experiments on three well-known image interpolation-based data hiding methods.The results show that the proposed method can enhance the three data embedding rates by approximately 14%,13%,and 10%,respectively,create stego-images with good quality,and resist RS steganalysis attacks.These experimental results indicate that the use of the multiple-based number conversion technique to improve the three interpolation-based methods for embedding secret messages increases the number of message bits embedded in the images.For many image interpolation-based data hiding methods,which use power-of-two pixel-value ranges for message embedding,other than the three tested ones,the proposed improvement method is also expected to be effective for enhancing their data embedding capabilities. 展开更多
关键词 data hiding image interpolation interpolation-based hiding methods steganography multiple-based number conversion
在线阅读 下载PDF
Application of Big Data Technology in User Behavior Analysis of E-commerce Platforms
11
作者 Yanzhao Jia 《Journal of Electronic Research and Application》 2025年第3期104-110,共7页
With the rapid development of the Internet and e-commerce,e-commerce platforms have accumulated huge amounts of user behavior data.The emergence of big data technology provides a powerful means for in-depth analysis o... With the rapid development of the Internet and e-commerce,e-commerce platforms have accumulated huge amounts of user behavior data.The emergence of big data technology provides a powerful means for in-depth analysis of these data and insight into user behavior patterns and preferences.This paper elaborates on the application of big data technology in the analysis of user behavior on e-commerce platforms,including the technical methods of data collection,storage,processing and analysis,as well as the specific applications in the construction of user profiles,precision marketing,personalized recommendation,user retention and churn analysis,etc.,and discusses the challenges and countermeasures faced in the application.Through the study of actual cases,it demonstrates the remarkable effectiveness of big data technology in enhancing the competitiveness of e-commerce platforms and user experience. 展开更多
关键词 Big data technology E-commerce platform User behavior analysis
在线阅读 下载PDF
A review of test methods for uniaxial compressive strength of rocks:Theory,apparatus and data processing
12
作者 Wei-Qiang Xie Xiao-Li Liu +2 位作者 Xiao-Ping Zhang Quan-Sheng Liu En-ZhiWang 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第3期1889-1905,共17页
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ... The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors. 展开更多
关键词 Uniaxial compressive strength(UCS) UCS testing methods Test apparatus data processing
在线阅读 下载PDF
Bayesian analysis of Gamow resonances with reduced basis methods:from eigenvector continuation to post-emulation corrections
13
作者 Ruo-Yu Cheng Zhi-Cheng Xu 《Nuclear Science and Techniques》 2025年第12期233-243,共11页
To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The R... To study the uncertainty quantification of resonant states in open quantum systems,we developed a Bayesian framework by integrating a reduced basis method(RBM)emulator with the Gamow coupled-channel(GCC)approach.The RBM,constructed via eigenvector continuation and trained on both bound and resonant configurations,enables the fast and accurate emulation of resonance properties across the parameter space.To identify the physical resonant states from the emulator’s output,we introduce an overlap-based selection technique that effectively isolates true solutions from background artifacts.By applying this framework to unbound nucleus ^(6)Be,we quantified the model uncertainty in the predicted complex energies.The results demonstrate relative errors of 17.48%in the real part and 8.24%in the imaginary part,while achieving a speedup of four orders of magnitude compared with the full GCC calculations.To further investigate the asymptotic behavior of the resonant-state wavefunctions within the RBM framework,we employed a Lippmann–Schwinger(L–S)-based correction scheme.This approach not only improves the consistency between eigenvalues and wavefunctions but also enables a seamless extension from real-space training data to the complex energy plane.By bridging the gap between bound-state and continuum regimes,the L–S correction significantly enhances the emulator’s capability to accurately capture continuum structures in open quantum systems. 展开更多
关键词 Uncertainty quantification Reduced basis method Resonance emulator Bayesian analysis Gamow coupled-channel model
在线阅读 下载PDF
Topology Data Analysis-Based Error Detection for Semantic Image Transmission with Incremental Knowledge-Based HARQ
14
作者 Ni Fei Li Rongpeng +1 位作者 Zhao Zhifeng Zhang Honggang 《China Communications》 2025年第1期235-255,共21页
Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpe... Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission. 展开更多
关键词 error detection incremental knowledgebased HARQ joint source-channel coding semantic communication swin transformer topological data analysis
在线阅读 下载PDF
A method to address the challenges of charging conditions on incremental capacity analysis:An ICA-compensation technique incorporating current interrupt methods
15
作者 Jinghua Sun Josef Kainz 《Journal of Energy Chemistry》 2025年第9期65-80,I0004,共17页
The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ... The incremental capacity analysis(ICA)technique is notably limited by its sensitivity to variations in charging conditions,which constrains its practical applicability in real-world scenarios.This paper introduces an ICA-compensation technique to address this limitation and propose a generalized framework for assessing the state of health(SOH)of batteries based on ICA that is applicable under differing charging conditions.This novel approach calculates the voltage profile under quasi-static conditions by subtracting the voltage increase attributable to the additional polarization effects at high currents from the measured voltage profile.This approach's efficacy is contingent upon precisely acquiring the equivalent impedance.To obtain the equivalent impedance throughout the batteries'lifespan while minimizing testing costs,this study employs a current interrupt technique in conjunction with a long short-term memory(LSTM)network to develop a predictive model for equivalent impedance.Following the derivation of ICA curves using voltage profiles under quasi-static conditions,the research explores two scenarios for SOH estimation:one utilizing only incremental capacity(IC)features and the other incorporating both IC features and IC sampling.A genetic algorithm-optimized backpropagation neural network(GABPNN)is employed for the SOH estimation.The proposed generalized framework is validated using independent training and test datasets.Variable test conditions are applied for the test set to rigorously evaluate the methodology under challenging conditions.These evaluation results demonstrate that the proposed framework achieves an estimation accuracy of 1.04%for RMSE and 0.90%for MAPE across a spectrum of charging rates ranging from 0.1 C to 1 C and starting SOCs between 0%and 70%,which constitutes a major advancement compared to established ICA methods.It also significantly enhances the applicability of conventional ICA techniques in varying charging conditions and negates the necessity for separate testing protocols for each charging scenario. 展开更多
关键词 Lithium-ion batteries Incremental capacity analysis Charging conditions State of health Current interrupt method
在线阅读 下载PDF
Precision Comparison and Analysis of Multi-stereo Fusion and Multi-view Matching Based on High-Resolution Satellite Data
16
作者 LIU Tengfei HUANG Xu HUANG Zefeng 《Transactions of Nanjing University of Aeronautics and Astronautics》 2025年第5期577-588,共12页
High-resolution sub-meter satellite data play an increasingly crucial role in the 3D real-scene China construction initiative.Current research on 3D reconstruction using high-resolution satellite data primarily focuse... High-resolution sub-meter satellite data play an increasingly crucial role in the 3D real-scene China construction initiative.Current research on 3D reconstruction using high-resolution satellite data primarily focuses on two approaches:Multi-stereo fusion and multi-view matching.While algorithms based on these two methodologies for multi-view image 3D reconstruction have reached relative maturity,no systematic comparison has been conducted specifically on satellite data to evaluate the relative merits of multi-stereo fusion versus multi-view matching methods.This paper conducts a comparative analysis of the practical accuracy of both approaches using high-resolution satellite datasets from diverse geographical regions.To ensure fairness in accuracy comparison,both methodologies employ non-local dense matching for cost optimization.Results demonstrate that the multi-stereo fusion method outperforms multi-view matching in all evaluation metrics,exhibiting approximately 1.2%higher average matching accuracy and 10.7%superior elevation precision in the experimental datasets.Therefore,for 3D modeling applications using satellite data,we recommend adopting the multi-stereo fusion approach for digital surface model(DSM)product generation. 展开更多
关键词 multi-stereo fusion reconstruction multi-view matching reconstruction non-local dense matching method occlusion detection high-resolution satellite data
在线阅读 下载PDF
Correction:Data analysis framework for silicon strip detector in compact spectrometer for heavy-ion experiments
17
作者 Xiao-Bao Wei Yu-Hao Qin +15 位作者 Sheng Xiao Da-Wei Si Dong Guo Zhi Qin Fen-Hai Guan Xin-Yue Diao Bo-Yuan Zhang Bai-Ting Tian Jun-Huai Xu Tian-Ren Zhuo Yi-Bo Hao Zeng-Xiang Wang Shi-Tao Wang Chun-Wang Ma Yi-Jie Wang Zhi-Gang Xiao 《Nuclear Science and Techniques》 2025年第11期367-367,共1页
In section‘Track decoding’of this article,one of the paragraphs was inadvertently missed out after the text'…shows the flow diagram of the Tr2-1121 track mode.'The missed paragraph is provided below.
关键词 tr track mode flow diagram data analysis heavy ion experiments silicon strip detector compact spectrometer track decoding
在线阅读 下载PDF
Single-Cell and Multi-Dimensional Data Analysis of the Key Role of IDH2 in Cervical Squamous Cell Carcinoma Progression
18
作者 Xiaojuan Liu Zhenpeng Zhu +5 位作者 Chenyang Hou Hui Ma Xiaoyan Li Chunxing Ma Lisha Shu Huiying Zhang 《Biomedical and Environmental Sciences》 2025年第6期773-778,共6页
Cervical cancer,a leading malignancy globally,poses a significant threat to women's health,with an estimated 604,000 new cases and 342,000 deaths reported in 2020^([1]).As cervical cancer is closely linked to huma... Cervical cancer,a leading malignancy globally,poses a significant threat to women's health,with an estimated 604,000 new cases and 342,000 deaths reported in 2020^([1]).As cervical cancer is closely linked to human papilloma virus(HPV)infection,early detection relies on HPV screening;however,late-stage prognosis remains poor,underscoring the need for novel diagnostic and therapeutic targets^([2]). 展开更多
关键词 cervical squamous cell carcinoma IDH cervical cancera multi dimensional data analysis novel diagnostic therapeutic targets cervical cancer prognosis human papilloma virus hpv infectionearly
暂未订购
ADGAP:a user-friendly online ancient DNA database and genome analysis platform
19
作者 Yanwei Chen Yu Xu +1 位作者 Kongyang Zhu Chuan-Chao Wang 《Journal of Genetics and Genomics》 2025年第8期1058-1061,共4页
The analysis of ancient genomics provides opportunities to explore human population history across both temporal and geographic dimensions(Haak et al.,2015;Wang et al.,2021,2024)to enhance the accessibility and utilit... The analysis of ancient genomics provides opportunities to explore human population history across both temporal and geographic dimensions(Haak et al.,2015;Wang et al.,2021,2024)to enhance the accessibility and utility of these ancient genomic datasets,a range of databases and advanced statistical models have been developed,including the Allen Ancient DNA Resource(AADR)(Mallick et al.,2024)and AdmixTools(Patterson et al.,2012).While upstream processes such as sequencing and raw data processing have been streamlined by resources like the AADR,the downstream analysis of these datasets-encompassing population genetics inference and spatiotemporal interpretation-remains a significant challenge.The AADR provides a unified collection of published ancient DNA(aDNA)data,yet its file-based format and reliance on command-line tools,such as those in Admix-Tools(Patterson et al.,2012),require advanced computational expertise for effective exploration and analysis.These requirements can present significant challenges forresearchers lackingadvanced computational expertise,limiting the accessibility and broader application of these valuable genomic resources. 展开更多
关键词 dataBASE raw data processing analysis ancient genomics upstream processes ancient DNA explore human population history allen ancient dna resource aadr mallick ancient genomic datasetsa
原文传递
Intelligent Electrocardiogram Analysis in Medicine:Data,Methods,and Applications
20
作者 Yu-Xia Guan Ying An +2 位作者 Feng-Yi Guo Wei-Bai Pan Jian-Xin Wang 《Chinese Medical Sciences Journal》 CAS CSCD 2023年第1期38-48,共11页
Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been wi... Electrocardiogram(ECG)is a low-cost,simple,fast,and non-invasive test.It can reflect the heart’s electrical activity and provide valuable diagnostic clues about the health of the entire body.Therefore,ECG has been widely used in various biomedical applications such as arrhythmia detection,disease-specific detection,mortality prediction,and biometric recognition.In recent years,ECG-related studies have been carried out using a variety of publicly available datasets,with many differences in the datasets used,data preprocessing methods,targeted challenges,and modeling and analysis techniques.Here we systematically summarize and analyze the ECGbased automatic analysis methods and applications.Specifically,we first reviewed 22 commonly used ECG public datasets and provided an overview of data preprocessing processes.Then we described some of the most widely used applications of ECG signals and analyzed the advanced methods involved in these applications.Finally,we elucidated some of the challenges in ECG analysis and provided suggestions for further research. 展开更多
关键词 ELECTROCARDIOGRAM dataBASE PREPROCESSING machine learning medical big data analysis
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部