期刊文献+
共找到107,913篇文章
< 1 2 250 >
每页显示 20 50 100
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
1
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
Current Situation of Application and Development Prospects of the Statistical Analysis of Big Data
2
作者 Zhuoran LI 《Meteorological and Environmental Research》 2026年第1期45-47,共3页
With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues su... With the advent of the big data era,modern statistics has enjoyed unprecedented development opportunities and also faced numerous new challenges.Traditional statistical computing methods are often limited by issues such as computer memory capacity and distributed storage of data across different locations,and are unable to directly apply to large-scale data sets.Therefore,in the context of big data,designing efficient and theoretically guaranteed statistical learning and inference algorithms has become a key issue that the current field of statistics urgently needs to address.In this paper,the application status of statistical analysis methods in the big data environment was systematically reviewed,and its future development directions were analyzed to provide reference and support for the further development of theory and methods of the statistical analysis of big data. 展开更多
关键词 Big data Statistical analysis Current status Development prospects
在线阅读 下载PDF
A Virtual Probe Deployment Method Based on User Behavioral Feature Analysis
3
作者 Bing Zhang Wenqi Shi 《Computers, Materials & Continua》 2026年第2期2017-2035,共19页
To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,thi... To address the challenge of low survival rates and limited data collection efficiency in current virtual probe deployments,which results from anomaly detection mechanisms in location-based service(LBS)applications,this paper proposes a novel virtual probe deployment method based on user behavioral feature analysis.The core idea is to circumvent LBS anomaly detection by mimicking real-user behavior patterns.First,we design an automated data extraction algorithm that recognizes graphical user interface(GUI)elements to collect spatio-temporal behavior data.Then,by analyzing the automatically collected user data,we identify normal users’spatio-temporal patterns and extract their features such as high-activity time windows and spatial clustering characteristics.Subsequently,an antidetection scheduling strategy is developed,integrating spatial clustering optimization,load-balanced allocation,and time window control to generate probe scheduling schemes.Additionally,a self-correction mechanism based on an exponential backoff strategy is implemented to rectify anomalous behaviors andmaintain system stability.Experiments in real-world environments demonstrate that the proposed method significantly outperforms baseline methods in terms of both probe ban rate and task completion rate,while maintaining high time efficiency.This study provides a more reliable and clandestine solution for geosocial data collection and lays the foundation for building more robust virtual probe systems. 展开更多
关键词 Virtual probe behavior feature analysis anomaly detection scheduling strategy geosocial data collection
在线阅读 下载PDF
GranuSAS:Software of rapid particle size distribution analysis from small angle scattering data
4
作者 Qiaoyu Guo Fei Xie +3 位作者 Xuefei Feng Zhe Sun Changda Wang Xuechen Jiao 《Chinese Physics B》 2026年第2期216-225,共10页
Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces th... Small angle x-ray scattering(SAXS)is an advanced technique for characterizing the particle size distribution(PSD)of nanoparticles.However,the ill-posed nature of inverse problems in SAXS data analysis often reduces the accuracy of conventional methods.This article proposes a user-friendly software for PSD analysis,GranuSAS,which employs an algorithm that integrates truncated singular value decomposition(TSVD)with the Chahine method.This approach employs TSVD for data preprocessing,generating a set of initial solutions with noise suppression.A high-quality initial solution is subsequently selected via the L-curve method.This selected candidate solution is then iteratively refined by the Chahine algorithm,enforcing constraints such as non-negativity and improving physical interpretability.Most importantly,GranuSAS employs a parallel architecture that simultaneously yields inversion results from multiple shape models and,by evaluating the accuracy of each model's reconstructed scattering curve,offers a suggestion for model selection in material systems.To systematically validate the accuracy and efficiency of the software,verification was performed using both simulated and experimental datasets.The results demonstrate that the proposed software delivers both satisfactory accuracy and reliable computational efficiency.It provides an easy-to-use and reliable tool for researchers in materials science,helping them fully exploit the potential of SAXS in nanoparticle characterization. 展开更多
关键词 small angle x-ray scattering data analysis software particle size distribution inverse problem
原文传递
Bending Analysis of Functionally Graded Material and Cracked Homogeneous Thin Plates Using Meshfree Numerical Manifold Method
5
作者 Shouyang Huang Hong Zheng +2 位作者 Xuguang Yu Ziheng Li Zhiwei Pan 《Computer Modeling in Engineering & Sciences》 2026年第3期304-340,共37页
Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture ... Functionally graded material(FGM)plates are widely used in various engineering structures owing to their tailor-made mechanical properties,whereas cracked homogeneous plates constitute a canonical setting in fracture mechanics analysis.These two classes of problems respectively embody material non-uniformity and geometric discontinuity,thereby imposing more stringent requirements on numerical methods in terms of high-order field continuity and accurate defect representation.Based on the classical Kirchhoff-Love plate theory,a numerical manifold method(MLS-NMM)incorporating moving least squares(MLS)interpolation is developed for bending analysis of FGM plates and fracture simulation of homogeneous plates with defects.The method constructs an H^(2)-regular approximation with high-order continuous weighting functions and,combined with the separation of mathematical and physical covers,establishes a unified framework that accurately handles material gradients and cracks without mesh reconstruction.For the crack tip,a singular physical cover incorporating the Williams asymptotic field is introduced to achieve local enrichment,enabling the natural capture of displacement discontinuity and stress singularity.Stress intensity factors are extracted using the interaction integral method,and the dimensionless J-integral shows a maximum relative error below 1.2%compared with the reference solution.Numerical results indicate that MLS-NMM exhibits excellent convergence performance:using 676 mathematical nodes,the nondimensional central deflection of both FGM and homogeneous plates agrees with reference solutions with a maximum relative error below 0.81%,and no shear locking occurs.A systematic analysis reveals that for a simply supported on all four edges(SSSS)FGM square plate with a/h=10,the nondimensional central deflection increases by 212%as the gradient index nrises from 0 to 5.For a homogeneous plate containing a central crack with c/a=0.6,the nondimensional central deflection increases by approximately 46%compared with the intact plate.Under weak boundary constraints(e.g.,SFSF),the deformation is markedly amplified,with the deflection reaching more than three times that under strong constraints(SCSC).The proposed method provides an efficient,reconstruction-free numerical tool for high-accuracy bending and fracture analyses of FGM and cracked thin-plate structures. 展开更多
关键词 Kirchhoff-love plate theory functionally graded materials moving least squares interpolation numerical manifold method bending analysis fracture mechanics stress intensity factor
在线阅读 下载PDF
NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS 被引量:6
6
作者 Yan Weiwu Shao HuiheDepartment of Automation,Shanghai Jiaotong University,Shanghai 200030, China 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2003年第2期117-119,共3页
In the industrial process situation, principal component analysis (PCA) is ageneral method in data reconciliation. However, PCA sometime is unfeasible to nonlinear featureanalysis and limited in application to nonline... In the industrial process situation, principal component analysis (PCA) is ageneral method in data reconciliation. However, PCA sometime is unfeasible to nonlinear featureanalysis and limited in application to nonlinear industrial process. Kernel PCA (KPCA) is extensionof PCA and can be used for nonlinear feature analysis. A nonlinear data reconciliation method basedon KPCA is proposed. The basic idea of this method is that firstly original data are mapped to highdimensional feature space by nonlinear function, and PCA is implemented in the feature space. Thennonlinear feature analysis is implemented and data are reconstructed by using the kernel. The datareconciliation method based on KPCA is applied to ternary distillation column. Simulation resultsshow that this method can filter the noise in measurements of nonlinear process and reconciliateddata can represent the true information of nonlinear process. 展开更多
关键词 principal component analysis KERNEL data reconciliation NONLINEAR
在线阅读 下载PDF
Fractal Method for Statistical Analysis Geological Data 被引量:2
7
作者 Meng Xianguo Zhao PengdaChina University of Geosciences , Wuhan 430074 《Journal of Earth Science》 SCIE CAS CSCD 1991年第1期114-119,共6页
This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it i... This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it introduces the R/S analysis for time series analysis into spacial series to calculate the structural fractal dimensions of ranges and standard deviation for spacial series data -and to establish the fractal dimension matrix and the procedures in plotting the fractal dimension anomaly diagram with vector distances of fractal dimension . At last , it has examples of its application . 展开更多
关键词 geological data fractal method fractal dimension space series R/S analysis .
在线阅读 下载PDF
Heteroscedastic regression analysis method for mixed data 被引量:1
8
作者 FU Hui-min YUE Xiao-rui 《航空动力学报》 EI CAS CSCD 北大核心 2011年第4期721-726,共6页
The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data,type-Ⅰ censored data and type-Ⅱ censored data from the l... The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data,type-Ⅰ censored data and type-Ⅱ censored data from the location-scale distribution.The best unbiased estimations of regression coefficients,as well as the confidence limits of the location parameter and scale parameter were given.Furthermore,the point estimations and confidence limits of percentiles were obtained.Thus,the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution.So the presented method has a broad range of promising applications. 展开更多
关键词 heteroscedastic regression analysis censored data performance test life prediction reliability analysis
原文传递
3D slope stability analysis considering strength anisotropy by a microstructure tensor enhanced elasto-plastic finite element method 被引量:1
9
作者 Wencheng Wei Hongxiang Tang +1 位作者 Xiaoyu Song Xiangji Ye 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第3期1664-1684,共21页
This article presents a micro-structure tensor enhanced elasto-plastic finite element(FE)method to address strength anisotropy in three-dimensional(3D)soil slope stability analysis.The gravity increase method(GIM)is e... This article presents a micro-structure tensor enhanced elasto-plastic finite element(FE)method to address strength anisotropy in three-dimensional(3D)soil slope stability analysis.The gravity increase method(GIM)is employed to analyze the stability of 3D anisotropic soil slopes.The accuracy of the proposed method is first verified against the data in the literature.We then simulate the 3D soil slope with a straight slope surface and the convex and concave slope surfaces with a 90turning corner to study the 3D effect on slope stability and the failure mechanism under anisotropy conditions.Based on our numerical results,the end effect significantly impacts the failure mechanism and safety factor.Anisotropy degree notably affects the safety factor,with higher degrees leading to deeper landslides.For concave slopes,they can be approximated by straight slopes with suitable boundary conditions to assess their stability.Furthermore,a case study of the Saint-Alban test embankment A in Quebec,Canada,is provided to demonstrate the applicability of the proposed FE model. 展开更多
关键词 Strength anisotropy Elasto-plastic finite element method(FEM) Three-dimensional(3D)soil slope Gravity increase method(GIM) Stability analysis Case study
在线阅读 下载PDF
Application of the Tikhonov regularization method to wind retrieval from scatterometer data I.Sensitivity analysis and simulation experiments 被引量:1
10
作者 钟剑 黄思训 +1 位作者 杜华栋 张亮 《Chinese Physics B》 SCIE EI CAS CSCD 2011年第3期274-283,共10页
Scatterometer is an instrument which provides all-day and large-scale wind field information, and its application especially to wind retrieval always attracts meteorologists. Certain reasons cause large direction erro... Scatterometer is an instrument which provides all-day and large-scale wind field information, and its application especially to wind retrieval always attracts meteorologists. Certain reasons cause large direction error, so it is important to find where the error mainly comes. Does it mainly result from the background field, the normalized radar cross-section (NRCS) or the method of wind retrieval? It is valuable to research. First, depending on SDP2.0, the simulated 'true' NRCS is calculated from the simulated 'true' wind through the geophysical mode] function NSCAT2. The simulated background field is configured by adding a noise to the simulated 'true' wind with the non-divergence constraint. Also, the simulated 'measured' NRCS is formed by adding a noise to the simulated 'true' NRCS. Then, the sensitivity experiments are taken, and the new method of regularization is used to improve the ambiguity removal with simulation experiments. The results show that the accuracy of wind retrieval is more sensitive to the noise in the background than in the measured NRCS; compared with the two-dimensional variational (2DVAR) ambiguity removal method, the accuracy of wind retrieval can be improved with the new method of Tikhonov regularization through choosing an appropriate regularization parameter, especially for the case of large error in the background. The work will provide important information and a new method for the wind retrieval with real data. 展开更多
关键词 SCATTEROMETER variational optimization analysis wind retrieval regularization method
原文传递
Discriminating Among Relatively Efficient Units in Data Envelopment Analysis: A Comparison of Alternative Methods and Some Extensions 被引量:1
11
作者 Antreas D. Athanassopoulos 《American Journal of Operations Research》 2012年第1期1-9,共9页
This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performa... This paper concentrates on methods for comparing activity units found relatively efficient by data envelopment analysis (DEA). The use of the basic DEA models does not provide direct information regarding the performance of such units. The paper provides a systematic framework of alternative ways for ranking DEA-efficient units. The framework contains criteria derived as by-products of the basic DEA models and also criteria derived from complementary DEA analysis that needs to be carried out. The proposed framework is applied to rank a set of relatively efficient restaurants on the basis of their market efficiency. 展开更多
关键词 data Envelopment analysis Cross-Efficiency SUPER-EFFICIENCY ABSOLUTE RANKING Linear PROGRAMMING
暂未订购
Gene Expression Data Analysis Based on Mixed Effects Model
12
作者 Yuanbo Dai 《Journal of Computer and Communications》 2025年第2期223-235,共13页
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres... DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions. 展开更多
关键词 Mixed Effects Model Gene Expression data analysis Gene analysis Gene Chip
暂未订购
Extraction of effective response for controlled-source electromagnetic data based on clustering analysis
13
作者 Cong Zhou Zhan-zi Qin +2 位作者 Liang Yang Tara P.Banjade Xiao-fei Zhou 《Applied Geophysics》 2025年第4期1297-1312,1499,共17页
The issue of strong noise has increasingly become a bottleneck restricting the precision and application space of electromagnetic exploration methods.Noise suppression and extraction of effective electromagnetic respo... The issue of strong noise has increasingly become a bottleneck restricting the precision and application space of electromagnetic exploration methods.Noise suppression and extraction of effective electromagnetic response information under a strong noise background is a crucial scientific task to be addressed.To solve the noise suppression problem of the controlled-source electromagnetic method in strong interference areas,we propose an approach based on complex-plane 2D k-means clustering for data processing.Based on the stability of the controlled-source signal response,clustering analysis is applied to classify the spectra of different sources and noises in multiple time segments.By identifying the power spectra with controlled-source characteristics,it helps to improve the quality of the controlled-source response extraction.This paper presents the principle and workflow of the proposed algorithm,and demonstrates feasibility and effectiveness of the new algorithm through synthetic and real data examples.The results show that,compared with the conventional Robust denoising method,the clustering algorithm has a stronger suppression effect on common noise,can identify high-quality signals,and improve the preprocessing data quality of the controlledsource electromagnetic method. 展开更多
关键词 controlled-source electromagnetic method data processing Cluster analysis Noise
在线阅读 下载PDF
Research on the Development Strategies of Realtime Data Analysis and Decision-support Systems
14
作者 Wei Tang 《Journal of Electronic Research and Application》 2025年第2期204-210,共7页
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This... With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques. 展开更多
关键词 Real-time data analysis Decision-support system Big data System architecture data processing Visualization technology
在线阅读 下载PDF
Analysis of the Impact of Legal Digital Currencies on Bank Big Data Practices
15
作者 Zhengkun Xiu 《Journal of Electronic Research and Application》 2025年第1期23-27,共5页
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e... This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies. 展开更多
关键词 Legal digital currency Bank big data data processing efficiency data analysis and application Countermeasures and suggestions
在线阅读 下载PDF
Multi-Source Heterogeneous Data Fusion Analysis Platform for Thermal Power Plants
16
作者 Jianqiu Wang Jianting Wen +1 位作者 Hui Gao Chenchen Kang 《Journal of Architectural Research and Development》 2025年第6期24-28,共5页
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter... With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%. 展开更多
关键词 Thermal power plant Multi-source heterogeneous data data fusion analysis platform Edge computing
在线阅读 下载PDF
Public Service Satisfaction of Huilongguan Sports and Cultural Park Based on the Importance-Performance Analysis(IPA)Method
17
作者 ZHANG Xinti CHEN Xi HAN Juejia 《Journal of Landscape Research》 2025年第4期44-50,共7页
With Beijing Huilongguan Sports and Cultural Park as the research object,this study was conducted to investigate public service satisfaction in the park by the Importance-Performance Analysis(IPA)method.A questionnair... With Beijing Huilongguan Sports and Cultural Park as the research object,this study was conducted to investigate public service satisfaction in the park by the Importance-Performance Analysis(IPA)method.A questionnaire covering six dimensions,including public transportation,sanitation and environment,and supporting facility construction,was designed.A total of 208 valid samples were collected,and SPSS was employed for reliability and validity tests as well as IPA analysis.The findings were as follows:①Visitors were generally quite satisfied with the overall public services in Huilongguan Sports and Cultural Park.②The highest satisfaction levels were observed in sanitation and environment services and the sports and cultural atmosphere,while lower satisfaction was noted for supporting facility construction and public information services.③The advantage enhancement zone includes sanitation and environment services and sports and cultural atmosphere;and the continuous maintenance zone includes public transportation services and security management amd maintenance;the subsequent opportunity zone includes supporting facility construction and public information services;and there are no dimensions in the urgent improvement zone.The study recommends strengthening the service connotations from three aspects:enhancing facilities with sports as the core,optimizing services with a people-centered approach,and upgrading the information platform through technological efficiency.Additionally,a multi-stakeholder collaborative mechanism involving the government in coordinating policy resources,the operator in improving implementation efficiency,and the public participating in supervision and evaluation is proposed to drive the enhancement of public service quality at Huilongguan Sports and Cultural Park. 展开更多
关键词 Importance-Performance analysis (IPA) method SPORTS PARK PUBLIC service SATISFACTION
在线阅读 下载PDF
The Role of Big Data Analysis in Digital Currency Systems
18
作者 Zhengkun Xiu 《Proceedings of Business and Economic Studies》 2025年第1期1-5,共5页
In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This s... In the contemporary era,characterized by the Internet and digitalization as fundamental features,the operation and application of digital currency have gradually developed into a comprehensive structural system.This system restores the essential characteristics of currency while providing auxiliary services related to the formation,circulation,storage,application,and promotion of digital currency.Compared to traditional currency management technologies,big data analysis technology,which is primarily embedded in digital currency systems,enables the rapid acquisition of information.This facilitates the identification of standard associations within currency data and provides technical support for the operational framework of digital currency. 展开更多
关键词 Big data Digital currency Computational methods Transaction speed
在线阅读 下载PDF
Evaluating fracture volume loss during production process by comparative analysis of initial and second flowback data
19
作者 Chong Cao Tamer Moussa Hassan Dehghanpour 《International Journal of Coal Science & Technology》 2025年第3期274-290,共17页
The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pr... The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days. 展开更多
关键词 Second flowback data analysis Infill development Preloading effect Effective fracture volume loss Flowback rate-transient analysis
在线阅读 下载PDF
Data credibility evaluation method for formation water in oil and gas fields and its influencing factors
20
作者 LI Wei XIE Wuren +2 位作者 WU Saijun SHUAI Yanhua MA Xingzhi 《Petroleum Exploration and Development》 2025年第2期361-376,共16页
The formation water sample in oil and gas fields may be polluted in processes of testing, trial production, collection, storage, transportation and analysis, making the properties of formation water not be reflected t... The formation water sample in oil and gas fields may be polluted in processes of testing, trial production, collection, storage, transportation and analysis, making the properties of formation water not be reflected truly. This paper discusses identification methods and the data credibility evaluation method for formation water in oil and gas fields of petroliferous basins within China. The results of the study show that: (1) the identification methods of formation water include the basic methods of single factors such as physical characteristics, water composition characteristics, water type characteristics, and characteristic coefficients, as well as the comprehensive evaluation method of data credibility proposed on this basis, which mainly relies on the correlation analysis sodium chloride coefficient and desulfurization coefficient and combines geological background evaluation;(2) The basic identifying methods for formation water enable the preliminary identification of hydrochemical data and the preliminary screening of data on site, the proposed comprehensive method realizes the evaluation by classifying the CaCl2-type water into types A-I to A-VI and the NaHCO3-type water into types B-I to B-IV, so that researchers can make in-depth evaluation on the credibility of hydrochemical data and analysis of influencing factors;(3) When the basic methods are used to identify the formation water, the formation water containing anions such as CO_(3)^(2-), OH- and NO_(3)^(-), or the formation water with the sodium chloride coefficient and desulphurization coefficient not matching the geological setting, are all invaded with surface water or polluted by working fluid;(4) When the comprehensive method is used, the data credibility of A-I, A-II, B-I and B-II formation water can be evaluated effectively and accurately only if the geological setting analysis in respect of the factors such as formation environment, sampling conditions, condensate water, acid fluid, leaching of ancient weathering crust, and ancient atmospheric fresh water, is combined, although such formation water is believed with high credibility. 展开更多
关键词 oil and gas field hydrogeology formation water hydrochemical data data credibility evaluation method hydrochemical characteristic indicator influencing factor
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部