The“First Multidisciplinary Forum on COVID-19,”as one of the pivotal medical forums organized by China during the initial outbreak of the pandemic,garnered significant attention from numerous countries worldwide.Ens...The“First Multidisciplinary Forum on COVID-19,”as one of the pivotal medical forums organized by China during the initial outbreak of the pandemic,garnered significant attention from numerous countries worldwide.Ensuring the seamless execution of the forum’s translation necessitated exceptionally high standards for simultaneous interpreters.This paper,through the lens of the Translation as Adaptation and Selection theory within Eco-Translatology,conducts an analytical study of the live simultaneous interpretation at the First Multidisciplinary Forum on COVID-19.It examines the interpreters’adaptations and selections across multiple dimensions-namely,linguistic,cultural,and communicative-with the aim of elucidating the guiding role and recommendations that Eco-Translatology can offer to simultaneous interpretation.Furthermore,it seeks to provide insights that may enhance the quality of interpreters’oral translations.展开更多
The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl...The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl entropy is the Shannon entropy of the Husimi function,and is nonzero even for pure states.This entropy has been extensively studied in mathematical physics.Recent research also suggests a significant connection between the Wehrl entropy and manybody quantum entanglement in spin systems.We investigate the statistical interpretation of the Husimi function and the Wehrl entropy,taking the system of N spin-1/2 particles as an example.Due to the completeness of coherent states,the Husimi function and Wehrl entropy can be explained via the positive operator-valued measurement(POVM)theory,although the coherent states are not a set of orthonormal basis.Here,with the help of the Bayes’theorem,we provide an alternative probabilistic interpretation for the Husimi function and the Wehrl entropy.This interpretation is based on direct measurements of the system,and thus does not require the introduction of an ancillary system as in the POVM theory.Moreover,under this interpretation the classical correspondences of the Husimi function and the Wehrl entropy are just phase-space probability distribution function of N classical tops,and its associated entropy,respectively.Therefore,this explanation contributes to a better understanding of the relationship between the Husimi function,Wehrl entropy,and classical-quantum correspondence.The generalization of this statistical interpretation to continuous-variable systems is also discussed.展开更多
The Interpretation of Nursing Guidelines for Intravenous Thrombolysis in Acute Ischemic Stroke offers comprehensive recommendations across five key domains:hospital organizational management,patient condition monitori...The Interpretation of Nursing Guidelines for Intravenous Thrombolysis in Acute Ischemic Stroke offers comprehensive recommendations across five key domains:hospital organizational management,patient condition monitoring,complication observation and management,positioning and mobility away from the bed,and quality assurance.These Guidelines encompass all the phases of intravenous thrombolysis care for patients experiencing acute ischemic stroke.This article aims to elucidate the Guidelines by discussing their developmental background,the designation process,usage recommendations,and the interpretation of evolving perspectives,thereby providing valuable insights for clinical practice.展开更多
Dynamic stress adjustment in deep-buried high geostress hard rock tunnels frequently triggers catastrophic failures such as rockbursts and collapses.While a comprehensive understanding of this process is critical for ...Dynamic stress adjustment in deep-buried high geostress hard rock tunnels frequently triggers catastrophic failures such as rockbursts and collapses.While a comprehensive understanding of this process is critical for evaluating surrounding rock stability,its dynamic evolution are often overlooked in engineering practice.This study systematically summarizes a novel classification framework for stress adjustment types—stabilizing(two-zoned),shallow failure(three-zoned),and deep failure(four-zoned)—characterized by distinct stress adjustment stages.A dynamic interpretation technology system is developed based on microseismic monitoring,integrating key microseismic parameters(energy index EI,apparent stressσa,microseismic activity S),seismic source parameter space clustering,and microseismic paths.This approach enables precise identification of evolutionary stages,stress adjustment types,and failure precursors,thereby elucidating the intrinsic linkage between geomechanical processes(stress redistribution)and failure risks.The study establishes criteria and procedures for identifying stress adjustment types and their associated failure risks,which were successfully applied in the Grand Canyon Tunnel of the E-han Highway to detect 50 instances of disaster risks.The findings offer invaluable insights into understanding the evolution process of stress adjustment and pinpointing the disaster risks linked to hard rock in comparable high geostress tunnels.展开更多
Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-i...Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-invasive geophysical methods,particularly those using passive seismic surface waves,have emerged as viable alternatives for geological profiling and rockhead detection.This study proposes three interpretation methods for rockhead determination using passive seismic surface wave data from Microtremor Array Measurement(MAM)and Horizontal-to-Vertical Spectral Ratio(HVSR)tests.These are:(1)the Wavelength-Normalized phase velocity(WN)method in which a nonlinear relationship between rockhead depth and wavelength is established;(2)the Statistically Determined-shear wave velocity(SD-V_(s))method in which the representative V_(s) value for rockhead is automatically determined using a statistical method;and(3)the empirical HVSR method in which the rockhead is determined by interpreting resonant frequencies using a reliably calibrated empirical equation.These methods were implemented to determine rockhead depths at 28 locations across two distinct geological formations in Singapore,and the results were evaluated using borehole data.The WN method can determine rockhead depths accurately and reliably with minimal absolute errors(average RMSE=3.11 m),demonstrating robust performance across both geological formations.Its advantage lies in interpreting dispersion curves alone,without the need for the inversion process.The SD-V_(s) method is practical in engineering practice owing to its simplicity.The empirical HVSR method reasonably determines rockhead depths with moderate accuracy,benefiting from a reliably calibrated empirical equation.展开更多
The Agadem block is an area of major oil interest located in the large sedimentary basin of Termit,in the south-east of the Republic of Niger.Since the 1950s,this basin has known geological and geophysical research ac...The Agadem block is an area of major oil interest located in the large sedimentary basin of Termit,in the south-east of the Republic of Niger.Since the 1950s,this basin has known geological and geophysical research activities.However,despite the extensive research carried out,we believe that a geophysical contribution in terms of magnetic properties and their repercussions on the structure of the Agadem block allowing the improvement of existing knowledge is essential.The present study aims to study the structural characteristics of the Agadem block associated with magnetic anomalies.For this,after data shaping,several filtering techniques were applied to the aeromagnetic data to identify and map deep geological structures.The reduction to the pole map shows large negative wavelength anomalies in the southeast half of the block and short positive wavelength anomalies in the northwest part embedded in a large positive anomaly occupying the lower northern half of the block.The maps of the total horizontal derivative and tilt angle show lineaments globally distributed along the NW-SE direction in accordance with the structural style of the study area.The resulting map highlights numerous lineaments that may be associated with faults hidden by the sedimentary cover.The calculation of the Euler deconvolution allowed us to locate and estimate the depths of magnetic sources at variable depths of up to 4000 m.The compilation of the results obtained allowed us to locate zones of high and low intensities which correspond respectively to horsts and grabens as major structures of the Agadem block.展开更多
In recent years,deeps learning has been widely applied in synthetic aperture radar(SAR)image processing.However,the collection of large-scale labeled SAR images is challenging and costly,and the classification accurac...In recent years,deeps learning has been widely applied in synthetic aperture radar(SAR)image processing.However,the collection of large-scale labeled SAR images is challenging and costly,and the classification accuracy is often poor when only limited SAR images are available.To address this issue,we propose a novel framework for sparse SAR target classification under few-shot cases,termed the transfer learning-based interpretable lightweight convolutional neural network(TL-IL-CNN).Additionally,we employ enhanced gradient-weighted class activation mapping(Grad-CAM)to mitigate the“black box”effect often associated with deep learning models and to explore the mechanisms by which a CNN classifies various sparse SAR targets.Initially,we apply a novel bidirectional iterative soft thresholding(BiIST)algorithm to generate sparse images of superior quality compared to those produced by traditional matched filtering(MF)techniques.Subsequently,we pretrain multiple shallow CNNs on a simulated SAR image dataset.Using the sparse SAR dataset as input for the CNNs,we assess the efficacy of transfer learning in sparse SAR target classification and suggest the integration of TL-IL-CNN to enhance the classification accuracy further.Finally,Grad-CAM is utilized to provide visual explanations for the predictions made by the classification framework.The experimental results on the MSTAR dataset reveal that the proposed TL-IL-CNN achieves nearly 90%classification accuracy with only 20%of the training data required under standard operating conditions(SOC),surpassing typical deep learning methods such as vision Transformer(ViT)in the context of small samples.Remarkably,it even presents better performance under extended operating conditions(EOC).Furthermore,the application of Grad-CAM elucidates the CNN’s differentiation process among various sparse SAR targets.The experiments indicate that the model focuses on the target and the background can differ among target classes.The study contributes to an enhanced understanding of the interpretability of such results and enables us to infer the classification outcomes for each category more accurately.展开更多
The discrete fracture system of a rock mass plays a crucial role in controlling the stability of rock slopes.To fully account for the geometric shape and distribution characteristics of jointed rock masses,terrestrial...The discrete fracture system of a rock mass plays a crucial role in controlling the stability of rock slopes.To fully account for the geometric shape and distribution characteristics of jointed rock masses,terrestrial laser scanning(TLS)was employed to acquire high-resolution point-cloud data,and a developed automatic discontinuity-identification technology was coupled to automatically interpret and characterize geometric information such as orientation,trace length,spacing,and set number of the discontinuities.The discrete element method(DEM)was applied to study the influence of the geometric morphology and distribution characteristics of discontinuities on slope stability by generating a discrete fracture network(DFN)with the same statistical characteristics as the actual discontinuities.Based on slope data from the Yebatan Hydropower Station,a simulation was conducted to verify the applicability of the automatic discontinuity identification technology and the discrete fracture network-discrete element method(DFN-DEM).Various geological parameters,including trace length,persistence,and density,were examined to investigate the morphological evolution and response characteristics of rock slope excavation under different joint combination conditions through simulation.The simulation results indicate that joint parameters affect slope stability,with density having the most significant impact.The impact of joint parameters on stability is relatively small within a reasonable range but becomes significant beyond a certain threshold,further validating that the accuracy of field geological surveys is critical for simulation.This study provides a scientific basis for the construction of complex rock slope models,engineering assessments,and disaster prevention and mitigation,which is of great value in both theory and engineering applications.展开更多
Computer analysis of electrocardiograms(ECGs)was introduced more than 50 years ago,with the aim to improve efficiency and clinical workflow.[1,2]However,inaccuracies have been documented in the literature.[3,4]Researc...Computer analysis of electrocardiograms(ECGs)was introduced more than 50 years ago,with the aim to improve efficiency and clinical workflow.[1,2]However,inaccuracies have been documented in the literature.[3,4]Research indicates that emergency department(ED)clinician interruptions occur every 4-10 min,which is significantly more common than in other specialties.[5]This increases the cognitive load and error rates and impacts patient care and clinical effi ciency.[1,2,5]De-prioritization protocols have been introduced in certain centers in the United Kingdom(UK),removing the need for clinician ECG interpretation where ECGs have been interpreted as normal by the machine.展开更多
With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural network...With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural networks.These intelligent and automated methods significantly reduce manual labor,particularly in the laborious task of manually labeling seismic facies.However,the extensive demand for training data imposes limitations on their wider application.To overcome this challenge,we adopt the UNet architecture as the foundational network structure for seismic facies classification,which has demonstrated effective segmentation results even with small-sample training data.Additionally,we integrate spatial pyramid pooling and dilated convolution modules into the network architecture to enhance the perception of spatial information across a broader range.The seismic facies classification test on the public data from the F3 block verifies the superior performance of our proposed improved network structure in delineating seismic facies boundaries.Comparative analysis against the traditional UNet model reveals that our method achieves more accurate predictive classification results,as evidenced by various evaluation metrics for image segmentation.Obviously,the classification accuracy reaches an impressive 96%.Furthermore,the results of seismic facies classification in the seismic slice dimension provide further confirmation of the superior performance of our proposed method,which accurately defines the range of different seismic facies.This approach holds significant potential for analyzing geological patterns and extracting valuable depositional information.展开更多
For departmental legal norms concerning citizens’basic rights,when multiple interpretations are possible based on individual case circumstances,interpreters representing public authority need to apply the method of c...For departmental legal norms concerning citizens’basic rights,when multiple interpretations are possible based on individual case circumstances,interpreters representing public authority need to apply the method of constitutional interpretation to screen out the interpretation conclusions that do not violate the Constitution.This means selecting interpretations at the constitutional level that do not overly restrict citizens’basic rights and understanding the specific connotations of legal norms with the principle of“not infringing on citizens’basic rights.”The Constitution,as a framework order,does not require interpreters to choose the most constitutionally aligned interpretation among various constitutional interpretations.If a legal norm does not have a constitutional interpretation conclusion in an individual case circumstance,it indicates that the application of that norm in the case is unconstitutional,and the interpreter should avoid applying the legal norm in that case.Regarding judgment standards,interpreters should apply the principle of proportionality to determine whether each legal interpretation conclusion concerning basic rights-related legal norms complies with the Constitution.Out of respect for the legislature,the application of the sub-principles of pro-portionality should consider the boundaries of interpretative actions.展开更多
Artificial Intelligence(AI)Machine Learning(ML)technologies,particularly Deep Learning(DL),have demonstrated significant potential in the interpretation of Remote Sensing(RS)imagery,covering tasks such as scene classi...Artificial Intelligence(AI)Machine Learning(ML)technologies,particularly Deep Learning(DL),have demonstrated significant potential in the interpretation of Remote Sensing(RS)imagery,covering tasks such as scene classification,object detection,land-cover/land-use classification,change detection,and multi-view stereo reconstruction.Large-scale training samples are essential for ML/DL models to achieve optimal performance.However,the current organization of training samples is ad-hoc and vendor-specific,lacking an integrated approach that can effectively manage training samples from different vendors to meet the demands of various RS AI tasks.This article proposes a solution to address these challenges by designing and implementing LuoJiaSET,a large-scale training sample database system for intelligent interpretation of RS imagery.LuoJiaSET accommodates over five million training samples,providing support for cross-dataset queries and serving as a comprehensive training data store for RS AI model training and calibration.It overcomes challenges related to label semantic categories,structural heterogeneity in label representation,and interoperable data access.展开更多
The application of whole genome sequencing is expanding in clinical diagnostics across various genetic disorders, and the significance of non-coding variants in penetrant diseases is increasingly being demonstrated. T...The application of whole genome sequencing is expanding in clinical diagnostics across various genetic disorders, and the significance of non-coding variants in penetrant diseases is increasingly being demonstrated. Therefore, it is urgent to improve the diagnostic yield by exploring the pathogenic mechanisms of variants in non-coding regions. However, the interpretation of non-coding variants remains a significant challenge, due to the complex functional regulatory mechanisms of non-coding regions and the current limitations of available databases and tools. Hence, we develop the non-coding variant annotation database (NCAD, http://www.ncawdb.net/), encompassing comprehensive insights into 665,679,194 variants, regulatory elements, and element interaction details. Integrating data from 96 sources, spanning both GRCh37 and GRCh38 versions, NCAD v1.0 provides vital information to support the genetic diagnosis of non-coding variants, including allele frequencies of 12 diverse populations, with a particular focus on the population frequency information for 230,235,698 variants in 20,964 Chinese individuals. Moreover, it offers prediction scores for variant functionality, five categories of regulatory elements, and four types of non-coding RNAs. With its rich data and comprehensive coverage, NCAD serves as a valuable platform, empowering researchers and clinicians with profound insights into non-coding regulatory mechanisms while facilitating the interpretation of non-coding variants.展开更多
During injection treatments, bottomhole pressure measurements may significantly mismatch modeling results. We devise a computationally effective technique for interpretation of fluid injection in a wellbore interval w...During injection treatments, bottomhole pressure measurements may significantly mismatch modeling results. We devise a computationally effective technique for interpretation of fluid injection in a wellbore interval with multiple geological layers based on the bottomhole pressure measurements. The permeability, porosity and compressibility in each layer are initially setup, while the skin factor and partitioning of injected fluids among the zones during the injection are found as a solution of the problem. The problem takes into account Darcy flow and chemical interactions between the injected acids, diverter fluids and reservoir rock typical in modern matrix acidizing treatments. Using the synchronously recorded injection rate and bottomhole pressure, we evaluate skin factor changes in each layer and actual fluid placement into the reservoir during different pumping jobs: matrix acidizing, water control, sand control, scale squeezes and water flooding. The model is validated by comparison with a simulator used in industry. It gives opportunity to estimate efficiency of a matrix treatment job, role of every injection stage, and control fluid delivery to each layer in real time. The presented interpretation technique significantly improves accuracy of matrix treatments analysis by coupling the hydrodynamic model with records of pressure and injection rate during the treatment.展开更多
Gas chromatography-mass spectrometry(GC-MS)is an extremely important analytical technique that is widely used in organic geochemistry.It is the only approach to capture biomarker features of organic matter and provide...Gas chromatography-mass spectrometry(GC-MS)is an extremely important analytical technique that is widely used in organic geochemistry.It is the only approach to capture biomarker features of organic matter and provides the key evidence for oil-source correlation and thermal maturity determination.However,the conventional way of processing and interpreting the mass chromatogram is both timeconsuming and labor-intensive,which increases the research cost and restrains extensive applications of this method.To overcome this limitation,a correlation model is developed based on the convolution neural network(CNN)to link the mass chromatogram and biomarker features of samples from the Triassic Yanchang Formation,Ordos Basin,China.In this way,the mass chromatogram can be automatically interpreted.This research first performs dimensionality reduction for 15 biomarker parameters via the factor analysis and then quantifies the biomarker features using two indexes(i.e.MI and PMI)that represent the organic matter thermal maturity and parent material type,respectively.Subsequently,training,interpretation,and validation are performed multiple times using different CNN models to optimize the model structure and hyper-parameter setting,with the mass chromatogram used as the input and the obtained MI and PMI values for supervision(label).The optimized model presents high accuracy in automatically interpreting the mass chromatogram,with R2values typically above 0.85 and0.80 for the thermal maturity and parent material interpretation results,respectively.The significance of this research is twofold:(i)developing an efficient technique for geochemical research;(ii)more importantly,demonstrating the potential of artificial intelligence in organic geochemistry and providing vital references for future related studies.展开更多
The Pennsylvanian unconformity,which is a detrital surface,separates the beds of the Permian-aged strata from the Lower Paleozoic in the Central Basin Platform.Seismic data interpretation indicates that the unconformi...The Pennsylvanian unconformity,which is a detrital surface,separates the beds of the Permian-aged strata from the Lower Paleozoic in the Central Basin Platform.Seismic data interpretation indicates that the unconformity is an angular unconformity,overlying multiple normal faults,and accompanied with a thrust fault which maximizes the region's structural complexity.Additionally,the Pennsylvanian angular unconformity creates pinch-outs between the beds above and below.We computed the spectral decomposition and reflector convergence attributes and analyzed them to characterize the angular unconformity and faults.The spectral decomposition attribute divides the broadband seismic data into different spectral bands to resolve thin beds and show thickness variations.In contrast,the reflector convergence attribute highlights the location and direction of the pinch-outs as they dip south at angles between 2° and 6°.After reviewing findings from RGB blending of the spectrally decomposed frequencies along the Pennsylvanian unconformity,we observed channel-like features and multiple linear bands in addition to the faults and pinch-outs.It can be inferred that the identified linear bands could be the result of different lithologies associated with the tilting of the beds,and the faults may possibly influence hydrocarbon migration or act as a flow barrier to entrap hydrocarbon accumulation.The identification of this angular unconformity and the associated features in the study area are vital for the following reasons:1)the unconformity surface represents a natural stratigraphic boundary;2)the stratigraphic pinch-outs act as fluid flow connectivity boundaries;3)the areal extent of compartmentalized reservoirs'boundaries created by the angular unconformity are better defined;and 4)fault displacements are better understood when planning well locations as faults can be flow barriers,or permeability conduits,depending on facies heterogeneity and/or seal effectiveness of a fault,which can affect hydrocarbon production.The methodology utilized in this study is a further step in the characterization of reservoirs and can be used to expand our knowledge and obtain more information about the Goldsmith Field.展开更多
Predictive modeling of photocatalytic NO removal is highly desirable for efficient air pollution abatement.However,great challenges remain in precisely predicting photocatalytic performance and understanding interacti...Predictive modeling of photocatalytic NO removal is highly desirable for efficient air pollution abatement.However,great challenges remain in precisely predicting photocatalytic performance and understanding interactions of diverse features in the catalytic systems.Herein,a dataset of g-C_(3) N_(4)-based catalysts with 255 data points was collected from peer-reviewed publications and machine learning(ML)model was proposed to predict the NO removal rate.The result shows that the Gradient Boosting Decision Tree(GBDT)demonstrated the greatest prediction accuracy with R 2 of 0.999 and 0.907 on the training and test data,respectively.The SHAP value and feature importance analysis revealed that the empirical categories for NO removal rate,in the order of importance,were catalyst characteristics>reaction process>preparation conditions.Moreover,the partial dependence plots broke the ML black box to further quantify the marginal contributions of the input features(e.g.,doping ratio,flow rate,and pore volume)to the model output outcomes.This ML approach presents a pure data-driven,interpretable framework,which provides new insights into the influence of catalyst characteristics,reaction process,and preparation conditions on NO removal.展开更多
The periphery of the Qinghai-Tibet Plateau is renowned for its susceptibility to landslides.However,the northwestern margin of this region,characterised by limited human activities and challenging transportation,remai...The periphery of the Qinghai-Tibet Plateau is renowned for its susceptibility to landslides.However,the northwestern margin of this region,characterised by limited human activities and challenging transportation,remains insufficiently explored concerning landslide occurrence and dispersion.With the planning and construction of the Xinjiang-Xizang Railway,a comprehensive investigation into disastrous landslides in this area is essential for effective disaster preparedness and mitigation strategies.By using the human-computer interaction interpretation approach,the authors established a landslide database encompassing 13003 landslides,collectively spanning an area of 3351.24 km^(2)(36°N-40°N,73°E-78°E).The database incorporates diverse topographical and environmental parameters,including regional elevation,slope angle,slope aspect,distance to faults,distance to roads,distance to rivers,annual precipitation,and stratum.The statistical characteristics of number and area of landslides,landslide number density(LND),and landslide area percentage(LAP)are analyzed.The authors found that a predominant concentration of landslide origins within high slope angle regions,with the highest incidence observed in intervals characterised by average slopes of 20°to 30°,maximum slope angle above 80°,along with orientations towards the north(N),northeast(NE),and southwest(SW).Additionally,elevations above 4.5 km,distance to rivers below 1 km,rainfall between 20-30 mm and 30-40 mm emerge as particularly susceptible to landslide development.The study area’s geological composition primarily comprises Mesozoic and Upper Paleozoic outcrops.Both fault and human engineering activities have different degrees of influence on landslide development.Furthermore,the significance of the landslide database,the relationship between landslide distribution and environmental factors,and the geometric and morphological characteristics of landslides are discussed.The landslide H/L ratios in the study area are mainly concentrated between 0.4 and 0.64.It means the landslides mobility in the region is relatively low,and the authors speculate that landslides in this region more possibly triggered by earthquakes or located in meizoseismal area.展开更多
The complex pore structure of carbonate reservoirs hinders the correlation between porosity and permeability.In view of the sedimentation,diagenesis,testing,and production characteristics of carbonate reservoirs in th...The complex pore structure of carbonate reservoirs hinders the correlation between porosity and permeability.In view of the sedimentation,diagenesis,testing,and production characteristics of carbonate reservoirs in the study area,combined with the current trends and advances in well log interpretation techniques for carbonate reservoirs,a log interpretation technology route of“geological information constraint+deep learning”was developed.The principal component analysis(PCA)was employed to establish lithology identification criteria with an accuracy of 91%.The Bayesian stepwise discriminant method was used to construct a sedimentary microfacies identification method with an accuracy of 90.5%.Based on production data,the main lithologies and sedimentary microfacies of effective reservoirs were determined,and 10 petrophysical facies with effective reservoir characteristics were identified.Constrained by petrophysical facies,the mean interpretation error of porosity compared to core analysis results is 2.7%,and the ratio of interpreted permeability to core analysis is within one order of magnitude,averaging 3.6.The research results demonstrate that deep learning algorithms can uncover the correlation in carbonate reservoir well logging data.Integrating geological and production data and selecting appropriate machine learning algorithms can significantly improve the accuracy of well log interpretation for carbonate reservoirs.展开更多
Logging data and its interpretation results are one of the most important basic data for understanding reservoirs and oilfield development. Standardized and unified logging interpretation results play a decisive role ...Logging data and its interpretation results are one of the most important basic data for understanding reservoirs and oilfield development. Standardized and unified logging interpretation results play a decisive role in fine reservoir description and reservoir development. Aiming at the problem of the conflict between the development effect and the initial interpretation result of Yan 9 reservoir in Hujianshan area of Ordos Basin, by combining the current well production performance, logging, oil test, production test and other data, on the basis of making full use of core, coring, logging, thin section analysis and high pressure mercury injection data, the four characteristics of reservoir are analyzed, a more scientific and reasonable calculation model of reservoir logging parameters is established, and the reserves are recalculated after the second interpretation standard of logging is determined. The research improves the accuracy of logging interpretation and provides an effective basis for subsequent production development and potential horizons.展开更多
文摘The“First Multidisciplinary Forum on COVID-19,”as one of the pivotal medical forums organized by China during the initial outbreak of the pandemic,garnered significant attention from numerous countries worldwide.Ensuring the seamless execution of the forum’s translation necessitated exceptionally high standards for simultaneous interpreters.This paper,through the lens of the Translation as Adaptation and Selection theory within Eco-Translatology,conducts an analytical study of the live simultaneous interpretation at the First Multidisciplinary Forum on COVID-19.It examines the interpreters’adaptations and selections across multiple dimensions-namely,linguistic,cultural,and communicative-with the aim of elucidating the guiding role and recommendations that Eco-Translatology can offer to simultaneous interpretation.Furthermore,it seeks to provide insights that may enhance the quality of interpreters’oral translations.
基金supported by the National Key Research and Development Program of China[Grant No.2022YFA1405300(PZ)]the Innovation Program for Quantum Science and Technology(Grant No.2023ZD0300700)。
文摘The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl entropy is the Shannon entropy of the Husimi function,and is nonzero even for pure states.This entropy has been extensively studied in mathematical physics.Recent research also suggests a significant connection between the Wehrl entropy and manybody quantum entanglement in spin systems.We investigate the statistical interpretation of the Husimi function and the Wehrl entropy,taking the system of N spin-1/2 particles as an example.Due to the completeness of coherent states,the Husimi function and Wehrl entropy can be explained via the positive operator-valued measurement(POVM)theory,although the coherent states are not a set of orthonormal basis.Here,with the help of the Bayes’theorem,we provide an alternative probabilistic interpretation for the Husimi function and the Wehrl entropy.This interpretation is based on direct measurements of the system,and thus does not require the introduction of an ancillary system as in the POVM theory.Moreover,under this interpretation the classical correspondences of the Husimi function and the Wehrl entropy are just phase-space probability distribution function of N classical tops,and its associated entropy,respectively.Therefore,this explanation contributes to a better understanding of the relationship between the Husimi function,Wehrl entropy,and classical-quantum correspondence.The generalization of this statistical interpretation to continuous-variable systems is also discussed.
文摘The Interpretation of Nursing Guidelines for Intravenous Thrombolysis in Acute Ischemic Stroke offers comprehensive recommendations across five key domains:hospital organizational management,patient condition monitoring,complication observation and management,positioning and mobility away from the bed,and quality assurance.These Guidelines encompass all the phases of intravenous thrombolysis care for patients experiencing acute ischemic stroke.This article aims to elucidate the Guidelines by discussing their developmental background,the designation process,usage recommendations,and the interpretation of evolving perspectives,thereby providing valuable insights for clinical practice.
基金supported by the National Natural Science Foundation of China(Nos.42177173,U23A20651 and 42130719)and the Outstanding Youth Science Fund Project of Sichuan Provincial Natural Science Foundation(No.2025NSFJQ0003)。
文摘Dynamic stress adjustment in deep-buried high geostress hard rock tunnels frequently triggers catastrophic failures such as rockbursts and collapses.While a comprehensive understanding of this process is critical for evaluating surrounding rock stability,its dynamic evolution are often overlooked in engineering practice.This study systematically summarizes a novel classification framework for stress adjustment types—stabilizing(two-zoned),shallow failure(three-zoned),and deep failure(four-zoned)—characterized by distinct stress adjustment stages.A dynamic interpretation technology system is developed based on microseismic monitoring,integrating key microseismic parameters(energy index EI,apparent stressσa,microseismic activity S),seismic source parameter space clustering,and microseismic paths.This approach enables precise identification of evolutionary stages,stress adjustment types,and failure precursors,thereby elucidating the intrinsic linkage between geomechanical processes(stress redistribution)and failure risks.The study establishes criteria and procedures for identifying stress adjustment types and their associated failure risks,which were successfully applied in the Grand Canyon Tunnel of the E-han Highway to detect 50 instances of disaster risks.The findings offer invaluable insights into understanding the evolution process of stress adjustment and pinpointing the disaster risks linked to hard rock in comparable high geostress tunnels.
基金partially supported by the Singapore Ministry of National Development and the National Research Foundation,Prime Minister’s Office,Singapore,under the Land and Liveability National Innovation Challenge(L2 NIC)Research Program(Grant No.L2NICCFP2-2015-1)by the National Research Foundation(NRF)of Singapore,under the Virtual Singapore program(Grant No.NRF2019VSG-GMS-001).
文摘Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-invasive geophysical methods,particularly those using passive seismic surface waves,have emerged as viable alternatives for geological profiling and rockhead detection.This study proposes three interpretation methods for rockhead determination using passive seismic surface wave data from Microtremor Array Measurement(MAM)and Horizontal-to-Vertical Spectral Ratio(HVSR)tests.These are:(1)the Wavelength-Normalized phase velocity(WN)method in which a nonlinear relationship between rockhead depth and wavelength is established;(2)the Statistically Determined-shear wave velocity(SD-V_(s))method in which the representative V_(s) value for rockhead is automatically determined using a statistical method;and(3)the empirical HVSR method in which the rockhead is determined by interpreting resonant frequencies using a reliably calibrated empirical equation.These methods were implemented to determine rockhead depths at 28 locations across two distinct geological formations in Singapore,and the results were evaluated using borehole data.The WN method can determine rockhead depths accurately and reliably with minimal absolute errors(average RMSE=3.11 m),demonstrating robust performance across both geological formations.Its advantage lies in interpreting dispersion curves alone,without the need for the inversion process.The SD-V_(s) method is practical in engineering practice owing to its simplicity.The empirical HVSR method reasonably determines rockhead depths with moderate accuracy,benefiting from a reliably calibrated empirical equation.
文摘The Agadem block is an area of major oil interest located in the large sedimentary basin of Termit,in the south-east of the Republic of Niger.Since the 1950s,this basin has known geological and geophysical research activities.However,despite the extensive research carried out,we believe that a geophysical contribution in terms of magnetic properties and their repercussions on the structure of the Agadem block allowing the improvement of existing knowledge is essential.The present study aims to study the structural characteristics of the Agadem block associated with magnetic anomalies.For this,after data shaping,several filtering techniques were applied to the aeromagnetic data to identify and map deep geological structures.The reduction to the pole map shows large negative wavelength anomalies in the southeast half of the block and short positive wavelength anomalies in the northwest part embedded in a large positive anomaly occupying the lower northern half of the block.The maps of the total horizontal derivative and tilt angle show lineaments globally distributed along the NW-SE direction in accordance with the structural style of the study area.The resulting map highlights numerous lineaments that may be associated with faults hidden by the sedimentary cover.The calculation of the Euler deconvolution allowed us to locate and estimate the depths of magnetic sources at variable depths of up to 4000 m.The compilation of the results obtained allowed us to locate zones of high and low intensities which correspond respectively to horsts and grabens as major structures of the Agadem block.
基金supported in part by the National Natural Science Foundation(Nos.62271248,62401256)in part by the Natural Science Foundation of Ji-angsu Province(Nos.BK20230090,BK20241384)in part by the Key Laboratory of Land Satellite Remote Sens-ing Application,Ministry of Natural Resources of China(No.KLSMNR-K202303)。
文摘In recent years,deeps learning has been widely applied in synthetic aperture radar(SAR)image processing.However,the collection of large-scale labeled SAR images is challenging and costly,and the classification accuracy is often poor when only limited SAR images are available.To address this issue,we propose a novel framework for sparse SAR target classification under few-shot cases,termed the transfer learning-based interpretable lightweight convolutional neural network(TL-IL-CNN).Additionally,we employ enhanced gradient-weighted class activation mapping(Grad-CAM)to mitigate the“black box”effect often associated with deep learning models and to explore the mechanisms by which a CNN classifies various sparse SAR targets.Initially,we apply a novel bidirectional iterative soft thresholding(BiIST)algorithm to generate sparse images of superior quality compared to those produced by traditional matched filtering(MF)techniques.Subsequently,we pretrain multiple shallow CNNs on a simulated SAR image dataset.Using the sparse SAR dataset as input for the CNNs,we assess the efficacy of transfer learning in sparse SAR target classification and suggest the integration of TL-IL-CNN to enhance the classification accuracy further.Finally,Grad-CAM is utilized to provide visual explanations for the predictions made by the classification framework.The experimental results on the MSTAR dataset reveal that the proposed TL-IL-CNN achieves nearly 90%classification accuracy with only 20%of the training data required under standard operating conditions(SOC),surpassing typical deep learning methods such as vision Transformer(ViT)in the context of small samples.Remarkably,it even presents better performance under extended operating conditions(EOC).Furthermore,the application of Grad-CAM elucidates the CNN’s differentiation process among various sparse SAR targets.The experiments indicate that the model focuses on the target and the background can differ among target classes.The study contributes to an enhanced understanding of the interpretability of such results and enables us to infer the classification outcomes for each category more accurately.
基金We acknowledge the funding support from the National Key R&D Program of China(Grant No.2022YFC3080100)the National Natural Science Foundation of China(Grant No.42102316)the opening fund of State Key Laboratory of Hydraulics and Mountain River Engineering,Sichuan University(Grant No.SKHL2306).
文摘The discrete fracture system of a rock mass plays a crucial role in controlling the stability of rock slopes.To fully account for the geometric shape and distribution characteristics of jointed rock masses,terrestrial laser scanning(TLS)was employed to acquire high-resolution point-cloud data,and a developed automatic discontinuity-identification technology was coupled to automatically interpret and characterize geometric information such as orientation,trace length,spacing,and set number of the discontinuities.The discrete element method(DEM)was applied to study the influence of the geometric morphology and distribution characteristics of discontinuities on slope stability by generating a discrete fracture network(DFN)with the same statistical characteristics as the actual discontinuities.Based on slope data from the Yebatan Hydropower Station,a simulation was conducted to verify the applicability of the automatic discontinuity identification technology and the discrete fracture network-discrete element method(DFN-DEM).Various geological parameters,including trace length,persistence,and density,were examined to investigate the morphological evolution and response characteristics of rock slope excavation under different joint combination conditions through simulation.The simulation results indicate that joint parameters affect slope stability,with density having the most significant impact.The impact of joint parameters on stability is relatively small within a reasonable range but becomes significant beyond a certain threshold,further validating that the accuracy of field geological surveys is critical for simulation.This study provides a scientific basis for the construction of complex rock slope models,engineering assessments,and disaster prevention and mitigation,which is of great value in both theory and engineering applications.
文摘Computer analysis of electrocardiograms(ECGs)was introduced more than 50 years ago,with the aim to improve efficiency and clinical workflow.[1,2]However,inaccuracies have been documented in the literature.[3,4]Research indicates that emergency department(ED)clinician interruptions occur every 4-10 min,which is significantly more common than in other specialties.[5]This increases the cognitive load and error rates and impacts patient care and clinical effi ciency.[1,2,5]De-prioritization protocols have been introduced in certain centers in the United Kingdom(UK),removing the need for clinician ECG interpretation where ECGs have been interpreted as normal by the machine.
基金funded by the Fundamental Research Project of CNPC Geophysical Key Lab(2022DQ0604-4)the Strategic Cooperation Technology Projects of China National Petroleum Corporation and China University of Petroleum-Beijing(ZLZX 202003)。
文摘With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural networks.These intelligent and automated methods significantly reduce manual labor,particularly in the laborious task of manually labeling seismic facies.However,the extensive demand for training data imposes limitations on their wider application.To overcome this challenge,we adopt the UNet architecture as the foundational network structure for seismic facies classification,which has demonstrated effective segmentation results even with small-sample training data.Additionally,we integrate spatial pyramid pooling and dilated convolution modules into the network architecture to enhance the perception of spatial information across a broader range.The seismic facies classification test on the public data from the F3 block verifies the superior performance of our proposed improved network structure in delineating seismic facies boundaries.Comparative analysis against the traditional UNet model reveals that our method achieves more accurate predictive classification results,as evidenced by various evaluation metrics for image segmentation.Obviously,the classification accuracy reaches an impressive 96%.Furthermore,the results of seismic facies classification in the seismic slice dimension provide further confirmation of the superior performance of our proposed method,which accurately defines the range of different seismic facies.This approach holds significant potential for analyzing geological patterns and extracting valuable depositional information.
文摘For departmental legal norms concerning citizens’basic rights,when multiple interpretations are possible based on individual case circumstances,interpreters representing public authority need to apply the method of constitutional interpretation to screen out the interpretation conclusions that do not violate the Constitution.This means selecting interpretations at the constitutional level that do not overly restrict citizens’basic rights and understanding the specific connotations of legal norms with the principle of“not infringing on citizens’basic rights.”The Constitution,as a framework order,does not require interpreters to choose the most constitutionally aligned interpretation among various constitutional interpretations.If a legal norm does not have a constitutional interpretation conclusion in an individual case circumstance,it indicates that the application of that norm in the case is unconstitutional,and the interpreter should avoid applying the legal norm in that case.Regarding judgment standards,interpreters should apply the principle of proportionality to determine whether each legal interpretation conclusion concerning basic rights-related legal norms complies with the Constitution.Out of respect for the legislature,the application of the sub-principles of pro-portionality should consider the boundaries of interpretative actions.
基金supported by the National Natural Science Foundation of China[grant number 42071354]supported by the Fundamental Research Funds for the Central Universities[grant number 2042022dx0001]supported by the Fundamental Research Funds for the Central Universities[grant number WUT:223108001].
文摘Artificial Intelligence(AI)Machine Learning(ML)technologies,particularly Deep Learning(DL),have demonstrated significant potential in the interpretation of Remote Sensing(RS)imagery,covering tasks such as scene classification,object detection,land-cover/land-use classification,change detection,and multi-view stereo reconstruction.Large-scale training samples are essential for ML/DL models to achieve optimal performance.However,the current organization of training samples is ad-hoc and vendor-specific,lacking an integrated approach that can effectively manage training samples from different vendors to meet the demands of various RS AI tasks.This article proposes a solution to address these challenges by designing and implementing LuoJiaSET,a large-scale training sample database system for intelligent interpretation of RS imagery.LuoJiaSET accommodates over five million training samples,providing support for cross-dataset queries and serving as a comprehensive training data store for RS AI model training and calibration.It overcomes challenges related to label semantic categories,structural heterogeneity in label representation,and interoperable data access.
基金supported by the National Natural Science Foundation of China(82171836)the 1·3·5 project for disciplines of excellence,West China Hospital,Sichuan University(ZYJC20002).
文摘The application of whole genome sequencing is expanding in clinical diagnostics across various genetic disorders, and the significance of non-coding variants in penetrant diseases is increasingly being demonstrated. Therefore, it is urgent to improve the diagnostic yield by exploring the pathogenic mechanisms of variants in non-coding regions. However, the interpretation of non-coding variants remains a significant challenge, due to the complex functional regulatory mechanisms of non-coding regions and the current limitations of available databases and tools. Hence, we develop the non-coding variant annotation database (NCAD, http://www.ncawdb.net/), encompassing comprehensive insights into 665,679,194 variants, regulatory elements, and element interaction details. Integrating data from 96 sources, spanning both GRCh37 and GRCh38 versions, NCAD v1.0 provides vital information to support the genetic diagnosis of non-coding variants, including allele frequencies of 12 diverse populations, with a particular focus on the population frequency information for 230,235,698 variants in 20,964 Chinese individuals. Moreover, it offers prediction scores for variant functionality, five categories of regulatory elements, and four types of non-coding RNAs. With its rich data and comprehensive coverage, NCAD serves as a valuable platform, empowering researchers and clinicians with profound insights into non-coding regulatory mechanisms while facilitating the interpretation of non-coding variants.
文摘During injection treatments, bottomhole pressure measurements may significantly mismatch modeling results. We devise a computationally effective technique for interpretation of fluid injection in a wellbore interval with multiple geological layers based on the bottomhole pressure measurements. The permeability, porosity and compressibility in each layer are initially setup, while the skin factor and partitioning of injected fluids among the zones during the injection are found as a solution of the problem. The problem takes into account Darcy flow and chemical interactions between the injected acids, diverter fluids and reservoir rock typical in modern matrix acidizing treatments. Using the synchronously recorded injection rate and bottomhole pressure, we evaluate skin factor changes in each layer and actual fluid placement into the reservoir during different pumping jobs: matrix acidizing, water control, sand control, scale squeezes and water flooding. The model is validated by comparison with a simulator used in industry. It gives opportunity to estimate efficiency of a matrix treatment job, role of every injection stage, and control fluid delivery to each layer in real time. The presented interpretation technique significantly improves accuracy of matrix treatments analysis by coupling the hydrodynamic model with records of pressure and injection rate during the treatment.
基金financially supported by China Postdoctoral Science Foundation(Grant No.2023M730365)Natural Science Foundation of Hubei Province of China(Grant No.2023AFB232)。
文摘Gas chromatography-mass spectrometry(GC-MS)is an extremely important analytical technique that is widely used in organic geochemistry.It is the only approach to capture biomarker features of organic matter and provides the key evidence for oil-source correlation and thermal maturity determination.However,the conventional way of processing and interpreting the mass chromatogram is both timeconsuming and labor-intensive,which increases the research cost and restrains extensive applications of this method.To overcome this limitation,a correlation model is developed based on the convolution neural network(CNN)to link the mass chromatogram and biomarker features of samples from the Triassic Yanchang Formation,Ordos Basin,China.In this way,the mass chromatogram can be automatically interpreted.This research first performs dimensionality reduction for 15 biomarker parameters via the factor analysis and then quantifies the biomarker features using two indexes(i.e.MI and PMI)that represent the organic matter thermal maturity and parent material type,respectively.Subsequently,training,interpretation,and validation are performed multiple times using different CNN models to optimize the model structure and hyper-parameter setting,with the mass chromatogram used as the input and the obtained MI and PMI values for supervision(label).The optimized model presents high accuracy in automatically interpreting the mass chromatogram,with R2values typically above 0.85 and0.80 for the thermal maturity and parent material interpretation results,respectively.The significance of this research is twofold:(i)developing an efficient technique for geochemical research;(ii)more importantly,demonstrating the potential of artificial intelligence in organic geochemistry and providing vital references for future related studies.
文摘The Pennsylvanian unconformity,which is a detrital surface,separates the beds of the Permian-aged strata from the Lower Paleozoic in the Central Basin Platform.Seismic data interpretation indicates that the unconformity is an angular unconformity,overlying multiple normal faults,and accompanied with a thrust fault which maximizes the region's structural complexity.Additionally,the Pennsylvanian angular unconformity creates pinch-outs between the beds above and below.We computed the spectral decomposition and reflector convergence attributes and analyzed them to characterize the angular unconformity and faults.The spectral decomposition attribute divides the broadband seismic data into different spectral bands to resolve thin beds and show thickness variations.In contrast,the reflector convergence attribute highlights the location and direction of the pinch-outs as they dip south at angles between 2° and 6°.After reviewing findings from RGB blending of the spectrally decomposed frequencies along the Pennsylvanian unconformity,we observed channel-like features and multiple linear bands in addition to the faults and pinch-outs.It can be inferred that the identified linear bands could be the result of different lithologies associated with the tilting of the beds,and the faults may possibly influence hydrocarbon migration or act as a flow barrier to entrap hydrocarbon accumulation.The identification of this angular unconformity and the associated features in the study area are vital for the following reasons:1)the unconformity surface represents a natural stratigraphic boundary;2)the stratigraphic pinch-outs act as fluid flow connectivity boundaries;3)the areal extent of compartmentalized reservoirs'boundaries created by the angular unconformity are better defined;and 4)fault displacements are better understood when planning well locations as faults can be flow barriers,or permeability conduits,depending on facies heterogeneity and/or seal effectiveness of a fault,which can affect hydrocarbon production.The methodology utilized in this study is a further step in the characterization of reservoirs and can be used to expand our knowledge and obtain more information about the Goldsmith Field.
基金supported by the National Natural Science Foundation of China(Nos.22172019,22225606,22176029)Excellent Youth Foundation of Sichuan Scientific Committee Grant in China(No.2021JDJQ0006).
文摘Predictive modeling of photocatalytic NO removal is highly desirable for efficient air pollution abatement.However,great challenges remain in precisely predicting photocatalytic performance and understanding interactions of diverse features in the catalytic systems.Herein,a dataset of g-C_(3) N_(4)-based catalysts with 255 data points was collected from peer-reviewed publications and machine learning(ML)model was proposed to predict the NO removal rate.The result shows that the Gradient Boosting Decision Tree(GBDT)demonstrated the greatest prediction accuracy with R 2 of 0.999 and 0.907 on the training and test data,respectively.The SHAP value and feature importance analysis revealed that the empirical categories for NO removal rate,in the order of importance,were catalyst characteristics>reaction process>preparation conditions.Moreover,the partial dependence plots broke the ML black box to further quantify the marginal contributions of the input features(e.g.,doping ratio,flow rate,and pore volume)to the model output outcomes.This ML approach presents a pure data-driven,interpretable framework,which provides new insights into the influence of catalyst characteristics,reaction process,and preparation conditions on NO removal.
基金supported by the National Key Research and Development Program of China(2021YFB3901205)National Institute of Natural Hazards,Ministry of Emergency Management of China(2023-JBKY-57)。
文摘The periphery of the Qinghai-Tibet Plateau is renowned for its susceptibility to landslides.However,the northwestern margin of this region,characterised by limited human activities and challenging transportation,remains insufficiently explored concerning landslide occurrence and dispersion.With the planning and construction of the Xinjiang-Xizang Railway,a comprehensive investigation into disastrous landslides in this area is essential for effective disaster preparedness and mitigation strategies.By using the human-computer interaction interpretation approach,the authors established a landslide database encompassing 13003 landslides,collectively spanning an area of 3351.24 km^(2)(36°N-40°N,73°E-78°E).The database incorporates diverse topographical and environmental parameters,including regional elevation,slope angle,slope aspect,distance to faults,distance to roads,distance to rivers,annual precipitation,and stratum.The statistical characteristics of number and area of landslides,landslide number density(LND),and landslide area percentage(LAP)are analyzed.The authors found that a predominant concentration of landslide origins within high slope angle regions,with the highest incidence observed in intervals characterised by average slopes of 20°to 30°,maximum slope angle above 80°,along with orientations towards the north(N),northeast(NE),and southwest(SW).Additionally,elevations above 4.5 km,distance to rivers below 1 km,rainfall between 20-30 mm and 30-40 mm emerge as particularly susceptible to landslide development.The study area’s geological composition primarily comprises Mesozoic and Upper Paleozoic outcrops.Both fault and human engineering activities have different degrees of influence on landslide development.Furthermore,the significance of the landslide database,the relationship between landslide distribution and environmental factors,and the geometric and morphological characteristics of landslides are discussed.The landslide H/L ratios in the study area are mainly concentrated between 0.4 and 0.64.It means the landslides mobility in the region is relatively low,and the authors speculate that landslides in this region more possibly triggered by earthquakes or located in meizoseismal area.
基金funded by the Science and Technology Project of Changzhou City(Grant No.CJ20210120)the Research Start-up Fund of Changzhou University(Grant No.ZMF21020056).
文摘The complex pore structure of carbonate reservoirs hinders the correlation between porosity and permeability.In view of the sedimentation,diagenesis,testing,and production characteristics of carbonate reservoirs in the study area,combined with the current trends and advances in well log interpretation techniques for carbonate reservoirs,a log interpretation technology route of“geological information constraint+deep learning”was developed.The principal component analysis(PCA)was employed to establish lithology identification criteria with an accuracy of 91%.The Bayesian stepwise discriminant method was used to construct a sedimentary microfacies identification method with an accuracy of 90.5%.Based on production data,the main lithologies and sedimentary microfacies of effective reservoirs were determined,and 10 petrophysical facies with effective reservoir characteristics were identified.Constrained by petrophysical facies,the mean interpretation error of porosity compared to core analysis results is 2.7%,and the ratio of interpreted permeability to core analysis is within one order of magnitude,averaging 3.6.The research results demonstrate that deep learning algorithms can uncover the correlation in carbonate reservoir well logging data.Integrating geological and production data and selecting appropriate machine learning algorithms can significantly improve the accuracy of well log interpretation for carbonate reservoirs.
文摘Logging data and its interpretation results are one of the most important basic data for understanding reservoirs and oilfield development. Standardized and unified logging interpretation results play a decisive role in fine reservoir description and reservoir development. Aiming at the problem of the conflict between the development effect and the initial interpretation result of Yan 9 reservoir in Hujianshan area of Ordos Basin, by combining the current well production performance, logging, oil test, production test and other data, on the basis of making full use of core, coring, logging, thin section analysis and high pressure mercury injection data, the four characteristics of reservoir are analyzed, a more scientific and reasonable calculation model of reservoir logging parameters is established, and the reserves are recalculated after the second interpretation standard of logging is determined. The research improves the accuracy of logging interpretation and provides an effective basis for subsequent production development and potential horizons.