期刊文献+
共找到800篇文章
< 1 2 40 >
每页显示 20 50 100
Snow and sea ice thermodynamics in the Arctic:Model validation and sensitivity study against SHEBA data 被引量:6
1
作者 Cheng Bin Timo Vihma +2 位作者 Zhang Zhanhai Li Zhijun Wu Huiding 《Chinese Journal of Polar Science》 2008年第2期108-122,共15页
Evolution of the Arctic sea ice and its snow cover during the SHEBA year were simulated by applying a high-resolution thermodynamic snow/ice model (HIGHTSI). Attention was paid to the impact of albedo on snow and se... Evolution of the Arctic sea ice and its snow cover during the SHEBA year were simulated by applying a high-resolution thermodynamic snow/ice model (HIGHTSI). Attention was paid to the impact of albedo on snow and sea ice mass balance, effect of snow on total ice mass balance, and the model vertical resolution. The SHEBA annual simulation was made applying the best possible external forcing data set created by the Sea Ice Model Intercomparison Project. The HIGHTSI control run reasonably reproduced the observed snow and ice thickness. A number of albedo schemes were incorporated into HIGHTSI to study the feedback processes between the albedo and snow and ice thickness. The snow thickness turned out to be an essential variable in the albedo parameterization. Albedo schemes dependent on the surface temperature were liable to excessive positive feedback effects generated by errors in the modelled surface temperature. The superimposed ice formation should be taken into account for the annual Arctic sea ice mass balance. 展开更多
关键词 ARCTIC sea ice Model validation and sensitivity study SHEBA data.
在线阅读 下载PDF
Improvement of Wired Drill Pipe Data Quality via Data Validation and Reconciliation 被引量:2
2
作者 Dan Sui Olha Sukhoboka Bernt Sigve Aadn?y 《International Journal of Automation and computing》 EI CSCD 2018年第5期625-636,共12页
Wired drill pipe(WDP)technology is one of the most promising data acquisition technologies in today s oil and gas industry.For the first time it allows sensors to be positioned along the drill string which enables c... Wired drill pipe(WDP)technology is one of the most promising data acquisition technologies in today s oil and gas industry.For the first time it allows sensors to be positioned along the drill string which enables collecting and transmitting valuable data not only from the bottom hole assembly(BHA),but also along the entire length of the wellbore to the drill floor.The technology has received industry acceptance as a viable alternative to the typical logging while drilling(LWD)method.Recently more and more WDP applications can be found in the challenging drilling environments around the world,leading to many innovations to the industry.Nevertheless most of the data acquired from WDP can be noisy and in some circumstances of very poor quality.Diverse factors contribute to the poor data quality.Most common sources include mis-calibrated sensors,sensor drifting,errors during data transmission,or some abnormal conditions in the well,etc.The challenge of improving the data quality has attracted more and more focus from many researchers during the past decade.This paper has proposed a promising solution to address such challenge by making corrections of the raw WDP data and estimating unmeasurable parameters to reveal downhole behaviors.An advanced data processing method,data validation and reconciliation(DVR)has been employed,which makes use of the redundant data from multiple WDP sensors to filter/remove the noise from the measurements and ensures the coherence of all sensors and models.Moreover it has the ability to distinguish the accurate measurements from the inaccurate ones.In addition,the data with improved quality can be used for estimating some crucial parameters in the drilling process which are unmeasurable in the first place,hence provide better model calibrations for integrated well planning and realtime operations. 展开更多
关键词 data quality wired drill pipe (WDP) data validation and reconciliation (DVR) DRILLING models.
原文传递
Ecosystem services mapping and modelling.Where is the validation?
3
作者 Paulo Pereira Miguel Inacio +1 位作者 Damia Barcelo Wenwu Zhao 《Geography and Sustainability》 2025年第3期13-16,共4页
Ecosystem services(ES)mapping and models have advanced in recent years.Improvements were made,and the assessments have transitioned from qualitative to quantitative.Although this is an important advancement,the ES map... Ecosystem services(ES)mapping and models have advanced in recent years.Improvements were made,and the assessments have transitioned from qualitative to quantitative.Although this is an important advancement,the ES mapping and modelling validation step has been overlooked,and this raises an important question in the credibility of the outcomes.This has been an important and unsolved issue in the ES research community that needs to be tackled.This highlight paper discusses the importance of validating single ES mapping and models.Conducting this using field or proximal/remote sensing raw data and not data from other models or stakeholder evaluation is important.A validation step should be mandatory in ES frameworks since it can assess the models’veracity,contribute to identifying the model’s weaknesses/strengths and ultimately represent a scientific advance in the field.This is easier to apply to the biophysical mapping and models of regulating and provisioning ES than to cultural ES,as the latter rely more on perception and cultural contexts.Also,ES supply models are easier to validate than demand and flow models.Robust and well-grounded models are essential for ensuring the reliability of individual ES maps and models and should be integrated into decision-making processes.Although several challenges arise related to the costs of data collection,in several cases prohibitive,and the time and the expertise needed to conduct this sampling and analysis,this is likely an imperative step that needs to be considered in the future.This will be beneficial in establishing ES research and improving decision-making and wellbeing. 展开更多
关键词 Ecosystem services validation Frameworks data DECISION-MAKING
暂未订购
Orthogonal Series Estimation of Nonparametric Regression Measurement Error Models with Validation Data
4
作者 Zanhua Yin 《Applied Mathematics》 2017年第12期1820-1831,共12页
In this article we study the estimation method of nonparametric regression measurement error model based on a validation data. The estimation procedures are based on orthogonal series estimation and truncated series a... In this article we study the estimation method of nonparametric regression measurement error model based on a validation data. The estimation procedures are based on orthogonal series estimation and truncated series approximation methods without specifying any structure equation and the distribution assumption. The convergence rates of the proposed estimator are derived. By example and through simulation, the method is robust against the misspecification of a measurement error model. 展开更多
关键词 ILL-POSED INVERSE Problems Measurement ERRORS NONPARAMETRIC Regression ORTHOGONAL Series validation data
在线阅读 下载PDF
Estimation of Nonparametric Multiple Regression Measurement Error Models with Validation Data
5
作者 Zanhua Yin Fang Liu 《Open Journal of Statistics》 2015年第7期808-819,共12页
In this article, we develop estimation approaches for nonparametric multiple regression measurement error models when both independent validation data on covariables and primary data on the response variable and surro... In this article, we develop estimation approaches for nonparametric multiple regression measurement error models when both independent validation data on covariables and primary data on the response variable and surrogate covariables are available. An estimator which integrates Fourier series estimation and truncated series approximation methods is derived without any error model structure assumption between the true covariables and surrogate variables. Most importantly, our proposed methodology can be readily extended to the case that only some of covariates are measured with errors with the assistance of validation data. Under mild conditions, we derive the convergence rates of the proposed estimators. The finite-sample properties of the estimators are investigated through simulation studies. 展开更多
关键词 ILL-POSED INVERSE Problem Linear OPERATOR Measurement ERRORS NONPARAMETRIC Regression validation data
在线阅读 下载PDF
The Proposal of Data Warehouse Validation
6
作者 Pavol Tanuska Michal Kebisek +1 位作者 Oliver Moravcik Pavel Vazan 《Computer Technology and Application》 2011年第8期650-657,共8页
The analysis of relevant standards and guidelines proved the lack of information on actions and activities concerning data warehouse testing. The absence of the complex data warehouse testing methodology seems to be c... The analysis of relevant standards and guidelines proved the lack of information on actions and activities concerning data warehouse testing. The absence of the complex data warehouse testing methodology seems to be crucial particularly in the phase of the data warehouse implementation. The aim of this article is to suggest basic data warehouse testing activities as a final part of data warehouse testing methodology. The testing activities that must be implemented in the process of the data warehouse testing can be split into four logical units regarding the multidimensional database testing, data pump testing, metadata and OLAP (Online Analytical Processing) testing. Between main testing activities can be included: revision of the multidimensional database scheme, optimizing of fact tables number, problem of data explosion, testing for correctness of aggregation and summation of data etc. 展开更多
关键词 data warehouse test case testing activities METHODOLOGY validation UML (unified modeling language).
在线阅读 下载PDF
Validation of Treatment Planning Dose Calculations: Experience Working with Medical Physics Practice Guideline 5.a. 被引量:2
7
作者 Jinyu Xue Jared D. Ohrt +5 位作者 James Fan Peter Balter Joo Han Park Leonard Kim Steven M. Kirsner Geoffrey S. Ibbott 《International Journal of Medical Physics, Clinical Engineering and Radiation Oncology》 2017年第1期57-72,共16页
Recently published Medical Physics Practice Guideline 5.a. (MPPG 5.a.) by American Association of Physicists in Medicine (AAPM) sets the minimum requirements for treatment planning system (TPS) dose algorithm commissi... Recently published Medical Physics Practice Guideline 5.a. (MPPG 5.a.) by American Association of Physicists in Medicine (AAPM) sets the minimum requirements for treatment planning system (TPS) dose algorithm commissioning and quality assurance (QA). The guideline recommends some validation tests and tolerances based primarily on published AAPM task group reports and the criteria used by IROC Houston. We performed the commissioning and validation of the dose algorithms for both megavoltage photon and electron beams on three linacs following MPPG 5.a. We designed the validation experiments in an attempt to highlight the evaluation method and tolerance criteria recommended by the guideline. It seems that comparison of dose profiles using in-water scan is an effective technique for basic photon and electron validation. IMRT/VMAT dose calculation is recommended to be tested with some TG-119 and clinical cases, but no consensus of the tolerance exists. Extensive validation tests have provided the better understanding of the accuracy and limitation of a specific dose calculation algorithm. We believe that some tests and evaluation criteria given in the guideline can be further refined. 展开更多
关键词 DOSE CALCULATION Algorithm Treatment PLANNING System BEAM data Modeling validation Test MPPG 5.a.
暂未订购
SVR-Miner:Mining Security Validation Rules and Detecting Violations in Large Software 被引量:1
8
作者 梁彬 谢素斌 +2 位作者 石文昌 梁朝晖 陈红 《China Communications》 SCIE CSCD 2011年第4期84-98,共15页
For various reasons,many of the security programming rules applicable to specific software have not been recorded in official documents,and hence can hardly be employed by static analysis tools for detection.In this p... For various reasons,many of the security programming rules applicable to specific software have not been recorded in official documents,and hence can hardly be employed by static analysis tools for detection.In this paper,we propose a new approach,named SVR-Miner(Security Validation Rules Miner),which uses frequent sequence mining technique [1-4] to automatically infer implicit security validation rules from large software code written in C programming language.Different from the past works in this area,SVR-Miner introduces three techniques which are sensitive thread,program slicing [5-7],and equivalent statements computing to improve the accuracy of rules.Experiments with the Linux Kernel demonstrate the effectiveness of our approach.With the ten given sensitive threads,SVR-Miner automatically generated 17 security validation rules and detected 8 violations,5 of which were published by Linux Kernel Organization before we detected them.We have reported the other three to the Linux Kernel Organization recently. 展开更多
关键词 static analysis data mining automated validation rules extraction automated violation detection
在线阅读 下载PDF
Fuzzy Clustering Validity for Spatia Data 被引量:1
9
作者 HU Chunchun MENG Lingkui SHI Wenzhong 《Geo-Spatial Information Science》 2008年第3期191-196,共6页
The validity measurement of fuzzy clustering is a key problem. If clustering is formed, it needs a kind of machine to verify its validity. To make mining more accountable, comprehensible and with a usable spatial patt... The validity measurement of fuzzy clustering is a key problem. If clustering is formed, it needs a kind of machine to verify its validity. To make mining more accountable, comprehensible and with a usable spatial pattern, it is necessary to first detect whether the data set has a clustered structure or not before clustering. This paper discusses a detection method for clustered patterns and a fuzzy clustering algorithm, and studies the validity function of the result produced by fuzzy clustering based on two aspects, which reflect the un-certainty of classification during fuzzy partition and spatial location features of spatial data, and proposes a new validity function of fuzzy clustering for spatial data. The experimental result indicates that the new validity function can accurately measure the validity of the results of fuzzy clustering. Especially, for the result of fuzzy clustering of spatial data, it is robust and its classification result is better when compared to other indices. 展开更多
关键词 fuzzy clustering spatial data validity UNCERTAINTY
在线阅读 下载PDF
Validation of the multi-satellite merged sea surface salinity in the South China Sea
10
作者 Huipeng WANG Junqiang SONG +3 位作者 Chengwu ZHAO Xiangrong YANG Hongze LENG Nan ZHOU 《Journal of Oceanology and Limnology》 SCIE CAS CSCD 2023年第6期2033-2044,共12页
Sea surface salinity(SSS)is an essential variable of ocean dynamics and climate research.The Soil Moisture and Ocean Salinity(SMOS),Aquarius,and Soil Moisture Active Passive(SMAP)satellite missions all provide SSS mea... Sea surface salinity(SSS)is an essential variable of ocean dynamics and climate research.The Soil Moisture and Ocean Salinity(SMOS),Aquarius,and Soil Moisture Active Passive(SMAP)satellite missions all provide SSS measurements.The European Space Agency(ESA)Climate Change Initiative Sea Surface Salinity(CCI-SSS)project merged these three satellite SSS data to produce CCI L4SSS products.We validated the accuracy of the four satellite products(CCI,SMOS,Aquarius,and SMAP)using in-situ gridded data and Argo floats in the South China Sea(SCS).Compared with in-situ gridded data,it shows that the CCI achieved the best performance(RMSD:0.365)on monthly time scales.The RMSD of SMOS,Aquarius,and SMAP(SMOS:0.389;Aquarius:0.409;SMAP:0.391)are close,and the SMOS takes a slight advantage in contrast with Aquarius and SMAP.Large discrepancies can be found near the coastline and in the shelf seas.Meanwhile,CCI with lower RMSD(0.295)perform better than single satellite data(SMOS:0.517;SMAP:0.297)on weekly time scales compared with Argo floats.Overall,the merged CCI have the smallest RMSD among the four satellite products in the SCS on both weekly time scales and monthly time scales,which illustrates the improved accuracy of merged CCI compared with the individual satellite data. 展开更多
关键词 sea surface salinity(SSS) South China Sea(SCS) ARGO multi-satellite merged data validation
在线阅读 下载PDF
Direct Pointwise Comparison of FE Predictions to StereoDIC Measurements:Developments and Validation Using Double Edge-Notched Tensile Specimen
11
作者 Troy Myers Michael A.Sutton +2 位作者 Hubert Schreier Alistair Tofts Sreehari Rajan Kattil 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第8期1263-1298,共36页
To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is... To compare finite element analysis(FEA)predictions and stereovision digital image correlation(StereoDIC)strain measurements at the same spatial positions throughout a region of interest,a field comparison procedure is developed.The procedure includes(a)conversion of the finite element data into a triangular mesh,(b)selection of a common coordinate system,(c)determination of the rigid body transformation to place both measurements and FEA data in the same system and(d)interpolation of the FEA nodal information to the same spatial locations as the StereoDIC measurements using barycentric coordinates.For an aluminum Al-6061 double edge notched tensile specimen,FEA results are obtained using both the von Mises isotropic yield criterion and Hill’s quadratic anisotropic yield criterion,with the unknown Hill model parameters determined using full-field specimen strain measurements for the nominally plane stress specimen.Using Hill’s quadratic anisotropic yield criterion,the point-by-point comparison of experimentally based full-field strains and stresses to finite element predictions are shown to be in excellent agreement,confirming the effectiveness of the field comparison process. 展开更多
关键词 StereoDIC spatial co-registration data transformation finite element simulations point-wise comparison of measurements and FEA predictions double edge notch specimen model validation
在线阅读 下载PDF
Knowledge Discovery for Query Formulation for Validation of a Bayesian Belief Network
12
作者 Gursel Serpen Michael Riesen 《Journal of Intelligent Learning Systems and Applications》 2010年第3期156-166,共11页
This paper proposes machine learning techniques to discover knowledge in a dataset in the form of if-then rules for the purpose of formulating queries for validation of a Bayesian belief network model of the same data... This paper proposes machine learning techniques to discover knowledge in a dataset in the form of if-then rules for the purpose of formulating queries for validation of a Bayesian belief network model of the same data. Although do-main expertise is often available, the query formulation task is tedious and laborious, and hence automation of query formulation is desirable. In an effort to automate the query formulation process, a machine learning algorithm is lev-eraged to discover knowledge in the form of if-then rules in the data from which the Bayesian belief network model under validation was also induced. The set of if-then rules are processed and filtered through domain expertise to identify a subset that consists of “interesting” and “significant” rules. The subset of interesting and significant rules is formulated into corresponding queries to be posed, for validation purposes, to the Bayesian belief network induced from the same dataset. The promise of the proposed methodology was assessed through an empirical study performed on a real-life dataset, the National Crime Victimization Survey, which has over 250 attributes and well over 200,000 data points. The study demonstrated that the proposed approach is feasible and provides automation, in part, of the query formulation process for validation of a complex probabilistic model, which culminates in substantial savings for the need for human expert involvement and investment. 展开更多
关键词 RULE Induction Semi-Automated QUERY Generation Bayesian Net validation Knowledge Acquisition BOTTLENECK CRIME data National CRIME VICTIMIZATION Survey
暂未订购
IEC 61850 SCL Validation Using UML Model in Modern Digital Substation
13
作者 Byungtae Jang Alidu Abubakari Namdae Kim 《Smart Grid and Renewable Energy》 2018年第8期127-149,共23页
The IEC 61850 standard stipulates the Substation Configuration Description Language (SCL) file as a means to define the substation equipment, IED function and also the communication mechanism for the substation area n... The IEC 61850 standard stipulates the Substation Configuration Description Language (SCL) file as a means to define the substation equipment, IED function and also the communication mechanism for the substation area network. The SCL is an eXtensible Markup Language (XML) based file which helps to describe the configuration of the substation Intelligent Electronic Devices (IED) including their associated functions. The SCL file is also configured to contain all IED capabilities including data model which is structured into objects for easy descriptive modeling. The effective functioning of this SCL file relies on appropriate validation techniques which check the data model for errors due to non-conformity to the IEC 61850 standard. In this research, we extend the conventional SCL validation algorithm to develop a more advanced validator which can validate the standard data model using the Unified Modeling Language (UML). By using the Rule-based SCL validation tool, we implement validation test cases for a more comprehensive understanding of the various validation functionalities. It can be observed from the algorithm and the various implemented test cases that the proposed validation tool can improve SCL information validation and also help automation engineers to comprehend the IEC 61850 substation system architecture. 展开更多
关键词 IEC 61850 Substation Automation IED XML UML XMI Schema RULE-BASED SCL validation Syntax Semantic data Model SCL Editor
暂未订购
Statistical Tests of the Validation of TCO Satellite Measurements, Recorded Simultaneously by TOMS-OMI (2005) and OMI-OMPS (2012-2018)
14
作者 Molina-Almaraz Mario Pinedo-Vega Jose Luis +1 位作者 Ríos-Martínez Carlos Mireles-García Fernando 《Atmospheric and Climate Sciences》 CAS 2023年第2期159-174,共16页
Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and ... Two statistical validation methods were used to evaluate the confidence level of the Total Column Ozone (TCO) measurements recorded by satellite systems measuring simultaneously, one using the normal distribution and another using the Mann-Whitney test. First, the reliability of the TCO measurements was studied hemispherically. While similar coincidences and levels of significance > 0.05 were found with the two statistical tests, an enormous variability in the levels of significance throughout the year was also exposed. Then, using the same statistical comparison methods, a latitudinal study was carried out in order to elucidate the geographical distribution that gave rise to this variability. Our study reveals that between the TOMS and OMI measurements in 2005 there was only a coincidence in 50% of the latitudes, which explained the variability. This implies that for 2005, the TOMS measurements are not completely reliable, except between the -50° and -15° latitude band in the southern hemisphere and between +15° and +50° latitude band in the northern hemisphere. In the case of OMI-OMPS, we observe that between 2011 and 2016 the measurements of both satellite systems are reasonably similar with a confidence level higher than 95%. However, in 2017 a band with a width of 20° latitude centered on the equator appeared, in which the significance levels were much less than 0.05, indicating that one of the measurement systems had begun to fail. In 2018, the fault was not only located in the equator, but was also replicated in various bands in the Southern Hemisphere. We interpret this as evidence of irreversible failure in one of the measurement systems. 展开更多
关键词 TOMS-OMI and OMPS data Global Statistical validation Total Column Ozone Mann-Whitney Test
在线阅读 下载PDF
Road Surface Condition and Monitoring System Utilizing Motorcycle(ROCOM)—System Development,Validation and Field Test
15
作者 Muhammad Marizwan Abdul Manan Muhammad Ruhaizat Abd Ghani 《Journal of Traffic and Transportation Engineering》 2018年第6期282-291,共10页
Motorcycles are the riskiest mode of travel in Malaysia,however motorcycles are also very sensitive to the road surface condition.Thus,by taking advantage of this,a system of software applications that analyses motorc... Motorcycles are the riskiest mode of travel in Malaysia,however motorcycles are also very sensitive to the road surface condition.Thus,by taking advantage of this,a system of software applications that analyses motorcycle motion and mapped out risky road sections was developed,i.e.ROCOM.The system consists of three major components,i.e.ROCOM Data Logger app,which utilizes a smart phone to collect acceleration data,ROCOM Risk Mapping app,which is a web-based application,and ROCOM Visual Tracking,which is a stand along software.ROCOM is able to detect adverse acceleration( >2g)or vibration on the road surface similar to the High Accuracy GPS Data Logging for Vehicle Testing(VBOX).Risk mapping validation along a section of the motorcycle lane along Federal Route 2 shows that not only ROCOM have the similar risk-mapping pattern,but its route tracking capability on the map is far superior that the VBOX.The pilot and field test results showed that ROCOM works best when mounting the smartphone on the motorcycle handle bar or basket,and it can detect various road anomalies with successful detection rate of 62%,with high detection rate when passing through uneven road surfaces. 展开更多
关键词 ROAD surface CONDITION mobile and WEB-BASED application visual TRACKING data validation
在线阅读 下载PDF
A Framework for Cloud Validation in Pharma
16
作者 Pravin Ullagaddi 《Journal of Computer and Communications》 2024年第9期103-118,共16页
The pharmaceutical industry’s increasing adoption of cloud-based technologies has introduced new challenges in computerized systems validation (CSV). This paper explores the evolving landscape of cloud validation in ... The pharmaceutical industry’s increasing adoption of cloud-based technologies has introduced new challenges in computerized systems validation (CSV). This paper explores the evolving landscape of cloud validation in pharmaceutical manufacturing, focusing on ensuring data integrity and regulatory compliance in the digital era. We examine the unique characteristics of cloud-based systems and their implications for traditional validation approaches. A comprehensive review of current regulatory frameworks, including FDA and EMA guidelines, provides context for discussing cloud-specific validation challenges. The paper introduces a risk-based approach to cloud CSV, detailing methodologies for assessing and mitigating risks associated with cloud adoption in pharmaceutical environments. Key considerations for maintaining data integrity in cloud systems are analyzed, particularly when applying ALCOA+ principles in distributed computing environments. The article presents strategies for adapting traditional Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ) models to cloud-based systems, highlighting the importance of continuous validation in dynamic cloud environments. The paper also explores emerging trends, including integrating artificial intelligence and edge computing in pharmaceutical manufacturing and their implications for future validation strategies. This research contributes to the evolving body of knowledge on cloud validation in pharmaceuticals by proposing a framework that balances regulatory compliance with the agility offered by cloud technologies. The findings suggest that while cloud adoption presents unique challenges, a well-structured, risk-based approach to validation can ensure the integrity and compliance of cloud-based systems in pharmaceutical manufacturing. 展开更多
关键词 Computerized Systems validation Risk-Based Approach data Integrity Pharmaceutical Manufacturing Cloud validation
在线阅读 下载PDF
The Importance of Integrating Geological Mapping Information with Validated Assay Data for Generating Accurate Geological Wireframes in Orebody Modelling of Mineral Deposit in Mineral Resource Estimation: A Case Study in AngloGold Ashanti, Obuasi Mine
17
作者 Joshua Wereko Opong Chiri G. Amedjoe +1 位作者 Andy Asante Matthew Coffie Wilson 《International Journal of Geosciences》 2022年第6期426-437,共12页
The basis of accurate mineral resource estimates is to have a geological model which replicates the nature and style of the orebody. Key inputs into the generation of a good geological model are the sample data and ma... The basis of accurate mineral resource estimates is to have a geological model which replicates the nature and style of the orebody. Key inputs into the generation of a good geological model are the sample data and mapping information. The Obuasi Mine sample data with a lot of legacy issues were subjected to a robust validation process and integrated with mapping information to generate an accurate geological orebody model for mineral resource estimation in Block 8 Lower. Validation of the sample data focused on replacing missing collar coordinates, missing assays, and correcting magnetic declination that was used to convert the downhole surveys from true to magnetic, fix missing lithology and finally assign confidence numbers to all the sample data. The missing coordinates which were replaced ensured that the sample data plotted at their correct location in space as intended from the planning stage. Magnetic declination data, which was maintained constant throughout all the years even though it changes every year, was also corrected in the validation project. The corrected magnetic declination ensured that the drillholes were plotted on their accurate trajectory as per the planned azimuth and also reflected the true position of the intercepted mineralized fissure(s) which was previously not the case and marked a major blot in the modelling of the Obuasi orebody. The incorporation of mapped data with the validated sample data in the wireframes resulted in a better interpretation of the orebody. The updated mineral resource generated by domaining quartz from the sulphides and compared with the old resource showed that the sulphide tonnes in the old resource estimates were overestimated by 1% and the grade overestimated by 8.5%. 展开更多
关键词 Mineral Resource Estimation Geological Models Sample data validation Assay data Geological Mapping
在线阅读 下载PDF
Potential error underestimation of cross-validation in missing value reconstruction in ocean satellite data
18
作者 Menghan Yu Hao Qin Haoyu Jiang 《Acta Oceanologica Sinica》 2025年第10期218-226,共9页
Ocean remote sensing datasets often have the problem of missing values due to various reasons.However,many scientific applications require spatiotemporal seamless data.Data reconstruction methods are commonly used to ... Ocean remote sensing datasets often have the problem of missing values due to various reasons.However,many scientific applications require spatiotemporal seamless data.Data reconstruction methods are commonly used to obtain such gap-free datasets.In reconstructing satellite remote sensing data,randomly masking original data for progressive cross-validation is a common method to indicate the performance of reconstruction.In this study,the accuracy of this validation method is analysed.We artificially constructed two data missing patterns using the sea surface temperature(SST)data in the East China Sea,one simulating natural cloud coverage and the other randomly masking the same percentage of original data.The results of reconstruction for the two types of masking were compared.The root mean square error(RMSE)of dataset that simulate real cloud coverage is more than 50%higher than that of the dataset randomly masking data,regardless of the data missing rate.This result implies that the error of satellite data gap-filling is underestimated when random masking of original data is applied for progressive cross-validation,which should be treated with care in applications. 展开更多
关键词 gap filling data Interpolating Empirical Orthogonal Function(DINEOF) validation
在线阅读 下载PDF
计算机英语词汇管理Validator框架数据校验 被引量:1
19
作者 刘丹 赵丹 《自动化技术与应用》 2015年第12期26-29,共4页
建立词汇管理系统是解决计算机词汇增长过快、难于学习和使用的有效途径,本文基于Validator框架研究数据输入检验,为系统开发提供支持。首先,研究了Validator框架的构成并对各部件进行了简要描述;然后,基于My SQL数据库管理系统设计了... 建立词汇管理系统是解决计算机词汇增长过快、难于学习和使用的有效途径,本文基于Validator框架研究数据输入检验,为系统开发提供支持。首先,研究了Validator框架的构成并对各部件进行了简要描述;然后,基于My SQL数据库管理系统设计了词汇存储数据结构;最后,按照建立validation.xml配置文件、在struts-config.xml中配置Validator插件、修改Action Form的父类、配置Validator调用等步骤完成了数据校验。本文的研究方法具有配置简单、使用方便等特点。 展开更多
关键词 计算机英语 词汇管理 数据校验 validator框架
在线阅读 下载PDF
Oracle GoldenGate Veridata数据验证技术的研究与应用 被引量:1
20
作者 王文阁 《电力信息与通信技术》 2013年第11期16-20,共5页
在企业数据中心、灾备中心的建设中广泛采用数据同步复制技术和数据异步复制技术来得到共享数据或灾备数据,许多原因会导致复制两端数据不一致而又很难及时发现。这些不一致会导致采用不准确的数据进行数据恢复和基于不准确数据的企业... 在企业数据中心、灾备中心的建设中广泛采用数据同步复制技术和数据异步复制技术来得到共享数据或灾备数据,许多原因会导致复制两端数据不一致而又很难及时发现。这些不一致会导致采用不准确的数据进行数据恢复和基于不准确数据的企业决策。通过引入Oracle GoldenGate Veridata数据验证技术,可以实现在线校验在不同应用之间共同使用的大量数据,确保数据绝对可信,找到可以常态化的数据一致性、完整性验证方法。 展开更多
关键词 ORACLE GOLDENGATE Veridata 数据验证 数据一致性 数据复制
在线阅读 下载PDF
上一页 1 2 40 下一页 到第
使用帮助 返回顶部