期刊文献+
共找到2,782篇文章
< 1 2 140 >
每页显示 20 50 100
Detection of Turbulence Anomalies Using a Symbolic Classifier Algorithm in Airborne Quick Access Record(QAR)Data Analysis 被引量:1
1
作者 Zibo ZHUANG Kunyun LIN +1 位作者 Hongying ZHANG Pak-Wai CHAN 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第7期1438-1449,共12页
As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ... As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards. 展开更多
关键词 turbulence detection symbolic classifier quick access recorder data
在线阅读 下载PDF
Algorithms of mining data records from website automatically
2
作者 邱勇 兰永杰 《Journal of Southeast University(English Edition)》 EI CAS 2006年第3期423-425,共3页
In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages ... In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages that have uniform structure, only differing in main information. A web page which contains many links that link to isomorphic web pages is called a directory page. Algorithm 1 can find directory web pages in a web using adjacent links similar analysis method. It first sorts the link, and then counts the links in each directory. If the count is greater than a given valve then finds the similar sub-page links in the directory and gives the results. A function for an isomorphic web page judgment is also proposed. Algorithm 2 can mine data records from an isomorphic page using a noise information filter. It is based on the fact that the noise information is the same in two isomorphic pages, only the main information is different. Algorithm 3 can mine data records from an entire website using the technology of spider. The experiment shows that the proposed algorithms can mine data records more intactly than the existing algorithms. Mining data records from isomorphic pages is an efficient method. 展开更多
关键词 data mining data record WEBSITE isomorphic page
在线阅读 下载PDF
Digital Continuity Guarantee Approach of Electronic Record Based on Data Quality Theory 被引量:7
3
作者 Yongjun Ren Jian Qi +2 位作者 Yaping Cheng Jin Wang Osama Alfarraj 《Computers, Materials & Continua》 SCIE EI 2020年第6期1471-1483,共13页
Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital con... Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record. 展开更多
关键词 Electronic record digital continuity data quality
在线阅读 下载PDF
Constructing Large Scale Cohort for Clinical Study on Heart Failure with Electronic Health Record in Regional Healthcare Platform:Challenges and Strategies in Data Reuse 被引量:2
4
作者 Daowen Liu Liqi Lei +1 位作者 Tong Ruan Ping He 《Chinese Medical Sciences Journal》 CAS CSCD 2019年第2期90-102,共13页
Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face ... Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data. 展开更多
关键词 electronic health recordS CLINICAL TERMINOLOGY knowledge graph CLINICAL special disease case REPOSITORY evaluation of data quality large scale COHORT study
暂未订购
Data Masking for Chinese Electronic Medical Records with Named Entity Recognition 被引量:1
5
作者 Tianyu He Xiaolong Xu +3 位作者 Zhichen Hu Qingzhan Zhao Jianguo Dai Fei Dai 《Intelligent Automation & Soft Computing》 SCIE 2023年第6期3657-3673,共17页
With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so ... With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so this reality drives the conversion of paper medical records to electronic medical records.Electronic medical records are the basis for establishing a smart hospital and an important guarantee for achieving medical intelligence,and the massive amount of electronic medical record data is also an important data set for conducting research in the medical field.However,electronic medical records contain a large amount of private patient information,which must be desensitized before they are used as open resources.Therefore,to solve the above problems,data masking for Chinese electronic medical records with named entity recognition is proposed in this paper.Firstly,the text is vectorized to satisfy the required format of the model input.Secondly,since the input sentences may have a long or short length and the relationship between sentences in context is not negligible.To this end,a neural network model for named entity recognition based on bidirectional long short-term memory(BiLSTM)with conditional random fields(CRF)is constructed.Finally,the data masking operation is performed based on the named entity recog-nition results,mainly using regular expression filtering encryption and principal component analysis(PCA)word vector compression and replacement.In addi-tion,comparison experiments with the hidden markov model(HMM)model,LSTM-CRF model,and BiLSTM model are conducted in this paper.The experi-mental results show that the method used in this paper achieves 92.72%Accuracy,92.30%Recall,and 92.51%F1_score,which has higher accuracy compared with other models. 展开更多
关键词 Named entity recognition Chinese electronic medical records data masking principal component analysis regular expression
在线阅读 下载PDF
Maintaining Consistency Based on Timely Updating Records List in Workflow Data
6
作者 WU Hongli YIN Baolin ZHAO Xiao XIANG Gang 《Wuhan University Journal of Natural Sciences》 CAS 2008年第4期481-484,共4页
In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updatin... In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updating the records list and the recovery mechanism of updating message proves the classical invalidation strategy. When the request cycle of duplication is too long, the strategy uses the method of updating the records list to pause for sending updating message; when the long cycle duplication is requested again, it uses the recovery mechanism to resume the updating message. This strategy not only ensures the consistency of the workflow data, but also reduces the unnecessary network traffic. From theoretical comparison with those common strategies, the unnecessary network traffic of this strategy is fewer and more stable. The simulation results validate this conclusion. 展开更多
关键词 distributed environment workflow data records list update cycle threshold consistency maintenance
在线阅读 下载PDF
Data Processing of Fault Recorder in Power System
7
作者 Yu Qin Yu Bai Minghao Wen 《Energy and Power Engineering》 2017年第4期46-52,共7页
This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter ... This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter and extract fundamental waves. Finally the effectiveness of the data processing method introduced in this paper is verified by CAAP software. 展开更多
关键词 FAULT recordER READ and Process recordED FAULT data RESULT Verification
暂未订购
Test Calibration of the Paleoclimatic Proxy Data with Chinese Historical Records
8
作者 De'er Zhang 《Advances in Climate Change Research》 SCIE 2011年第1期38-42,共5页
The calibration of paleoclimate proxies is one of the key problems in the study of paleoclimate at present. Historical documentary records of climate are suitable for calibration on dating and the climatic implication... The calibration of paleoclimate proxies is one of the key problems in the study of paleoclimate at present. Historical documentary records of climate are suitable for calibration on dating and the climatic implication of the proxy data in a climatological sense. A test calibration on correcting the Delingha tree ring precipitation series using Chinese historical documentary records shows that among the 44 extreme dry cases in 1401 1950 AD, 42 cases (or 95.5%) are believable. Thus the long series of Delingha rings-denoted precipitation is highly reliable. Another test to validate the monsoon intensity proxy data based on the Zhanjiang Huguangyan sediments using historical records indicates that the years of Lake Maar Ti content series-designated winter monsoon intensities are entirely opposite to historical documents- depicted years of harsh winters in 800-900 AD. As a result, serious doubt is raised about the climatic implication of this paleo-monsoon proxy series. 展开更多
关键词 proxy data calibration PALEOCLIMATE historical documentary records of climate
在线阅读 下载PDF
Medi-Block Record Secure Data Sharing in Healthcare System:Issues,Solutions and Challenges
9
作者 Zuriati Ahmad Zukarnain Amgad Muneer +1 位作者 Nur Atirah Mohamad Nassir Akram A。Almohammedi 《Computer Systems Science & Engineering》 SCIE EI 2023年第12期2725-2740,共16页
With the advancements in the era of artificial intelligence,blockchain,cloud computing,and big data,there is a need for secure,decentralized medical record storage and retrieval systems.While cloud storage solves stor... With the advancements in the era of artificial intelligence,blockchain,cloud computing,and big data,there is a need for secure,decentralized medical record storage and retrieval systems.While cloud storage solves storage issues,it is challenging to realize secure sharing of records over the network.Medi-block record in the healthcare system has brought a new digitalization method for patients’medical records.This centralized technology provides a symmetrical process between the hospital and doctors when patients urgently need to go to a different or nearby hospital.It enables electronic medical records to be available with the correct authentication and restricts access to medical data retrieval.Medi-block record is the consumer-centered healthcare data system that brings reliable and transparent datasets for the medical record.This study presents an extensive review of proposed solutions aiming to protect the privacy and integrity of medical data by securing data sharing for Medi-block records.It also aims to propose a comprehensive investigation of the recent advances in different methods of securing data sharing,such as using Blockchain technology,Access Control,Privacy-Preserving,Proxy Re-Encryption,and Service-On-Chain approach.Finally,we highlight the open issues and identify the challenges regarding secure data sharing for Medi-block records in the healthcare systems. 展开更多
关键词 Medi-block record healthcare system Blockchain technology secure data sharing
在线阅读 下载PDF
Design of universal data structure and interface based on dynamic self-incremental record
10
作者 Yan Li Peng Ji 《International Journal of Technology Management》 2015年第3期115-116,共2页
In the software of data management system, there are some different lengths of records needed storing in an array, and the number of records often increases in use of the software. A universal data structure is presen... In the software of data management system, there are some different lengths of records needed storing in an array, and the number of records often increases in use of the software. A universal data structure is presented in the design, and it provide an unified interface for dynamic storage records in different length, so that the developers can call the unified interface directly for the data storage to simplify the design of data management system. 展开更多
关键词 data management system length-variable data record dynamic self-incremental array
在线阅读 下载PDF
Guarantee Mechanism of Data Continuity for Electronic Record Based on Linked Data
11
作者 Yuyi Huo Shi Zhou +2 位作者 Ruiguo Hu Yongjun Ren Jinyue Xia 《Journal of Cyber Security》 2020年第3期151-156,共6页
In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the dat... In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the data continuity guarantee of electronic record into a set of data protection requirements consisting of data relevance,traceability and comprehensibility,and proposes to use the associated data technology to provide an integrated guarantee mechanism to meet the above three requirements. 展开更多
关键词 Electronic record data continuity guarantee mechanism
在线阅读 下载PDF
Secure approach to sharing digitized medical data in a cloud environment 被引量:2
12
作者 Kukatlapalli Pradeep Kumar Boppuru Rudra Prathap +2 位作者 Michael Moses Thiruthuvanathan Hari Murthy Vinay Jha Pillai 《Data Science and Management》 2024年第2期108-118,共11页
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laborat... Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records. 展开更多
关键词 Electronic medical records Cloud computing data privacy Attribute-based encryption AUTHENTICATION
在线阅读 下载PDF
“华西黉医”大模型构建与应用 被引量:2
13
作者 石锐 郑兵 +13 位作者 姚巡 杨豪 杨煦晨 张思远 王真吾 刘东峰 董婧 谢佳希 马虎 贺志阳 蒋成 乔丰 罗凤鸣 黄进 《中国胸心血管外科临床杂志》 北大核心 2025年第5期587-593,共7页
目的构建“华西黉医”大模型,探索其在辅助病历生成中的应用效果。方法采用“数据标注-模型训练-场景孵化”全链条的医疗大模型建设范式,通过多模态数据融合、领域自适应训练及国产化硬件适配策略,构建720亿参数规模的医学大模型,即“... 目的构建“华西黉医”大模型,探索其在辅助病历生成中的应用效果。方法采用“数据标注-模型训练-场景孵化”全链条的医疗大模型建设范式,通过多模态数据融合、领域自适应训练及国产化硬件适配策略,构建720亿参数规模的医学大模型,即“华西黉医”大模型。结合语音识别、知识图谱和强化学习技术,在构建“华西黉医”大模型的基础上开发辅助病历生成应用系统。结果以出院小结辅助生成为例,试点科室应用病历生成系统后每份病历书写平均时间由21 min缩短至5 min,效率提高3.2倍,系统输出准确率92.4%。结论医疗机构构建自主可控的医学大模型并以此孵化各类应用系统的模式可行,能为同类机构人工智能建设提供路径参考。 展开更多
关键词 医疗大模型 数据标注 多模态学习 病历生成 人工智能
原文传递
基于同源录波数据比对的继电保护采样回路异常检测方法 被引量:3
14
作者 戴志辉 张富泽 韩笑 《电力系统保护与控制》 北大核心 2025年第1期147-159,共13页
处于改建阶段的智能变电站采样模式复杂,继电保护装置难以发现采样回路轻微异常,导致回路隐患暴露时间严重滞后。针对上述问题,分析改建时期智能变电站的采样模式和二次设备配置情况,提出基于同源录波数据比对的继电保护采样回路异常检... 处于改建阶段的智能变电站采样模式复杂,继电保护装置难以发现采样回路轻微异常,导致回路隐患暴露时间严重滞后。针对上述问题,分析改建时期智能变电站的采样模式和二次设备配置情况,提出基于同源录波数据比对的继电保护采样回路异常检测方法。首先,利用双向编码器表征(bidirectional encoder representations from transformers,BERT)语言模型与余弦相似度算法,实现同源录波数据的通道匹配。然后,利用重采样技术和曼哈顿距离完成波形的采样频率统一与时域对齐。最后,基于动态时间规整(dynamic time warping,DTW)算法提出改进算法,并结合采样点偏移量共同设置采样回路的异常判据。算例分析表明,该方法可以完成录波数据的同源通道匹配,实现波形的一致性对齐,并且相比于传统DTW算法,改进DTW算法对异常状态识别的灵敏性和准确性更高。根据异常判据能够有效检测继电保护采样回路的异常状态,确保了智能变电站的安全可靠运行。 展开更多
关键词 继电保护装置 采样回路 异常检测 改进DTW算法 录波数据
在线阅读 下载PDF
基于Sentence-MacBERT模型的同源录波数据匹配方法 被引量:1
15
作者 戴志辉 张富泽 +1 位作者 韩笑 王冠南 《电力系统保护与控制》 北大核心 2025年第8期159-167,共9页
由于不同时期的录波数据记录标准有所不同,以及各个生产厂家对标准的解读存在偏差,造成同源录波数据的通道名称存在个性化差异,且通道索引号不同,难以进行录波数据的同源匹配。针对上述问题,提出基于句向量掩码纠错双向编码器表征语言模... 由于不同时期的录波数据记录标准有所不同,以及各个生产厂家对标准的解读存在偏差,造成同源录波数据的通道名称存在个性化差异,且通道索引号不同,难以进行录波数据的同源匹配。针对上述问题,提出基于句向量掩码纠错双向编码器表征语言模型(sentence-masked language model as correction bidirectional encoder representations from transformers,Sentence-MacBERT)的同源录波数据匹配方法。首先,分析录波文件的记录格式特点,根据录波文件的格式特点完成核查信息表的构建。然后,通过构建的核查信息表进行录波文件自动校核。最后,在双向编码器表征(bidirectional encoder representations from transformers,BERT)模型的基础上构建Sentence-MacBERT同源通道匹配模型,完成同源录波数据匹配。算例分析表明,根据核查信息表能够完成录波文件的自动校核,并对解析失败的录波文件发出告警信息。利用Sentence-MacBERT模型进行通道名称匹配的效果良好,能够有效地完成录波数据的同源匹配,帮助运行人员进行故障分析。 展开更多
关键词 录波数据 Sentence-MacBERT 自动校核 通道名称 同源匹配
在线阅读 下载PDF
基于专病数据库建设的临床诊疗数据抽取与对接平台的建设与应用 被引量:1
16
作者 司超增 胡宇 +4 位作者 张晟 田艳彬 张伟硕 李琳 夏杰峰 《中国医疗设备》 2025年第3期64-69,90,共7页
目的构建一套高效稳定的临床多源异构数据抽取与对接方法,实现专病数据库建设的数据整合及应用。方法通过增量累加模型和自拉链模型等数据抽取算法获取临床数据资源,应用数据引擎技术实现与专病数据库的数据对接和整合,并建立规范的数... 目的构建一套高效稳定的临床多源异构数据抽取与对接方法,实现专病数据库建设的数据整合及应用。方法通过增量累加模型和自拉链模型等数据抽取算法获取临床数据资源,应用数据引擎技术实现与专病数据库的数据对接和整合,并建立规范的数据质量和数据安全管理体系。结果截至2023年底,数据抽取与对接平台已成功实现31个专病数据库的临床数据对接。平台应用后,专病数据库功能模块数量、平均建设周期、入组样本量、纳入指标数量和指标缺失情况等指标较平台应用前均有显著改善(P<0.05)。结论临床诊疗数据对接方法实现了专病数据库建设中数据内容的准确提取、转换和整合应用,提高了专病数据库的建设效率和数据质量,为临床科研提供数据支撑,进而推动了临床科研成果转化。 展开更多
关键词 数据抽取 数据对接 专病数据库 临床诊疗数据 医院信息系统 电子病历
暂未订购
基于古今医案云平台中医药治疗紫癜性肾炎用药规律研究 被引量:1
17
作者 王贺勇 熊兰月 +4 位作者 孙胜 陈扬 杨兰 周靖颖 何小平 《四川中医》 2025年第3期203-208,共6页
目的:应用古今医案云平台研究中医药治疗紫癜性肾炎(HSPN)的用药规律,为临床更好的治疗HSPN提供借鉴。方法:采用检索有关期刊、文献中记载中医药治疗HSPN的有效医案,经过数据筛选、规范化处理并利用古今医案云平台的数据挖掘工具,对中... 目的:应用古今医案云平台研究中医药治疗紫癜性肾炎(HSPN)的用药规律,为临床更好的治疗HSPN提供借鉴。方法:采用检索有关期刊、文献中记载中医药治疗HSPN的有效医案,经过数据筛选、规范化处理并利用古今医案云平台的数据挖掘工具,对中医药治疗HSPN的药物使用频次、性味归经、中医证候、核心处方等用药规律进行分析总结。结果:纳入处方132首,涉及药物325种,其中使用频率最高的药物为地黄、牡丹皮、甘草、白茅根、赤芍、小蓟等,性味以寒性、平性、甘味、苦味为主,以赤芍、牡丹皮、地黄、甘草、白茅根为核心基础方。结论:中医治疗时需注意标本兼顾,治本注意培补脾肾、益气养阴、扶助正气,治标注意清热解毒、凉血止血。 展开更多
关键词 古今医案云平台 数据挖掘 中医药 紫癜性肾炎 用药规律
原文传递
高质量发展背景下医院电子病历质控体系建设的思考 被引量:2
18
作者 东振彩 《现代医院》 2025年第1期65-67,71,共4页
目的建立电子病历质控评价体系,有效提升数据质控能力,加快形成电子病历理论体系框架构建,推动以电子病历质控体系建设为核心的公立医院高质量发展进程。方法基于霍尔三维结构模型,构建质量管理体系结构,将时间维、逻辑维和知识维融为一... 目的建立电子病历质控评价体系,有效提升数据质控能力,加快形成电子病历理论体系框架构建,推动以电子病历质控体系建设为核心的公立医院高质量发展进程。方法基于霍尔三维结构模型,构建质量管理体系结构,将时间维、逻辑维和知识维融为一体,切实发挥监管机制作用,有效提升管理水平,形成适用于临床、功能持续提升的高效管理体系。结果运行病历各类别缺陷率持续下降,会诊记录缺陷率、手术记录缺陷率呈现大幅降低。结论基于霍尔三维结构的住院电子病历质量管理体系建设,可有效推动电子病历系统高效运行,发挥功能性引导作用。 展开更多
关键词 电子病历 数据质量 指标 体系
暂未订购
基于PDCA的病案数据质量风险治理研究 被引量:2
19
作者 刘媛 汪刚 《中国卫生质量管理》 2025年第2期38-41,共4页
目的分析某院在数字化转型进程中病案数据面临的质量风险,并采取应对措施,以提高病案数据质量。方法采用PDCA管理方法,对病案首页中患者的基本信息进行梳理与追踪分析,实施了包括流程再造、完善标准数据库与质控规则、建设病案数据指标... 目的分析某院在数字化转型进程中病案数据面临的质量风险,并采取应对措施,以提高病案数据质量。方法采用PDCA管理方法,对病案首页中患者的基本信息进行梳理与追踪分析,实施了包括流程再造、完善标准数据库与质控规则、建设病案数据指标可视化平台以及针对相关人员专业培训等改进措施。结果病案首页患者基本信息的完整性得到显著改善,数据缺失与错误现象大幅减少,病案首页的整体质量得到有效提升。结论通过信息采集流程同质化管理,系统设置智能校验机制,加强数据质量培训,建立多部门协助机制等措施,可以提升病案首页信息的完整性、准确性,提高医院管理水平。 展开更多
关键词 数字化转型 病案 PDCA 数据质量
暂未订购
基于XGBoost的丢头地震记录自动识别模型
20
作者 李山有 谢博楠 +3 位作者 卢建旗 谢志南 李伟 陈欣 《应用基础与工程科学学报》 北大核心 2025年第2期338-348,共11页
约1/2以上的强震动观测数据面临信号丢头的问题.如何在海量记录中自动剔除丢头的地震记录是地震P波参数相关算法研究的重要需求.基于极限梯度提升树(XGBoost)方法,建立了丢头地震动记录的自动识别模型.采用日本K-NET台网记录的970次地震... 约1/2以上的强震动观测数据面临信号丢头的问题.如何在海量记录中自动剔除丢头的地震记录是地震P波参数相关算法研究的重要需求.基于极限梯度提升树(XGBoost)方法,建立了丢头地震动记录的自动识别模型.采用日本K-NET台网记录的970次地震的83825条竖向分量加速度记录作为XGBoost模型的训练/测试数据集.该模型对正样本(未丢头记录)的识别成功率为92.07%,对负样本(丢头记录)的识别成功率为98.93%.在相同测试数据集下与基于Fisher线性分辨的传统模型相比,XGBoost模型不仅极大地提高了正样本的识别成功率,同时也保证了负样本较高的识别成功率.结果表明,该模型对(未)丢头地震记录有很高的识别精度,当需要从海量强震动观测数据中自动提取P波参数时,可以运用该模型自动剔除丢头地震记录,以避免丢头地震记录对数据质量造成污染. 展开更多
关键词 海量地震数据 丢头地震记录 XGBoost 集成学习 地震P波 数据清洗
原文传递
上一页 1 2 140 下一页 到第
使用帮助 返回顶部