期刊文献+
共找到959篇文章
< 1 2 48 >
每页显示 20 50 100
Advances in five-dimensional seismic data interpretation and reservoir prediction
1
作者 Xingyao YIN Kun LI +2 位作者 Zhaoyun ZONG Fanchang ZHANG Zhengqian MA 《Science China Earth Sciences》 2026年第2期395-415,共21页
Five-dimensional seismic data encompasses seismic reflection wavefield information across three-dimensional space,offset,and observation azimuth.The interpretation of such data offers a novel approach for high-precisi... Five-dimensional seismic data encompasses seismic reflection wavefield information across three-dimensional space,offset,and observation azimuth.The interpretation of such data offers a novel approach for high-precision characterization of complex oil and gas reservoirs.This paper reviews key scientific issues and foundational research related to five-dimensional seismic data interpretation,with a particular emphasis on major advances in techniques involving rock physics theories,seismic attribute analysis,seismic inversion optimization,fracture prediction,in-situ stress estimation,and fluid identification,both domestically and internationally.It further explores the opportunities,challenges,and future directions in the development of theories and methods for interpreting five-dimensional seismic data.Theoretical research and real applications have shown that constructing a five-dimensional seismic rock physics model—incorporating temperature and pressure conditions,strong heterogeneity and anisotropy,and other microscopic rock physics mechanisms—provides the physical basis for seismically identifying different types of complex reservoirs.Additionally,the development of robust inversion and quantitative interpretation methods tailored to fractured reservoirs can address issues such as computational instability and low information utilization often associated with massive high-dimensional datasets.Innovations in fracture prediction technology,leveraging multi-dimensional information fusion attributes—including five-dimensional geometric attributes,azimuthal elastic modulus ellipse fitting,Fourier series decomposition,and azimuthal inversion attributes—have proven effective in enhancing fracture prediction accuracy.Moreover,the establishment of five-dimensional seismic prediction methods for engineering sweet spots(e.g.,reservoir brittleness and in-situ stress)based on anisotropy theory enables effective evaluation of the fracturability of subsurface formations.The application of five-dimensional seismic interpretation theory and technology provides a new pathway for predicting complex reservoirs and oil-gas identification. 展开更多
关键词 five-dimensional seismic data Seismic inversion Reservoir prediction Seismic rock physics Fracture prediction
原文传递
Bernoulli-based random undersampling schemes for 2D seismic data regularization 被引量:4
2
作者 蔡瑞 赵群 +3 位作者 佘德平 杨丽 曹辉 杨勤勇 《Applied Geophysics》 SCIE CSCD 2014年第3期321-330,351,352,共12页
Seismic data regularization is an important preprocessing step in seismic signal processing. Traditional seismic acquisition methods follow the Shannon–Nyquist sampling theorem, whereas compressive sensing(CS) prov... Seismic data regularization is an important preprocessing step in seismic signal processing. Traditional seismic acquisition methods follow the Shannon–Nyquist sampling theorem, whereas compressive sensing(CS) provides a fundamentally new paradigm to overcome limitations in data acquisition. Besides the sparse representation of seismic signal in some transform domain and the 1-norm reconstruction algorithm, the seismic data regularization quality of CS-based techniques strongly depends on random undersampling schemes. For 2D seismic data, discrete uniform-based methods have been investigated, where some seismic traces are randomly sampled with an equal probability. However, in theory and practice, some seismic traces with different probability are required to be sampled for satisfying the assumptions in CS. Therefore, designing new undersampling schemes is imperative. We propose a Bernoulli-based random undersampling scheme and its jittered version to determine the regular traces that are randomly sampled with different probability, while both schemes comply with the Bernoulli process distribution. We performed experiments using the Fourier and curvelet transforms and the spectral projected gradient reconstruction algorithm for 1-norm(SPGL1), and ten different random seeds. According to the signal-to-noise ratio(SNR) between the original and reconstructed seismic data, the detailed experimental results from 2D numerical and physical simulation data show that the proposed novel schemes perform overall better than the discrete uniform schemes. 展开更多
关键词 Seismic data regularization compressive sensing Bernoulli distribution sparse transform UNDERSAMPLING 1-norm reconstruction algorithm.
在线阅读 下载PDF
Diverse Deep Matrix Factorization With Hypergraph Regularization for Multi-View Data Representation
3
作者 Haonan Huang Guoxu Zhou +2 位作者 Naiyao Liang Qibin Zhao Shengli Xie 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第11期2154-2167,共14页
Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency o... Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency of multi-view data,while neglecting the diversity among different views as well as the high-order relationships of data,resulting in the loss of valuable complementary information.In this paper,we design a hypergraph regularized diverse deep matrix factorization(HDDMF)model for multi-view data representation,to jointly utilize multi-view diversity and a high-order manifold in a multilayer factorization framework.A novel diversity enhancement term is designed to exploit the structural complementarity between different views of data.Hypergraph regularization is utilized to preserve the high-order geometry structure of data in each view.An efficient iterative optimization algorithm is developed to solve the proposed model with theoretical convergence analysis.Experimental results on five real-world data sets demonstrate that the proposed method significantly outperforms stateof-the-art multi-view learning approaches. 展开更多
关键词 Deep matrix factorization(DMF) diversity hypergraph regularization multi-view data representation(MDR)
在线阅读 下载PDF
Iteratively Weighted Least Square Inversion of 3D Seismic Data Regularization under Constraints of Local Plane Wave Model
4
作者 Liu Yujin Li Zhenchun 《石油地球物理勘探》 EI CSCD 北大核心 2012年第A02期41-47,共7页
关键词 石油 地球物理勘探 地质调查 油气资源
在线阅读 下载PDF
Data-Path Placement Based on Regularity Extraction and Implementation
5
作者 杨长旗 洪先龙 +2 位作者 蔡懿慈 经彤 吴为民 《Journal of Semiconductors》 EI CAS CSCD 北大核心 2004年第8期925-936,共12页
An algorithm named DPP is addressed.In it,a new model based on the concept of irregularity degree is founded to evaluate the regularity of cells.It generates the structure regularity of cells by exploiting the signal ... An algorithm named DPP is addressed.In it,a new model based on the concept of irregularity degree is founded to evaluate the regularity of cells.It generates the structure regularity of cells by exploiting the signal flow of circuit.Then,it converts the bit slice structure to parallel constraints to enable Q place algorithm.The design flow and the main algorithms are introduced.Finally,the satisfied experimental result of the tool compared with the Cadence placement tool SE is discussed. 展开更多
关键词 data Path regularity extraction bit slice structure Q place
在线阅读 下载PDF
Regularized least-squares migration of simultaneous-source seismic data with adaptive singular spectrum analysis 被引量:12
6
作者 Chuang Li Jian-Ping Huang +1 位作者 Zhen-Chun Li Rong-Rong Wang 《Petroleum Science》 SCIE CAS CSCD 2017年第1期61-74,共14页
Simultaneous-source acquisition has been recog- nized as an economic and efficient acquisition method, but the direct imaging of the simultaneous-source data produces migration artifacts because of the interference of... Simultaneous-source acquisition has been recog- nized as an economic and efficient acquisition method, but the direct imaging of the simultaneous-source data produces migration artifacts because of the interference of adjacent sources. To overcome this problem, we propose the regularized least-squares reverse time migration method (RLSRTM) using the singular spectrum analysis technique that imposes sparseness constraints on the inverted model. Additionally, the difference spectrum theory of singular values is presented so that RLSRTM can be implemented adaptively to eliminate the migration artifacts. With numerical tests on a fiat layer model and a Marmousi model, we validate the superior imaging quality, efficiency and convergence of RLSRTM compared with LSRTM when dealing with simultaneoussource data, incomplete data and noisy data. 展开更多
关键词 Least-squares migration Adaptive singularspectrum analysis regularization Blended data
原文传递
Graph Regularized L_p Smooth Non-negative Matrix Factorization for Data Representation 被引量:10
7
作者 Chengcai Leng Hai Zhang +2 位作者 Guorong Cai Irene Cheng Anup Basu 《IEEE/CAA Journal of Automatica Sinica》 EI CSCD 2019年第2期584-595,共12页
This paper proposes a Graph regularized Lpsmooth non-negative matrix factorization(GSNMF) method by incorporating graph regularization and L_p smoothing constraint, which considers the intrinsic geometric information ... This paper proposes a Graph regularized Lpsmooth non-negative matrix factorization(GSNMF) method by incorporating graph regularization and L_p smoothing constraint, which considers the intrinsic geometric information of a data set and produces smooth and stable solutions. The main contributions are as follows: first, graph regularization is added into NMF to discover the hidden semantics and simultaneously respect the intrinsic geometric structure information of a data set. Second,the Lpsmoothing constraint is incorporated into NMF to combine the merits of isotropic(L_2-norm) and anisotropic(L_1-norm)diffusion smoothing, and produces a smooth and more accurate solution to the optimization problem. Finally, the update rules and proof of convergence of GSNMF are given. Experiments on several data sets show that the proposed method outperforms related state-of-the-art methods. 展开更多
关键词 data clustering dimensionality reduction GRAPH regularization LP SMOOTH non-negative matrix factorization(SNMF)
在线阅读 下载PDF
An enrichment model using regular health examination data for early detection of colorectal cancer 被引量:3
8
作者 Qiang Shi Zhaoya Gao +8 位作者 Pengze Wu Fanxiu Heng Fuming Lei Yanzhao Wang Qingkun Gao Qingmin Zeng Pengfei Niu Cheng Li Jin Gu 《Chinese Journal of Cancer Research》 SCIE CAS CSCD 2019年第4期686-698,共13页
Objective: Challenges remain in current practices of colorectal cancer(CRC) screening, such as low compliance,low specificities and expensive cost. This study aimed to identify high-risk groups for CRC from the genera... Objective: Challenges remain in current practices of colorectal cancer(CRC) screening, such as low compliance,low specificities and expensive cost. This study aimed to identify high-risk groups for CRC from the general population using regular health examination data.Methods: The study population consist of more than 7,000 CRC cases and more than 140,000 controls. Using regular health examination data, a model detecting CRC cases was derived by the classification and regression trees(CART) algorithm. Receiver operating characteristic(ROC) curve was applied to evaluate the performance of models. The robustness and generalization of the CART model were validated by independent datasets. In addition, the effectiveness of CART-based screening was compared with stool-based screening.Results: After data quality control, 4,647 CRC cases and 133,898 controls free of colorectal neoplasms were used for downstream analysis. The final CART model based on four biomarkers(age, albumin, hematocrit and percent lymphocytes) was constructed. In the test set, the area under ROC curve(AUC) of the CART model was 0.88 [95%confidence interval(95% CI), 0.87-0.90] for detecting CRC. At the cutoff yielding 99.0% specificity, this model’s sensitivity was 62.2%(95% CI, 58.1%-66.2%), thereby achieving a 63-fold enrichment of CRC cases. We validated the robustness of the method across subsets of test set with diverse CRC incidences, aging rates, genders ratio, distributions of tumor stages and locations, and data sources. Importantly, CART-based screening had the higher positive predictive value(1.6%) than fecal immunochemical test(0.3%).Conclusions: As an alternative approach for the early detection of CRC, this study provides a low-cost method using regular health examination data to identify high-risk individuals for CRC for further examinations. The approach can promote early detection of CRC especially in developing countries such as China, where annual health examination is popular but regular CRC-specific screening is rare. 展开更多
关键词 Classification and regression trees COLORECTAL cancer regular health examination data ROUTINE lab test biomarkers
暂未订购
Data Gathering in Wireless Sensor Networks Via Regular Low Density Parity Check Matrix 被引量:1
9
作者 Xiaoxia Song Yong Li 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2018年第1期83-91,共9页
A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomne... A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs. 展开更多
关键词 data gathering regular low density parity check(RLDPC) matrix sensing matrix signal reconstruction wireless sensor networks(WSNs)
在线阅读 下载PDF
Regularized canonical correlation analysis with unlabeled data 被引量:1
10
作者 Xi-chuan ZHOU Hai-bin SHEN 《Journal of Zhejiang University-Science A(Applied Physics & Engineering)》 SCIE EI CAS CSCD 2009年第4期504-511,共8页
In standard canonical correlation analysis (CCA), the data from definite datasets are used to estimate their canonical correlation. In real applications, for example in bilingual text retrieval, it may have a great po... In standard canonical correlation analysis (CCA), the data from definite datasets are used to estimate their canonical correlation. In real applications, for example in bilingual text retrieval, it may have a great portion of data that we do not know which set it belongs to. This part of data is called unlabeled data, while the rest from definite datasets is called labeled data. We propose a novel method called regularized canonical correlation analysis (RCCA), which makes use of both labeled and unlabeled samples. Specifically, we learn to approximate canonical correlation as if all data were labeled. Then, we describe a generalization of RCCA for the multi-set situation. Experiments on four real world datasets, Yeast, Cloud, Iris, and Haberman, demonstrate that, by incorporating the unlabeled data points, the accuracy of correlation coefficients can be improved by over 30%. 展开更多
关键词 Canonical correlation analysis (CCA) regularization Unlabeled data Generalized canonical correlation analysis(GCCA)
原文传递
Effect of regularization parameters on geophysical reconstruction
11
作者 Zhou Hui Wang Zhaolei +2 位作者 Qiu Dongling Li Guofa Shen Jinsong 《Petroleum Science》 SCIE CAS CSCD 2009年第2期119-126,共8页
In this paper we discuss the edge-preserving regularization method in the reconstruction of physical parameters from geophysical data such as seismic and ground-penetrating radar data. In the regularization method a p... In this paper we discuss the edge-preserving regularization method in the reconstruction of physical parameters from geophysical data such as seismic and ground-penetrating radar data. In the regularization method a potential function of model parameters and its corresponding functions are introduced. This method is stable and able to preserve boundaries, and protect resolution. The effect of regularization depends to a great extent on the suitable choice of regularization parameters. The influence of the edge-preserving parameters on the reconstruction results is investigated and the relationship between the regularization parameters and the error of data is described. 展开更多
关键词 Geophysical data INVERSION error of data regularization regularization parameters
原文传递
THE REGULARITY AND UNIQUENESS OF A GLOBAL SOLUTION TO THE ISENTROPIC NAVIER-STOKES EQUATION WITH ROUGH INITIAL DATA
12
作者 王海涛 张雄韬 《Acta Mathematica Scientia》 SCIE CSCD 2023年第4期1675-1716,共42页
A global weak solution to the isentropic Navier-Stokes equation with initial data around a constant state in the L^(1)∩BV class was constructed in[1].In the current paper,we will continue to study the uniqueness and ... A global weak solution to the isentropic Navier-Stokes equation with initial data around a constant state in the L^(1)∩BV class was constructed in[1].In the current paper,we will continue to study the uniqueness and regularity of the constructed solution.The key ingredients are the Holder continuity estimates of the heat kernel in both spatial and time variables.With these finer estimates,we obtain higher order regularity of the constructed solution to Navier-Stokes equation,so that all of the derivatives in the equation of conservative form are in the strong sense.Moreover,this regularity also allows us to identify a function space such that the stability of the solutions can be established there,which eventually implies the uniqueness. 展开更多
关键词 compressible Navier-Stokes equation BV initial data regularITY UNIQUENESS
在线阅读 下载PDF
甲状腺结节、乳腺增生和子宫肌瘤三病的相关性及患病规律:基于真实世界数据的研究 被引量:1
13
作者 李春晓 张莹莹 +3 位作者 凌霄 杨玉晴 张业 金艳涛 《中华中医药学刊》 北大核心 2026年第1期7-12,共6页
目的旨在探究甲状腺结节(thyroid nodules,TN)、乳腺增生(hyperplasia of mammary gland,HMG)和子宫肌瘤(uterine leiomyomas,UL)三种疾病之间的相互关联性及其患病规律,为临床诊疗方案和合理用药提供科学依据。方法回顾性分析了河南中... 目的旨在探究甲状腺结节(thyroid nodules,TN)、乳腺增生(hyperplasia of mammary gland,HMG)和子宫肌瘤(uterine leiomyomas,UL)三种疾病之间的相互关联性及其患病规律,为临床诊疗方案和合理用药提供科学依据。方法回顾性分析了河南中医药大学第一附属医院体检中心2011年10月19日—2018年12月19日进行甲状腺、乳房、子宫超声检查的女性体检者的电子健康记录。根据检出结果,将单独检出任一疾病、任意两种疾病共发及三病并发的病例纳入病例组,未检出任何疾病的个体纳入对照组。通过SPSS Statistics 22.0软件运用Cochran-Mantel-Haenszel检验及描述性统计方法,分析三种疾病之间的相关性及患病规律。结果共纳入符合研究条件的病例5252例,其中病例组2902例。相关性分析显示,任意两病的共发均有显著的正向相关性。其中TN与HMG在45~59岁年龄组(r=0.106,P<0.001)、TN与UL在18~44岁年龄组(r=0.122,P<0.001)、HMG和UL在45~59岁年龄组(r=0.157,P<0.001)的相关性最为显著。描述性统计分析表明,三病或任意两病的并发主要集中在45~59岁年龄段。三种疾病患者的中医体质主要为痰湿质和阳虚质,血压、血脂及血糖水平大多正常,但与对照组相比存在显著差异(P<0.001)。实验室检查结果显示,血常规、肝功能和肾功能指标均处于正常范围内,但与对照组相比存在显著差异(P<0.001)。结论基于真实世界数据的研究结果显示,甲状腺结节、乳腺增生和子宫肌瘤三种疾病间存在显著的正向相关性,且在年龄分布、中医体质、血压、血糖、血脂及实验室检验指标方面与正常组相比均有显著差异,为这三种疾病的早期发现、预防和治疗提供了重要依据。 展开更多
关键词 甲状腺结节 乳腺增生 子宫肌瘤 数据挖掘 真实世界研究 患病规律
原文传递
Multiobjective particle swarm inversion algorithm for two-dimensional magnetic data 被引量:8
14
作者 熊杰 张涛 《Applied Geophysics》 SCIE CSCD 2015年第2期127-136,273,共11页
Regularization inversion uses constraints and a regularization factor to solve ill- posed inversion problems in geophysics. The choice of the regularization factor and of the initial model is critical in regularizatio... Regularization inversion uses constraints and a regularization factor to solve ill- posed inversion problems in geophysics. The choice of the regularization factor and of the initial model is critical in regularization inversion. To deal with these problems, we propose a multiobjective particle swarm inversion (MOPSOI) algorithm to simultaneously minimize the data misfit and model constraints, and obtain a multiobjective inversion solution set without the gradient information of the objective function and the regularization factor. We then choose the optimum solution from the solution set based on the trade-off between data misfit and constraints that substitute for the regularization factor. The inversion of synthetic two-dimensional magnetic data suggests that the MOPSOI algorithm can obtain as many feasible solutions as possible; thus, deeper insights of the inversion process can be gained and more reasonable solutions can be obtained by balancing the data misfit and constraints. The proposed MOPSOI algorithm can deal with the problems of choosing the right regularization factor and the initial model. 展开更多
关键词 multiobjective inversion particle swarm optimization regularization factor global search magnetic data
在线阅读 下载PDF
Extrapolated Tikhonov method and inversion of 3D density images of gravity data
15
作者 王祝文 许石 +1 位作者 刘银萍 刘菁华 《Applied Geophysics》 SCIE CSCD 2014年第2期139-148,252,共11页
Tikhonov regularization(TR) method has played a very important role in the gravity data and magnetic data process. In this paper, the Tikhonov regularization method with respect to the inversion of gravity data is d... Tikhonov regularization(TR) method has played a very important role in the gravity data and magnetic data process. In this paper, the Tikhonov regularization method with respect to the inversion of gravity data is discussed. and the extrapolated TR method(EXTR) is introduced to improve the fitting error. Furthermore, the effect of the parameters in the EXTR method on the fitting error, number of iterations, and inversion results are discussed in details. The computation results using a synthetic model with the same and different densities indicated that. compared with the TR method, the EXTR method not only achieves the a priori fitting error level set by the interpreter but also increases the fitting precision, although it increases the computation time and number of iterations. And the EXTR inversion results are more compact than the TR inversion results, which are more divergent. The range of the inversion data is closer to the default range of the model parameters, and the model features and default model density distribution agree well. 展开更多
关键词 Gravity data inversion 3D inversion extrapolated Tikhonov regularization method extrapolated Tikhonov parameter selection
在线阅读 下载PDF
基于数据挖掘分析心悸中药复方专利的用药规律
16
作者 王雅婧 张晋豫 +3 位作者 韩雨桐 崔坤 何海龙 谢连娣 《中西医结合心脑血管病杂志》 2026年第4期508-514,共7页
目的:基于国家知识产权局专利数据库中药复方专利,探析中药治疗心悸的组方规律。方法:搜集1985年9月10日—2024年3月5日符合纳入标准的中药治疗心悸的复方专利数据,运用Excel 2022、R语言(v4.2.0)、VOSviewer(1.6.20)等软件对处方数据... 目的:基于国家知识产权局专利数据库中药复方专利,探析中药治疗心悸的组方规律。方法:搜集1985年9月10日—2024年3月5日符合纳入标准的中药治疗心悸的复方专利数据,运用Excel 2022、R语言(v4.2.0)、VOSviewer(1.6.20)等软件对处方数据进行数据清洗、规范、建模、用药频次分析、关联规则分析、聚类分析、共现次数及可视化等以探究心悸的组方规律。结果:最终纳入符合标准的心悸专利数据350项,涉及中药736味;用药频次排名前10位的中药有丹参、麦冬、当归、酸枣仁、五味子、桂枝、黄芪、炙甘草、人参、党参;关联规则分析得到数据63条,关联度较高的组合为丹参、红花→桃仁,牡蛎→龙骨,龙骨→牡蛎,丹参、桃仁→红花,桃仁→红花;聚类分析得到药物组合6组;共现规则结果提示,出现频次最多的药对是丹参-当归及丹参-麦冬等。结论:心悸的基本病机为气阴两虚,郁热内生,治以益气养阴,宣郁清热,核心用药为麦冬、五味子、地黄、桂枝、炙甘草、郁金、黄连、苦参;心脾两虚,气血乏源,治以健脾养血,化痰活血,核心用药为大枣、龙眼肉、白芍、熟地、党参、菖蒲、远志、白术、陈皮、茯苓、甘草;血瘀为核心要素,治以活血祛瘀,方药多选丹参相佐;悸为不安,心神不宁,可依证取镇静龙骨、牡蛎或养心之酸枣仁、柏子仁。核心处方为丹参、炙甘草、五味子、酸枣仁、人参、麦冬、茯苓、党参、当归、川芎。通过数据挖掘所得新处方及配伍规律可为临床诊治心悸及新药开发提供参考与借鉴。 展开更多
关键词 心悸 心律失常 数据挖掘 国家专利 用药规律
暂未订购
基于《医学原理》探讨汪机脑系病证用药规律
17
作者 远志 胡庆龄 +4 位作者 葛淑凡 吴榕柠 郭锦晨 刘柳青 黄辉 《陕西中医药大学学报》 2026年第1期87-96,共10页
目的 基于《医学原理》,运用数据挖掘的方法探讨汪机治疗脑系病证核心处方的用药规律。方法 检索《医学原理》中关于脑系病证的组方用药,使用Microsoft Excel 2021软件创建用药数据库,对所载处方用药开展频次、四气五味、归经以及功效... 目的 基于《医学原理》,运用数据挖掘的方法探讨汪机治疗脑系病证核心处方的用药规律。方法 检索《医学原理》中关于脑系病证的组方用药,使用Microsoft Excel 2021软件创建用药数据库,对所载处方用药开展频次、四气五味、归经以及功效分类的分析;通过SPSS Modeler 18.0和SPSS Statistics 26.0软件开展关联规则分析、高频中药聚类分析和因子分析,并对用药规律分析中高频中药的特征性活性成分及作用机制进行文献挖掘分析。结果 共纳入109首处方,使用中药共136味,总使用频次达776次,频次排名前7的药物分别是甘草、人参、川芎、半夏、陈皮、茯苓、白芍;所用中药寒温并用,药味以辛味为主,甘苦为辅,归经以入脾经为主,多为补虚药、解表药、清热药。关联规则分析得到高支持度用药组合“白芍-当归-川芎”,高置信度组合“人参-黄芪”(暂不考虑置信度为100%的组合),并通过复杂网络得到甘草-人参之间关联性最强;聚类分析得到7类药物组合;因子分析得到9个公因子。结论 脑系病证当先进行虚实辨证,虚证当先补虚,如补气、补血、滋阴、温阳等;实证当先泻实,如理气、化瘀、逐水、攻下、豁痰、通络等。汪机治疗脑系病证倡导补气为主,兼以补血养阴,善用人参、黄芪固本培元。常使用柴胡、升麻、细辛等解表药,以及熟地黄、白术、当归等补虚药,临床处方常基于四物汤、补中益气汤、二陈汤等加减。汪机治疗脑系病证的核心处方中常用中药的作用机制多与抑制神经炎症和氧化应激有关,为临床脑系病变的治疗及靶向药物的研发提供参考借鉴。 展开更多
关键词 医学原理 数据挖掘 中药 脑系病证 用药规律
暂未订购
基于垂向密度的LiDAR点云建筑物轮廓提取
18
作者 蔡训峰 徐卓揆 +1 位作者 袁齐 朱彬 《工程勘察》 2026年第2期70-75,共6页
从点云数据中提取建筑物轮廓是当前的一个研究热点,而现有算法大都需要先选取合适的种子点或不能很好地适应密度不均匀的点云数据。本文提出一种基于垂向密度快速提取点云数据建筑物矢量轮廓的方法,首先采用高程和面积阈值对滤波得到的... 从点云数据中提取建筑物轮廓是当前的一个研究热点,而现有算法大都需要先选取合适的种子点或不能很好地适应密度不均匀的点云数据。本文提出一种基于垂向密度快速提取点云数据建筑物矢量轮廓的方法,首先采用高程和面积阈值对滤波得到的非地面点分离出建筑物点云,然后基于垂向密度提取建筑物初始多段线,最后对初始多段线进行加权拟合提取建筑物规则化轮廓线。结果表明,基于垂向密度的点云建筑物轮廓提取方法无需其他辅助数据,且能较好地适应复杂地形,通过实验获取数据与实测数据对比分析可知,建筑物轮廓提取的准确度为90.98%、面积提取的准确度为94.32%、周长提取准确度为95.72%、位置精度均分误差为0.036 m,提取效果较好,可为点云数据的建筑物轮廓提取提供一种新方法。 展开更多
关键词 LiDAR点云数据 矢量化 建筑物轮廓 垂向密度 多段线加权规则化
原文传递
FedReg^(*):Addressing Non-Independent and Identically Distributed Challenges in Federated Learning
19
作者 SHI Xiujin ZHU Xiaolong XIAO Wentao 《Journal of Donghua University(English Edition)》 2026年第1期41-49,共9页
In non-independent and identically distributed(non-IID)data environments,model performance often degrades significantly.To address this issue,two improvement methods are proposed:FedReg and FedReg^(*).FedReg is a meth... In non-independent and identically distributed(non-IID)data environments,model performance often degrades significantly.To address this issue,two improvement methods are proposed:FedReg and FedReg^(*).FedReg is a method based on hybrid regularization aimed at enhancing federated learning in non-IID scenarios.It introduces hybrid regularization to replace traditional L2 regularization,combining the advantages of L1 and L2 regularization to enable feature selection while preventing overfitting.This method better adapts to the diverse data distributions of different clients,improving the overall model performance.FedReg^(*)combines hybrid regularization with weighted model aggregation.In addition to the benefits of hybrid regularization,FedReg^(*)applies a weighted averaging method in the model aggregation process,calculating weights based on the cosine similarity between each client gradient and the global gradient to more reasonably distribute client contributions.By considering variations in data quality and quantity among clients,FedReg^(*)highlights the importance of key clients and enhances the model’s generalization performance.These improvement methods enhance model accuracy and communication efficiency. 展开更多
关键词 federated learning non-independent and identically distributed(non-IID)data hybrid regularization cosine similarity
在线阅读 下载PDF
基于数据挖掘的黎文中医治疗荨麻疹用药规律分析
20
作者 叶锦雄 陈文婷 +2 位作者 何骋 郑利平 陈建新 《中国民族民间医药》 2026年第4期110-115,共6页
目的:探索黎文中医治疗荨麻疹的遣方用药规律。方法:基于门诊病历系统,收集广州市东升医院黎文中医2020年11月至2021年5月诊断为荨麻疹的处方,建立数据库。运用SPSS Statistics 26统计软件及SPSS Modeler 18.0数据挖掘软件,对中药数据... 目的:探索黎文中医治疗荨麻疹的遣方用药规律。方法:基于门诊病历系统,收集广州市东升医院黎文中医2020年11月至2021年5月诊断为荨麻疹的处方,建立数据库。运用SPSS Statistics 26统计软件及SPSS Modeler 18.0数据挖掘软件,对中药数据进行频次统计、分类及关联规则分析。结果:共纳入处方83例,中药95味,总计频次1175次。用药频次较高的前3味中药依次为防风、荆芥穗、蛇蜕;使用较高为解表药、补益药和平肝息风药;药性以温性和平性为主,药味以甘、辛为主;归经以肺经、肝经为主。在支持度为80%且信度为90%的情况下,组合频次较高的是荆芥穗、防风。聚类分析可分为5个聚类方。结论:黎文中医治疗荨麻疹主要采用解表药和补益药,中药选择相对集中,具有固定规律,为临床治疗荨麻疹提供有用的参考。 展开更多
关键词 荨麻疹 黎文 数据挖掘 用药规律
暂未订购
上一页 1 2 48 下一页 到第
使用帮助 返回顶部