期刊文献+
共找到3,587篇文章
< 1 2 180 >
每页显示 20 50 100
Real-time rock mass condition prediction with TBM tunneling big data using a novel rock-machine mutual feedback perception method 被引量:23
1
作者 Zhijun Wu Rulei Wei +1 位作者 Zhaofei Chu Quansheng Liu 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2021年第6期1311-1325,共15页
Real-time perception of rock mass information is of great importance to efficient tunneling and hazard prevention in tunnel boring machines(TBMs).In this study,a TBM-rock mutual feedback perception method based on dat... Real-time perception of rock mass information is of great importance to efficient tunneling and hazard prevention in tunnel boring machines(TBMs).In this study,a TBM-rock mutual feedback perception method based on data mining(DM) is proposed,which takes 10 tunneling parameters related to surrounding rock conditions as input features.For implementation,first,the database of TBM tunneling parameters was established,in which 10,807 tunneling cycles from the Songhua River water conveyance tunnel were accommodated.Then,the spectral clustering(SC) algorithm based on graph theory was introduced to cluster the TBM tunneling data.According to the clustering results and rock mass boreability index,the rock mass conditions were classified into four classes,and the reasonable distribution intervals of the main tunneling parameters corresponding to each class were presented.Meanwhile,based on the deep neural network(DNN),the real-time prediction model regarding different rock conditions was established.Finally,the rationality and adaptability of the proposed method were validated via analyzing the tunneling specific energy,feature importance,and training dataset size.The proposed TBM-rock mutual feedback perception method enables the automatic identification of rock mass conditions and the dynamic adjustment of tunneling parameters during TBM driving.Furthermore,in terms of the prediction performance,the method can predict the rock mass conditions ahead of the tunnel face in real time more accurately than the traditional machine learning prediction methods. 展开更多
关键词 Tunnel boring machine(TBM) data mining(DM) Spectral clustering(SC) Deep neural network(DNN) Rock mass condition perception
在线阅读 下载PDF
Effect of non-vacuum storage condition on minimum explosible concentration of aluminum dust 被引量:3
2
作者 ZHANG Shi-an ZHANG Lu-gang 《Journal of Measurement Science and Instrumentation》 CAS CSCD 2019年第3期219-222,共4页
Non-vacuum storage condition has a great impact on the explosion characteristics of aluminum powders. In this paper, vacuum-packed flake and globular aluminum powders stored in a dryer after opening the vacuum package... Non-vacuum storage condition has a great impact on the explosion characteristics of aluminum powders. In this paper, vacuum-packed flake and globular aluminum powders stored in a dryer after opening the vacuum package are selected as the experimental samples, and a 20 L spherical explosion device is chosen to test the minimum explosible concentration (MEC) values of aluminum dusts under different storage time. The results show that the MEC values of two types of unoxidized aluminum powders are 30 g/m^3. The MEC values of flake and globular aluminum powders firstly go up with the increase of storage time in the dryer and then reach the maximum values of 50 g/m^3 and 60 g/m^3 at respective storage time until finally they stabilize gradually. The main reason is that the oxidation rate is faster owing to the bigger specific surface area of globular aluminum powders. Hence, the storage time has more significant effect on the MEC of globular aluminum powder than that of flake aluminum powder. After a period of time, the outer surface is oxidized to generate a layer of film, which prevents the further oxidation of aluminum powder, resulting in the temporary stability of MEC. 展开更多
关键词 storage condition oxidation rate aluminum powders minimum explosible concentration (MEC)
在线阅读 下载PDF
A PRELIMINARY STUDY ON COMBINING TWO KINDS OF PROXY DATA USING THE CONDITIONAL QUANTILE ADJUSTMENT METHOD 被引量:1
3
作者 Wu Xiangding Liu Hongbin(Institute of Geography, CAS, Beijing 100101People’s Republic of China)Pan Yimin(Institute of Applied Mathematics, CAS, Beijing 100080People’s Republic of China) 《Journal of Geographical Sciences》 SCIE CSCD 1995年第1期52-62,共11页
Based on two kinds of proxy data, a tree-ring width chronology at Huashan and the wetness/dryness grade series around Xi'an in north-centralChina, thes presat study demonstrates how different types of proxy climat... Based on two kinds of proxy data, a tree-ring width chronology at Huashan and the wetness/dryness grade series around Xi'an in north-centralChina, thes presat study demonstrates how different types of proxy climaterecords can be combined to give a more reliable estimate of past climate thaneither record can be done individually. With comparison and correction of thetwo data sets, various statistical models can be developed from individual andcombined senes. Among them, the best combined model produced by theconditional quantile adjustmat method can be selected for reconstruction ofApril-July rainfall at Huashan back to 1600 A.D. 展开更多
关键词 conditional quantile CLIMATE proxy data
在线阅读 下载PDF
Effective data sampling strategies and boundary condition constraints of physics-informed neural networks for identifying material properties in solid mechanics 被引量:4
4
作者 W.WU M.DANEKER +2 位作者 M.A.JOLLEY K.T.TURNER L.LU 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI CSCD 2023年第7期1039-1068,共30页
Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the ch... Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the characteristic of the material is highly nonlinear in nature,as is common in biological tissue.In this work,we identify unknown material properties in continuum solid mechanics via physics-informed neural networks(PINNs).To improve the accuracy and efficiency of PINNs,we develop efficient strategies to nonuniformly sample observational data.We also investigate different approaches to enforce Dirichlet-type boundary conditions(BCs)as soft or hard constraints.Finally,we apply the proposed methods to a diverse set of time-dependent and time-independent solid mechanic examples that span linear elastic and hyperelastic material space.The estimated material parameters achieve relative errors of less than 1%.As such,this work is relevant to diverse applications,including optimizing structural integrity and developing novel materials. 展开更多
关键词 solid mechanics material identification physics-informed neural network(PINN) data sampling boundary condition(BC)constraint
在线阅读 下载PDF
On the 4D Variational Data Assimilation with Constraint Conditions 被引量:1
5
作者 朱克云 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2001年第6期1131-1145,共15页
An investigation is carried out on the problem involved in 4D variational data assimilation (VDA) with constraint conditions based on a finite-element shallow-water equation model. In the investigation, the adjoint te... An investigation is carried out on the problem involved in 4D variational data assimilation (VDA) with constraint conditions based on a finite-element shallow-water equation model. In the investigation, the adjoint technology, penalty method and augmented Lagrangian method are used in constraint optimization field to minimize the defined constraint objective functions. The results of the numerical experiments show that the optimal solutions are obtained if the functions reach the minima. VDA with constraint conditions controlling the growth of gravity oscillations is efficient to eliminate perturbation and produces optimal initial field. It seems that this method can also be applied to the problem in numerical weather prediction. Key words Variational data assimilation - Constraint conditions - Penalty methods - finite-element model This research is supported by National Natural Science Foundation of China (Grant No. 49575269) and by National Key Basic Research on the Formation Mechanism and Prediction Theory of Severe Synoptic Disasters (Grant No. G1998040910). 展开更多
关键词 Variational data assimilation Constraint conditions Penalty methods finite-element model
在线阅读 下载PDF
Quantitative Integration of High-Resolution Hydrogeophysical Data: A Novel Approach to Monte-Carlo-Type Conditional Stochastic Simulations and Implications for Hydrological Predictions
6
作者 Baptiste Dafflon James Irving Klaus Holliger 《Journal of China University of Geosciences》 SCIE CSCD 2009年第3期580-591,共12页
Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing rese... Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing research domain generally referred to as hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters, their inherent trade-off between resolution and range, as well as the notoriously site-specific nature of petrophysical parameter relations, the fundamental usefulness of multi-method surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database into a unified model of the probed subsurface region that is consistent with all available measurements. To this end, we present a novel approach toward hydrogeophysical data integration based on a Monte-Carlo-type conditional stochastic simulation method that we consider to be particularly suitable for high-resolution local-scale studies. Monte Carlo techniques are flexible and versatile, allowing for accounting for a wide variety of data and constraints of differing resolution and hardness, and thus have the potential of providing, in a geostatistical sense, realistic models of the pertinent target parameter distributions. Compared to more conventional approaches, such as co-kriging or cluster analysis, our approach provides significant ad- vancements in the way that larger-scale structural information eontained in the hydrogeophysieal data can be accounted for. After outlining the methodological background of our algorithm, we present the results of its application to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the detailed local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to a field dataset collected at the Boise Hydrogeophysical Research Site. Finally, we compare the performance of our data integration approach to that of more conventional methods with regard to the prediction of flow and transport phenomena in highly heterogeneous media and discuss the implications arising. 展开更多
关键词 aquifer characterization conditional simulation GEOSTATISTICS HYDROGEOPHYSICS HYDROLOGY quantitative data integration Monte Carlo methods simulated annealing flow and transport modeling.
原文传递
A Geometric Approach to Conditioning and the Search for Minimum Variance Unbiased Estimators
7
作者 James E. Marengo David L. Farnsworth 《Open Journal of Statistics》 2021年第3期437-442,共6页
Our purpose is twofold: to present a prototypical example of the conditioning technique to obtain the best estimator of a parameter and to show that th</span><span style="font-family:Verdana;">is... Our purpose is twofold: to present a prototypical example of the conditioning technique to obtain the best estimator of a parameter and to show that th</span><span style="font-family:Verdana;">is technique resides in the structure of an inner product space. Th</span><span style="font-family:Verdana;">e technique uses conditioning </span></span><span style="font-family:Verdana;">of</span><span style="font-family:Verdana;"> an unbiased estimator </span><span style="font-family:Verdana;">on</span><span style="font-family:Verdana;"> a sufficient statistic. This procedure is founded upon the conditional variance formula, which leads to an inner product space and a geometric interpretation. The example clearly illustrates the dependence on the sampling methodology. These advantages show the power and centrality of this process. 展开更多
关键词 conditional Variance Formula conditionING Geometric Representation minimum Variance Estimator Rao-Blackwell Theorem Sufficient Statistic Unbiased Estimator
在线阅读 下载PDF
Some Additional Moment Conditions for a Dynamic Count Panel Data Model with Predetermined Explanatory Variables
8
作者 Yoshitsugu Kitazawa 《Open Journal of Statistics》 2013年第5期319-333,共15页
This paper proposes some additional moment conditions for the linear feedback model with explanatory variables being predetermined, which is proposed by [1] for the purpose of dealing with count panel data. The newly ... This paper proposes some additional moment conditions for the linear feedback model with explanatory variables being predetermined, which is proposed by [1] for the purpose of dealing with count panel data. The newly proposed moment conditions include those associated with the equidispersion, the Negbin I-type model and the stationarity. The GMM estimators are constructed incorporating the additional moment conditions. Some Monte Carlo experiments indicate that the GMM estimators incorporating the additional moment conditions perform well, compared to that using only the conventional moment conditions proposed by [2,3]. 展开更多
关键词 COUNT PANEL data Linear Feedback Model MOMENT conditions GMM MONTE Carlo Experiments
在线阅读 下载PDF
Zero Truncated Bivariate Poisson Model: Marginal-Conditional Modeling Approach with an Application to Traffic Accident Data
9
作者 Rafiqul I. Chowdhury M. Ataharul Islam 《Applied Mathematics》 2016年第14期1589-1598,共11页
A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model wi... A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model with estimation procedure and tests for goodness-of-fit and under (or over) dispersion are shown and applied to road safety data. Two correlated outcome variables considered in this study are number of cars involved in an accident and number of casualties for given number of cars. 展开更多
关键词 Bivariate Poisson conditional Model Generalized Linear Model Marginal Model Road Safety data Zero-Truncated
在线阅读 下载PDF
A Novel Computationally Efficient Approach to Identify Visually Interpretable Medical Conditions from 2D Skeletal Data
10
作者 Praveen Jesudhas T.Raghuveera 《Computer Systems Science & Engineering》 SCIE EI 2023年第9期2995-3015,共21页
Timely identification and treatment of medical conditions could facilitate faster recovery and better health.Existing systems address this issue using custom-built sensors,which are invasive and difficult to generaliz... Timely identification and treatment of medical conditions could facilitate faster recovery and better health.Existing systems address this issue using custom-built sensors,which are invasive and difficult to generalize.A low-complexity scalable process is proposed to detect and identify medical conditions from 2D skeletal movements on video feed data.Minimal set of features relevant to distinguish medical conditions:AMF,PVF and GDF are derived from skeletal data on sampled frames across the entire action.The AMF(angular motion features)are derived to capture the angular motion of limbs during a specific action.The relative position of joints is represented by PVF(positional variation features).GDF(global displacement features)identifies the direction of overall skeletal movement.The discriminative capability of these features is illustrated by their variance across time for different actions.The classification of medical conditions is approached in two stages.In the first stage,a low-complexity binary LSTM classifier is trained to distinguish visual medical conditions from general human actions.As part of stage 2,a multi-class LSTM classifier is trained to identify the exact medical condition from a given set of visually interpretable medical conditions.The proposed features are extracted from the 2D skeletal data of NTU RGB+D and then used to train the binary and multi-class LSTM classifiers.The binary and multi-class classifiers observed average F1 scores of 77%and 73%,respectively,while the overall system produced an average F1 score of 69%and a weighted average F1 score of 80%.The multi-class classifier is found to utilize 10 to 100 times fewer parameters than existing 2D CNN-based models while producing similar levels of accuracy. 展开更多
关键词 Action recognition 2D skeletal data medical condition computer vision deep learning
在线阅读 下载PDF
Censored Composite Conditional Quantile Screening for High-Dimensional Survival Data
11
作者 LIU Wei LI Yingqiu 《应用概率统计》 CSCD 北大核心 2024年第5期783-799,共17页
In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all usef... In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated. 展开更多
关键词 high-dimensional survival data censored composite conditional quantile coefficient sure screening property rank consistency property
在线阅读 下载PDF
Minimum MSE Weighted Estimator to Make Inferences for a Common Risk Ratio across Sparse Meta-Analysis Data
12
作者 Chukiat Viwatwongkasem Sutthisak Srisawad +4 位作者 Pichitpong Soontornpipit Jutatip Sillabutra Pratana Satitvipawee Prasong Kitidamrongsuk Hathaikan Chootrakool 《Open Journal of Statistics》 2022年第1期49-69,共21页
The paper aims to discuss three interesting issues of statistical inferences for a common risk ratio (RR) in sparse meta-analysis data. Firstly, the conventional log-risk ratio estimator encounters a number of problem... The paper aims to discuss three interesting issues of statistical inferences for a common risk ratio (RR) in sparse meta-analysis data. Firstly, the conventional log-risk ratio estimator encounters a number of problems when the number of events in the experimental or control group is zero in sparse data of a 2 × 2 table. The adjusted log-risk ratio estimator with the continuity correction points  based upon the minimum Bayes risk with respect to the uniform prior density over (0, 1) and the Euclidean loss function is proposed. Secondly, the interest is to find the optimal weights of the pooled estimate  that minimize the mean square error (MSE) of  subject to the constraint on  where , , . Finally, the performance of this minimum MSE weighted estimator adjusted with various values of points  is investigated to compare with other popular estimators, such as the Mantel-Haenszel (MH) estimator and the weighted least squares (WLS) estimator (also equivalently known as the inverse-variance weighted estimator) in senses of point estimation and hypothesis testing via simulation studies. The results of estimation illustrate that regardless of the true values of RR, the MH estimator achieves the best performance with the smallest MSE when the study size is rather large  and the sample sizes within each study are small. The MSE of WLS estimator and the proposed-weight estimator adjusted by , or , or are close together and they are the best when the sample sizes are moderate to large (and) while the study size is rather small. 展开更多
关键词 minimum MSE Weights Adjusted Log-Risk Ratio Estimator Sparse Meta-Analysis data Continuity Correction
在线阅读 下载PDF
WATERiD's Novel Methodology for Condition Assessment Cost Data Collection and Visualization
13
作者 Stephen M. Welling Sunil K. Sinha 《Journal of Civil Engineering and Architecture》 2015年第4期419-428,共10页
A profound understanding of the costs to perform condition assessment on buried drinking water pipeline infrastructure is required for enhanced asset management. Toward this end, an automated and uniform method of col... A profound understanding of the costs to perform condition assessment on buried drinking water pipeline infrastructure is required for enhanced asset management. Toward this end, an automated and uniform method of collecting cost data can provide water utilities a means for viewing, understanding, interpreting and visualizing complex geographically referenced cost information to reveal data relationships, patterns and trends. However, there has been no standard data model that allows automated data collection and interoperability across platforms. The primary objective of this research is to develop a standard cost data model for drinking water pipeline condition assessment projects and to conflate disparate datasets from differing utilities. The capabilities of this model will be further demonstrated through performing trend analyses. Field mapping files will be generated from the standard data model and demonstrated in an interactive web map created using Google Maps API (application programming interface) for JavaScript that allows the user to toggle project examples and to perform regional comparisons. The aggregation of standardized data and further use in mapping applications will help in providing timely access to condition assessment cost information and resources that will lead to enhanced asset management and resource allocation for drinking water utilities. 展开更多
关键词 Drinking water pipeline condition assessment water pipeline cost data.
在线阅读 下载PDF
Optimal Receiver Operating Characteristic Curve of Classical Conditional Power under Normal Models
14
作者 ZHANG Ying-Ying 《应用概率统计》 北大核心 2025年第2期277-304,共28页
A Receiver Operating Characteristic(ROC)analysis of a power is important and useful in clinical trials.A Classical Conditional Power(CCP)is a probability of a classical rejection region given values of true treatment ... A Receiver Operating Characteristic(ROC)analysis of a power is important and useful in clinical trials.A Classical Conditional Power(CCP)is a probability of a classical rejection region given values of true treatment effect and interim result.For hypotheses and reversed hypotheses under normal models,we obtain analytical expressions of the ROC curves of the CCP,find optimal ROC curves of the CCP,investigate the superiority of the ROC curves of the CCP,calculate critical values of the False Positive Rate(FPR),True Positive Rate(TPR),and cutoff of the optimal CCP,and give go/no go decisions at the interim of the optimal CCP.In addition,extensive numerical experiments are carried out to exemplify our theoretical results.Finally,a real data example is performed to illustrate the go/no go decisions of the optimal CCP. 展开更多
关键词 area under the curve(AUC) classical conditional power(CCP) go/no go decisions historical and interim data receiver operating characteristic(ROC)curve
在线阅读 下载PDF
Graph Based Two-Phase Procedure for Phasor Data Concentrator Planning in Wide Area Measurement System of Smart Grid
15
作者 Ma Hailong Duan Tong +2 位作者 Yi Peng Jiang Yiming Zhang Jin 《China Communications》 2025年第11期291-304,共14页
The phasor data concentrator placement(PDCP)in wide area measurement systems(WAMS)is an optimization problem in the communication network planning for power grid.Instead of using the traditional integer linear program... The phasor data concentrator placement(PDCP)in wide area measurement systems(WAMS)is an optimization problem in the communication network planning for power grid.Instead of using the traditional integer linear programming(ILP)based modeling and solution schemes that ignore the graph-related features of WAMS,in this work,the PDCP problem is solved through a heuristic graphbased two-phase procedure(TPP):topology partitioning,and phasor data concentrator(PDC)provisioning.Based on the existing minimum k-section algorithms in graph theory,the k-base topology partitioning algorithm is proposed.To improve the performance,the“center-node-last”pre-partitioning algorithm is proposed to give an initial partition before the k-base partitioning algorithm is applied.Then,the PDC provisioning algorithm is proposed to locate PDCs into the decomposed sub-graphs.The proposed TPP was evaluated on five different IEEE benchmark test power systems and the achieved overall communication performance compared to the ILP based schemes show the validity and efficiency of the proposed method. 展开更多
关键词 industrial Internet minimum k-section phasor data concentrator placement phasor measurement unit smart grid topology partitioning wide area measurement system
在线阅读 下载PDF
Identification of working conditions and prediction of NO_(x) emissions in iron ore fines sintering process
16
作者 Bao-rong Wang Xiao-ming Li +3 位作者 Zhi-heng Yu Xu-hui Lin Yi-ze Ren Xiang-dong Xing 《Journal of Iron and Steel Research International》 2025年第8期2277-2285,共9页
Predicting NO_(x)in the sintering process of iron ore powder in advance was helpful to adjust the denitrification process in time.Taking NO_(x)in the sintering process of iron ore powder as the object,the boxplot,empi... Predicting NO_(x)in the sintering process of iron ore powder in advance was helpful to adjust the denitrification process in time.Taking NO_(x)in the sintering process of iron ore powder as the object,the boxplot,empirical mode decomposition algorithm,Pearson correlation coefficient,maximum information coefficient and other methods were used to preprocess the sintering data and naive Bayes classification algorithm was used to identify the sintering conditions.The regression prediction model with high accuracy and good stability was selected as the sub-model for different sintering conditions,and the sub-models were combined into an integrated prediction model.Based on actual operational data,the approach proved the superiority and effectiveness of the developed model in predicting NO_(x),yielding an accuracy of 96.17%and an absolute error of 5.56,and thereby providing valuable foresight for on-site sintering operations. 展开更多
关键词 Iron ore fines sintering Operating condition recognition NO_(x)emission data preprocessing Integrated prediction model
原文传递
Wavelet Transform-Based Bayesian Inference Learning with Conditional Variational Autoencoder for Mitigating Injection Attack in 6G Edge Network
17
作者 Binu Sudhakaran Pillai Raghavendra Kulkarni +1 位作者 Venkata Satya Suresh kumar Kondeti Surendran Rajendran 《Computer Modeling in Engineering & Sciences》 2025年第10期1141-1166,共26页
Future 6G communications will open up opportunities for innovative applications,including Cyber-Physical Systems,edge computing,supporting Industry 5.0,and digital agriculture.While automation is creating efficiencies... Future 6G communications will open up opportunities for innovative applications,including Cyber-Physical Systems,edge computing,supporting Industry 5.0,and digital agriculture.While automation is creating efficiencies,it can also create new cyber threats,such as vulnerabilities in trust and malicious node injection.Denialof-Service(DoS)attacks can stop many forms of operations by overwhelming networks and systems with data noise.Current anomaly detection methods require extensive software changes and only detect static threats.Data collection is important for being accurate,but it is often a slow,tedious,and sometimes inefficient process.This paper proposes a new wavelet transformassisted Bayesian deep learning based probabilistic(WT-BDLP)approach tomitigate malicious data injection attacks in 6G edge networks.The proposed approach combines outlier detection based on a Bayesian learning conditional variational autoencoder(Bay-LCVariAE)and traffic pattern analysis based on continuous wavelet transform(CWT).The Bay-LCVariAE framework allows for probabilistic modelling of generative features to facilitate capturing how features of interest change over time,spatially,and for recognition of anomalies.Similarly,CWT allows emphasizing the multi-resolution spectral analysis and permits temporally relevant frequency pattern recognition.Experimental testing showed that the flexibility of the Bayesian probabilistic framework offers a vast improvement in anomaly detection accuracy over existing methods,with a maximum accuracy of 98.21%recognizing anomalies. 展开更多
关键词 Bayesian inference learning automaton convolutional wavelet transform conditional variational autoencoder malicious data injection attack edge environment 6G communication
在线阅读 下载PDF
基于多样性数据生成的滚动轴承领域泛化剩余使用寿命预测方法
18
作者 宋仁旺 姚程昊 +1 位作者 石慧 陈琳英 《振动与冲击》 北大核心 2026年第1期290-301,共12页
针对滚动轴承剩余使用寿命(remaining useful life,RUL)预测过程中,存在的可用于训练的数据稀少,且未知工况下轴承的实际运行数据和训练数据有明显分布差异,从而导致寿命预测模型泛化性能大幅下降的问题。研究了一种基于多样性数据生成... 针对滚动轴承剩余使用寿命(remaining useful life,RUL)预测过程中,存在的可用于训练的数据稀少,且未知工况下轴承的实际运行数据和训练数据有明显分布差异,从而导致寿命预测模型泛化性能大幅下降的问题。研究了一种基于多样性数据生成的滚动轴承领域泛化剩余寿命预测方法。首先,构建相空间数据生成模块,在捕获退化过程中慢变信息的同时,实现样本多样性增强。在此基础上,构建对抗域外数据生成模块,并采取对抗的训练方式,基于所搭建的对抗性数据生成框架获得伪域,进一步扩充了可用样本的多样性,在该框架中添加了域差异函数从而增强伪域的多样性,并添加了流形正则化和语义正则化确保伪域的一致性。最后,基于扩充后的多样性样本,搭建双通道深度卷积回归网络用于RUL预测的领域泛化训练,在PHM2012数据集下实现未知工况下的滚动轴承RUL预测,试验结果验证了所提方法的有效性和优越性。 展开更多
关键词 领域泛化 剩余使用寿命(RUL)预测 未知工况 多样性数据生成 对抗性数据生成
在线阅读 下载PDF
掺氢天然气管道仿真方法研究
19
作者 田晓龙 杨罗 +2 位作者 吴小平 代明亮 江明 《北京化工大学学报(自然科学版)》 北大核心 2026年第1期40-50,共11页
迄今,掺氢天然气的仿真工作始终无法有效平衡计算精度和仿真速度这2项参数,不利于指导现场运行方案的快速决策。针对这一问题,将组分追踪方程、水力方程和热力方程进行去耦仿真,分析了耦合模型与气体瞬态管网模拟软件TGNET结果、耦合模... 迄今,掺氢天然气的仿真工作始终无法有效平衡计算精度和仿真速度这2项参数,不利于指导现场运行方案的快速决策。针对这一问题,将组分追踪方程、水力方程和热力方程进行去耦仿真,分析了耦合模型与气体瞬态管网模拟软件TGNET结果、耦合模型和去耦模型结果的差异性,确定了最优的时间步长和空间步长,引入单指令多数据(SIMD)并行计算技术,基于AVX2指令集实现中间参数求解过程的快速化处理,以实际案例进行了计算和分析。结果显示:耦合模型结果与TGNET软件结果的吻合程度较高,两者的计算时间分别为8237 ms和8359 ms,2种模型具有一致性和相容性;去耦模型计算时采用时间步长90 s、空间步长1 km的组合较为合适,与耦合模型相比,此组合下的加速比为2.89;与最大压力-最大流量边界条件相比,最大流量-最大流量边界条件下的压力、温度、氢气摩尔分数的误差均有所增大,建议在管网规划设计、运行调峰、突发性用气和供气安全冗余保障等工况下优先使用该边界条件;与去耦模型相比,引入SIMD指令的计算加速比为2.68,低于理论加速比上限。研究结果可为大型掺氢天然气管网的仿真工作提供理论依据和实际参考。 展开更多
关键词 掺氢天然气 仿真 耦合模型 去耦模型 边界条件 SIMD
在线阅读 下载PDF
RELIABILITY EVALUATION MODEL BASED ON DATA FUSION FOR AIRCRAFT ENGINES 被引量:3
20
作者 王华伟 吴海桥 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2012年第4期318-324,共7页
Reliability evaluation for aircraft engines is difficult because of the scarcity of failure data. But aircraft engine data are available from a variety of sources. Data fusion has the function of maximizing the amount... Reliability evaluation for aircraft engines is difficult because of the scarcity of failure data. But aircraft engine data are available from a variety of sources. Data fusion has the function of maximizing the amount of valu- able information extracted from disparate data sources to obtain the comprehensive reliability knowledge. Consid- ering the degradation failure and the catastrophic failure simultaneously, which are competing risks and can affect the reliability, a reliability evaluation model based on data fusion for aircraft engines is developed, Above the characteristics of the proposed model, reliability evaluation is more feasible than that by only utilizing failure data alone, and is also more accurate than that by only considering single failure mode. Example shows the effective- ness of the proposed model. 展开更多
关键词 aircraft engine reliability evaluation data fusion competing failure condition monitoring
在线阅读 下载PDF
上一页 1 2 180 下一页 到第
使用帮助 返回顶部