期刊文献+
共找到3,626篇文章
< 1 2 182 >
每页显示 20 50 100
Real-time rock mass condition prediction with TBM tunneling big data using a novel rock-machine mutual feedback perception method 被引量:24
1
作者 Zhijun Wu Rulei Wei +1 位作者 Zhaofei Chu Quansheng Liu 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2021年第6期1311-1325,共15页
Real-time perception of rock mass information is of great importance to efficient tunneling and hazard prevention in tunnel boring machines(TBMs).In this study,a TBM-rock mutual feedback perception method based on dat... Real-time perception of rock mass information is of great importance to efficient tunneling and hazard prevention in tunnel boring machines(TBMs).In this study,a TBM-rock mutual feedback perception method based on data mining(DM) is proposed,which takes 10 tunneling parameters related to surrounding rock conditions as input features.For implementation,first,the database of TBM tunneling parameters was established,in which 10,807 tunneling cycles from the Songhua River water conveyance tunnel were accommodated.Then,the spectral clustering(SC) algorithm based on graph theory was introduced to cluster the TBM tunneling data.According to the clustering results and rock mass boreability index,the rock mass conditions were classified into four classes,and the reasonable distribution intervals of the main tunneling parameters corresponding to each class were presented.Meanwhile,based on the deep neural network(DNN),the real-time prediction model regarding different rock conditions was established.Finally,the rationality and adaptability of the proposed method were validated via analyzing the tunneling specific energy,feature importance,and training dataset size.The proposed TBM-rock mutual feedback perception method enables the automatic identification of rock mass conditions and the dynamic adjustment of tunneling parameters during TBM driving.Furthermore,in terms of the prediction performance,the method can predict the rock mass conditions ahead of the tunnel face in real time more accurately than the traditional machine learning prediction methods. 展开更多
关键词 Tunnel boring machine(TBM) data mining(DM) Spectral clustering(SC) Deep neural network(DNN) Rock mass condition perception
在线阅读 下载PDF
Effect of non-vacuum storage condition on minimum explosible concentration of aluminum dust 被引量:3
2
作者 ZHANG Shi-an ZHANG Lu-gang 《Journal of Measurement Science and Instrumentation》 CAS CSCD 2019年第3期219-222,共4页
Non-vacuum storage condition has a great impact on the explosion characteristics of aluminum powders. In this paper, vacuum-packed flake and globular aluminum powders stored in a dryer after opening the vacuum package... Non-vacuum storage condition has a great impact on the explosion characteristics of aluminum powders. In this paper, vacuum-packed flake and globular aluminum powders stored in a dryer after opening the vacuum package are selected as the experimental samples, and a 20 L spherical explosion device is chosen to test the minimum explosible concentration (MEC) values of aluminum dusts under different storage time. The results show that the MEC values of two types of unoxidized aluminum powders are 30 g/m^3. The MEC values of flake and globular aluminum powders firstly go up with the increase of storage time in the dryer and then reach the maximum values of 50 g/m^3 and 60 g/m^3 at respective storage time until finally they stabilize gradually. The main reason is that the oxidation rate is faster owing to the bigger specific surface area of globular aluminum powders. Hence, the storage time has more significant effect on the MEC of globular aluminum powder than that of flake aluminum powder. After a period of time, the outer surface is oxidized to generate a layer of film, which prevents the further oxidation of aluminum powder, resulting in the temporary stability of MEC. 展开更多
关键词 storage condition oxidation rate aluminum powders minimum explosible concentration (MEC)
在线阅读 下载PDF
A PRELIMINARY STUDY ON COMBINING TWO KINDS OF PROXY DATA USING THE CONDITIONAL QUANTILE ADJUSTMENT METHOD 被引量:1
3
作者 Wu Xiangding Liu Hongbin(Institute of Geography, CAS, Beijing 100101People’s Republic of China)Pan Yimin(Institute of Applied Mathematics, CAS, Beijing 100080People’s Republic of China) 《Journal of Geographical Sciences》 SCIE CSCD 1995年第1期52-62,共11页
Based on two kinds of proxy data, a tree-ring width chronology at Huashan and the wetness/dryness grade series around Xi'an in north-centralChina, thes presat study demonstrates how different types of proxy climat... Based on two kinds of proxy data, a tree-ring width chronology at Huashan and the wetness/dryness grade series around Xi'an in north-centralChina, thes presat study demonstrates how different types of proxy climaterecords can be combined to give a more reliable estimate of past climate thaneither record can be done individually. With comparison and correction of thetwo data sets, various statistical models can be developed from individual andcombined senes. Among them, the best combined model produced by theconditional quantile adjustmat method can be selected for reconstruction ofApril-July rainfall at Huashan back to 1600 A.D. 展开更多
关键词 conditional quantile CLIMATE proxy data
在线阅读 下载PDF
Effective data sampling strategies and boundary condition constraints of physics-informed neural networks for identifying material properties in solid mechanics 被引量:4
4
作者 W.WU M.DANEKER +2 位作者 M.A.JOLLEY K.T.TURNER L.LU 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI CSCD 2023年第7期1039-1068,共30页
Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the ch... Material identification is critical for understanding the relationship between mechanical properties and the associated mechanical functions.However,material identification is a challenging task,especially when the characteristic of the material is highly nonlinear in nature,as is common in biological tissue.In this work,we identify unknown material properties in continuum solid mechanics via physics-informed neural networks(PINNs).To improve the accuracy and efficiency of PINNs,we develop efficient strategies to nonuniformly sample observational data.We also investigate different approaches to enforce Dirichlet-type boundary conditions(BCs)as soft or hard constraints.Finally,we apply the proposed methods to a diverse set of time-dependent and time-independent solid mechanic examples that span linear elastic and hyperelastic material space.The estimated material parameters achieve relative errors of less than 1%.As such,this work is relevant to diverse applications,including optimizing structural integrity and developing novel materials. 展开更多
关键词 solid mechanics material identification physics-informed neural network(PINN) data sampling boundary condition(BC)constraint
在线阅读 下载PDF
On the 4D Variational Data Assimilation with Constraint Conditions 被引量:1
5
作者 朱克云 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2001年第6期1131-1145,共15页
An investigation is carried out on the problem involved in 4D variational data assimilation (VDA) with constraint conditions based on a finite-element shallow-water equation model. In the investigation, the adjoint te... An investigation is carried out on the problem involved in 4D variational data assimilation (VDA) with constraint conditions based on a finite-element shallow-water equation model. In the investigation, the adjoint technology, penalty method and augmented Lagrangian method are used in constraint optimization field to minimize the defined constraint objective functions. The results of the numerical experiments show that the optimal solutions are obtained if the functions reach the minima. VDA with constraint conditions controlling the growth of gravity oscillations is efficient to eliminate perturbation and produces optimal initial field. It seems that this method can also be applied to the problem in numerical weather prediction. Key words Variational data assimilation - Constraint conditions - Penalty methods - finite-element model This research is supported by National Natural Science Foundation of China (Grant No. 49575269) and by National Key Basic Research on the Formation Mechanism and Prediction Theory of Severe Synoptic Disasters (Grant No. G1998040910). 展开更多
关键词 Variational data assimilation Constraint conditions Penalty methods finite-element model
在线阅读 下载PDF
Quantitative Integration of High-Resolution Hydrogeophysical Data: A Novel Approach to Monte-Carlo-Type Conditional Stochastic Simulations and Implications for Hydrological Predictions
6
作者 Baptiste Dafflon James Irving Klaus Holliger 《Journal of China University of Geosciences》 SCIE CSCD 2009年第3期580-591,共12页
Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing rese... Geophysical techniques can help to bridge the inherent gap that exists with regard to spatial resolution and coverage for classical hydrological methods. This has led to the emergence of a new and rapidly growing research domain generally referred to as hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters, their inherent trade-off between resolution and range, as well as the notoriously site-specific nature of petrophysical parameter relations, the fundamental usefulness of multi-method surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database into a unified model of the probed subsurface region that is consistent with all available measurements. To this end, we present a novel approach toward hydrogeophysical data integration based on a Monte-Carlo-type conditional stochastic simulation method that we consider to be particularly suitable for high-resolution local-scale studies. Monte Carlo techniques are flexible and versatile, allowing for accounting for a wide variety of data and constraints of differing resolution and hardness, and thus have the potential of providing, in a geostatistical sense, realistic models of the pertinent target parameter distributions. Compared to more conventional approaches, such as co-kriging or cluster analysis, our approach provides significant ad- vancements in the way that larger-scale structural information eontained in the hydrogeophysieal data can be accounted for. After outlining the methodological background of our algorithm, we present the results of its application to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the detailed local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to a field dataset collected at the Boise Hydrogeophysical Research Site. Finally, we compare the performance of our data integration approach to that of more conventional methods with regard to the prediction of flow and transport phenomena in highly heterogeneous media and discuss the implications arising. 展开更多
关键词 aquifer characterization conditional simulation GEOSTATISTICS HYDROGEOPHYSICS HYDROLOGY quantitative data integration Monte Carlo methods simulated annealing flow and transport modeling.
原文传递
A Geometric Approach to Conditioning and the Search for Minimum Variance Unbiased Estimators
7
作者 James E. Marengo David L. Farnsworth 《Open Journal of Statistics》 2021年第3期437-442,共6页
Our purpose is twofold: to present a prototypical example of the conditioning technique to obtain the best estimator of a parameter and to show that th</span><span style="font-family:Verdana;">is... Our purpose is twofold: to present a prototypical example of the conditioning technique to obtain the best estimator of a parameter and to show that th</span><span style="font-family:Verdana;">is technique resides in the structure of an inner product space. Th</span><span style="font-family:Verdana;">e technique uses conditioning </span></span><span style="font-family:Verdana;">of</span><span style="font-family:Verdana;"> an unbiased estimator </span><span style="font-family:Verdana;">on</span><span style="font-family:Verdana;"> a sufficient statistic. This procedure is founded upon the conditional variance formula, which leads to an inner product space and a geometric interpretation. The example clearly illustrates the dependence on the sampling methodology. These advantages show the power and centrality of this process. 展开更多
关键词 conditional Variance Formula conditionING Geometric Representation minimum Variance Estimator Rao-Blackwell Theorem Sufficient Statistic Unbiased Estimator
在线阅读 下载PDF
Some Additional Moment Conditions for a Dynamic Count Panel Data Model with Predetermined Explanatory Variables
8
作者 Yoshitsugu Kitazawa 《Open Journal of Statistics》 2013年第5期319-333,共15页
This paper proposes some additional moment conditions for the linear feedback model with explanatory variables being predetermined, which is proposed by [1] for the purpose of dealing with count panel data. The newly ... This paper proposes some additional moment conditions for the linear feedback model with explanatory variables being predetermined, which is proposed by [1] for the purpose of dealing with count panel data. The newly proposed moment conditions include those associated with the equidispersion, the Negbin I-type model and the stationarity. The GMM estimators are constructed incorporating the additional moment conditions. Some Monte Carlo experiments indicate that the GMM estimators incorporating the additional moment conditions perform well, compared to that using only the conventional moment conditions proposed by [2,3]. 展开更多
关键词 COUNT PANEL data Linear Feedback Model MOMENT conditions GMM MONTE Carlo Experiments
在线阅读 下载PDF
Zero Truncated Bivariate Poisson Model: Marginal-Conditional Modeling Approach with an Application to Traffic Accident Data
9
作者 Rafiqul I. Chowdhury M. Ataharul Islam 《Applied Mathematics》 2016年第14期1589-1598,共11页
A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model wi... A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model with estimation procedure and tests for goodness-of-fit and under (or over) dispersion are shown and applied to road safety data. Two correlated outcome variables considered in this study are number of cars involved in an accident and number of casualties for given number of cars. 展开更多
关键词 Bivariate Poisson conditional Model Generalized Linear Model Marginal Model Road Safety data Zero-Truncated
在线阅读 下载PDF
A Novel Computationally Efficient Approach to Identify Visually Interpretable Medical Conditions from 2D Skeletal Data
10
作者 Praveen Jesudhas T.Raghuveera 《Computer Systems Science & Engineering》 SCIE EI 2023年第9期2995-3015,共21页
Timely identification and treatment of medical conditions could facilitate faster recovery and better health.Existing systems address this issue using custom-built sensors,which are invasive and difficult to generaliz... Timely identification and treatment of medical conditions could facilitate faster recovery and better health.Existing systems address this issue using custom-built sensors,which are invasive and difficult to generalize.A low-complexity scalable process is proposed to detect and identify medical conditions from 2D skeletal movements on video feed data.Minimal set of features relevant to distinguish medical conditions:AMF,PVF and GDF are derived from skeletal data on sampled frames across the entire action.The AMF(angular motion features)are derived to capture the angular motion of limbs during a specific action.The relative position of joints is represented by PVF(positional variation features).GDF(global displacement features)identifies the direction of overall skeletal movement.The discriminative capability of these features is illustrated by their variance across time for different actions.The classification of medical conditions is approached in two stages.In the first stage,a low-complexity binary LSTM classifier is trained to distinguish visual medical conditions from general human actions.As part of stage 2,a multi-class LSTM classifier is trained to identify the exact medical condition from a given set of visually interpretable medical conditions.The proposed features are extracted from the 2D skeletal data of NTU RGB+D and then used to train the binary and multi-class LSTM classifiers.The binary and multi-class classifiers observed average F1 scores of 77%and 73%,respectively,while the overall system produced an average F1 score of 69%and a weighted average F1 score of 80%.The multi-class classifier is found to utilize 10 to 100 times fewer parameters than existing 2D CNN-based models while producing similar levels of accuracy. 展开更多
关键词 Action recognition 2D skeletal data medical condition computer vision deep learning
在线阅读 下载PDF
Censored Composite Conditional Quantile Screening for High-Dimensional Survival Data
11
作者 LIU Wei LI Yingqiu 《应用概率统计》 CSCD 北大核心 2024年第5期783-799,共17页
In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all usef... In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated. 展开更多
关键词 high-dimensional survival data censored composite conditional quantile coefficient sure screening property rank consistency property
在线阅读 下载PDF
Minimum MSE Weighted Estimator to Make Inferences for a Common Risk Ratio across Sparse Meta-Analysis Data
12
作者 Chukiat Viwatwongkasem Sutthisak Srisawad +4 位作者 Pichitpong Soontornpipit Jutatip Sillabutra Pratana Satitvipawee Prasong Kitidamrongsuk Hathaikan Chootrakool 《Open Journal of Statistics》 2022年第1期49-69,共21页
The paper aims to discuss three interesting issues of statistical inferences for a common risk ratio (RR) in sparse meta-analysis data. Firstly, the conventional log-risk ratio estimator encounters a number of problem... The paper aims to discuss three interesting issues of statistical inferences for a common risk ratio (RR) in sparse meta-analysis data. Firstly, the conventional log-risk ratio estimator encounters a number of problems when the number of events in the experimental or control group is zero in sparse data of a 2 × 2 table. The adjusted log-risk ratio estimator with the continuity correction points  based upon the minimum Bayes risk with respect to the uniform prior density over (0, 1) and the Euclidean loss function is proposed. Secondly, the interest is to find the optimal weights of the pooled estimate  that minimize the mean square error (MSE) of  subject to the constraint on  where , , . Finally, the performance of this minimum MSE weighted estimator adjusted with various values of points  is investigated to compare with other popular estimators, such as the Mantel-Haenszel (MH) estimator and the weighted least squares (WLS) estimator (also equivalently known as the inverse-variance weighted estimator) in senses of point estimation and hypothesis testing via simulation studies. The results of estimation illustrate that regardless of the true values of RR, the MH estimator achieves the best performance with the smallest MSE when the study size is rather large  and the sample sizes within each study are small. The MSE of WLS estimator and the proposed-weight estimator adjusted by , or , or are close together and they are the best when the sample sizes are moderate to large (and) while the study size is rather small. 展开更多
关键词 minimum MSE Weights Adjusted Log-Risk Ratio Estimator Sparse Meta-Analysis data Continuity Correction
在线阅读 下载PDF
WATERiD's Novel Methodology for Condition Assessment Cost Data Collection and Visualization
13
作者 Stephen M. Welling Sunil K. Sinha 《Journal of Civil Engineering and Architecture》 2015年第4期419-428,共10页
A profound understanding of the costs to perform condition assessment on buried drinking water pipeline infrastructure is required for enhanced asset management. Toward this end, an automated and uniform method of col... A profound understanding of the costs to perform condition assessment on buried drinking water pipeline infrastructure is required for enhanced asset management. Toward this end, an automated and uniform method of collecting cost data can provide water utilities a means for viewing, understanding, interpreting and visualizing complex geographically referenced cost information to reveal data relationships, patterns and trends. However, there has been no standard data model that allows automated data collection and interoperability across platforms. The primary objective of this research is to develop a standard cost data model for drinking water pipeline condition assessment projects and to conflate disparate datasets from differing utilities. The capabilities of this model will be further demonstrated through performing trend analyses. Field mapping files will be generated from the standard data model and demonstrated in an interactive web map created using Google Maps API (application programming interface) for JavaScript that allows the user to toggle project examples and to perform regional comparisons. The aggregation of standardized data and further use in mapping applications will help in providing timely access to condition assessment cost information and resources that will lead to enhanced asset management and resource allocation for drinking water utilities. 展开更多
关键词 Drinking water pipeline condition assessment water pipeline cost data.
在线阅读 下载PDF
Optimal Receiver Operating Characteristic Curve of Classical Conditional Power under Normal Models
14
作者 ZHANG Ying-Ying 《应用概率统计》 北大核心 2025年第2期277-304,共28页
A Receiver Operating Characteristic(ROC)analysis of a power is important and useful in clinical trials.A Classical Conditional Power(CCP)is a probability of a classical rejection region given values of true treatment ... A Receiver Operating Characteristic(ROC)analysis of a power is important and useful in clinical trials.A Classical Conditional Power(CCP)is a probability of a classical rejection region given values of true treatment effect and interim result.For hypotheses and reversed hypotheses under normal models,we obtain analytical expressions of the ROC curves of the CCP,find optimal ROC curves of the CCP,investigate the superiority of the ROC curves of the CCP,calculate critical values of the False Positive Rate(FPR),True Positive Rate(TPR),and cutoff of the optimal CCP,and give go/no go decisions at the interim of the optimal CCP.In addition,extensive numerical experiments are carried out to exemplify our theoretical results.Finally,a real data example is performed to illustrate the go/no go decisions of the optimal CCP. 展开更多
关键词 area under the curve(AUC) classical conditional power(CCP) go/no go decisions historical and interim data receiver operating characteristic(ROC)curve
在线阅读 下载PDF
Graph Based Two-Phase Procedure for Phasor Data Concentrator Planning in Wide Area Measurement System of Smart Grid
15
作者 Ma Hailong Duan Tong +2 位作者 Yi Peng Jiang Yiming Zhang Jin 《China Communications》 2025年第11期291-304,共14页
The phasor data concentrator placement(PDCP)in wide area measurement systems(WAMS)is an optimization problem in the communication network planning for power grid.Instead of using the traditional integer linear program... The phasor data concentrator placement(PDCP)in wide area measurement systems(WAMS)is an optimization problem in the communication network planning for power grid.Instead of using the traditional integer linear programming(ILP)based modeling and solution schemes that ignore the graph-related features of WAMS,in this work,the PDCP problem is solved through a heuristic graphbased two-phase procedure(TPP):topology partitioning,and phasor data concentrator(PDC)provisioning.Based on the existing minimum k-section algorithms in graph theory,the k-base topology partitioning algorithm is proposed.To improve the performance,the“center-node-last”pre-partitioning algorithm is proposed to give an initial partition before the k-base partitioning algorithm is applied.Then,the PDC provisioning algorithm is proposed to locate PDCs into the decomposed sub-graphs.The proposed TPP was evaluated on five different IEEE benchmark test power systems and the achieved overall communication performance compared to the ILP based schemes show the validity and efficiency of the proposed method. 展开更多
关键词 industrial Internet minimum k-section phasor data concentrator placement phasor measurement unit smart grid topology partitioning wide area measurement system
在线阅读 下载PDF
Identification of working conditions and prediction of NO_(x) emissions in iron ore fines sintering process
16
作者 Bao-rong Wang Xiao-ming Li +3 位作者 Zhi-heng Yu Xu-hui Lin Yi-ze Ren Xiang-dong Xing 《Journal of Iron and Steel Research International》 2025年第8期2277-2285,共9页
Predicting NO_(x)in the sintering process of iron ore powder in advance was helpful to adjust the denitrification process in time.Taking NO_(x)in the sintering process of iron ore powder as the object,the boxplot,empi... Predicting NO_(x)in the sintering process of iron ore powder in advance was helpful to adjust the denitrification process in time.Taking NO_(x)in the sintering process of iron ore powder as the object,the boxplot,empirical mode decomposition algorithm,Pearson correlation coefficient,maximum information coefficient and other methods were used to preprocess the sintering data and naive Bayes classification algorithm was used to identify the sintering conditions.The regression prediction model with high accuracy and good stability was selected as the sub-model for different sintering conditions,and the sub-models were combined into an integrated prediction model.Based on actual operational data,the approach proved the superiority and effectiveness of the developed model in predicting NO_(x),yielding an accuracy of 96.17%and an absolute error of 5.56,and thereby providing valuable foresight for on-site sintering operations. 展开更多
关键词 Iron ore fines sintering Operating condition recognition NO_(x)emission data preprocessing Integrated prediction model
原文传递
Wavelet Transform-Based Bayesian Inference Learning with Conditional Variational Autoencoder for Mitigating Injection Attack in 6G Edge Network
17
作者 Binu Sudhakaran Pillai Raghavendra Kulkarni +1 位作者 Venkata Satya Suresh kumar Kondeti Surendran Rajendran 《Computer Modeling in Engineering & Sciences》 2025年第10期1141-1166,共26页
Future 6G communications will open up opportunities for innovative applications,including Cyber-Physical Systems,edge computing,supporting Industry 5.0,and digital agriculture.While automation is creating efficiencies... Future 6G communications will open up opportunities for innovative applications,including Cyber-Physical Systems,edge computing,supporting Industry 5.0,and digital agriculture.While automation is creating efficiencies,it can also create new cyber threats,such as vulnerabilities in trust and malicious node injection.Denialof-Service(DoS)attacks can stop many forms of operations by overwhelming networks and systems with data noise.Current anomaly detection methods require extensive software changes and only detect static threats.Data collection is important for being accurate,but it is often a slow,tedious,and sometimes inefficient process.This paper proposes a new wavelet transformassisted Bayesian deep learning based probabilistic(WT-BDLP)approach tomitigate malicious data injection attacks in 6G edge networks.The proposed approach combines outlier detection based on a Bayesian learning conditional variational autoencoder(Bay-LCVariAE)and traffic pattern analysis based on continuous wavelet transform(CWT).The Bay-LCVariAE framework allows for probabilistic modelling of generative features to facilitate capturing how features of interest change over time,spatially,and for recognition of anomalies.Similarly,CWT allows emphasizing the multi-resolution spectral analysis and permits temporally relevant frequency pattern recognition.Experimental testing showed that the flexibility of the Bayesian probabilistic framework offers a vast improvement in anomaly detection accuracy over existing methods,with a maximum accuracy of 98.21%recognizing anomalies. 展开更多
关键词 Bayesian inference learning automaton convolutional wavelet transform conditional variational autoencoder malicious data injection attack edge environment 6G communication
在线阅读 下载PDF
基于最小数据集的银川平原农田土壤健康评价
18
作者 吴霞 蔡进军 +3 位作者 王长军 郭鑫年 李维倩 陈刚 《土壤》 北大核心 2026年第1期225-234,共10页
土壤健康评价需要考虑区域自然禀赋和环境特征,评价指标的选择应依据不同区域特征来确定。为确保所选指标能够代表银川平原农田土壤的关键属性和功能,本研究在调查采集银川平原农田土壤样点147个,测试分析土壤物理、化学和生物16项指标... 土壤健康评价需要考虑区域自然禀赋和环境特征,评价指标的选择应依据不同区域特征来确定。为确保所选指标能够代表银川平原农田土壤的关键属性和功能,本研究在调查采集银川平原农田土壤样点147个,测试分析土壤物理、化学和生物16项指标的基础上,采用主成分分析法筛选与构建农田土壤健康评价最小数据集,并应用土壤健康指数法评价了银川平原农田土壤健康状况。结果表明:银川平原农田土壤健康评价最小数据集包含土壤含水率、水稳性大团聚体、pH、全盐、全氮、有效磷、有机质和微生物生物量碳8个指标,采用加权综合法计算所得的最小数据集健康指数与全数据集健康指数,其线性回归方程拟合效果良好(R^(2)=0.846, P<0.001),表明所构建的最小数据集可代表全数据集进行银川平原农田土壤健康评价。基于最小数据集计算的银川平原农田土壤健康指数介于0.18~0.78,平均值0.52,土壤健康指数在空间分布上整体呈现出南高北低趋势,局部地区分布不均,空间异质性较大。本研究结果可为银川平原农田土壤健康评价以及农业绿色生产提供理论依据。 展开更多
关键词 最小数据集 主成分分析 农田土壤 健康评价 银川平原
在线阅读 下载PDF
基于多样性数据生成的滚动轴承领域泛化剩余使用寿命预测方法
19
作者 宋仁旺 姚程昊 +1 位作者 石慧 陈琳英 《振动与冲击》 北大核心 2026年第1期290-301,共12页
针对滚动轴承剩余使用寿命(remaining useful life,RUL)预测过程中,存在的可用于训练的数据稀少,且未知工况下轴承的实际运行数据和训练数据有明显分布差异,从而导致寿命预测模型泛化性能大幅下降的问题。研究了一种基于多样性数据生成... 针对滚动轴承剩余使用寿命(remaining useful life,RUL)预测过程中,存在的可用于训练的数据稀少,且未知工况下轴承的实际运行数据和训练数据有明显分布差异,从而导致寿命预测模型泛化性能大幅下降的问题。研究了一种基于多样性数据生成的滚动轴承领域泛化剩余寿命预测方法。首先,构建相空间数据生成模块,在捕获退化过程中慢变信息的同时,实现样本多样性增强。在此基础上,构建对抗域外数据生成模块,并采取对抗的训练方式,基于所搭建的对抗性数据生成框架获得伪域,进一步扩充了可用样本的多样性,在该框架中添加了域差异函数从而增强伪域的多样性,并添加了流形正则化和语义正则化确保伪域的一致性。最后,基于扩充后的多样性样本,搭建双通道深度卷积回归网络用于RUL预测的领域泛化训练,在PHM2012数据集下实现未知工况下的滚动轴承RUL预测,试验结果验证了所提方法的有效性和优越性。 展开更多
关键词 领域泛化 剩余使用寿命(RUL)预测 未知工况 多样性数据生成 对抗性数据生成
在线阅读 下载PDF
简支梁桥轻量化监测的状态评估方法研究
20
作者 田石柱 陆一帆 刘晨光 《广西大学学报(自然科学版)》 北大核心 2026年第1期14-25,共12页
为了提升简支梁桥状态评估效率和评估智能化水平,提出基于准静态响应的桥梁正常使用状态的快速智能评估方法。首先,以车辆信息监测和关键截面的挠度监测为基础,建立轻量化监测体系;其次,建立以车辆信息和截面响应为输入、准静态挠度响... 为了提升简支梁桥状态评估效率和评估智能化水平,提出基于准静态响应的桥梁正常使用状态的快速智能评估方法。首先,以车辆信息监测和关键截面的挠度监测为基础,建立轻量化监测体系;其次,建立以车辆信息和截面响应为输入、准静态挠度响应为输出的3种机器学习模型,构建桥梁响应有限元代理模型;再次,提出基于逐次变分模态分解(SVMD)的准静态挠度曲线自适应提取方法,通过迭代分解、残差评估,自动确定最优模态分解个数;最后,通过损伤桥梁和未损桥梁的准静态响应曲线面积比,构建评估系数β和桥梁状态分级表,并通过简支箱梁桥的数值算例进行验证。结果表明:基于反向传播(BP)神经网络构建的有限元代理模型训练结果最优,决定系数(R^(2))为98%、均方根误差(RMSE)为0.0069;通过SVMD提取出的准静态挠度响应整体相对误差均小于2%,峰值相对误差均小于1%;将模型刚度折减成设计值的88%时,计算简支箱梁桥的β值为0.88,通过桥梁状态分级表评估桥梁的状态为一般,与设置的损伤状况一致,可初步判断桥梁的正常使用状态。 展开更多
关键词 轻量化监测 数据处理 机器学习 BP神经网络 准静态响应 状态评估
在线阅读 下载PDF
上一页 1 2 182 下一页 到第
使用帮助 返回顶部