期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Reducing parameter space for neural network training 被引量:1
1
作者 Tong Qin Ling Zhou Dongbin Xiu 《Theoretical & Applied Mechanics Letters》 CAS CSCD 2020年第3期170-181,共12页
For neural networks(NNs)with rectified linear unit(ReLU)or binary activation functions,we show that their training can be accomplished in a reduced parameter space.Specifically,the weights in each neuron can be traine... For neural networks(NNs)with rectified linear unit(ReLU)or binary activation functions,we show that their training can be accomplished in a reduced parameter space.Specifically,the weights in each neuron can be trained on the unit sphere,as opposed to the entire space,and the threshold can be trained in a bounded interval,as opposed to the real line.We show that the NNs in the reduced parameter space are mathematically equivalent to the standard NNs with parameters in the whole space.The reduced parameter space shall facilitate the optimization procedure for the network training,as the search space becomes(much)smaller.We demonstrate the improved training performance using numerical examples. 展开更多
关键词 Rectified linear unit network Universal approximator Reduced space
在线阅读 下载PDF
Verifying ReLU Neural Networks from a Model Checking Perspective 被引量:3
2
作者 Wan-Wei Liu Fu Song +1 位作者 Tang-Hao-Ran Zhang Ji Wang 《Journal of Computer Science & Technology》 SCIE EI CSCD 2020年第6期1365-1381,共17页
Neural networks, as an important computing model, have a wide application in artificial intelligence (AI) domain. From the perspective of computer science, such a computing model requires a formal description of its b... Neural networks, as an important computing model, have a wide application in artificial intelligence (AI) domain. From the perspective of computer science, such a computing model requires a formal description of its behaviors, particularly the relation between input and output. In addition, such specifications ought to be verified automatically. ReLU (rectified linear unit) neural networks are intensively used in practice. In this paper, we present ReLU Temporal Logic (ReTL), whose semantics is defined with respect to ReLU neural networks, which could specify value-related properties about the network. We show that the model checking algorithm for theΣ2∪Π2 fragment of ReTL, which can express properties such as output reachability, is decidable in EXPSPACE. We have also implemented our algorithm with a prototype tool, and experimental results demonstrate the feasibility of the presented model checking approach. 展开更多
关键词 model checking rectified linear unit neural(ReLU)network temporal logic
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部