For neural networks(NNs)with rectified linear unit(ReLU)or binary activation functions,we show that their training can be accomplished in a reduced parameter space.Specifically,the weights in each neuron can be traine...For neural networks(NNs)with rectified linear unit(ReLU)or binary activation functions,we show that their training can be accomplished in a reduced parameter space.Specifically,the weights in each neuron can be trained on the unit sphere,as opposed to the entire space,and the threshold can be trained in a bounded interval,as opposed to the real line.We show that the NNs in the reduced parameter space are mathematically equivalent to the standard NNs with parameters in the whole space.The reduced parameter space shall facilitate the optimization procedure for the network training,as the search space becomes(much)smaller.We demonstrate the improved training performance using numerical examples.展开更多
Neural networks, as an important computing model, have a wide application in artificial intelligence (AI) domain. From the perspective of computer science, such a computing model requires a formal description of its b...Neural networks, as an important computing model, have a wide application in artificial intelligence (AI) domain. From the perspective of computer science, such a computing model requires a formal description of its behaviors, particularly the relation between input and output. In addition, such specifications ought to be verified automatically. ReLU (rectified linear unit) neural networks are intensively used in practice. In this paper, we present ReLU Temporal Logic (ReTL), whose semantics is defined with respect to ReLU neural networks, which could specify value-related properties about the network. We show that the model checking algorithm for theΣ2∪Π2 fragment of ReTL, which can express properties such as output reachability, is decidable in EXPSPACE. We have also implemented our algorithm with a prototype tool, and experimental results demonstrate the feasibility of the presented model checking approach.展开更多
文摘For neural networks(NNs)with rectified linear unit(ReLU)or binary activation functions,we show that their training can be accomplished in a reduced parameter space.Specifically,the weights in each neuron can be trained on the unit sphere,as opposed to the entire space,and the threshold can be trained in a bounded interval,as opposed to the real line.We show that the NNs in the reduced parameter space are mathematically equivalent to the standard NNs with parameters in the whole space.The reduced parameter space shall facilitate the optimization procedure for the network training,as the search space becomes(much)smaller.We demonstrate the improved training performance using numerical examples.
基金This work is supported by the National Natural Science Foundation of China under Grant No.61872371the Open Fund from the State Key Laboratory of High Performance Computing of China(HPCL)under Grant No.202001-07the Natural Key Research and Development Program of China under Grant No.2018YFB0204301.
文摘Neural networks, as an important computing model, have a wide application in artificial intelligence (AI) domain. From the perspective of computer science, such a computing model requires a formal description of its behaviors, particularly the relation between input and output. In addition, such specifications ought to be verified automatically. ReLU (rectified linear unit) neural networks are intensively used in practice. In this paper, we present ReLU Temporal Logic (ReTL), whose semantics is defined with respect to ReLU neural networks, which could specify value-related properties about the network. We show that the model checking algorithm for theΣ2∪Π2 fragment of ReTL, which can express properties such as output reachability, is decidable in EXPSPACE. We have also implemented our algorithm with a prototype tool, and experimental results demonstrate the feasibility of the presented model checking approach.