构造一种适用于反向传播(backpropagation,BP)神经网络的新型激活函数Lfun(logarithmic series function),并使用基于该函数的BP神经网络进行机床能耗状态的预测。首先,分析Sigmoid系列和ReLU系列激活函数的特点和缺陷,结合对数函数,构...构造一种适用于反向传播(backpropagation,BP)神经网络的新型激活函数Lfun(logarithmic series function),并使用基于该函数的BP神经网络进行机床能耗状态的预测。首先,分析Sigmoid系列和ReLU系列激活函数的特点和缺陷,结合对数函数,构造了一种非线性分段含参数激活函数。该函数可导且光滑、导数形式简单、单调递增、输出均值为零,且通过可变参数使函数形式更灵活;其次,通过数值仿真实验在公共数据集上将Lfun函数与Sigmoid、ReLU、tanh、Leaky_ReLU和ELU函数的性能进行对比;最后,使用基于Lfun函数的BP神经网络进行机床能耗状态的预测。实验结果表明,使用Lfun函数的BP神经网络相较于使用其他几种常用激活函数的网络具有更好的性能。展开更多
This paper describes our implementation of several neural networks built on a field programmable gate array (FPGA) and used to recognize a handwritten digit dataset—the Modified National Institute of Standards and Te...This paper describes our implementation of several neural networks built on a field programmable gate array (FPGA) and used to recognize a handwritten digit dataset—the Modified National Institute of Standards and Technology (MNIST) database. We also propose a novel hardware-friendly activation function called the dynamic Rectifid Linear Unit (ReLU)—D-ReLU function that achieves higher performance than traditional activation functions at no cost to accuracy. We built a 2-layer online training multilayer perceptron (MLP) neural network on an FPGA with varying data width. Reducing the data width from 8 to 4 bits only reduces prediction accuracy by 11%, but the FPGA area decreases by 41%. Compared to networks that use the sigmoid functions, our proposed D-ReLU function uses 24% - 41% less area with no loss to prediction accuracy. Further reducing the data width of the 3-layer networks from 8 to 4 bits, the prediction accuracies only decrease by 3% - 5%, with area being reduced by 9% - 28%. Moreover, FPGA solutions have 29 times faster execution time, even despite running at a 60× lower clock rate. Thus, FPGA implementations of neural networks offer a high-performance, low power alternative to traditional software methods, and our novel D-ReLU activation function offers additional improvements to performance and power saving.展开更多
Accurate prediction of the rate of penetration(ROP)is significant for drilling optimization.While the intelligent ROP prediction model based on fully connected neural networks(FNN)outperforms traditional ROP equations...Accurate prediction of the rate of penetration(ROP)is significant for drilling optimization.While the intelligent ROP prediction model based on fully connected neural networks(FNN)outperforms traditional ROP equations and machine learning algorithms,its lack of interpretability undermines its credibility.This study proposes a novel interpretation and characterization method for the FNN ROP prediction model using the Rectified Linear Unit(ReLU)activation function.By leveraging the derivative of the ReLU function,the FNN function calculation process is transformed into vector operations.The FNN model is linearly characterized through further simplification,enabling its interpretation and analysis.The proposed method is applied in ROP prediction scenarios using drilling data from three vertical wells in the Tarim Oilfield.The results demonstrate that the FNN ROP prediction model with ReLU as the activation function performs exceptionally well.The relative activation frequency curve of hidden layer neurons aids in analyzing the overfitting of the FNN ROP model and determining drilling data similarity.In the well sections with similar drilling data,averaging the weight parameters enables linear characterization of the FNN ROP prediction model,leading to the establishment of a corresponding linear representation equation.Furthermore,the quantitative analysis of each feature's influence on ROP facilitates the proposal of drilling parameter optimization schemes for the current well section.The established linear characterization equation exhibits high precision,strong stability,and adaptability through the application and validation across multiple well sections.展开更多
文摘构造一种适用于反向传播(backpropagation,BP)神经网络的新型激活函数Lfun(logarithmic series function),并使用基于该函数的BP神经网络进行机床能耗状态的预测。首先,分析Sigmoid系列和ReLU系列激活函数的特点和缺陷,结合对数函数,构造了一种非线性分段含参数激活函数。该函数可导且光滑、导数形式简单、单调递增、输出均值为零,且通过可变参数使函数形式更灵活;其次,通过数值仿真实验在公共数据集上将Lfun函数与Sigmoid、ReLU、tanh、Leaky_ReLU和ELU函数的性能进行对比;最后,使用基于Lfun函数的BP神经网络进行机床能耗状态的预测。实验结果表明,使用Lfun函数的BP神经网络相较于使用其他几种常用激活函数的网络具有更好的性能。
文摘This paper describes our implementation of several neural networks built on a field programmable gate array (FPGA) and used to recognize a handwritten digit dataset—the Modified National Institute of Standards and Technology (MNIST) database. We also propose a novel hardware-friendly activation function called the dynamic Rectifid Linear Unit (ReLU)—D-ReLU function that achieves higher performance than traditional activation functions at no cost to accuracy. We built a 2-layer online training multilayer perceptron (MLP) neural network on an FPGA with varying data width. Reducing the data width from 8 to 4 bits only reduces prediction accuracy by 11%, but the FPGA area decreases by 41%. Compared to networks that use the sigmoid functions, our proposed D-ReLU function uses 24% - 41% less area with no loss to prediction accuracy. Further reducing the data width of the 3-layer networks from 8 to 4 bits, the prediction accuracies only decrease by 3% - 5%, with area being reduced by 9% - 28%. Moreover, FPGA solutions have 29 times faster execution time, even despite running at a 60× lower clock rate. Thus, FPGA implementations of neural networks offer a high-performance, low power alternative to traditional software methods, and our novel D-ReLU activation function offers additional improvements to performance and power saving.
基金The authors greatly thanked the financial support from the National Key Research and Development Program of China(funded by National Natural Science Foundation of China,No.2019YFA0708300)the Strategic Cooperation Technology Projects of CNPC and CUPB(funded by China National Petroleum Corporation,No.ZLZX2020-03)+1 种基金the National Science Fund for Distinguished Young Scholars(funded by National Natural Science Foundation of China,No.52125401)Science Foundation of China University of Petroleum,Beijing(funded by China University of petroleum,Beijing,No.2462022SZBH002).
文摘Accurate prediction of the rate of penetration(ROP)is significant for drilling optimization.While the intelligent ROP prediction model based on fully connected neural networks(FNN)outperforms traditional ROP equations and machine learning algorithms,its lack of interpretability undermines its credibility.This study proposes a novel interpretation and characterization method for the FNN ROP prediction model using the Rectified Linear Unit(ReLU)activation function.By leveraging the derivative of the ReLU function,the FNN function calculation process is transformed into vector operations.The FNN model is linearly characterized through further simplification,enabling its interpretation and analysis.The proposed method is applied in ROP prediction scenarios using drilling data from three vertical wells in the Tarim Oilfield.The results demonstrate that the FNN ROP prediction model with ReLU as the activation function performs exceptionally well.The relative activation frequency curve of hidden layer neurons aids in analyzing the overfitting of the FNN ROP model and determining drilling data similarity.In the well sections with similar drilling data,averaging the weight parameters enables linear characterization of the FNN ROP prediction model,leading to the establishment of a corresponding linear representation equation.Furthermore,the quantitative analysis of each feature's influence on ROP facilitates the proposal of drilling parameter optimization schemes for the current well section.The established linear characterization equation exhibits high precision,strong stability,and adaptability through the application and validation across multiple well sections.