Deep learning(DL)methods like multilayer perceptrons(MLPs)and convolutional neural networks(CNNs)have been applied to predict the complex traits in animal and plant breeding.However,improving the genomic prediction ac...Deep learning(DL)methods like multilayer perceptrons(MLPs)and convolutional neural networks(CNNs)have been applied to predict the complex traits in animal and plant breeding.However,improving the genomic prediction accuracy still presents signifcant challenges.In this study,we applied CNNs to predict swine traits using previously published data.Specifcally,we extensively evaluated the CNN model's performance by employing various sets of single nucleotide polymorphisms(SNPs)and concluded that the CNN model achieved optimal performance when utilizing SNP sets comprising 1,000 SNPs.Furthermore,we adopted a novel approach using the one-hot encoding method that transforms the 16 different genotypes into sets of eight binary variables.This innovative encoding method signifcantly enhanced the CNN's prediction accuracy for swine traits,outperforming the traditional one-hot encoding techniques.Our fndings suggest that the expanded one-hot encoding method can improve the accuracy of DL methods in the genomic prediction of swine agricultural economic traits.This discovery has significant implications for swine breeding programs,where genomic prediction is pivotal in improving breeding strategies.Furthermore,future research endeavors can explore additional enhancements to DL methods by incorporating advanced data pre-processing techniques.展开更多
This article presents a proposal for a model of a microprogram control unit (CMCU) with output identification adapted for implementation in complex programmable logic devices (CPLD) equipped with integrated memory mod...This article presents a proposal for a model of a microprogram control unit (CMCU) with output identification adapted for implementation in complex programmable logic devices (CPLD) equipped with integrated memory modules [1]. An approach which applies two sources of code and one-hot encoding has been used in a base CMCU model with output identification [2] [3]. The article depicts a complete example of processing for the proposed CMCU model. Furthermore, it also discusses the advantages and disadvantages of the approach in question and presents the results of the experiments conducted on a real CPLD system.展开更多
软件缺陷是软件出现错误、故障的根源。软件缺陷是需求分析不合理、编程语言不严谨、开发人员缺少经验等因素导致的。软件缺陷不可避免,提交缺陷报告是发现缺陷并改进缺陷的重要途径。缺陷报告是描述缺陷的载体,对缺陷报告的修复是完善...软件缺陷是软件出现错误、故障的根源。软件缺陷是需求分析不合理、编程语言不严谨、开发人员缺少经验等因素导致的。软件缺陷不可避免,提交缺陷报告是发现缺陷并改进缺陷的重要途径。缺陷报告是描述缺陷的载体,对缺陷报告的修复是完善软件的必要手段。维护人员和用户因同一缺陷重复提交报告,导致缺陷报告库中存在大量冗余的报告,手动分诊已无法适应越来越复杂的软件系统。重复缺陷报告检测能过滤缺陷报告库中冗余的重复报告,并将人力与时间投入到新的缺陷报告上。当前研究方法的预测准确率始终不高,其难点在于寻找一个合适且全面的方法来衡量缺陷报告之间的相似性。借鉴集成方法的思想,提出了一种基于文本信息、分类信息相融合的重复缺陷报告检测方法——BSO(combination of BM25F、LSI and One-Hot)。在数据预处理的基础上,文中将重复缺陷报告分割为文本信息域与分类信息域。在文本信息域上使用BM25F与LSI算法,得到两个方法的相似性打分,运用相似性融合方法将两个方法的相似性打分进行整合;在分类信息域上使用One-Hot算法得到相似性打分。运用相似性融合方法,融合文本信息域与分类信息域的相似性打分,为每个缺陷报告对应一个重复缺陷报告推荐列表,并计算重复缺陷报告检测的准确率。利用Python语言,在公开的数据集OpenOffice上与基线方法以及较新水平方法REP、DBTM进行对比。实验结果表明,与DBTM相比,本文方法的准确率平均提高了4.7%;与REP方法相比,本文方法的准确率平均提高了6.3%;与基线方法相比,本文方法的准确率提升较高。实验结果充分证明了BSO方法的有效性。展开更多
The Internet of Things(IoT)and related applications have witnessed enormous growth since its inception.The diversity of connecting devices and relevant applications have enabled the use of IoT devices in every domain....The Internet of Things(IoT)and related applications have witnessed enormous growth since its inception.The diversity of connecting devices and relevant applications have enabled the use of IoT devices in every domain.Although the applicability of these applications are predominant,battery life remains to be a major challenge for IoT devices,wherein unreliability and shortened life would make an IoT application completely useless.In this work,an optimized deep neural networks based model is used to predict the battery life of the IoT systems.The present study uses the Chicago Park Beach dataset collected from the publicly available data repository for the experimentation of the proposed methodology.The dataset is pre-processed using the attribute mean technique eliminating the missing values and then One-Hot encoding technique is implemented to convert it to numerical format.This processed data is normalized using the Standard Scaler technique.Moth Flame Optimization(MFO)Algorithm is then implemented for selecting the optimal features in the dataset.These optimal features are finally fed into the DNN model and the results generated are evaluated against the stateof-the-art models,which justify the superiority of the proposed MFO-DNN model.展开更多
基金supported by the National Natural Science Foundation of China(32102513)the National Key Scientific Research Project(2023YFF1001100)+1 种基金the Shenzhen Innovation and Entrepreneurship PlanMajor Special Project of Science and Technology,China(KJZD20230923115003006)the Innovation Project of Chinese Academy of Agricultural Sciences(CAAS-ZDRW202006)。
文摘Deep learning(DL)methods like multilayer perceptrons(MLPs)and convolutional neural networks(CNNs)have been applied to predict the complex traits in animal and plant breeding.However,improving the genomic prediction accuracy still presents signifcant challenges.In this study,we applied CNNs to predict swine traits using previously published data.Specifcally,we extensively evaluated the CNN model's performance by employing various sets of single nucleotide polymorphisms(SNPs)and concluded that the CNN model achieved optimal performance when utilizing SNP sets comprising 1,000 SNPs.Furthermore,we adopted a novel approach using the one-hot encoding method that transforms the 16 different genotypes into sets of eight binary variables.This innovative encoding method signifcantly enhanced the CNN's prediction accuracy for swine traits,outperforming the traditional one-hot encoding techniques.Our fndings suggest that the expanded one-hot encoding method can improve the accuracy of DL methods in the genomic prediction of swine agricultural economic traits.This discovery has significant implications for swine breeding programs,where genomic prediction is pivotal in improving breeding strategies.Furthermore,future research endeavors can explore additional enhancements to DL methods by incorporating advanced data pre-processing techniques.
文摘This article presents a proposal for a model of a microprogram control unit (CMCU) with output identification adapted for implementation in complex programmable logic devices (CPLD) equipped with integrated memory modules [1]. An approach which applies two sources of code and one-hot encoding has been used in a base CMCU model with output identification [2] [3]. The article depicts a complete example of processing for the proposed CMCU model. Furthermore, it also discusses the advantages and disadvantages of the approach in question and presents the results of the experiments conducted on a real CPLD system.
文摘软件缺陷是软件出现错误、故障的根源。软件缺陷是需求分析不合理、编程语言不严谨、开发人员缺少经验等因素导致的。软件缺陷不可避免,提交缺陷报告是发现缺陷并改进缺陷的重要途径。缺陷报告是描述缺陷的载体,对缺陷报告的修复是完善软件的必要手段。维护人员和用户因同一缺陷重复提交报告,导致缺陷报告库中存在大量冗余的报告,手动分诊已无法适应越来越复杂的软件系统。重复缺陷报告检测能过滤缺陷报告库中冗余的重复报告,并将人力与时间投入到新的缺陷报告上。当前研究方法的预测准确率始终不高,其难点在于寻找一个合适且全面的方法来衡量缺陷报告之间的相似性。借鉴集成方法的思想,提出了一种基于文本信息、分类信息相融合的重复缺陷报告检测方法——BSO(combination of BM25F、LSI and One-Hot)。在数据预处理的基础上,文中将重复缺陷报告分割为文本信息域与分类信息域。在文本信息域上使用BM25F与LSI算法,得到两个方法的相似性打分,运用相似性融合方法将两个方法的相似性打分进行整合;在分类信息域上使用One-Hot算法得到相似性打分。运用相似性融合方法,融合文本信息域与分类信息域的相似性打分,为每个缺陷报告对应一个重复缺陷报告推荐列表,并计算重复缺陷报告检测的准确率。利用Python语言,在公开的数据集OpenOffice上与基线方法以及较新水平方法REP、DBTM进行对比。实验结果表明,与DBTM相比,本文方法的准确率平均提高了4.7%;与REP方法相比,本文方法的准确率平均提高了6.3%;与基线方法相比,本文方法的准确率提升较高。实验结果充分证明了BSO方法的有效性。
基金The authors are grateful to the Raytheon Chair for Systems Engineering for funding.
文摘The Internet of Things(IoT)and related applications have witnessed enormous growth since its inception.The diversity of connecting devices and relevant applications have enabled the use of IoT devices in every domain.Although the applicability of these applications are predominant,battery life remains to be a major challenge for IoT devices,wherein unreliability and shortened life would make an IoT application completely useless.In this work,an optimized deep neural networks based model is used to predict the battery life of the IoT systems.The present study uses the Chicago Park Beach dataset collected from the publicly available data repository for the experimentation of the proposed methodology.The dataset is pre-processed using the attribute mean technique eliminating the missing values and then One-Hot encoding technique is implemented to convert it to numerical format.This processed data is normalized using the Standard Scaler technique.Moth Flame Optimization(MFO)Algorithm is then implemented for selecting the optimal features in the dataset.These optimal features are finally fed into the DNN model and the results generated are evaluated against the stateof-the-art models,which justify the superiority of the proposed MFO-DNN model.