The emulation of human multisensory functions to construct artificial perception systems is an intriguing challenge for developing humanoid robotics and cross-modal human–machine interfaces.Inspired by human multisen...The emulation of human multisensory functions to construct artificial perception systems is an intriguing challenge for developing humanoid robotics and cross-modal human–machine interfaces.Inspired by human multisensory signal generation and neuroplasticity-based signal processing,here,an artificial perceptual neuro array with visual-tactile sensing,processing,learning,and memory is demonstrated.The neuromorphic bimodal perception array compactly combines an artificial photoelectric synapse network and an integrated mechanoluminescent layer,endowing individual and synergistic plastic modulation of optical and mechanical information,including short-term memory,long-term memory,paired pulse facilitation,and“learning-experience”behavior.Sequential or superimposed visual and tactile stimuli inputs can efficiently simulate the associative learning process of“Pavlov's dog”.The fusion of visual and tactile modulation enables enhanced memory of the stimulation image during the learning process.A machine-learning algorithm is coupled with an artificial neural network for pattern recognition,achieving a recognition accuracy of 70%for bimodal training,which is higher than that obtained by unimodal training.In addition,the artificial perceptual neuron has a low energy consumption of~20 pJ.With its mechanical compliance and simple architecture,the neuromorphic bimodal perception array has promising applications in largescale cross-modal interactions and high-throughput intelligent perceptions.展开更多
基金National Natural Science Foundation of China,Grant/Award Numbers:52002246,52192614,U22A2077,U20A20166,52125205,52372154Natural Science Foundation of Beijing Municipality,Grant/Award Numbers:2222088,Z180011+4 种基金Shenzhen Fundamental Research Project,Grant/Award Number:JCYJ20190808170601664Shenzhen Science and Technology Program,Grant/Award Number:KQTD20170810105439418Science and Technology Innovation Project of Shenzhen Excellent Talents,Grant/Award Number:RCBS20200714114919006National Key R&D Program of China,Grant/Award Numbers:2021YFB3200304,2021YFB3200302Fundamental Research Funds for the Central Universities。
文摘The emulation of human multisensory functions to construct artificial perception systems is an intriguing challenge for developing humanoid robotics and cross-modal human–machine interfaces.Inspired by human multisensory signal generation and neuroplasticity-based signal processing,here,an artificial perceptual neuro array with visual-tactile sensing,processing,learning,and memory is demonstrated.The neuromorphic bimodal perception array compactly combines an artificial photoelectric synapse network and an integrated mechanoluminescent layer,endowing individual and synergistic plastic modulation of optical and mechanical information,including short-term memory,long-term memory,paired pulse facilitation,and“learning-experience”behavior.Sequential or superimposed visual and tactile stimuli inputs can efficiently simulate the associative learning process of“Pavlov's dog”.The fusion of visual and tactile modulation enables enhanced memory of the stimulation image during the learning process.A machine-learning algorithm is coupled with an artificial neural network for pattern recognition,achieving a recognition accuracy of 70%for bimodal training,which is higher than that obtained by unimodal training.In addition,the artificial perceptual neuron has a low energy consumption of~20 pJ.With its mechanical compliance and simple architecture,the neuromorphic bimodal perception array has promising applications in largescale cross-modal interactions and high-throughput intelligent perceptions.