Grover’s search algorithm is one of the most significant quantum algorithms,which can obtain quadratic speedup of the extensive search problems.Since Grover's search algorithm cannot be implemented on a real quan...Grover’s search algorithm is one of the most significant quantum algorithms,which can obtain quadratic speedup of the extensive search problems.Since Grover's search algorithm cannot be implemented on a real quantum computer at present,its quantum simulation is regarded as an effective method to study the search performance.When simulating the Grover's algorithm,the storage space required is exponential,which makes it difficult to simulate the high-qubit Grover’s algorithm.To this end,we deeply study the storage problem of probability amplitude,which is the core of the Grover simulation algorithm.We propose a novel memory-efficient method via amplitudes compression,and validate the effectiveness of the method by theoretical analysis and simulation experimentation.The results demonstrate that our compressed simulation search algorithm can help to save nearly 87.5%of the storage space than the uncompressed one.Thus under the same hardware conditions,our method can dramatically reduce the required computing nodes,and at the same time,it can simulate at least 3 qubits more than the uncompressed one.Particularly,our memory-efficient simulation method can also be used to simulate other quantum algorithms to effectively reduce the storage costs required in simulation.展开更多
Due to the bionic features,neuromorphic computing has achieved higher energy efficiency than deep learning in many fields in recent years.Similar to the biological brain,the memory of synapses and weights occupy a lar...Due to the bionic features,neuromorphic computing has achieved higher energy efficiency than deep learning in many fields in recent years.Similar to the biological brain,the memory of synapses and weights occupy a large area in a neuromorphic processor.The prior neuromorphic processors meet the challenge of the large area of the memory organization.In this work,based on the characteristics of the brain and spiking neural networks(SNNs),we propose a set-associative memory organization and a compressed SRAM memory organization with an adjacent matrix of synapses for loose and tight coupling structures in SNN respectively to construct an area-efficient memory organization for generalized neuromorphic architectures.A ping-pong memory is also proposed for the logic neuron number expansion.Experiments show that our methods use less chip area and consume less power than the CAM implementation in related work by 23.4–75.8%and 21.2–75.7%while bringing minor processor performance overhead.展开更多
基金This work was supported by Funding of National Natural Science Foundation of China(Grant No.61571226,Grant No.61701229).
文摘Grover’s search algorithm is one of the most significant quantum algorithms,which can obtain quadratic speedup of the extensive search problems.Since Grover's search algorithm cannot be implemented on a real quantum computer at present,its quantum simulation is regarded as an effective method to study the search performance.When simulating the Grover's algorithm,the storage space required is exponential,which makes it difficult to simulate the high-qubit Grover’s algorithm.To this end,we deeply study the storage problem of probability amplitude,which is the core of the Grover simulation algorithm.We propose a novel memory-efficient method via amplitudes compression,and validate the effectiveness of the method by theoretical analysis and simulation experimentation.The results demonstrate that our compressed simulation search algorithm can help to save nearly 87.5%of the storage space than the uncompressed one.Thus under the same hardware conditions,our method can dramatically reduce the required computing nodes,and at the same time,it can simulate at least 3 qubits more than the uncompressed one.Particularly,our memory-efficient simulation method can also be used to simulate other quantum algorithms to effectively reduce the storage costs required in simulation.
基金funded by National Key Research and Development Programs of China[Grant numbers 2018YFB2202603 and 2020AAA0104602].
文摘Due to the bionic features,neuromorphic computing has achieved higher energy efficiency than deep learning in many fields in recent years.Similar to the biological brain,the memory of synapses and weights occupy a large area in a neuromorphic processor.The prior neuromorphic processors meet the challenge of the large area of the memory organization.In this work,based on the characteristics of the brain and spiking neural networks(SNNs),we propose a set-associative memory organization and a compressed SRAM memory organization with an adjacent matrix of synapses for loose and tight coupling structures in SNN respectively to construct an area-efficient memory organization for generalized neuromorphic architectures.A ping-pong memory is also proposed for the logic neuron number expansion.Experiments show that our methods use less chip area and consume less power than the CAM implementation in related work by 23.4–75.8%and 21.2–75.7%while bringing minor processor performance overhead.