Catastrophic forgetting describes the fact that machine learning models will likely forget the knowledge of previously learned tasks after the learning process of a new one.It is a vital problem in the continual learn...Catastrophic forgetting describes the fact that machine learning models will likely forget the knowledge of previously learned tasks after the learning process of a new one.It is a vital problem in the continual learning scenario and recently has attracted tremendous concern across different communities.We explore the catastrophic forgetting phenomena in the context of quantum machine learning.It is found that,similar to those classical learning models based on neural networks,quantum learning systems likewise suffer from such forgetting problem in classification tasks emerging from various application scenes.We show that based on the local geometrical information in the loss function landscape of the trained model,a uniform strategy can be adapted to overcome the forgetting problem in the incremental learning setting.Our results uncover the catastrophic forgetting phenomena in quantum machine learning and offer a practical method to overcome this problem,which opens a new avenue for exploring potential quantum advantages towards continual learning.展开更多
With the rapid development of quantum devices across various platforms[1–4],reconstructing quantum many-body states from experimentally measured data posts a crucial challenge.Straightforward quantum state tomography...With the rapid development of quantum devices across various platforms[1–4],reconstructing quantum many-body states from experimentally measured data posts a crucial challenge.Straightforward quantum state tomography(QST)is only applicable for small systems[5],since the required classical computing resources,such as the number of measurements and the memory size,grow exponentially as the system size increases.展开更多
基金supported by the Start-Up Fund from Tsinghua University(Grant No.53330300320)the National Natural Science Foundation of China(Grant No.12075128)the Shanghai Qi Zhi Institute。
文摘Catastrophic forgetting describes the fact that machine learning models will likely forget the knowledge of previously learned tasks after the learning process of a new one.It is a vital problem in the continual learning scenario and recently has attracted tremendous concern across different communities.We explore the catastrophic forgetting phenomena in the context of quantum machine learning.It is found that,similar to those classical learning models based on neural networks,quantum learning systems likewise suffer from such forgetting problem in classification tasks emerging from various application scenes.We show that based on the local geometrical information in the loss function landscape of the trained model,a uniform strategy can be adapted to overcome the forgetting problem in the incremental learning setting.Our results uncover the catastrophic forgetting phenomena in quantum machine learning and offer a practical method to overcome this problem,which opens a new avenue for exploring potential quantum advantages towards continual learning.
基金supported by the National Natural Science Foundation of China(11925404,92165209,92265210,92365301,T2225008,12075128,and 62173201)the Innovation Program for Quantum Science and Technology(2021ZD0300203,2021ZD0302203,and 2021ZD0300201)+1 种基金the National Key Research and Development Program of China(2017YFA0304303)the Tsinghua University Dushi Program,and the Shanghai Qi Zhi Institute Innovation Program(SQZ202318)。
文摘With the rapid development of quantum devices across various platforms[1–4],reconstructing quantum many-body states from experimentally measured data posts a crucial challenge.Straightforward quantum state tomography(QST)is only applicable for small systems[5],since the required classical computing resources,such as the number of measurements and the memory size,grow exponentially as the system size increases.