Lifelong Learning
정의
Continual Learning(=Lifelong learning) : 옛 지식을 잊지 않으면서 새로운 지식을 학습하는 AI
- Incremental Training:새로운 데이터만을 사용, 기존 모델 재학습
- 이전 데이터로부터 학습한 내용을 잊어버리는 현상인
Catastrophic Forgetting이 발생함
Inclusive Training:전체 데이터를 사용하여 모델을 새롭게 학습
- 전체 데이터에 대한 학습은 Scalability Issue가 있음
lifelong learning은 심층 신경망(DNN)에서 online/incremental learning의 특수한 사례로 생각할 수 있다.
Lifelong Machine Learning focuses on developing versatile systems that accumulate and refine their knowledge over time.
This research area integrates techniques from multiple subfields of Machine Learning and Artificial Intelligence, including
- transfer learning,
- multi-task learning,
- online learning
- and knowledge representation and maintenance.
목적
- 오프라인 러닝과 동일한 성능
the main objective of an online machine learning algorithm is to try to perform as closely to the corresponding offline algorithm as possible
용어
Fine-Tuning : 가장 Naive 한 방법,
transfer learning
sequential/Online/Incremental learning : model learns one observation at a time
- sequential Vs. incremental = 데이터에 순서 법칙이 존재 할때 oder Vs. 데이터에 순서 법칙이 없을때 random
- online Vs. incremental = Label정보 없음, 이전 내용을 잊을수 있음(Catastrophic Interference) Vs. 라벨 정보 있음, 이전 내용을 잊을 없음
- online Vs. incremental = faster than the sampling rate VS. runs slower than the sampling rate(updating every 1000 samples)
Multi-center Learning : 여러 지역(center)에 흩어져 있는 데이터를 프라이버시 이슈등으로 한곳에 모으지 못하고, 한 지역시 돌아 다니면서 학습을 진행 (eg. Incremental Learning과 유사??)
- Incremental learning : transferring knowledge acquired on old tasks to the new ones, new classes are learned continually, 얼굴인식 -> 얼굴 식별
Lifelong learning is akin to transferring knowledge acquired on old tasks to the new ones.
Never-ending learning, on the other hand, focuses on continuously acquiring data to improve existing classifiers or to learn new ones.
Online Learning : PPT, 60pages
DEEP ONLINE LEARNING VIA META-LEARNING:CONTINUAL ADAPTATION FOR MODEL-BASED RL
- Sequential Labeling with online Deep Learning : 논문, 코드(Matlab), 2014
- Online/Incremental Learning with Keras and Creme: pyimagesearch, Creme는 머신러닝용인듯
- Keras: Feature extraction on large datasets with Deep Learning
- Transfer Learning with Keras and Deep Learning
OperationalAI: 지속적으로 학습하는 AnomalyDetection시스템 만들기 : DEVIEW2019, 김기현
[코세라] Online Learning: 13min
Catastrophic Forgetting
2019-OVERCOMING CATASTROPHIC FORGETTING FOR CONTINUAL LEARNING VIA MODEL ADAPTATION: ICLR 2019
2017-Overcoming catastrophic forgetting in neural networks: 2017, [code]
2017-Overcoming Catastrophic Forgetting by Incremental Moment Matching: NIPS2017, SNU, NAVER
217-Measuring Catastrophic Forgetting in Neural Networks : 20172018-Keep and Learn: Knowledge Preservation in Neural Networks: 2018, [한글정리]2017-Incremental On-line Learning: A Review and Comparison of State of the Art Algorithms
지속 학습 방법들
https://www.slideshare.net/JinwonLee9/deep-learning-seminarsnu161031 : ppt
- Less forgetting learning in Deep Neural Networks heechul jung
- [LwF] Li, Z. and Hoiem, D., Learning without forgetting, In: ECCV (2016)