Continual Lifelong Learning with Neural Networks: A Review


영문 정리

The goal of Continual Learning is to overcome “catastrophic forgetting”, in this way the architecture is able to smoothly update the prediction model using several tasks and data distributions.

catastrophic forgetting: a phenomenon that happens when deep learning are trained sequentially on multiple tasks and the network loses knowledge achieved in the previous ones because weights that are important for a current task are different in the following one

There are several strategies to figure out this matter, in the talk were explained three:

-Naïve Strategy;

-Rehearsal Strategy;

-Elastic Weight Consolidation Strategy

results matching ""

    No results matching ""