Transferring Knowledge across Learning Processes

Sebastian Flennerhag, Pablo Garcia Moreno, Neil D. Lawrence, Andreas Damianou
International Conference on Learning Representations, 2019.

Abstract

In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at at higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta-learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta-learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding reinforcement learning environments (Atari) that involve millions of gradient steps.

Cite this Paper


BibTeX
@InProceedings{Flennerhag-transferring19, title = {Transferring Knowledge across Learning Processes}, author = {Flennerhag, Sebastian and Garcia Moreno, Pablo and Lawrence, Neil D. and Damianou, Andreas}, booktitle = {International Conference on Learning Representations}, year = {2019}, pdf = {https://openreview.net/attachment?id=Hkg-xgrYvH&name=original_pdf}, url = {http://inverseprobability.com/publications/transferring-knowledge-across-learning-processes.html}, abstract = {In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at at higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta-learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta-learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding reinforcement learning environments (Atari) that involve millions of gradient steps. } }
Endnote
%0 Conference Paper %T Transferring Knowledge across Learning Processes %A Sebastian Flennerhag %A Pablo Garcia Moreno %A Neil D. Lawrence %A Andreas Damianou %B International Conference on Learning Representations %D 2019 %F Flennerhag-transferring19 %U http://inverseprobability.com/publications/transferring-knowledge-across-learning-processes.html %X In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at at higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta-learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta-learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding reinforcement learning environments (Atari) that involve millions of gradient steps.
RIS
TY - CPAPER TI - Transferring Knowledge across Learning Processes AU - Sebastian Flennerhag AU - Pablo Garcia Moreno AU - Neil D. Lawrence AU - Andreas Damianou BT - International Conference on Learning Representations DA - 2019/04/26 ID - Flennerhag-transferring19 L1 - https://openreview.net/attachment?id=Hkg-xgrYvH&name=original_pdf UR - http://inverseprobability.com/publications/transferring-knowledge-across-learning-processes.html AB - In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at at higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta-learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta-learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding reinforcement learning environments (Atari) that involve millions of gradient steps. ER -
APA
Flennerhag, S., Garcia Moreno, P., Lawrence, N.D. & Damianou, A.. (2019). Transferring Knowledge across Learning Processes. International Conference on Learning Representations Available from http://inverseprobability.com/publications/transferring-knowledge-across-learning-processes.html.

Related Material