A4 Vertaisarvioitu artikkeli konferenssijulkaisussa

Plastic and stable gated classifiers for continual learning




TekijätKuo Nicholas I-Hsien, Harandi Mehrtash, Fourrier Nicolas, Walder Christian, Ferraro Gabriela, Suominen Hanna

Konferenssin vakiintunut nimiIEEE Computer Society Conference on Computer Vision and Pattern Recognition workshops

KustantajaIEEE Computer Society

Julkaisuvuosi2021

JournalIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops

Kokoomateoksen nimi2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)

Tietokannassa oleva lehden nimiIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops

Aloitussivu3548

Lopetussivu3553

ISBN978-1-6654-4900-7

eISBN978-1-6654-4899-4

ISSN2160-7516

DOIhttps://doi.org/10.1109/CVPRW53098.2021.00394


Tiivistelmä

Conventional neural networks are mostly high in plasticity but low in stability. Hence, catastrophic forgetting tends to occur over the sequential training of multiple tasks and a backbone learner loses its ability in solving a previously learnt task. Several studies have shown that catastrophic forgetting can be partially mitigated through freezing the feature extractor weights while only sequentially training the classifier network. Though these are effective methods in retaining knowledge, forgetting could still become severe if the classifier network is over-parameterised over many tasks. As a remedy, this paper presents a novel classifier design with high stability. Highway-Connection Classifier Networks (HCNs) leverage gated units to alleviate forgetting. When employed alone, they exhibit strong robustness against forgetting. In addition, they synergise well with many existing and popular continual learning archetypes. We release our codes at https://github.com/Nic5472K/CLVISION2021_CVPR_HCN



Last updated on 2024-26-11 at 16:03