Skip to main content Skip to main navigation

Publication

BOWLL: A Deceptively Simple Open World Lifelong Learner

Roshni Kamath; Rupert Mitchell; Subarnaduti Paul; Kristian Kersting; Martin Mundt
In: Computing Research Repository eprint Journal (CoRR), Vol. abs/2402.04814, Pages 1-16, arXiv, 2024.

Abstract

Traditional machine learning excels on static benchmarks, but the real world is dynamic and seldom as carefully curated as test sets. Practical applications may generally encounter undesired inputs, are required to deal with novel information, and need to ensure operation through their full lifetime - aspects where standard deep models struggle. These three elements may have been researched individually, but their practical conjunction, i.e., open world learning, is much less consolidated. In this paper, we posit that neural networks already contain a powerful catalyst to turn them into open world learners: the batch normalization layer. Leveraging its tracked statistics, we derive effective strategies to detect in- and out-of-distribution samples, select informative data points, and update the model continuously. This, in turn, allows us to demonstrate that existing batch-normalized models can be made more robust, less prone to forgetting over time, and be trained efficiently with less data

More links