Severity of Catastrophic Forgetting in Object Detection for Autonomous Driving

Christian Witte; René Schuster; Syed Saqib Bukhari; Patrick Trampert; Didier Stricker; Georg Schneider

In: International Conference on Pattern Recognition Applications and Methods. International Conference on Pattern Recognition Applications and Methods (ICPRAM-2023), February 22-24, Lissabon, Portugal, Scitepress, 2023.


Incorporating unseen data in pre-trained neural networks remains a challenging endeavor, as complete retraining is often impracticable. Yet, training the networks sequentially on data with different distributions can lead to performance degradation for previously learned data, known as catastrophic forgetting. The sequential training paradigm and the mitigation of catastrophic forgetting are subject to Continual Learning (CL). The phenomenon of forgetting poses a challenge for applications with changing distributions and prediction objectives, including Autonomous Driving (AD). Our work aims to illustrate the severity of catastrophic forgetting for object detection for class- and domainincremental learning. We propose four hypotheses, as we investigate the impact of the ordering of sequential increments and the underlying data distribution of AD datasets. Further, the influence of different object detection architectures is examined. The results of our empirical study highlight the major effects of forgetting for class-incremental learning. Moreover, we show that domain-incremental learning suffers less from forgetting but is highly dependent on the design of the experiments and choice of architecture.

witte2023severity.pdf (pdf, 1 MB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence