T2NER: Transformers based Transfer Learning Framework for Named Entity Recognition

Saadullah Amin, Günter Neumann

In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations. Conference of the European Chapter of the Association for Computational Linguistics (EACL-2021) System Demonstrations April 19-23 Kyiv/Virtual Ukraine Pages 212-220 ACL 4/2021.


Recent advances in deep transformer models have achieved state-of-the-art in several natural language processing (NLP) tasks, whereas named entity recognition (NER) has traditionally benefited from long-short term memory (LSTM) networks. In this work, we present a Transformers based Transfer Learning framework for Named Entity Recognition (T2NER) created in PyTorch for the task of NER with deep transformer models. The framework is built upon the Transformers library as the core modeling engine and supports several transfer learning scenarios from sequential transfer to domain adaptation, multi-task learning, and semi-supervised learning. It aims to bridge the gap between the algorithmic advances in these areas by combining them with the state-of-theart in transformer models to provide a unified platform that is readily extensible and can be used for both the transfer learning research in NER, and for real-world applications. The framework is available at: https://github. com/suamin/t2ner.


Weitere Links

2021.eacl-demos.25.pdf (pdf, 658 KB )

German Research Center for Artificial Intelligence
Deutsches Forschungszentrum für Künstliche Intelligenz