Publikation
Evaluation of Transfer Learning Approaches for Cross-Lingual Question Answering
Christine Schäfer
Mastersthesis, Deutsches Forschungszentrum für Künstliche Intelligenz, 2020.
Zusammenfassung
In cross-lingual question answering systems try to find answers to natural language questions
in languages they were not (mainly) trained on. This thesis looks at different approaches for
cross-lingual transfer on the Xqa corpus [Liu et al., 2019a]. It first investigates the corpus
and compares it to other cross-lingual question answering datasets. The next chapters explore
several potential enhancements to the Xqa baselines. The first investigates whether cross-
lingual word embeddings can be used for cross-lingual transfer in a QA-model. The next part
asks the question if small amounts of target language training data can improve a model that
was trained in the source language. Another section explores how well training on one cross-
lingual dataset transfers to others. The last investigated questions are if shallow input features
that proved helpful in non-neural baselines can enhance mBERT and if the paragraph selection
features in the baselines are suitable for the Xqa dataset.