Skip to main content Skip to main navigation

Publikation

Recursive tree grammar autoencoders

Benjamin Paaßen; Irena Koprinska; Kalina Yacef
In: Peggy Cellier; Krzysztof Dembczynski; Albrecht Zimmermann; Emilie Devijver (Hrsg.). Machine Learning, Vol. Special Issue of the ECML PKDD 2022 Journal Track, Pages 1-31, Springer, 2022.

Zusammenfassung

Machine learning on trees has been mostly focused on trees as input. Much less research has investigated trees as output, which has many applications, such as molecule optimization for drug discovery, or hint generation for intelligent tutoring systems. In this work, we propose a novel autoencoder approach, called recursive tree grammar autoencoder (RTG-AE), which encodes trees via a bottom-up parser and decodes trees via a tree grammar, both learned via recursive neural networks that minimize the variational autoencoder loss. The resulting encoder and decoder can then be utilized in subsequent tasks, such as optimization and time series prediction. RTG-AEs are the first model to combine three features: recursive processing, grammatical knowledge, and deep learning. Our key message is that this unique combination of all three features outperforms models which combine any two of the three. Experimentally, we show that RTG-AE improves the autoencoding error, training time, and optimization score on synthetic as well as real datasets compared to four baselines. We further prove that RTG-AEs parse and generate trees in linear time and are expressive enough to handle all regular tree grammars.

Weitere Links