Variational Bayesian Sequence-to-Sequence Networks for Memory-Efficient Sign Language Translation

TitleVariational Bayesian Sequence-to-Sequence Networks for Memory-Efficient Sign Language Translation
Publication TypeConference Proceedings
Year of Conference2020
AuthorsPartaourides, H, Voskou, A, Kosmopoulos, D, Chatzis, S, Metaxas, DN
Conference NameInternational Symposium on Visual Computing
Pagination251-262
PublisherSpringer International Publishing
Conference LocationCham
ISBN Number978-3-030-64559-5
Abstract

Memory-efficient continuous Sign Language Translation is a significant challenge for the development of assisted technologies with real-time applicability for the deaf. In this work, we introduce a paradigm of designing recurrent deep networks whereby the output of the recurrent layer is derived from appropriate arguments from nonparametric statistics. A novel variational Bayesian sequence-to-sequence network architecture is proposed that consists of a) a full Gaussian posterior distribution for data-driven memory compression and b) a nonparametric Indian Buffet Process prior for regularization applied on the Gated Recurrent Unit non-gate weights. We dub our approach Stick-Breaking Recurrent network and show that it can achieve a substantial weight compression without diminishing modeling performance.