Publication Details
Soft Language Prompts for Language Transfer
cross-lingual transfer, multilinguality, less-resourced languages, language representations
Cross-lingual knowledge transfer, especially between high- and low-resource languages, remains challenging in natural language processing (NLP). This study offers insights for improving cross-lingual NLP applications through the combination of parameter-efficient fine-tuning methods. We systematically explore strategies for enhancing cross-lingual transfer through the incorporation of language-specific and task-specific adapters and soft prompts. We present a detailed investigation of various combinations of these methods, exploring their efficiency across 16 languages, focusing on 10 mid- and low-resource languages. We further present to our knowledge the first use of soft prompts for language transfer, a technique we call soft language prompts. Our findings demonstrate that in contrast to claims of previous work, a combination of language and task adapters does not always work best; instead, combining a soft language prompt with a task adapter outperforms most configurations in many cases.
@INPROCEEDINGS{FITPUB13353, author = "Ivan Vykopal and Simon Ostermann and Mari\'{a}n \v{S}imko", title = "Soft Language Prompts for Language Transfer", pages = "10294--10313", booktitle = "Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)", year = 2025, location = "Albuquerque, US", publisher = "Association for Computational Linguistics", ISBN = "979-8-8917-6189-6", language = "english", url = "https://www.fit.vut.cz/research/publication/13353" }