Conference dates: May 2-6, 2023
SCOPE
Based on the success of past low-resource machine translation (MT) workshops at AMTA 2018 (
https://amtaweb.org/), MT Summit 2019 (
https://www.mtsummit2019.com), AACL-IJCNLP 2020 (
http://aacl2020.org/),
AMTA 2021, COLING 2022, we introduce the Sixth Workshop. The workshop
provides a discussion panel for researchers working on MT
systems/methods for low-resource and under-represented languages in
general. We would like to help review/overview the state of MT for
low-resource languages and define the most important directions. We also
solicit papers dedicated to supplementary NLP tools that are used in
any language and especially in low-resource languages. Overview papers
on these NLP tools are very welcome. It will be beneficial if the
evaluations of these tools in research papers include their impact on
the quality of MT output.
TOPICS
We are highly
interested in (1) original research papers, (2) review/opinion papers,
and (3) online systems on the topics below; however, we welcome all
novel ideas that cover research on low-resource languages.
- COVID-related corpora, their translations and corresponding NLP/MT systems
- Neural machine translation for low-resource languages
- Work that presents online systems for practical use by native speakers
- Word tokenizers/de-tokenizers for specific languages
- Word/morpheme segmenters for specific languages
- Alignment/Re-ordering tools for specific language pairs
- Use of morphology analyzers and/or morpheme segmenters in MT
- Multilingual/cross-lingual NLP tools for MT
- Corpora creation and curation technologies for low-resource languages
- Review of available parallel corpora for low-resource languages
- Research and review papers on MT methods for low-resource languages
- MT systems/methods (e.g. rule-based, SMT, NMT) for low-resource languages
- Pivot MT for low-resource languages
- Zero-shot MT for low-resource languages
- Fast building of MT systems for low-resource languages
- Re-usability of existing MT systems for low-resource languages
- Machine translation for language preservation
SUBMISSION INFORMATION
We
are soliciting two types of submissions: (1) research, review, and
position papers and (2) system demonstration papers. For research,
review and position papers, the length of each paper should be at least
four (4) and not exceed eight (8) pages, plus unlimited pages for
references. For system demonstration papers, the limit is four (4)
pages. Submissions should be formatted according to the official EACL
2023 style templates (LaTeX, Word, Overleaf). Accepted papers will be
published online in the EACL 2023 proceedings and will be presented at
the conference.
Submissions must be anonymized and should be done
using the official conference management system (which will be
available in the following weeks). Scientific papers that have been or
will be submitted to other venues must be declared as such and must be
withdrawn from the other venues if accepted and published at LoResMT.
The review will be double-blind.
We would like to encourage
authors to cite papers written in ANY language that are related to the
topics, as long as both original bibliographic items and their
corresponding English translations are provided.
ORGANIZING COMMITTEE (LISTED ALPHABETICALLY)
Atul Kr. Ojha, University of Galway & Panlingua Language Processing LLP
Chao-Hong Liu, Potamu Research Ltd
Ekaterina Vylomova, University of Melbourne, Australia
Jade Abbott, Retro Rabbit
Jonathan Washington, Swarthmore College
Nathaniel Oco, National University (Philippines)
Tommi A Pirinen, UiT The Arctic University of Norway, Tromsø
Valentin Malykh, Huawei Noah’s Ark lab and Kazan Federal University
Varvara Logacheva, Skolkovo Institute of Science and Technology
Xiaobing Zhao, Minzu University of China
PROGRAM COMMITTEE (LISTED ALPHABETICALLY)
Alberto Poncelas, Rakuten, Singapore
Alina Karakanta, Fondazione Bruno Kessler
Amirhossein Tebbifakhr, Fondazione Bruno Kessler
Anna Currey, Amazon Web Services
Aswarth Abhilash Dara, Amazon
Arturo Oncevay, University of Edinburgh
Bharathi Raja Chakravarthi, University of Galway
Beatrice Savold, University of Trento
Bogdan Babych, Heidelberg University
Constantine Lignos, Brandeis University, USA
Daan van Esch, Google
Diptesh Kanojia, University of Surrey, UK
Duygu Ataman, University of Zurich
Eleni Metheniti, CLLE-CNRS and IRIT-CNRS
Francis Tyers, Indiana University
Kalika Bali, MSRI Bangalore, India
Koel Dutta Chowdhury, Saarland University (Germany)
Jade Abbott, Retro Rabbit
Jasper Kyle Catapang, University of the Philippines
John P. McCrae, DSI, Univerity of Galway
Kevin Patrick Scannell, Saint Louis University
Liangyou Li, Noah’s Ark Lab, Huawei Technologies
Maria Art Antonette Clariño, University of the Philippines Los Baños
Majid Latifi, University of York, York, UK
Mathias Müller, University of Zurich
Monojit Choudhury, Microsoft Turing
Rajdeep Sarkar, Univerity of Galway
Rico Sennrich, University of Zurich
Sangjee Dondrub, Qinghai Normal University
Santanu Pal, WIPRO AI
Sardana Ivanova, University of Helsinki
Shantipriya Parida, Silo AI
Sunit Bhattacharya, Charles University
Surafel Melaku Lakew, Amazon AI
CONTACT
Please email
loresmt@googlegroups.com if you have any questions/comments/suggestions.