Poster
in
Affinity Event: Muslims in ML
Neural Machine Translators (NMTs) as Efficient Forward and Backward Arabic Transliterators
Toyib Ogunremi · Anthony Soronnadi · Olamide Shogbamu · Olubayo Adekanmbi
Keywords: [ Translation ] [ Non-Latin Languages ] [ Transliteration ] [ Romanization ] [ Seq2Seq models ] [ Transformer ]
This study addresses the challenge in converting Romanized Arabic text back to its original Arabic script, a capability that remains largely unsupported by existing transliteration tools. We propose that both forward and backward transliteration tasks can be effectively approached as machine translation problems. To test this hypothesis, we fine-tune three HuggingFace transformer-based Neural Machine Translation (NMT) Pretrained Language Models (PLMs) on Arabic and Romanized script datasets. Experimental results demonstrate that these models perform well, achieving approximately 99 ROUGE score and 95 BLEU score. Our findings underscore the potential of NMT models to accurately handle transliteration, offering a valuable resource for improving Arabic language accessibility and communication.
Live content is unavailable. Log in and register to view live content