Jimma University Open access Institutional Repository

Attention-based Amharic-to-Afaan Oromo Neural Machine Translation

Show simple item record

dc.contributor.author Gashaw, Meron
dc.date.accessioned 2022-02-03T07:02:37Z
dc.date.available 2022-02-03T07:02:37Z
dc.date.issued 2021-01-23
dc.identifier.uri https://repository.ju.edu.et//handle/123456789/6168
dc.description.abstract Humans use a natural language (NL) in order to convey meanings from one entity or group to another. This NL should be mutually understandable by both communicating entities, which is not imaginary living in a world having a population of more than 7 billion. So, there should be a translator between the two, of which most of the time is a human. But, human translator is expensive and inconvenient. The emergence of Natural Language Processing (NLP) and Machine Translation (MT) has solved this issue.MT is an automatic translation of a source language to a target language. Allowing the use of neural network models to learn a statistical model for MT, Neural Machine Translation (NMT), aims at building a single neural network that can be jointly tuned to maximize translation performance. In this work, an attention based Amharic to Afaan Oromo NMT system has been developed. The system is developed based on Encoder-Decoder model by using a Bi-directional Gated Recurrent Unit (BGRU). In order to compare the performance of the system, we have also implemented our system before applying attention mechanism. The non-attention based system with basic encoder-decoder architecture, have some limitations. As the length of the sentence increase the interdependency of words will loosely increase. This shows that the non-attention based architecture works well with shorter sentences but highly suffer to translate longer sentences. Moreover, as each word in the sentence is visited, it must be assigned a new identity number in order to identify a word by a unique index at the time it encountered it in the data. But when the length of the dictionary increases, the dimension of word vector needed becomes higher. We have observed that these problems have been solved by applying an attention mechanism to the system. Prior to this work there was no other NMT system translating Amharic to Afaan Oromo. So, in this work attention based Amharic to Afaan Oromo neural machine translation has been developed and its performance was compared to that of non-attention based Amharic to Afaan Oromo neural machine translation system. For the evaluation purpose, BLEU score evaluation was used. We have recorded a BLEU score of 61.49 for the non-attention based system and 67.82 for the attention based Amharic to Afaan Oromo Neural Machine Translation en_US
dc.language.iso en_US en_US
dc.subject Natural Language Processing en_US
dc.subject Machine Translation en_US
dc.subject Neural Machine Translation en_US
dc.subject Recurrent Neural Network en_US
dc.subject Local Attention en_US
dc.title Attention-based Amharic-to-Afaan Oromo Neural Machine Translation en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IR


Browse

My Account