BDU IR

DEEP NEURAL NETWORD BASED NATURAL LANGUAGE INFERENCE MODEL

Show simple item record

dc.contributor.author BEKELE, TESFAYE
dc.date.accessioned 2020-10-06T12:15:50Z
dc.date.available 2020-10-06T12:15:50Z
dc.date.issued 2020
dc.identifier.uri http://hdl.handle.net/123456789/11274
dc.description.abstract Addressing natural language understanding (NLU) problem is basically enabling the computer to replace human being in tasks that need semantical understanding. Natural language inference (NLI) can be used in paired document applications by reducing the problems to textual entailment task by taking the source document as premises (a coherent text) and the result new document as hypothesis (a language expression which is hypothesized as drawn from the premises). NLU understanding is more challenging tasks due to language expression variability and ambiguity. Most of DNN based models achieve state of the art for all of the tasks with huge amount of dataset. We have proposed deep neural network based natural language inference that uses combined sentence encoder in sentence embedding layer. By most researchers RNN is recommended for NLP tasks that needs the previous history for the new current component. And the current state of the art language modeling approach is Transformer. We have combined these two approaches at sentence embedding layer. Bidirectional LSTM with average pooling, and blocked Multi-head attention mechanism and positional embedding as a transformer sub model has been implemented. We have proposed and implemented the dynamic dropout throughout the model layers and two training controllers i.e. early stopping patience and learning rate patience. We have trained the model for 10 epochs. We have also implemented each encoder separately and compare their performance with the proposed model. We have prepared Amharic language inference dataset. Additionally, we have used SNLI dataset. The proposed model predicts 88.91% of the Amharic testing dataset that outperforms by 5.95%, 6.5% and 7.13% performance of model with only Bi-LSTMs, Only with Transformers and averaged models respectively. It also predict 87.06% of SNLI testing dataset by outperforming LSTM based model with 2.01%, Transformer sub-model with 3.31% and averaged with 3.16%. en_US
dc.language.iso en en_US
dc.subject Computer Science en_US
dc.title DEEP NEURAL NETWORD BASED NATURAL LANGUAGE INFERENCE MODEL en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record