Abstract:
Currently, different scholars are doing search on different natural language processing tasks for different languages like, Arabic, English and Chinese such as machine translation, question answering, information extraction and text summarization. Nowadays, SRL has become a hot research issue and one of the main focusing areas. Since it is a crucial and sentence-level semantic task to specify the main role of each argument in a given text and used as an input for doing other NLP tasks. Unfortunately, the previous researchers have not focused on Amharic sentences for semantics relationships of constituents and predicate. In order to solve this gap, we have developed a context-based semantic role labeler model for Amharic language using a deep learning approach called Bidirectional Long-Short Term Memory networks by considering different senses of predicate on simple Amharic sentences during annotation of dataset. The data were collected from different social media platforms and student Amharic textbooks and annotated semantically based on a PropBank data annotation guidelines and linguistic experts from wollo university. From these datasets, we have 40 predicates which have more than one contextual meaning each of them annotated depending on their sense and assigned different role labels for multi-sense predicate data for training and testing the model. The MLP classifiers were used to classify each argument to its associated role label based on the score predicted by biaffine attentional scorer. The proposed model achieved 95.9% training accuracy and 84.9% Testing accuracy.