Abstract:
Abstract
The purpose of this study was to evaluate the practice of testing writing skill. The participants were grade ten English language teachers at Damot General Secondary School in 2016-2017 academic year. Moreover, nine grade ten students in the school in this academic year were also considered as participants of the study. Hence, to achieve the purpose, four data gathering instruments were employed. These were document analysis, observation, in-depth interview, and focus group discussion. Using document analysis, the data were gathered from three years’ English language final examinations (2014-2016). By means of observation, appropriate data were collected from three sections focusing on the administration process of writing tests. Next, in-depth interview was employed to obtain information from three teachers who taught English language at grade ten. Finally, focus group discussion was employed to gather data from nine students who were in grade ten in 2016-2017 academic year. Except the data on content validity, all the data collected from different sources were analyzed qualitatively; the data on content validity were analyzed quantitatively. The major findings of the study were that the procedures employed to develop writing skill tests (designing, operationaliztion and administration) were not properly applicable. The procedures used in these stages were also not methodical to increase test usefulness. The findings also revealed that the practice of testing writing skill had negative impact on the validity (content and construct validities) of the tests. Furthermore, the study clearly showed that students were not assessed frequently and they did not also get corrective feedback to learn from their errors. Finally, recommendations were provided based on the major findings so as to minimize the problems found in relation to the practice of testing writing skill. For instance, teachers should use table of specification while preparing tests. Therefore, topics found under the heading of contents in the specification should be covered by the items in the test.
Abstract
The purpose of this study was to evaluate the practice of testing writing skill. The participants were grade ten English language teachers at Damot General Secondary School in 2016-2017 academic year. Moreover, nine grade ten students in the school in this academic year were also considered as participants of the study. Hence, to achieve the purpose, four data gathering instruments were employed. These were document analysis, observation, in-depth interview, and focus group discussion. Using document analysis, the data were gathered from three years’ English language final examinations (2014-2016). By means of observation, appropriate data were collected from three sections focusing on the administration process of writing tests. Next, in-depth interview was employed to obtain information from three teachers who taught English language at grade ten. Finally, focus group discussion was employed to gather data from nine students who were in grade ten in 2016-2017 academic year. Except the data on content validity, all the data collected from different sources were analyzed qualitatively; the data on content validity were analyzed quantitatively. The major findings of the study were that the procedures employed to develop writing skill tests (designing, operationaliztion and administration) were not properly applicable. The procedures used in these stages were also not methodical to increase test usefulness. The findings also revealed that the practice of testing writing skill had negative impact on the validity (content and construct validities) of the tests. Furthermore, the study clearly showed that students were not assessed frequently and they did not also get corrective feedback to learn from their errors. Finally, recommendations were provided based on the major findings so as to minimize the problems found in relation to the practice of testing writing skill. For instance, teachers should use table of specification while preparing tests. Therefore, topics found under the heading of contents in the specification should be covered by the items in the test.
Abstract
The purpose of this study was to evaluate the practice of testing writing skill. The participants were grade ten English language teachers at Damot General Secondary School in 2016-2017 academic year. Moreover, nine grade ten students in the school in this academic year were also considered as participants of the study. Hence, to achieve the purpose, four data gathering instruments were employed. These were document analysis, observation, in-depth interview, and focus group discussion. Using document analysis, the data were gathered from three years’ English language final examinations (2014-2016). By means of observation, appropriate data were collected from three sections focusing on the administration process of writing tests. Next, in-depth interview was employed to obtain information from three teachers who taught English language at grade ten. Finally, focus group discussion was employed to gather data from nine students who were in grade ten in 2016-2017 academic year. Except the data on content validity, all the data collected from different sources were analyzed qualitatively; the data on content validity were analyzed quantitatively. The major findings of the study were that the procedures employed to develop writing skill tests (designing, operationaliztion and administration) were not properly applicable. The procedures used in these stages were also not methodical to increase test usefulness. The findings also revealed that the practice of testing writing skill had negative impact on the validity (content and construct validities) of the tests. Furthermore, the study clearly showed that students were not assessed frequently and they did not also get corrective feedback to learn from their errors. Finally, recommendations were provided based on the major findings so as to minimize the problems found in relation to the practice of testing writing skill. For instance, teachers should use table of specification while preparing tests. Therefore, topics found under the heading of contents in the specification should be covered by the items in the test.