DEVELOPMENT AND VALIDATION OF ESSAY TEST ASSESSOR FOR SENIOR SCHOOL CERTIFICATE EXAMINATION IN NIGERIA

No Thumbnail Available

Date

2021-03

Journal Title

Journal ISSN

Volume Title

Publisher

UNIVERSITY OF ILORIN

Abstract

Objective and essay tests are commonly used measurement instruments for assessment of learning outcomes. The advent of computers for scoring tests has improved the speed and accuracy of assessment even for large scale examinations. Scoring of objective tests using computer is accurate but the same is not true of essay tests. Moreover, research on assessment of essay tests using automated grading system is quite scarce in Nigeria. Studies have shown that that scoring of essay tests has depended on training of computers with as many model answers as 100 or more. There is the need to improve on this; therefore, this study has been designed to use one model answer for scoring essay tests. The objectives of the study were to: (a) develop the Essay Test Assessor (ETA); (b) examine the scores from ETA for grading Economics essay test items; (c) examine the scores from human raters; (d) determine the validity of students’ scores from ETA and human raters; and (e) determine the efficiency and speed with which the ETA operates in assessing essay test items. The study adopted the correlational research design of development and validation of Essay Test Assessor. A sample of 1,200 senior secondary II students was drawn from a population of 11,932 offering Economics in public schools in South-west, Nigeria. The multi-stage sampling procedure was used to select three out of six States and 55 public secondary schools out of 1,102 across South-west, Nigeria. All responses were scored by 32 human raters and ETA. The data collected were analysed using descriptive and inferential statistical. Respondents’ scores that fell between 1–6, 7–12 and 13–18 were categorised as low, average and high scores respectively base on their general performance. The findings of the study were that: i. the developed ETA was validated by human raters (experts) with a grand mean of 9.57; ii. analysis of scores resulting from ETA showed that 11.7%, 57.9% and 30.5% of the sampled students were in the low, average and high score categories respectively; iii. analysis of scores resulting from human raters showed that 40.97%, 46.8% and 12.23% of sampled students were in the low, average and high score categories respectively; iv. level of agreement obtained between ETA and human raters was 0.79 using Linear Kappa and 0.69 using Spearman’s rho; and v. mean time taken by ETA for scoring across all items and persons was 0.00002 seconds compared to that of human raters at 5.59 minutes. The study concluded that ETA yielded valid scores comparable to human raters for scoring short-response essay items. The ETA has confirmed the possibility of using one model answer for scoring short-response essay test items. It implies that training of grading systems to score using multiple model answer could be eliminated and one model answer is possible and effective. The study recommended that ETA should be trial-tested and ultimately adopted by senior school examination bodies for scoring short-response essay items in Economics.

Description

Keywords

DEVELOPMENT, VALIDATION, ESSAY TEST ASSESSOR, SENIOR SCHOOL CERTIFICATE EXAMINATION, NIGERIA

Citation