ITEM PARAMETER ESTIMATES OF UNIFIED TERTIARY MATRICULATION MATHEMATICS EXAMINATION USING ITEM RESPONSE THEORY 3 – PARAMETER LOGISTIC MODEL
CHRISTIANA AMAECHI UGODULUNWA
Evaluation, Research and Statistics Unit, Department of Educational Foundations, Faculty of Education, Nnamdi Azikiwe University, Awka Anambra State, Nigeria.
NJIDEKA GERTRUDE MBELEDE *
Evaluation, Research and Statistics Unit, Department of Educational Foundations, Faculty of Education, Nnamdi Azikiwe University, Awka Anambra State, Nigeria.
NNEKA CHINYERE EZEUGO
Evaluation, Research and Statistics Unit, Department of Educational Foundations, Faculty of Education, Nnamdi Azikiwe University, Awka Anambra State, Nigeria.
LYDIA IJEOMA ELEJE
Evaluation, Research and Statistics Unit, Department of Educational Foundations, Faculty of Education, Nnamdi Azikiwe University, Awka Anambra State, Nigeria.
CLEMENTINA METU IFEOMA
Evaluation, Research and Statistics Unit, Department of Educational Foundations, Faculty of Education, Nnamdi Azikiwe University, Awka Anambra State, Nigeria.
*Author to whom correspondence should be addressed.
Abstract
Item response theory has become a unique methodological framework for reviewing response data from assessments in education and other fields of study. This study attempted to assess the Item parameter estimates of Nigeria’s unified tertiary matriculation mathematics assessment instrument using item response theory 3 – Parameter logistic model and determined the conformity level of the items of the instrument to the revised Bloom’s taxonomy of educational objectives. The descriptive validation design was applied in the study. The population of the study comprised the 3, 320 SS 3 students registered for 2020 Unified Tertiary Matriculation Examination (UTME) who opted for mathematics in the public secondary schools in Abia State out of which 40 students were selected. Simple random sampling and purposive sampling techniques were adopted in selection of the samples using multistage procedure. Mathematics Assessment instrument used in the 2017, 2018 and 2019 UTM examinations were the instruments for data collection. This was made up of 120 (40 items for each year) multiple choice items, each having a single stem with four options including one correct answer and three distractors. b, a and c-parameters of the individual items of the instruments were established. The findings of the study revealed that a very high percentage (95% in 2017, 95% in 2018 and 85% in 2019) of the items were of moderate difficulty and discriminated adequately among high and low performing students. More so, the pseudo guessing parameter estimate indicated that there were low level of guessing since a high percentage ((90% in 2017, 92.5% in 2018 and 90% in 2019)%) of the items survived. The researcher recommended among others that examination bodies in Nigeria should concretize the quality of their test items by conducting item analysis using IRT model.
Keywords: Bivoltine breeds, Item parameter, thermal stress, tertiary matriculation, temperature tolerant breeds, mathematics examination, 3-parameter logistic model
How to Cite
Downloads
References
AlKhatib HS, Brazeau G, Akour A, Almuhaissen SA. Evaluation of the effect of items’ format and type on psychometric properties of sixth year pharmacy students clinical clerkship assessment items. BMC Medical Education. 2020;20.
Adeyinka T. Variables that may determine secondary school students' preparedness for Utme-Cbt. Global Scientific Journal; 2019.
Ojerinde D. Innovations in assessment: JAMB experience, Nigerian Journal of Educational Research and Evaluation. 2015; 14(3).
Ukomadu C, Fabian B. Effect of public service reform on service delivery of Joint Admissions and Matriculation Board (JAMB) in Nigeria. Quest Journals Journal of Research in Humanities and Social Science. 2018;6(8):41-49.
Available:www.questjournals.org
Ubulom WJ, Wokocha KD. Readiness and acceptability of computer-based test (CBT) for post-university matriculation examinations (PUME) among urban and rural senior secondary school students in Rivers State. International Journal of Innovative Social & Science Education Research. 2017;5(3):51-60. Available:www.seahipaj.org
Adebayo FO. Using computer based test method for the conduct of examination in Nigeria: prospects, challenges, and strategies. Mediterranean Journal of Social Sciences MCSER Publishing, Rome-Italy. 2014;5(2). DOI: 10.5901/mjss.2014.v5n2p47
Wilson CO. Understanding the New Version of Bloom’s Taxonomy; 2016.
Available:https://quincycollege.edu/wp-content/uploads/Anderson-and-Krathwohl_Revised Blooms Taxonomy.pdf.
Asikhia OA. Students and teachers’ perception of the causes of poor academic performance in Ogun state secondary schools: Implications for counseling for National development. European Journal of Social Sciences. 2010; 13(2):28-36.
Ayanwale MA, Adeleke JO, Mamad A. Relational analysis of personal variables and marking skills of national examinations council’s examiners. African Journal of Pedagogy, Kampala International University College, Tanzania. 2018;8:25-38.
Adegoke BA. Comparison of item statistics of physics achievement test using classical Test theory and item response theory frameworks. Journal of Education and Practice. 2013;4(22): 87–96.
Columbian Public Health; 2021.
Available:https://www.publichealth.columbia.edu/research/population-health-methods/itemresponse theory
Mehta G, Mokhasi V. Item analysis of multiple choice questions- An assessment of the assessment tool. International Journal of Health Sciences & Research. 2014;4(7):197-202.
Mahjabeen W, Alam S, Hussan U, Zafar T, Butt R, Konain S, Rizvi M. Difficulty index, discrimination index, and distractor efficiency in multiple choice questions. Annuals of Pakistan Institute of Medical Sciences. 2018;4: 310-315.
Available:https://www.researchgate.net/publication/323705126
Mukherjee P, Lahiri SK. Analysis of multiple-choice questions (MCQs): Item and Test Statistics from an assessment in a medical college of Kolkata, West Bengal. Journal of Dental and Medical Sciences. 2015;14(12):47-52.
Available:www.iosrjournals.org
Pathak CK, Patro KC, Pathak JA, Valenha CL. An empirical comparison of item response theory and hierarchical factor analysis in applications to the measurement of job satisfaction. Journal of Applied Psychology. 2013;67:826-834.
Musa A, Shaheen S, Elmardi A, Ahmed A. Item difficulty & item discrimination as quality indicators of physiology MCQ examinations at the Faculty of Medicine, Khartoum University. Khartoum Medical Journal. 2018;11(02):1477-1486.
Available:https://www.researchgate.net/publication/328583573
D'Sa JL, Visbal-Dionaldo ML. Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education. 2017;9(3):109-114.
DOI: 10.5958/0974-9357.2017.00079.4.
Meyer JP. Applied Measurement with jMetrik. Published by Routledge 55 B/W Illustrations; 2014.
ISBN: 9780415531979.
Houser J. Nursing research: Reading, using, and creating (4th ed.). Burlington, MA: Jones & Bartlett Learning; 2018.
Polit DF, Yang FM. Measurement and the measurement of change. Philadelphia: Wolters Kluwer. Obon & Rey, Analysis of Multiple-Choice; 2015.
Obinne ADE. Using IRT in Determining Test Item Prone to Guessing. World Journal of Education. 2012;2(1).
Retrieved 22/12/2021
Available:https://files.eric.ed.gov/fulltext/EJ1158955.pdf, http://dx.doi.org/10.5430/wje.v2n1p91
Tjabolo SA, Otaya LG. Quality of school exam tests based on item response theory. Universal Journal of Educational Research. 2019;7(10): 2156-2164.
DOI: 10.13189/ujer.2019.071013
Setiawati FA, Setiawati R, EkaIzzaty VH. Items parameters of the space-relati subtest using item response theory. Data in Brief. 2018;19:1785-1793. Available:https://doi.org/10.1016/j.dib.2018.06.061i
Adedoyin OO, Mokobi T. Using IRT psychometric analysis in examining the quality of junior certificate mathematics multiple choice examination test items. International Journal of Asian Social Science. 2013;3(4): 992-1011.