Main Article Content
The study made use of instrumentation research design while item response theory was applied, to develop and validate Basic Science multiple-choice tests. 600 junior secondary school II students consisted of a sample that was randomly selected from 20 government co-education secondary schools in Udi education zone of Enugu State, Nigeria. The study was guided by six research questions. A 40-test item of Basic Science multiple choice test was constructed by the researchers and used to collect data. Three experts subjected the instrument to content and face validation to ensure its validity. Two of them were from the departments of Science education and educational foundations, respectively. A reliability index of 0.85, was realized. Analysis of the data that were generated, was carried out, using the maximum likelihood estimation technique of BILOG-MG computer programming. It revealed that 40-test items with the appropriate indices consisted of the final instrument developed and was used to assess the ability of students in Basic Science. The result of the study confirmed the reliability of the items of the Basic Science Multiple choice questions based on the three-parameter (3pl) model. The findings again revealed that the multiple-choice Basic Science test items were difficult and that there was differential item functioning in Basic Science among male and female learners. Recommendations that were in line with the findings were made, such as that: teachers and examination bodies should adopt and encourage IRT in the development of test instruments used in measuring the ability of students in Basic Science and other subjects.
In submitting the manuscript to the International Journal on Integrated Education (IJIE), the authors certify that:
- They are authorized by their co-authors to enter into these arrangements.
- The work described has not been formally published before, except in the form of an abstract or as part of a published lecture, review, thesis, or overlay journal.
- That it is not under consideration for publication elsewhere,
- The publication has been approved by the author(s) and by responsible authorities – tacitly or explicitly – of the institutes where the work has been carried out.
- They secure the right to reproduce any material that has already been published or copyrighted elsewhere.
- They agree to the following license and copyright agreement.
License and Copyright Agreement
Authors who publish with International Journal on Integrated Education (IJIE) agree to the following terms:
- Authors retain copyright and grant the International Journal on Integrated Education (IJIE) right of first publication with the work simultaneously licensed under Creative Commons Attribution License (CC BY 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors can enter into separate, additional contractual arrangements for the non-exclusive distribution of the International Journal on Integrated Education (IJIE) published version of the work (e.g., post it to an institutional repository or edit it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) before and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
- 1. Adedoyin, O.O. (2010). Investigating the invariance of person parameter estimates based on classical test and item response theories. International journal of educational Science. Retrieved from http://www.uniBotswana./journal/ education /Science
- 2. Anene, G.U. & Ndubisi, OG (2003). Test development process. In B. G. Nworgu (Ed.), Educational measurement and evaluation: Theory and practice pp.110-122). Nsukka: University Trust Publishers.
- 3. Baker, F.B. (2001). The Basics of item response theory. (2nd ed.).United States of America: ERIC clearinghouse on assessment and evaluation.
- 4. Berondo, R.G. & Dela Fuente, J.A. (2021). Technology Exposure: Its Relationship to the Study Habits and Academic Performance of Students. Utamax : Journal of Ultimate Research and Trends in Education, 3(3), 125-141. https://doi.org/10.31849/utamax.v3i3.7280
- 5. Black, P.J. & William, D. (2009). Assessment and classroom learning. Assessment in education. 5, 7-74
- 6. Chong, H.Y. (2013). A Simple guide to the Item Response Theory (IRT) and Rasch modeling. Retrieved from March, 2013, from http:// www.creativewisdom.com.
- 7. Cherkesova, L.V. (2016). Problemy sovremennoi fundamentalnoi nauki [Problems of modern fundamental Science]. Moscow: Publishing house of the Academy of Natural History. [In Rus.]
- 8. Crocker, L. & Algina, J. (2008). Introduction to classical and modern test theory. Fort Worth: Harcourt Brace Jovanovich.
- 9. Davis, L.L. (2002). Strategies for controlling item exposure in computerized adaptive testing with polytomous scored items. Unpublished doctoral dissertation, of Texas at Autin.
- 10. Dela Fuente, J.A. (2021). Contributing factors to the performance of pre-service physical Science teachers in the Licensure Examination for Teachers (LET) in the Philippines. Journal of Educational Research in Developing Areas, 2(2), 141-152. https://doi.org/10.47434/JEREDA.2.2.2021.141
- 11. Dela Fuente, J.A. & Biñas, L.C. (2020). Teachers’ competence in information and communications technology (ICT) as an educational tool in teaching: An empirical analysis for program intervention. Journal of Research in Education, Science and Technology, 5(2), 61-76.
- 12. Dela Fuente, J.A. (2019). Driving Forces of Students’ Choice in specializing Science: a Science education context in the Philippines Perspective. The Normal Lights, 13(2), 225-250.
- 13. Dela Fuente, J.A. (2021). Facebook messenger as an educational platform to scaffold deaf students’ conceptual understanding in environmental Science subject: A single group quasi-experimental study. International Journal of Education, 14(1), 19-29. doi:10.17509/ije.v14i1.31386
- 14. Dela Fuente, J.A. (2021). Implementing inclusive education in the Philippines. College teacher experiences with deaf students. Issues in Educational Research, 31(1), 94-110. http://www.iier.org.au/iier31/dela-fuente.pdf
- 15. Ebuoh, C.N. (2018). Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests", World Journal of Education.
- 16. Federal Republic of Nigeria (FRN) (2013). National Policy on Education (4th ed.). Lagos: NERDC press.
- 17. Harlen, W. & Deakin-Crick, R. (2002). A systematic review of the impact of summative assessment and tests on students’ motivation for learning. In EPPI Centre (Ed.), Research evidence in education library (1.1 ed., pp. 153–).
- 18. Harris, D. (2005). Educational measurement issues and practice: comparison of 1-, 2-, and 3- parameter IRT models. DOI: 10.1111/j.1745-3992.1989.tb00313.x
- 19. Henard, D.H. (2000). Item response theory, in reading and understanding more - multivariate statistics, Vol. II, Larry Grimm and Paul Yarnold, (Eds)., Washington, DC: American Psychological Association, 67-97.
- 20. Huba, M.E. & Freed, J.E. (2000). Learner-centered assessment on college campuses: Shifting the focus from teaching to learning. Boston, MA: Allyn & Bac
- 21. Karami, H. (2010). A Differential Item Functioning analysis of a language proficiency test: an investigation of background knowledge bias. Unpublished Master‟s Thesis. University of Tehran, Iran.
- 22. Madu, B.C. (2012). Analysis of Gender-Related Differential Item Functioning in Mathematics Multiple Choice Items Administered by West African Examination Council (WAEC). Journal of Education and Practice. Retrieved
- 23. May, 15, 2012, from ISSN /2222.1735 (Paper) 2222-288X (Online) Vol 3(8).
- 24. Malcolm, T. (2003). An achievement test. Retrieved November, 20, 2013, from http://www.wisegeek.com/what-is-an- achievement-test.htm
- 25. Makama, G.A. (2013). Patriarchy and Gender Inequality in Nigeria: The Way Forward.European Scientific Journal June 2013 edition vol.9, No.17 ISSN: 1857 – 7881 (Print) e - ISSN 1857- 7431
- 26. Meredith, D.G., Joyce, P.G., & Walter, R.B. (2007). Educational research: an introduction (8th ed.). United State of America: Pearson Press.
- 27. Nkpone, H.L. (2001). Application of latent trait models in the development and standardization of physics achievement test for senior secondary students. Unpublished doctoral dissertation, University of Nigeria, Nsukka.
- 28. Nworgu, B.G. (2015). Introduction to Educational Measurement and evaluation: theory and practice (2nd ed.). Nsukka: Hallman Publisher.
- 29. Obinne, A.D.E. (2008). Psychometric properties of senior certificate biology examinations conducted by West African Examinations council: Application of item response theory. Unpublished doctoral dissertation, University of Nigeria, Nsukka.
- 30. Obinne, A.D.E. (2012). Using IRT in determining test item prone to guessing. Reprieved June, 20, 2013, URL: http://dx.doi.org/wje.v2 n1p91.
- 31. Obinne, A.D.E. (2013). Test item validity: item response theory (IRT) perspective for Nigeria. Research Journal in Organizational Psychology & Educational Studies 2(1). Retrieved January, 28, 2014, from www.emergingresource.org
- 32. Obodo, A.C., Ani, M.I., & Neboh, P.O (2021). Effects of guided inquiry and lecture teaching methods on junior secondary school Basic Science students’ academic achievement. Journal of scientific Research and Methods. Maiden Edition. 18-29.
- 33. Okoro, O.M. (2006). Measurement and evaluation in education. Uruowulu-Obosi: Pacific Publishers Ltd.
- 34. Onunkwo, G.I .N. (2002). Fundamentals of education measurement and evaluation. Owerri: Cape Publishers Int’l Ltd.
- 35. Troy-Gerard, C. (2004). An empirical comparison of item response theory and Classical test theory item/person statistics. Unpublished doctoral dissertation, University Texas A&M.
- 36. Reeve, B.B. (2002). An introduction to modern measurement theory. Bethesda, Maryland: National cancer institution.
- 37. Reeve, B.B. & Fayers, P. (2005). Applying item response theory modeling for evaluating questionnaire items and scale properties. In P. Fayers and R.D.
- 38. Hays (Eds.), Assessing quantity of life in clinical trials: method of practice. (2nd ed.). USA: Oxford university press. Retrieved September, 11, from http://cancer. Unic.edu/research/faculty/display member-plone.asp? ID-694