Review Article | | Peer-Reviewed

English Language Assessment in Nepal: Policies, Practices and Problems

Received: 1 November 2023    Accepted: 17 November 2023    Published: 22 December 2023
Views:       Downloads:
Abstract

The study investigates the high-stakes English test for the SEE examination, emphasizing a mismatch with the curriculum's language skills and prompting exploration into test design processes and influencing factors. This research employs a qualitative case study approach. The findings reveal a gap between regulations valuing language skills and practical implementation in assessments. The tests are based on traditional testing philosophy, with inadequate standardization in the test items and test inadequate standardization in the test items and test administration. However, the bulk of these evaluations indicated factors including the teacher, the institution, and the students that had an impact on student learning. The four language skills—listening, speaking, reading, and writing—are valued by assessment regulations, but our classroom environment and evaluation system seldom put this into reality. Summative public exams, administered externally, lack feedback on teaching and primarily serve student progression. Despite recognizing the efficacy of communicative language instruction, the prevalence of non-communicative approaches in exams raises questions about students' communicative skills development. Although the testing literature is replete with theoretical discussions of test design, reviews, and validation (as seen by the references given previously), there is a lack of attention on how high-stakes language exams are actually constructed, particularly in developing cultures. Tests used in external examinations at various stages of schooling in these civilizations are of special importance. Although public examinations in English and other courses have been utilized in Nepal for decades, there has been little study on how these tests are created, what learning or success is targeted for evaluation, and what repercussions these tests may have for students and their families, the education system, and society at large. The researcher concludes by advocating for the adoption of theoretical advancements in testing within the Nepalese educational system and globally, emphasizing the importance of critically examining discrepancies within regulatory correlation, causation, and inconsistencies between testing and curriculum.

Published in International Journal of English Teaching and Learning (Volume 1, Issue 1)
DOI 10.11648/j.ijetl.20230101.14
Page(s) 22-33
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Curriculum, Language Teaching, Language Test Design, High-Stakes Testing, School Leaving Examinations, Testing Across Societies, Nepal

References
[1] Alderson, J. C., Clapham, C. and Wall, D. (1995). Language Test Construction and Evaluation. Cambridge: Cambridge University Press. http://library.uc.edu.kh/userfiles/pdf/10.Language%20Test%20construction%20and%20evaluation.pdf
[2] Ali, M., & Walker, A. L. (2014). ‘Bogged down’ ELT in Bangladesh: Problems and policy. English Today, 30 (2), 33–38. https://doi.org/10.1017/s0266078414000108
[3] Allen, D. (2016). Japanese Cram Schools and Entrance Exam Washback. The Asian Journal of Applied Linguistics, 3 (1): 54–67. https://caes.hku.hk/ajal/index.php/ajal/article/view/338/412
[4] Ashadi, A., & Rice, S. (2016). High stakes testing and teacher access to professional opportunities: lessons from Indonesia. Journal of Education Policy, 31 (6), 727–741. https://doi.org/10.1080/02680939.2016.1193901
[5] Asian Development Bank. (2017). Innovative Strategies for Accelerated Human Resources Development in South Asia: Asian Development Bank. https://doi.org/10.22617/tcs179079
[6] Au, W. (2009). Unequal by Design: High-Stakes Testing and the Standardization of Inequality. New York, NY and London: Routledge.
[7] Bachman, L. F., & Palmer, A. S. (1996). Language Testing in Practice: Designing and Developing Useful Language Tests. Oxford: Oxford University Press.
[8] Bachman, L. F. (1990). Fundamental Considerations in Language Testing. Oxford: Oxford University Press.
[9] Balwanz, D. (2016). The Discursive Hold of the Matric: Is There Space for New Vision for Secondary Education in South Africa? In W. C. Smith (Ed.), The Global Testing Culture: Shaping Education Policy, Perceptions, and Practice (pp. 261–278). Oxford: Symposium Books. https://www.researchgate.net/publication/287646307_The_Global_Testing_Culture_Shaping_Education_Policy_Perceptions_and_Practice
[10] Bajracharya, N. (2016, June 14). Big reforms ahead in Nepal’s education. Himalayantimes.Com. https://thehimalayantimes.com/opinion/dynamism-education-system-new-policy-needed
[11] Bhattarai, G. R. (2006). English teaching situation in Nepal: Elaboration of the theme for panel discussion in the 40th TESOL conference. Journal of NELTA. 11. Kathmandu: NELTA. https://www.nepjol.info/index.php/NELTA
[12] Bloem, S. (2015). PISA for low- and middle-income countries. Compare: A Journal of Comparative and International Education, 45 (3), 481–486. https://doi.org/10.1080/03057925.2015.1027513
[13] Brown, G. T., & Hirschfeld, G. H. (2008). Students’ conceptions of assessment: Links to outcomes. Assessment in Education: Principles, Policy &Amp; Practice, 15 (1), 3–17. https://doi.org/10.1080/09695940701876003
[14] Brown, H. D., & Lee, H. (2015). Teaching by principles: An interactive approach to language pedagogy, (4th ed.,). New York: Pearson.
[15] Caddell, M. (2007). Education and change: A historical perspective on schooling, development and the Nepali nation-state. In K. Kumar, & J. Oesterheld (Eds.), Education and social change in South Asia (pp. 251-284). New Delhi: Orient Longman.
[16] Cheng, L. (2008). The Key to Success: English Language Testing in China. Language Testing 25 (1), 15–37. https://doi.org/10.1177/0265532207083743
[17] Cheng, L., & Curtis. A. (2004). “Washback or Backwash: A Review of the Impact of Testing on Teaching and Learning.” In L. Cheng, Y. Watanabe, and A. Curtis (Ed.), Washback in Language Testing: Research Context and Methods (pp. 3–18). London: Lawrence Erlbaum.
[18] Curiculum Development Center. (2009). Primary education grade 4-5. Sanothimi, Bhaktapur: Curriculum Development Centre, Nepal Retrieved from http://nepaknol.org.np/cdc/elibrary/pages/view.php?ref=64&k=#
[19] Curriculum Development Center. (2012). Basic curriculum Class 6-8. Sanothimi Bhaktapur: Curriculum Development Centre, Nepal Retrieved from http://nepaknol.org.np/cdc/elibrary/pages/view.php?ref=382&k=#
[20] Curriculum Development Center. (2019). Basic level (Grade 1-3) curriculum. Sanothimi, Bhaktapur Curriculum Development Centre, Nepal Retrieved from http://nepaknol.org.np/cdc/elibrary/pages/view.php?ref=2457&k=#
[21] Curriculum Development Center, (2020). National curriculum framework of school education-2076. Sanothimi Bhaktapur Curriculum Development centre, Nepal Retrieved from http://nepaknol.org.np/cdc/elibrary/pages/view.php?ref=2451&k=#
[22] Creswell, J. W., & Clark. V. L. P. (2011). Designing and Conducting Mixed Method Research. London: Sage.
[23] Davies, D. (2015). The ‘iron gate’: high-stakes assessment at age 16 in Nepal and England. Compare: A Journal of Comparative and International Education, 46 (4), 582–602. https://doi.org/10.1080/03057925.2015.1030591
[24] Downing, S. M., &. Haladyna. T. M. (2006). Handbook of Test Development. Mahwah, NJ: Erlbaum.
[25] Education Review Office, (2020). National Assessment of Student Assessment. Sanothimi, Bhaktapur. https://www.ero.gov.np/post/6_60410dcdd2cc3
[26] Fulcher, G. (2010). Practical Language Testing. London: Hodder Education.
[27] Ghimire, B. (2016, May 26). House panel endorses bill on education act amendment. Kathmandu Post.Com. https://kathmandupost.com/national/2016/05/26/house-panel-endorses-bill-on-education-act-amendment
[28] Giraldo, F. (2018). Language Assessment Literacy: Implications for Language Teachers. Profile: Issues in Teachers´ Professional Development, 20 (1), 179–195. https://doi.org/10.15446/profile.v20n1.62089
[29] Giri, R. A. (2011). Languages and language politics. Language Problems and Language Planning, 35 (3), 197–221. https://doi.org/10.1075/lplp.35.3.01gir
[30] Gu, P. Y. (2013). The unbearable lightness of the curriculum: what drives the assessment practices of a teacher of English as a Foreign Language in a Chinese secondary school? Assessment in Education: Principles, Policy &Amp; Practice, 21 (3), 286–305. https://doi.org/10.1080/0969594x.2013.836076
[31] Hamid, M. O., & Baldauf, R. B. (2008). Will CLT bail out the bogged down ELT in Bangladesh? English Today, 24 (3), 16–24. https://doi.org/10.1017/s0266078408000254
[32] Hamid, M. O., Sussex, R., & Khan, A. (2009). Private Tutoring in English for Secondary School Students in Bangladesh. TESOL Quarterly, 43 (2), 281–308. https://doi.org/10.1002/j.1545-7249.2009.tb00168.x
[33] Hamilton, M., Maddox, B. & Addy, C. (2015). Introduction. In M. Hamilton, B. Maddox, and C. Addey (Ed.), Literacy as Numbers: Researching the Politics and Practices of International Literacy Assessment, xiii–xxxiv. Cambridge: Cambridge University Press.
[34] Hardy, I. (2013). Education as a ‘risky business’: Theorising student and teacher learning in complex times. British Journal of Sociology of Education, 36 (3), 375–394. https://doi.org/10.1080/01425692.2013.829746
[35] Harlen, W. (2009). Improving assessment of learning and for learning. Education 3-13, 37 (3), 247–257. https://doi.org/10.1080/03004270802442334
[36] Harris, L. R., & Brown. T. L. (2016). The Human and Social Experience of Assessment: Valuing the Person and Context. In G. T. L. Brown and L. R. Harris (Ed.), Handbook of Human and Social Conditions in Assessment (pp. 1–17). London and New York, NY: Routledge.
[37] Himalayan News Service. (2018, November 3). National curriculum framework revised. The Himalayan Times.Com. https://thehimalayantimes.com/kathmandu/national-curriculum-framework-revised
[38] Hughes, A. (2003). Testing for Language Teachers. Cambridge: Cambridge University Press.
[39] Hursh, D. (2007). Assessing No Child Left Behind and the Rise of Neoliberal Education Policies. American Educational Research Journal, 44 (3), 493–518. https://doi.org/10.3102/0002831207306764
[40] Hyland, K. (2006). English for academic purposes. An advanced resource book. London: Routledge.
[41] Islam, J. Majid, I. A. N. Shahidullah, M. & Shams, N. (2001). Teacher’s Guide for English for Today: For Classes XI-XII. Dhaka: NCTB and British Council.
[42] Jilani, R. (2009). Problematizing High School Certificate Exam in Pakistan: A Washback Perspective. The Reading Matrix 9 (2): 175–183. http://www.readingmatrix.com/articles/sept_2009/jilani.pdf
[43] Jones, B. D. (2007). The Unintended Outcomes of High-Stakes Testing. Journal of Applied School Psychology, 23 (2), 65–86. https://doi.org/10.1300/j370v23n02_05
[44] Jones, M. G., Jones, B., & Hargrove, T. (2003). The unintended consequences of high-stakes testing. Lanham, MD: Rowman and Littlefield.
[45] Kabir, M. H. (2008). How Validity Is Ensured in Our Language Test: A Case Study. IIUC Studies 5: 37–52.
[46] Kachru, B., Kachru, Y., & Nelson, C. L. (Eds.). (2006). The handbook of World Englishes. Malden, MA, and Oxford, England: Blackwell.
[47] Khaniya, T. R. (2005). Examination for enhance learning. Lalitpur: Millennium Publication.
[48] Khan, R. (2010). English language assessment in Bangladesh: Developments and challenges. In Y. Moon, & B. Spolsky (Eds.), Language assessment in Asia: Local, regional or global? (pp. 121–157). South Korea: Asia TEFL.
[49] Kamens, D. H. (2013). Globalization and the Emergence of an Audit Culture: PISA and the Search for ‘Best Practices’ and Magic Bullets. In H. Mayer & A. Benavot (Ed.), PISA, Power and Policy: The Emergence of Global Educational Governance, 117–140. Oxford: Symposium Books.
[50] Klenowski, V., & Wyatt-Smith, C. (2012). The impact of high stakes testing: the Australian story. Assessment in Education: Principles, Policy &Amp; Practice, 19 (1), 65–79. https://doi.org/10.1080/0969594x.2011.592972
[51] Kucuk, F., & Walters, J. (2009). How good is your test? ELT Journal, 63 (4), 332–341. https://doi.org/10.1093/elt/ccp001
[52] Kwon, S. K., Lee, M., & Shin, D. (2015). Educational assessment in the Republic of Korea: lights and shadows of high-stake exam-based education system. Assessment in Education: Principles, Policy &Amp; Practice, 24 (1), 60–77. https://doi.org/10.1080/0969594x.2015.1074540
[53] Leung, C. Y., & Andrews, S. (2012). The mediating role of textbooks in high-stakes assessment reform. ELT Journal, 66 (3), 356–365. https://doi.org/10.1093/elt/ccs018
[54] Lingard, B. (2011). Policy as numbers: ac/counting for educational research. The Australian Educational Researcher, 38 (4), 355–382. https://doi.org/10.1007/s13384-011-0041-9
[55] Maniruzzaman, M., & Hoque, M. E. (2010). How Does Washback Work on the EFL Syllabus and Curriculum? A Case Study at the HSC Level in Bangladesh. Language in India 10 (12): 49–88. http://www.languageinindia.com/dec2010/hoquewashback.pdf.
[56] McNamara, T. (2000). Language Testing. Oxford: Oxford University Press.
[57] Menken, K. (2008). High–Stakes Tests as de facto Language Education Policies. SpringerLink. https://link.springer.com/referenceworkentry/10.1007/978-0-387-30424-3_189?error=cookies_not_supported&code=a5cf7aa6-3dca-4858-9f65-b88329c2a03e
[58] Meyer, H., & Benavot, A. (2013). PISA and the Globalisation of Education Governance: Some Puzzles and Problems. In H. Mayer and A. Benavot (Ed.), PISA, Power and Policy: The Emergence of Global Educational Governance (pp. 9–26). Oxford: Symposium Books.
[59] Ministry of Education, (2019). National Education Policy (in Nepali). Kathmandu: Ministry of Education, Government of Nepal. https://moest.gov.np/post/1_62b7036a58f86
[60] Ministry of Education & UNESCO Office in Kathmandu. (2015). Education for all national review report (Nepal). In Education for All 2015 national review: Nepal (Vol. 114p). UNESCO. https://unesdoc.unesco.org/ark:/48223/pf0000232769
[61] Ministry of Education. (2016). Flash report. Ministry of Education: Kathmandu https://moest.gov.np/post/1_632a9b6716aa5
[62] Ministry Of Education. (2016). School sector development plan 2016-2023. Kathmandu Government of Nepal, Ministry of Education Retrieved from http://doe.gov.np/assets/uploads/files/3bee63bb9c50761bb8c97e2cc75b85b2.pdf
[63] Ministry of Education, (1971). National education system plan 1971-76. Ministry of Education. https://www.martinchautari.org.np/storage/files/thenationaleducationsystemplanfor-1971-english.pdf
[64] Mulmi, A. R. (2017, October 1). Why did the British not colonize Nepal? - The Record. Recordnepal.Com. https://www.recordnepal.com/why-did-the-british-not-colonize-nepal
[65] Curriculum Development Center (CDC), (2019). National Curriculum English: Classes 9 & 10. Curriculum Development Center. Ministry of Education. Bhaktapur. https://bagisworischool.edu.np/wp-content/uploads/2020/09/2.-ENGLISH.pdf
[66] National Education Commission, 1992. Report of the national education commission, 1992 (executive summary). National Education Commission, Kesharmahal, Kathmandu, Nepal. https://nepalindata.com/media/resources/items/20/bReport_of_the_National_Education_Commission_1992.pdf
[67] NNEPC, (1956). Education of Nepal. Bureau of Publications, College of Education, Kathmandu. Nepal. https://himalaya.socanth.cam.ac.uk/collections/rarebooks/downloads/Education_in_Nepal.pdf
[68] Neupane, D. (2019, May 2). Continuous assessment system: From paper to pedagogy. Thehimalayantimes.Com. https://thehimalayantimes.com/opinion/continuous-assessment-system-from-paper-to-pedagogy
[69] Office, E. N. R. (n.d.). Education Review Office, Nepal. https://www.ero.gov.np/post/6_5f16e3298b914
[70] Organization of Economic Cooperation Development, (2016). PISA for Development Brief 1. https://www.oecd.org/pisa/pisa-for-development/pisafordevelopment-documentation-briefs.htm
[71] Shore, A., Pedulla, J., & Clarke, M. (2001). The building blocks of state testing programs. Chestnut Hill, MA: National Board on Educational Testing and Public Policy. https://www.bc.edu/research/nbetpp/statements/V2N4.pdf
[72] Poole, A. (2016). ‘Complex teaching realities’ and ‘deep rooted cultural traditions’: Barriers to the implementation and internalisation of formative assessment in China. Cogent Education, 3 (1), 1156242. https://doi.org/10.1080/2331186x.2016.1156242
[73] Poudel, P. P., & Choi, T. H. (2021). Discourses shaping the language-in-education policy and foreign language education in Nepal: an intersectional perspective. Current Issues in Language Planning, 23 (5), 488–506. https://doi.org/10.1080/14664208.2021.2013063
[74] Poudel, Lekh Nath. (2016). Reviewing the practice of national assessment of student achievement in Nepal. Nepalese Journal of Educational Assessment, 1 (1), 1-16. https://www.ero.gov.np/upload_file/files/post/1595313482_598772199_1587622596_1795486007_Journal_NJEA_2(1)_2017(1).pdf
[75] Rahman, M., Pandian, A., & Kaur, M. (2018). Factors Affecting Teachers’ Implementation of Communicative Language Teaching Curriculum in Secondary Schools in Bangladesh. The Qualitative Report. https://doi.org/10.46743/2160-3715/2018.3220
[76] Rahman, M. M., & Pandian, A. (2018). A Critical Investigation of English Language Teaching in Bangladesh. English Today, 34 (3), 43–49. https://doi.org/10.1017/s026607841700061x
[77] Ramanathan, H. (2008). Testing of English in India: A developing concept. Language Testing, 25 (1), 111–126. https://doi.org/10.1177/0265532207083747
[78] Richards, K. (2003). Qualitative Inquiry in TESOL. Basingstoke: Palgrave Macmillan.
[79] Ritt. M. (2016). The impact of high-stakes testing on the learning environment. Retrieved from Sophia, the St. Catherine University repository website: https://tinyurl.com/38dzuype
[80] Rose, P. (2015). Is a global system of international large-scale assessments necessary for tracking progress of a post-2015 learning target? Compare: A Journal of Comparative and International Education, 45 (3), 486–490. https://doi.org/10.1080/03057925.2015.1027514
[81] Ross, S. J. (2008). Language testing in Asia: Evolution, innovation, and policy challenges. Language Testing, 25 (1), 5–13. https://doi.org/10.1177/0265532207083741
[82] Roach, E. (2022, March 30). Education in Nepal. WENR. https://wenr.wes.org/2018/04/education-in-nepal
[83] Sasaki, M. (2008). The 150-year history of English language assessment in Japanese education. Language Testing, 25 (1), 63–83. https://doi.org/10.1177/0265532207083745
[84] Shin, S. Y., & Lidster, R. (2016). Evaluating different standard-setting methods in an ESL placement testing context. Language Testing, 34 (3), 357–381. https://doi.org/10.1177/0265532216646605
[85] Seargeant, P., and Erling, E. J. (2011). The Discourse of “English as a Language for International Development: Policy Assumptions and Practical Challenges.” In H. Coleman (Ed.), Dreams and Realities: Developing Countries and the English Language (pp. 248–267). London: British Council.
[86] Shank, G. (2006). Qualitative Research Methods: A Personal Skills Approach. Upper Saddle River, NJ: Pearson.
[87] Shepard, L. A. & Dougherty, K. C. (1991, April). Effect of high-stakes testing on instruction. Unpublished paper presented at the American Education Research Association and the National Council on Measurement in Education, Chicago. https://www.colorado.edu/education/sites/default/files/attached-files/Effects%20of%20High-Stakes%20Testing.pdf
[88] Shohamy, E. (2007). Language tests as language policy tools. Assessment in Education: Principles, Policy &Amp; Practice, 14 (1), 117–130. https://doi.org/10.1080/09695940701272948
[89] Shohamy, E. (2016). Critical Language Testing. Language Testing and Assessment, 1–15. https://doi.org/10.1007/978-3-319-02326-7_26-1
[90] Shrestha (2014). Perception of Teachers towards Continuous Assessments System. M.Ed thesis Tribhuvan University, Kirtipur. https://elibrary.tucl.edu.np/bitstream/123456789/1948/3/12540.pdf
[91] Silverman, D. (2016). Qualitative Research. Los Angeles, CA: Sage.
[92] Smith, W. C. (2016). An Introduction to the Global Testing Culture. In W. C. Smith (Ed.), The Global Testing Culture: Shaping Education Policy, Perceptions, and Practice (pp. 7–23). Oxford: Symposium Books.
[93] Sultana, R. (2015). Reliability of the currently administered language tests in Bangladesh: a case study. Journal of Literature, Language and Linguisitcs, 15 (2422–8435), 76–85. https://www.iiste.org/Journals/index.php/JLLL/article/view/27665
[94] Stiggins, R. J. (2002). Assessment Crisis: The Absence of Assessment for Learning. Phi Delta Kappan, 83 (10), 758–765. https://doi.org/10.1177/003172170208301010
[95] THT Online. (2021, June 16). Editorial: Grading SEE students. The Himalayan Times.Com. https://thehimalayantimes.com/opinion/editorial-grading-see-students
[96] Wu, M. (2015). What National Testing Data Can Tell Us. In B. Lingard, G. Thompson, & S. Sellar (Ed.), National Testing in Schools: An Australian Assessment (pp. 39–51). New York, NY: Routledge.
[97] Yin, R. (2014). Case Study Research: Design and Methods. 4th ed. Los Angeles, CA: Sage.
Cite This Article
  • APA Style

    Ranjit, R. (2023). English Language Assessment in Nepal: Policies, Practices and Problems. International Journal of English Teaching and Learning, 1(1), 22-33. https://doi.org/10.11648/j.ijetl.20230101.14

    Copy | Download

    ACS Style

    Ranjit, R. English Language Assessment in Nepal: Policies, Practices and Problems. Int. J. Engl. Teach. Learn. 2023, 1(1), 22-33. doi: 10.11648/j.ijetl.20230101.14

    Copy | Download

    AMA Style

    Ranjit R. English Language Assessment in Nepal: Policies, Practices and Problems. Int J Engl Teach Learn. 2023;1(1):22-33. doi: 10.11648/j.ijetl.20230101.14

    Copy | Download

  • @article{10.11648/j.ijetl.20230101.14,
      author = {Rabu Ranjit},
      title = {English Language Assessment in Nepal: Policies, Practices and Problems},
      journal = {International Journal of English Teaching and Learning},
      volume = {1},
      number = {1},
      pages = {22-33},
      doi = {10.11648/j.ijetl.20230101.14},
      url = {https://doi.org/10.11648/j.ijetl.20230101.14},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijetl.20230101.14},
      abstract = {The study investigates the high-stakes English test for the SEE examination, emphasizing a mismatch with the curriculum's language skills and prompting exploration into test design processes and influencing factors. This research employs a qualitative case study approach. The findings reveal a gap between regulations valuing language skills and practical implementation in assessments. The tests are based on traditional testing philosophy, with inadequate standardization in the test items and test inadequate standardization in the test items and test administration. However, the bulk of these evaluations indicated factors including the teacher, the institution, and the students that had an impact on student learning. The four language skills—listening, speaking, reading, and writing—are valued by assessment regulations, but our classroom environment and evaluation system seldom put this into reality. Summative public exams, administered externally, lack feedback on teaching and primarily serve student progression. Despite recognizing the efficacy of communicative language instruction, the prevalence of non-communicative approaches in exams raises questions about students' communicative skills development. Although the testing literature is replete with theoretical discussions of test design, reviews, and validation (as seen by the references given previously), there is a lack of attention on how high-stakes language exams are actually constructed, particularly in developing cultures. Tests used in external examinations at various stages of schooling in these civilizations are of special importance. Although public examinations in English and other courses have been utilized in Nepal for decades, there has been little study on how these tests are created, what learning or success is targeted for evaluation, and what repercussions these tests may have for students and their families, the education system, and society at large. The researcher concludes by advocating for the adoption of theoretical advancements in testing within the Nepalese educational system and globally, emphasizing the importance of critically examining discrepancies within regulatory correlation, causation, and inconsistencies between testing and curriculum. 
    },
     year = {2023}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - English Language Assessment in Nepal: Policies, Practices and Problems
    AU  - Rabu Ranjit
    Y1  - 2023/12/22
    PY  - 2023
    N1  - https://doi.org/10.11648/j.ijetl.20230101.14
    DO  - 10.11648/j.ijetl.20230101.14
    T2  - International Journal of English Teaching and Learning
    JF  - International Journal of English Teaching and Learning
    JO  - International Journal of English Teaching and Learning
    SP  - 22
    EP  - 33
    PB  - Science Publishing Group
    SN  - 2997-2566
    UR  - https://doi.org/10.11648/j.ijetl.20230101.14
    AB  - The study investigates the high-stakes English test for the SEE examination, emphasizing a mismatch with the curriculum's language skills and prompting exploration into test design processes and influencing factors. This research employs a qualitative case study approach. The findings reveal a gap between regulations valuing language skills and practical implementation in assessments. The tests are based on traditional testing philosophy, with inadequate standardization in the test items and test inadequate standardization in the test items and test administration. However, the bulk of these evaluations indicated factors including the teacher, the institution, and the students that had an impact on student learning. The four language skills—listening, speaking, reading, and writing—are valued by assessment regulations, but our classroom environment and evaluation system seldom put this into reality. Summative public exams, administered externally, lack feedback on teaching and primarily serve student progression. Despite recognizing the efficacy of communicative language instruction, the prevalence of non-communicative approaches in exams raises questions about students' communicative skills development. Although the testing literature is replete with theoretical discussions of test design, reviews, and validation (as seen by the references given previously), there is a lack of attention on how high-stakes language exams are actually constructed, particularly in developing cultures. Tests used in external examinations at various stages of schooling in these civilizations are of special importance. Although public examinations in English and other courses have been utilized in Nepal for decades, there has been little study on how these tests are created, what learning or success is targeted for evaluation, and what repercussions these tests may have for students and their families, the education system, and society at large. The researcher concludes by advocating for the adoption of theoretical advancements in testing within the Nepalese educational system and globally, emphasizing the importance of critically examining discrepancies within regulatory correlation, causation, and inconsistencies between testing and curriculum. 
    
    VL  - 1
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • Brac Institute of Languages, Brac University, Dhaka, Bangladesh

  • Sections