Abstract:
The Centre for Language in Education (CLE) was granted to expand its battery of items for the Tertiary English Language Test (TELT) in 2008 for screening and recruitment of non-JUPAS/non-local students. It was also used as placement test to identify students’ appropriate proficiency levels. The test comprised a reading, listening and vocabulary test for Non-English major students with an additional speaking and writing test for English major students.
Over the past 12 months, this first test has been trialed and validated using Rasch analysis. The test was used extensively in the placement of both JUPAS and Non-JUPAS students into the 3 levels of CLE English enhancement courses, namely Foundation, Bridge and Access. The success of the placement tool was evidenced in the results in 2007/08. The test helped to ‘sort’ the students effectively across the three levels of proficiency with specific reference to sorting the Bridge and Foundations levels, and that a good spread of results was found.
This project aims to expand the number of texts and test items (reading, listening, vocabulary, and grammar) in the TELT placement databank from 200 to 300, and develop the speaking and writing component of the test. This enabled CLE to vary the test from time to time and thereby improve the security of the test. Furthermore, it was also anticipated that a heavier usage of the placement test in the coming years. As well, and in line with the new language policy, monitoring and exit benchmarks become increasingly important to determine proficiency levels on graduation with validity and reliability. CLE would like to be able to provide mid degree and exit level data for students on graduation by expanding this test.
Code:
3335
Principal Project Supervisors:
Subjects:
Start Date:
30 Jun 2009
End Date:
29 Jun 2010
Status:
Completed
Result:
The project team designed 4 speaking and 4 writing tasked for 2010 cohort for pilot. They also carried out standardization session for speaking assessors, writing assessors, and CLE English teaching staff. From the analysis, it was found the instrument was reliable and valid. There was a good spread of student ability in writing test and speaking test. English major students are at the top of the scale in both tests. Good inter-rater reliability also assured and confirmed in the analysis.
Impact:
The project significantly improved the reliability of the speaking and writing tests. It provided insight on the rating behavior of instructors as well as more reliable information when students were placed into classes of different proficiency levels. Students and teachers were able to diagnostic speaking and writing profiles to understand students’ areas of strengths and weakness in speaking and writing which helped a lot in effective teaching and learning.
Deliverables:
Raquel, M. R. & Lockwood, J. (2010, March). Measuring Hong Kong students’ English proficiency: Development of the Tertiary English Language Test. Poster presentation at the 31st Annual Language Testing Research Colloquium (LTRC), Cambridge, United Kingdom.
Lockwood, J. & Raquel, M. R. (2010). Language policy and language proficiency assessment in English and Putonghua: Is what is proposed, possible? Poster presentation at the 31st Annual Language Testing Research Colloquium (LTRC), Cambridge, United Kingdom.
Financial Year:
2008-09
Type:
TDG