Setting Language Assessments: Face Validity and Authenticity in Conflict

Ng Chi Wui

As an English language teacher, have you ever struggled with setting examination papers for students? Have you ever found setting a paper from scratch extremely cumbersome and therefore resorted to online question banks? It is not uncommon to see English teachers in Hong Kong relying on questions developed by textbook publishers when asked to set assessments at school. Those prefabricated papers, while simulating papers of high-stakes public examinations, are highly contrived.

The aforementioned phenomenon reflects a dilemma between face validity and authenticity that English teachers confront with when devising language assessments. Face validity denotes the extent to which a test “looks like as if it measures what it is supposed to measure” (Hughes & Hughes, 2020, p.36) whilst authenticity entails “the degree of correspondence of the characteristics of a given test task to the features of a target language task” (Bachman & Palmer, 1996, p.23). To put it simply, a test is said to exhibit high face validity if test takers consider the test as familiar, useful, and relevant whereas a test is argued to be authentic if it simulates real-life tasks in genuine contexts. These two important constructs in language are not inherently in opposition with each other, but conflicts may arise when test developers emphasize one of them at the expense of the other.

Despite their high face validity, examination papers adapted from question banks provided by textbook publishers lack authenticity. In order to raise their market sales, textbook publishers provide schoolteachers with multifarious supplementary resources, including online question banks comprising pre-made and readily used examination papers. The format and language input used in those assessment papers are similar to those found in public examinations. For instance, the reading materials and audio recordings used in the test papers developed by publishers are contrived instead of taken from any genuine contexts, and the question formats resemble those seen in public examinations rather than those encountered in everyday life. Still, perceived to be relevant to and useful for preparing them for the public examinations, such papers possess high face validity. Two major factors however refrain teachers from creating authentic language assessments, which reflect daily language use.

The first obstacle to the use of authentic language sources is their high level of difficulty. Produced for the general public but not specifically for learners, the language input found in everyday life, such as newspaper articles and radio programmes, is likely to pose a significant challenge to learners in both cognitive and linguistic respects. Some features of authentic speech, such as use of fillers, environmental noise, and speakers’ fast pace, possibly add another layer of difficulty in comprehension. Given that such materials are used as language input in reading or listening assessments, learners’ language performance may be somewhat disrupted. For instance, even if a learner possesses knowledge on a certain lexical item in a dialogue between two speakers, overlapping voices, which are common in natural conversations, may prevent the learner from accurately identifying that word to be answered on the test paper. In other words, such assessments might fail to truly test learners’ language ability. Some may argue that teachers should exercise their expertise to modify the language input for use in language assessments, yet this is easier said than done given the heavy workload brought by the “reform syndrome” (Cheng, 2009), which I briefly discussed in an earlier blog entry on formative assessment.

Another hindrance to authentic language assessments is the examination-oriented education culture in Hong Kong. Influenced by the Confucian Heritage Culture, the Chinese education system is characterized as examination-oriented with a heavy emphasis of achieving social upward mobility through success in high-stakes public examinations (Tang, 2009). The backwash, which refers to “the effect that tests have on learning and teaching” (Hughes & Hughes, 2020, p.57), of these high-stakes examinations is viewed as negative as teachers tend to prioritize examination skills over subject knowledge in their teaching and mould internal assessments based on the syllabus and format of public examinations to prime students for the actual exams (Bachman & Palmer, 1996). In such a context, a test with high face validity, which is more examination-oriented, appears more preferable than an authentic one from teachers’ perspective.

Does this mean that English teachers in Hong Kong have no choice but to adhere to the “traditional” practice of using online question banks while forsaking authentic language assessments? As Hughes and Hughes (2020) suggested, one tenet of language test development is to adopt direct testing whenever feasible. Teachers ought not to be overly concerned about the conflict between face validity and authenticity or that between question banks and authentic materials. Instead, they need to ensure that the test developed truly assesses what is intended to be measured directly, even if that is not exactly what they do in their everyday life. Language tests fulfilling such criteria will suffice.

References

Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice: Designing and developing useful language tests. Oxford: Oxford University Press.

Cheng, Y. C. (2009). Hong Kong educational reforms in the last decade: reform syndrome and new developments. International Journal of Educational Management, 23 (1), 65-86. http://www.emeraldinsight.com/doi/pdfplus/10.1108/09513540910926439

Hughes, A., & Hughes, J. (2020). Testing for language teachers (3rd ed). Cambridge: Cambridge University Press.

Tang, E. (2009). A cultural framework of “Chinese learn English”: A critical review of and reflections on research. English as International Language Journal, 4, 7-43. https://www.eilj.com/wp-content/uploads/2013/12/4-august_2009.pdf

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top