Zheng Wenxin and David Carless
I am Zheng Wenxin, a second-year HKU student participating in the Eureka Undergraduate Research Programme, conducting a research project with Professor David Carless. Our research topic is Assessment re-design and GenAI: What do undergraduate students want from assessment?
The rapid advancement of generative artificial intelligence (GenAI) is transforming educational practices, and reinforces the importance of well-designed assessment. Our research uses qualitative methods to explore students’ views on assessment and GenAI’s role, providing insights for educators to redesign assessments for meaningful learning.
I conducted fifteen in-depth semi-structured interviews with undergraduates from a variety of disciplinary backgrounds. I particularly asked them about assignments they found interesting and valuable. The analysis revealed a number of recurring themes and in this post, we mainly focus on creativity as a key principle of assessment design. This struck us as an interesting theme, perhaps somewhat underplayed in the assessment literature.
An example of creative assessments highlighted by students was debate and role-play. These fall under the umbrella of interactive oral assessment, an increasingly pertinent option in the context of doubts about the integrity of conventional written assignments in the GenAI era.
Our first example comes from the Common Core a set of interdisciplinary courses required for all HKU undergraduates, covering four Areas of Inquiry: Science, Technology and Big Data; Arts and Humanities; Global Issues; and China Culture, State, and Society.
In a course on the Chinese cultural revolution, students learn to apply ideas drawn from a range of disciplines: history, political and social science, literature, and film. They are assessed on participation in a debate and a role-play. In the debate, students form two teams representing Red Guard characters from a film that is part of the course content. They discuss who is entitled to join the revolution with one student presenting arguments first, followed by a response from the opposing team. In the role-plays, students represent different groups of people (workers, students, party leaders) responding to historical scenarios.
Students found the interactive activities interesting and meaningful. Role-plays and debates allow space for creativity because they enable imaginative responses through preparing lines, actions and anecdotes. One student stated:
“We need to really get into the shoes of the historical characters, experience their thoughts, and find out why they think the way they do. Then we express it through a debate, a high-intensity format requiring quick responses. To perform well, we must internalize and skillfully apply relevant historical knowledge.”
The ability to respond quickly and appropriately in an interactive format is the kind of skill highly valued in the workplace. Interactive oral assessments do not appeal to all students, however, and this represents a dilemma. A student majoring in mathematics found the readings challenging, felt stressed by the time required to prepare for the debate, and does not enjoy speaking in public. This reminds us that assessments need to fit diverse learning styles and academic backgrounds. Relevant literature suggests a well-designed authentic assessment should offer appropriate but not too much cognitive challenge.
Another interesting and creative assessment related to students being tasked with enhancing AI-generated content. In an Art history class, students were provided with two 650-word AI-generated essays. They had to choose one of them, and then expand and improve it into a 1500-word piece. A student noted that this process presents a specific challenge since they must refine and expand the AI work rather than starting from scratch. This exercise provides valuable insights into the differences between AI and human writing. By enhancing the AI output, students can appreciate its strengths while recognizing areas where human creativity and critical thinking can significantly improve the final product. The guidelines and rubric also encourage students to contribute their own perspectives and offer depth of analysis.
A similar approach for students majoring in Translation major involves modifying AI-translated content. In a Chinese-to-English translation course, students are required to use AI to translate a particular literary work and then evaluate and modify the translations. This method allows students to build on existing web translations, learn from them, and incorporate positive human elements. The teaching staff in the Translation major acknowledge AI’s capabilities but believe there is still a need for human involvement. This assessment can help students learn to work with AI and adapt to the future direction of the translation profession through hybrid human-AI collaboration.
Our ongoing research reinforces some principles guiding assessment design:
- relationships to future careers or real-life applications of the discipline;
- stimulating critical thinking and creativity;
- offering choice and flexibility;
- embedding AI within assessment design.
Future assessment design might embrace or adapt these principles and continue exploring innovative ways of facilitating authentic student learning experiences.
References
Sotiriadou, P., Logan, D., Daly, A. & Guest, R. (2020). The role of authentic assessment to preserve academic integrity and promote skill development and employability. Studies in Higher Education 45(11), 2132-2148. doi: 10.1080/03075079.2019.1582015
Villarroel, V., S. Bloxham, D. Bruna, C. Bruna, & C. Herrera-Seda. (2018). Authentic assessment: Creating a blueprint for course design. Assessment & Evaluation in Higher Education 43 (5): 840-854. doi:10.1080/02602938.2017.1412396