Learning-Oriented Assessment Enhanced by Technology: Effects on Language Proficiency Among Chinese TEVT Students
This mixed-methods study investigates the effect of a technology-enhanced, learning-oriented assessment (LOA) intervention on the language proficiency of Technical and Vocational Education and Training (TVET) students in China. It seeks to uncover LOA strategies that could effectively enhance students’ language learning outcomes in flipped classroom instruction, with special focus on the role of technology in mediating LOA practices. By synthesizing quantitative and qualitative data, this study aims to furnish substantial empirical evidence to validate LOA practices in the specific context of language teaching in TEVT and offer valuable insights into the optimization of technology-enhanced assessment practices within the flipped classroom contexts.
Learning-oriented assessment prioritizes learning and demonstrates great potential in enhancing learning outcomes and fostering self-directedness. However, concerns have been raised about the practicality of the LOA and its claimed benefits across various contexts owing to reported implementation challenges and inconsistent evidence of its effectiveness in improving learning outcomes. These challenges can be addressed by improving the design of LOA interventions and enhancing the effectiveness of technological support.
A pretest–posttest between-group quasi-experimental design, complemented by focus group discussions and classroom observations, was employed to examine the effects of LOA on language proficiency and to identify effective implementation strategies. The LOA intervention was developed based on Carless’s (2007) LOA principles and Jones et al.’s (2016) LOA cycle. Four intact classes were randomly assigned to either the experimental or control group. The modified PRETCO-A test, which had been verified for reliability prior to the intervention, was administered to both groups before and after the 12-week treatment period. Paired-sample and independent-sample t-tests were conducted using the Statistical Package for the Social Sciences (SPSS) to compare the mean scores of overall language proficiency and individual sub-skills within and between groups. Data from the focus groups and classroom observations were analyzed using content and descriptive analyses.
This study contributes empirical evidence on the validity of LOA practice by revealing its positive effects on the specific language skills of TEVT students in an exam-oriented educational setting. By identifying the language skills most strongly enhanced by the LOA intervention and elucidating the challenges encountered and the practical strategies adopted in a flipped classroom, the findings offer valuable insights for researchers and educators seeking to optimize LOA implementation and technology-mediated assessment practices within a flipped classroom context. Particularly, this study advances the understanding of the role of technology in enhancing the LOA process by illustrating how digital tools can integrate assessment, instruction, and learning; enable an action-oriented feedback loop; support more effective assessment task design; and promote rubric-driven peer assessment. Furthermore, it expands the discussion on how the affordance of technology can be leveraged to bolster LOA without compromising academic honesty and fostering student disengagement.
Although the independent-samples T tests demonstrated no statistically significant overall gains in TEVT students’ language learning outcomes following the LOA intervention, a considerable post-test difference emerged in writing skills (writing, p=0.039), with a small effect size (η2=0.0363), and no corresponding pre-test difference. Qualitative evidence from focus group discussions and classroom observations indicated that effective LOA strategies prioritized rubric-driven peer assessment, an action-oriented feedback loop integrating individualized e-feedback and teacher guidance, and protocol-based self-assessment and self-reflection. While technology has enhanced the effectiveness and efficiency of the LOA procedure, teachers should remain alert to the risks of academic dishonesty and disengagement associated with technological use.
When designing LOA tasks, teachers should align them with TVET students’ prior knowledge, desired learning outcomes, and future professional needs, ensuring appropriate challenges and relevance. Prior to peer or self-assessment, students should receive training on how to provide effective feedback and this practice needs to be supported by incentives to foster engagement. Well-structured, rubric-driven peer-assessment and protocol-based self-assessment and reflection protocols are recommended to reduce cognitive load and enhance participation. Additionally, instant diagnostic feedback and scaffolding should be provided to students to foster a growth mindset and diminish a grade-oriented mindset. Moreover, despite technological affordances, consistent teacher guidance remains crucial throughout the LOA process.
The research suggests that the impact of LOA may be limited in scope and may vary across dimensions of language proficiency and among diverse learner types. Researchers are recommended to investigate the impact of prolonged interventions across different learner types and diverse pedagogical settings. Although the current study provides valuable quantitative insights, the discussion remains limited in pedagogical scope. More studies are needed to broaden the understanding of technology-enhanced LOA strategies by incorporating learners’ perspectives to provide a more comprehensive view of the mediating effects of students’ LOA literacy and prior learning experiences on learning outcomes. Moreover, given the growing concerns over academic integrity and student disengagement in technology-mediated assessment observed in this study, researchers are encouraged to explore the strategies that leverage technology to support LOA while simultaneously addressing these issues.
The findings provide robust evidence for the validity and practicality of the LOA, particularly the effectiveness of rubric-driven peer assessment, feedback loops, and protocol-based self-assessment and reflection in enhancing language proficiency. This study offers educators valuable insights to optimize LOA implementation, foster pedagogical innovation, and contribute to the broader assessment reform promoted by national educational authorities. It also highlights the potential tensions in technology-mediated assessment, emphasizing the need to balance its benefits with safeguards that ensure academic integrity.
Future research should focus on developing systematic strategies for implementing LOA by further testing its validity and effectiveness across different learner groups and in diverse pedagogical contexts. In addition, the integration of emerging technologies in LOA warrants thorough investigation, particularly regarding their pedagogical affordances and associated ethical considerations.



Back