While competence assessment dimensions of English presentation have been well clarified (Schreiber, Paul, & Shibley, 2012), studies probing into the emerging field of presentation in academic context are limited, leading to the lack of a widely acknowledged scoring rubric in this field (Barrett & Liu, 2016). To this end, the researchers conducted a case study of Chinese university students, using a data-driven approach to develop a scoring rubric for assessing academic presentation competence. The case study included four stages. The first stage was a pilot study where students' in-class English academic presentation performance was observed to identify the distinguishing dimensions that determined the quality of their presentations. Based on the pilot study, the second stage was an exploration of the distinctive levels of each dimension, based on which a specific five-level rating scale was developed to assess Chinese college students' academic presentation competence, addressing English-as-a-foreign-language factors (Joe, Kitchen, Chen, & Feng, 2015). On the third stage, students were informed of the modified scoring rubric and required to review their peer performance based on it. Students' feedback on the scoring standards were elicited to refine the assessing rubric. Finally, a group of expert raters were invited to score the students' academic presentations with the refined scales. The confirmation of the interrater reliability suggests that the assessing rubric is justifiably reliable, and for the assessment of presentation in academic context, the importance of content and structure cannot be underemphasized. Overall this case study has provided empirical evidence in the development of scoring rubric for assessing English academic presentation competence. Pedagogical implications are also offered for the instruction of English academic presentation in the Chinese college context.
References
Barrett, Neil E., & Liu, Gi-Zen. (2016). Global trends and research aims for English academic oral
presentations: Changes, challenges, and opportunities for learning technology. Review of
Educational Research, 86, 997-1012.
https://doi.org/10.3102/0034654316628296
Joe, J., Kitchen, C., Chen, L., & Feng, G. (2015). A prototype public speaking skills assessment: An
evaluation of human-scoring quality. ETS Research Report Series, 1-21.
https://dx.doi.org/10.1002/ets2.12083
Schreiber, L. M., Paul, G.D., & Shibley, L.R. (2012). The development and test of the public speaking
competence rubric. Communication Education, 61, 205-233.
https://doi.org/10.1080/03634523.2012.670709