Home / Tools / Academic Writing Analytics (AWA)

Academic Writing Analytics (AWA)

Students learn that critical, reflective writing must make their thinking visible. This is conveyed through particular linguistic forms. Can computers learn to identify these patterns, and feed them back to students to help improve their writing?

Screen Shot 2015-06-03 at 1.03.16 pm

The ability to communicate and debate ideas coherently and critically is a core graduate attribute. In many disciplines, writing provides a significant window into the mind of the student, evidencing mastery of the curriculum and the ability to reflect on one’s own learning. Arguably, in the humanities and social sciences, writing is the primary source of evidence. Moreover, as dialogue and debate move from face-to-face to online in a variety of genres and digital channels, discourse shifts from being ephemeral to persistent, providing a new evidence base. Yet while all the evidence shows that timely, personalised feedback is one of the key factors impacting learning, and students consistently request quicker, better feedback, assessing writing is extremely time-consuming — whether a brief first assignment, a draft essay, a thesis chapter, or a research article in preparation for peer review.

This is the focus of the Academic Writing Analytics (AWA) project. CIC is prototyping a formative feedback app for student writing which we call AcaWriter, working in close partnership with academics from diverse faculties, HELPS and IML. This Natural Language Processing (NLP) tool identifies concepts, people, places, and distinctively, the metadiscourse corresponding to rhetorical moves. These moves are important ways of using language to signal to the reader that a scholarly, knowledge-level claim is being made, but UTS practice and the wider research literature evidence how difficult this is for students to learn, and indeed, for some educators to teach and grade with confidence.

This is not automated grading, but rapid formative feedback on draft texts. AcaWriter is designed to make visible to learners the ways in which they are using (or failing to use) language to ‘make their thinking visible’ — i.e. construct claims and argumentative reasoning for academic writing. A series of pilots is now under way with staff and students.

Demo: there’s a simple demo version for you to test AcaWriter on different genres of writing.

Codebase: read the announcement of the open source release of the codebase plus developer training and academic educational resources.

Learn more from the publications and replays below, by following our Writing Analytics blog, and browse these workshops (2016/2017/2018) where some of the world’s leading researchers meet to reflect on the state of the art and future of automated writing analysis.

Key references:

Lucas, C., Gibson, A. and Buckingham Shum, S. (In Press). Utilization of a novel online reflective learning tool for immediate formative feedback to assist pharmacy students’ reflective writing skills. American Journal of Pharmaceutical Educationhttps://doi.org/10.5688/ajpe6800 

Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á. and Wang, X. (2018). Designing Academic Writing Analytics for Civil Law Student Self-Assessment. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0. (Part of a Special Issue on Multidisciplinary Approaches to AI and Education for Reading and Writing – Parts 1 & 2. Guest Editors: Rebecca J. Passonneau, Danielle McNamara, Smaranda Muresan, and Dolores Perin)

Shibani, A., Knight, S., Buckingham Shum S. and Ryan, P. (2017). Design and Implementation of a Pedagogic Intervention Using Writing Analytics. In Proceedings of the 25th International Conference on Computers in Education. New Zealand: Asia-Pacific Society for Computers in Education

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective Writing Analytics for Actionable Feedback. Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge, March 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436. [Preprint] [Replay]

Buckingham Shum, S., Á. Sándor, R. Goldsmith, R. Bass and M. McWilliams (2017). Towards Reflective Writing Analytics: Rationale, Methodology and Preliminary Results. Journal of Learning Analytics, 4, (1), 58–84. http://dx.doi.org/10.18608/jla.2017.41.5

This is an extended version of: Buckingham Shum, S., Á. Sándor, R. Goldsmith, X. Wang, R. Bass and M. McWilliams (2016). Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool. 6th International Learning Analytics & Knowledge Conference (LAK16), Edinburgh, UK, April 25 – 29 2016, ACM, New York, NY. http://dx.doi.org/10.1145/2883851.2883955 [Preprint] [Replay]

Critical Perspective on Writing Analytics. Workshop, 6th International Learning Analytics & Knowledge Conference (LAK16), Edinburgh, UK, April 25, 2016. http://wa.utscic.edu.au/events/lak16wa

Simsek, D., Á. Sándor, S. Buckingham Shum, R. Ferguson, A. D. Liddo and D. Whitelock (2015). Correlations between automated rhetorical analysis and tutors’ grades on student essays. Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York, ACM. http://dx.doi.org/10.1145/2723576.2723603

Top