Home / Projects / OLT Projects / OnTask: Scalable Student Feedback

OnTask: Scalable Student Feedback

Scaling the Provision of Personalised Learning Support Actions to Large Student Cohorts

safariscreensnapz580Abelardo Pardo (PI), Kathryn Bartimote-Aufflick, Simon McIntyre, Danny Liu (U. Sydney) • Jurgen Schulte (UTS-PI), Simon Buckingham Shum, Roberto Martinez‐Maldonado (UTS) • Shane Dawson, George Siemens, Dragan Gašević  (U. South Australia) • Lorenzo Vigentini, Negin Mirriahi (UNSW)

Project website


(Extract from project proposal funded by the Australian Government Office for Learning & Teaching)


Over the last decade, the number of students in Australian higher education (HE) institutions has been steadily rising at an average of 4% per year to reach almost 1,400,000 full time students in 2014. This dramatic increase has forced universities to face the challenge of scaling their operations while maintaining or improving the quality of the student experience. Scaling the size of lecture theatres or the capacity to accommodate more students in tutorials is already a challenge in itself. Institutions are grappling with how to adapt to this trend by exploring more flexible delivery options. However, crucial aspects of the student experience such as assessment or the provision of personalised feedback are not undertaken due to the prohibitive amounts of resources, or they are addressed with superficial techniques. In the case of assessment, only a small number of fully automatic techniques can be scaled to accommodate large student cohorts. As a consequence, the overall quality of a course is significantly diminished due to the restrictions in available assessment types. Labour costs and workload constrains effectively remove assessment types that require a high level of instructor presence and feedback. The situation is further complicated when considering the provision of feedback to be more personalised and nuanced. Instructors delivering a course with a large student cohort have no support from technology when they try to scale a one‐to‐one conversation with students. As such, the capacity of the instructor to affect student learning is reduced.

Although Australian HE institutions have an abundance of technologies to support teaching and learning practice, few offer systematised assistance to instructors for personalising student learning on a large scale. By personalising learning we refer to the set of actions, resources and support aimed at maximising the learning opportunities for each student. Learning technologies have focused either on administrative aspects or automated low level support actions. For instance, Learning Management Systems (LMSs) provide a cost effective method to deliver course content and engage students in social learning practice yet they are not able to easily help instructors to adapt or alter the curriculum design to cater to a student’s prior experience or knowledge. One set of technologies that can provide automated and personalised feedback is Intelligent Tutoring Systems (ITS). However, the establishment of ITS is not a simple or readily scalable process. They are resource‐intensive to create, require the creation of complex representations of both the knowledge domain and student state, and the expertise of the instructor is not easily integrated in the design. Simply put, to date there are limited options available to combine automation with instructor intervention to personalise instructional support and learning resources for each student or at least clusters of like students. This challenge is further exacerbated as class sizes increase and prompts the question of how can HE institutions leverage their current technical infrastructure, combine it with the instructor expertise and move towards scalable personalised learning.


This project has two aims. The first is to improve the quality of student learning in large cohorts by scaling the deployment of Personal Learning Support Actions (PLSAs) within Australian HE institutions. We define PLSAs as any instructor led intervention that is designed to help students in their learning journey by recognising and acknowledging their strengths and weaknesses, and suggesting steps or mentorship interventions that are relevant to their particular situation. This term encompasses conventional actions such as the provision of feedback as well as content personalisation, advice on learning strategies, content recommendations, and visualisations. This project proposes a methodological shift in which students in large cohorts will receive frequent, personalised and relevant learning support actions derived from the combination of state‐of‐the‐art learning analytics approaches and the expertise of the instructors. The second aim of the project is to increase the maturity of learning analytics deployments in educational institutions by providing evidence‐based guidelines to move from reduced experimentations into organisational transformations.


In order to achieve this aim, the project proposes the following outcomes:

  1. A list of technical requirements to help Australian HE institutions facilitate the data management procedures required to support academics’ use of PLSAs.
  2. A set of guidelines to help academics design their own PLSAs by deriving indicators of student learning or performance relevant to their curriculum, and accompanying intervention strategies.
  3. A suite of case study examples from the participating institutions of how PLSAs have been employed in large student cohorts across different disciplines and teaching strategies. Each case study will exemplify the use of the guidelines and data management procedures, and will provide evidence about effectiveness and impact on student learning.
  4. A software tool for academics to deploy and manage the implementation of PLSAs in their teaching. This tool will integrate with common LMSs used in Australian HE institutions such as Blackboard and Moodle.
  5. A framework to help academics undertake and share evaluations of the impact of their own use of PLSAs with large student cohorts.

The project approaches the problem of how instructors can leverage their expertise with technology to reach each student with effective support actions from a practical point of view. The existing barriers that militate against or at least impede the deployment of frequent and effective PLSAs for large or massive student cohorts are addressed by the set of requirements to foster changes in educational institutions. The data required to support the design and deployment of PLSAs is usually either not available, requires a convoluted procedure to access, which is incompatible with frequent access, or it is not appropriately related to personal indicators within the learning design. Furthermore, deciding what the most appropriate support action is usually requires the intervention of the instructor, and the combination of technological support and human knowhow still unsolved in learning contexts. Finally, although technology offers a large variety of channels to communicate with students in learning contexts this communication is reduced to conventional channels (email) or in some cases its moderation is unfeasible when aiming at personalisation at scale.

microsoft-wordscreensnapz035The project proposes a solution based on analytics methods that combine data collected while students participate in technology mediated learning experiences with the expertise of instructors to scale the provision of this support. We will provide and disseminate guidelines and exemplars on how to make data about learning environments easily available to instructors, illustrate how to capture the instructor expertise to select the and personalise a PLSA, and finally how to be able to deploy these actions in large student cohorts. Overcoming the aforementioned barriers will allow Australian educational institutions to fully embrace a change towards comprehensive personalisation in which the quality of courses with large student cohorts is not compromised.