Home / CIC Doctoral Program / PhD in Learning Analytics

PhD in Learning Analytics

WelcomeUTS ContextToolsSkills & DispositionsScholarships & Applications

Welcome to the UTS:CIC Doctoral Program

Screen Shot 2015-07-21 at 2.53.05 pm

We are delighted to announce that CIC’s doctoral program in Learning Analytics is offering new UTS Scholarships for 2017.

CIC’s primary mission is to maximise the benefits of analytics for UTS teaching and learning. The Learning Analytics Doctoral Program is part of our strategy to cultivate transdisciplinary innovation to tackle challenges at UTS, through rigorous methodologies, arguments and evidence. A core focus is the personalisation of the learning experience, especially through improved feedback to learners and educators.

As you will see from our work, and the PhD topics advertised, we have a particular interest in analytics techniques to nurture in learners the creative, critical, sensemaking qualities needed for lifelong learning, employment and citizenship in a complex, data-saturated society.

We invite you to apply for a place if you are committed to working in a transdisciplinary team to invent user-centered analytics tools in close partnership with the UTS staff and students who are our ‘clients’.

Please explore this website so you understand the context in which we work, and the research topics we are supervising. We look forward to hearing why you wish to join CIC, and how your background, skills and aspirations could advance this program.

UTS has been awarded a significant five stars result for excellence in eight out of eight categories of higher education by QS™ for 2014-2016. UTS is ranked 1st in Australia and 21st globally in the Times Higher Education top 100 universities under 50 years of age, and is in the top 250 universities by the Times Higher Education World University Rankings which judges world class universities across all of their core missions – teaching, research, knowledge transfer and international outlook. [learn more]

CIC sits in UTS within the portfolio of Prof. Shirley Alexander, Deputy Vice-Chancellor and Vice-President, Education & Students — whose learning and teaching strategy, through a $1.2B investment in new learning spaces, is ranked as world leading in a recent analysis. Data and analytics are a core enabler of the UTS vision for learning.futures, while other initiatives such as Assessment Futures pose new challenges for authentic assessment that analytics should help to tackle.

Our primary audience is UTS, working closely with faculties, information technology and student support units to prototype new analytics applications. Since we are breaking new ground, developing approaches that have wide applicability, we disseminate this for research impact. As you can see from our projects, we conduct both internally and externally-funded work.

CIC works closely with key teams in UTS who support the improvement of teaching and learning, including the Institute for Interactive Media & Learning (IML), Higher Education Language & Presentation Support (HELPS), and the Information & Technology Division to ensure that our tools can operate and scale in UTS as required. The annual Learning & Teaching Awards showcase leading educator practice, while the Assessment Futures program is defining the contours of assessment regimes relevant to the real world.

While you are expected to take charge of your own tool development, CIC’s application developer may well be able to support you with some web, mobile or script development to enable your research.

While CIC is inventing new analytics tools, we are also interested in evaluating open source and commercial learner-facing applications that have interesting potential for analytics.

PhD projects often add to and learn from ongoing projects, so think about whether your work connects to mainstream tools used in UTS such as Blackboard, SPARK, ReView and A.nnotate, as well as more experimental products such as Glyma and Declara, and prototypes like AWA, CLA and Compendium.  You may bring expertise in particular data analysis tools. Those already in use in CIC include R, Weka, RapidMiner, ProM, Tableau.

Topic-specific technical skills and academic grounding that you will need for your PhD are specified in the PhD project descriptions, but there are some common skills and dispositions that we are seeking, given the way that we work.

  • CIC is committed to multidisciplinarity, which we hope will become transdisciplinary as we build enough common ground for the disciplines to inform or even transform perspectives. Thinking outside your ‘home turf’ is not easy or comfortable, but we are seeking people with an appetite to stretch themselves with new worldviews.
  • CIC is committed to user-centered participatory design of learning analytics tools, so you will need passion for, and commitment to, working with non-technical users as you prototype new tools. We are seeking excellent interpersonal and communication skills in order to translate between the technical and educational worlds, and creative design thinking to help users engage with new kinds of tools. Ideally, you will already have had some design experience, but this can also be an area you want to learn.


Three successful candidates will be eligible for a 3-year Scholarship of $35,000/pa (a substantial increase on the  standard Australian PhD stipend of $25,849). To this, you may be able to add potential teaching income from the many opportunities to work with our Master of Data Science & Innovation students. In addition, as far as possible, CIC will fund you to present  peer reviewed papers at approved, high quality conferences.

Domestic students have their fees covered by the Australian government, and international students will receive a UTS International Research Scholarship covering tuition fees. Please note, all scholarships at UTS are dependent upon satisfactory progress throughout the three years.

We are also open to applications from self-funded full-time and part-time candidates, in which case you may propose other topics that fit CIC’s priorities.


To be eligible for a scholarship, a student must minimally:

  • have completed a Bachelor Degree with First Class Honours or Second Class Honours (Division 1), or be regarded by the University as having an equivalent level of attainment;
  • have been accepted for a higher degree by research at UTS in the year of the scholarship;
  • have completed enrolment as a full-time student

Additional person requirements are as specified on the CIC PhD website.

Selection Criteria

Appointments will be made based on the quality of the candidates, their proposals, the overall coherence of the team, the potential contribution to UTS student and educator experience, and the research advances that will result.

The criteria are specified in the CIC PhD website, both generic and specific to advertised projects. Evidence will be taken from an applicant’s written application, face-to-face/video interview, multimedia research presentation at interview, and references.


Applicants for a Studentship should submit:

  • Covering letter
  • Curriculum Vitae
  • Research Proposal, maximum 4 pages, applying for one of the advertised PhD topics

Please email your scholarship application as a PDF, with PhD 2017 Application in the subject line, to:

Georgia Markakis <Georgia.Markakis@uts.edu.au>

Following discussion with the relevant potential supervisors you will be required to go through the UTS application process as a formal part of the application.

To begin this formal application process, click here and complete the following steps:

  1. Scroll down to Point 4 (“Lodge your application”)
  2. Click on the blue “Register and Apply” button
  3. When you reach the section asking for a course code, do not try to type anything in.  Instead, just search the course name and then select it.


We aim to appoint for August 2017, the start of the second semester. We invite applications by close of Monday 3rd April 2017, with shortlisting for interview shortly after. However, there is an advantage to contacting us earlier to open discussions: you are strongly encouraged to get in touch with project leads informally in advance of that because if we like you, we will offer you a place as soon as we can.

So please get in touch with the Director if you have queries about CIC in general, and with the relevant supervisors about the topic of interest to you.

The UTS application form, and further guidance on preparation and submission of your research proposal, are on the UTS Research Degrees website. The 30 October deadline does not apply to these Scholarships.

PhD Topics

We invite scholarship applications to investigate the following topics, which are broad enough for you to bring your own perspective. If you have your own funding, then you may propose another topic which fits with CIC’s priorities.

1. LA Across Spaces2. LA for Writing Practices3 . Human-Centred-Design for LA4. Debate Visualisation & Analytics5. Data interoperability and analytics for lifelong personalised learning

Multimodal Learning Analytics Across Spaces


Roberto Martinez-Maldonado and Simon Buckingham Shum

The Challenge

While in blended learning we deploy a variety of educational technologies and pedagogical resources for online and face-to-face settings, as we build the qualities needed for lifelong learning, and increasingly authentic assessment within formal education, learner activities must necessarily span spaces and moments beyond formal educational contexts and tools (Kloss et al., 2012, Sharples & Roschelle, 2010).

The learning analytics challenge for this PhD is to research, prototype and evaluate approaches to automatically capture traces of students’ activity, using analytics techniques to make sense of data from heterogeneous contexts. Depending on the trajectory that you take, examples of the questions that such a project could investigate include:

  • How can the insights of students’ activity across spaces be connected with higher-level pedagogies?
  • How can these insights promote productive behavioural change?
  • How can the teacher be supported with this information to provide informed feedback?
  • How can this information support more authentic and holistic assessment?
  • What are the technical challenges that need to be overcome?
  • How can multimodal analytics approaches be applied to gain a holistic understanding of students’ activity?
  • How do learning theories and learning design patterns map to the orchestration of such analytics tools?

Analytic Approaches

We are seeking a PhD candidate interested in working on designing and connecting Learning Analytics solutions according to the pedagogical needs and contextual constraints of learning occurring across physical and digital spaces. Providing continued support in the classroom, for mobile experiences and using web-based systems has been explored to different extents and each poses its own challenges. An overarching concern is how to integrate and coordinate learning analytics in a coherent way. Synergies with educational frameworks and theories already drawn on by the CIC team will be encouraged, such as Learning Power (Crick et al, 2015; CLARA tool) Epistemic Cognition, and science and technology studies of information infrastructure. The Connected Learning Analytics toolkit is another candidate infrastructure.

Addressing these questions should lead to educationally grounded machine learning techniques that give insight into heterogeneous activity traces (e.g. Martinez-Maldonado et al, 2013), and the design and evaluation of teacher and/or student-facing dashboards that provoke productive sensemaking, and inform action (e.g. Martinez-Maldonado et al, 2012). We invite your proposals as to which techniques might be best suited to this challenge.

You will work in close collaboration with ‘clients’ from other faculties/units in UTS, and potentially industry partners, with opportunities for synergy with existing projects and tools as described on the CIC website. For more information about ongoing research in this area, please visit the CrossLAK website.


In addition to the skills and dispositions that we are seeking in all candidates, you should have:

  • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science, mathematics, statistics, or equivalent
  • Analytical, creative and innovative approach to solving problems
  • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies
  • Strong programming skills in at least one relevant language (e.g. C/C++, .NET, Java, Python, R, etc.)
  • Experience with data mining, data analytics or business intelligence tools (e.g. Weka, ProM, RapidMiner). Visualisation tools are a bonus.

It is advantageous if you can evidence:

  • Experience in designing and conducting quantitative, qualitative or mixed-method studies
  • Familiarity with educational theory, instructional design, learning sciences
  • Peer reviewed publications
  • A digital scholarship profile
  • Design of user-centered software

Interested candidates should contact Roberto.Martinez-Maldonado@uts.edu.au and Simon.BuckinghamShum@uts.edu.au with informal queries. Please follow the application procedure for the submission of your proposal.


Aljohani, Naif R. and Davis, Hugh C. (2012) Learning analytics in mobile and ubiquitous learning environments. In Proceedings of the 11th World Conference on Mobile and Contextual Learning: mLearn 2012, Helsinki, Finland.

Deakin Crick, R., S. Huang, A. Ahmed-Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies 63(2): 121- 160.

Delgado Kloos, Carlos, D. Hernández-Leo, and J. I. Asensio-Pérez. (2012). Technology for Learning across Physical and Virtual Spaces: Special Issue. Journal of Universal Computer Science, 18(15), pp. 2093-2096.

Kitto, Kirsty, Sebastian Cross, Zak Waters, and Mandy Lupton. (2015). Learning analytics beyond the LMS: the connected learning analytics toolkit. In Proceedings of the 5th International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York: ACM, pp. 11-15

Martinez-Maldonado, R., Clayphan, A., Yacef, K. and Kay, J. (2015) MTFeedback: providing notifications to enhance teacher awareness of small group work in the classroom. IEEE Transactions on Learning Technologies, TLT, 8(2): 187-200

Martinez-Maldonado, R., Kay, J. and Yacef, K. (2013) An automatic approach for mining patterns of collaboration around an interactive tabletop.  International Conference on Artificial Intelligence in Education, AIED 2013, pages 101-110.

Martinez-Maldonado, R., Yacef, K., Kay, J., and Schwendimann, B. (2012) An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment.  International Conference on Intelligent Tutoring Systems, pages 482-492.

Sharples, Mike, and Jeremy Roschelle. (2010). Guest editorial: Special section on mobile and ubiquitous technologies for learning. IEEE Transactions on Learning Technologies, (1), pp. 4-6.

Learning Analytics for Writing Practices


Simon Knight and Simon Buckingham Shum

The Challenge

Literacy, including the abilities to comprehend rich multimedia, and effectively communicate through written texts, are key to learning, and full participation in society (OECD, 2013; OECD & Statistics Canada, 2010). In everyday contexts, for example, parents need to understand their child’s vaccination needs; voters want to weigh up the claims of politicians on climate change; and students want to assess the relative weight of argument around the causes of some historic event.

In each case, information seekers require more than just the ability to read content; they must make decisions about where to look for information, which sources to select (and corroborate), how to synthesise (sometimes competing) claims from diverse sources. In an educational, business or other work context, they must typically produce a high quality written synthesis making an argument for a point of view, or decision.

The learning analytics challenge is to research, design and evaluate techniques to make sense of the data traces produced from search, discourse and writing. These should illuminate the relationships between the above activities, and provide feedback for learners and educators that can inform productive reflection and action.

Analytic Approaches

One class of research into the sorts of “literacy practices” introduced above has studied multiple document processing (MDP), the ability to read, comprehend and integrate information from across sources (see, for examples, Bråten, 2008; Bråten, Britt, Strømsø, & Rouet, 2011; Ferguson, 2014; Foltz, Britt, & Perfetti, 1996; Goldman et al., 2011; Hastings, Hughes, Magliano, Goldman, & Lawless, 2012; Kobayashi, 2014; Rouet & Britt, 2011). We are particularly interested in research viewing these behaviours through the lens of epistemic cognition – beliefs about the certainty, simplicity, source, and justification of knowledge (see, for examples, Bråten, 2008; Bråten et al., 2011; Ferguson, 2014).

Across this work, analysis of processes and products of writing have emerged. Emerging language technologies raise the potential for work on the detection of features in output texts that are related to features of literacy and high-level epistemic cognition. For example, analysis of the written outputs for: rhetorical parsing to detect typical scholarly moves (see, for example, de Waard, Buitelaar, & Eigner, 2009; Groza, Handschuh, & Bordea, 2010; Simsek, Buckingham Shum, Sandor, De Liddo, & Ferguson, 2013); text cohesion (McNamara, Louwerse, McCarthy, & Graesser, 2010); and topic coverage (see, for example, Hastings et al., 2012).

Proposals are welcomed that address the use of writing-practice based learning analytics to support student learning. Writing practices, here, are broadly construed to include activities such as: Information seeking; reading; annotation; writing itself (both the process and the output); and self and peer assessment. We invite proposals that address the kinds of: experimental paradigms, such as the MDP tasks, to investigate student writing; analytic techniques to explore semantic and meta-discourse properties of written outputs, and their relation to source documents (see discussion above); analysis of writing processes, including temporal analyses (further resources); and assessment tools to explore the best methods for feedback and constructive peer and self-assessment, or calibrated peer review (Balfour, 2013). We also welcome proposals with a focus on collaborative knowledge practices, including: co-writing; the giving of constructive formative feedback; and joint enterprise on the writing practices described above.

This PhD will contribute intellectually and technically to the ongoing research program being developed around the Academic Writing Analytics platform. You will work in close collaboration with ‘clients’ from other faculties/units in UTS, and potentially industry partners, with opportunities for synergy with existing projects and tools as described on the CIC website.


Proposals are welcomed from candidates with a range of backgrounds and skills. We welcome proposals from language technologists, computational linguists and other computer or information science backgrounds, and from those with backgrounds in education, psychology, or related social science disciplines. All proposals should be trans-disciplinary in nature; orient your proposal to your particular strengths and interests, within the CIC context as a technology and innovation directed centre aiming at impact on student learning.

In addition to the skills and dispositions that we are seeking in all candidates, you should have:

  • A higher degree in a computer, information, or learning science discipline, other relevant social science subject, or computational linguistics
  • Demonstrated knowledge of educational contexts (through applying work in education, etc.)
  • Previous experience using computational language technologies

It is advantageous if you can evidence:

  • Previous research experience e.g. at Masters level
  • Knowledge of, or willingness to learn, a relevant programming language (e.g. R, Python)
  • Experience in designing and conducting quantitative, qualitative or mixed-method studies
  • Peer reviewed publications
  • A digital scholarship profile
  • Design of user-centered software

Interested candidates should contact Simon.Knight@uts.edu.au. and Simon.BuckinghamShum@uts.edu.au with informal queries. Please follow the application procedure for the submission of your proposal.

Example projects

We welcome applications addressing the theme described above. Project proposals might address one of the following examples:

  • Writing Analytics Evidence Hub: We need to build our collective intelligence around the key issues in student writing, interventions to target support, and measuring impact. An evidence hub (e.g. here) is one means through which to do this. A ‘collective intelligence’ PhD would catalyse our work in this area, to support us in developing writing analytics, build organisational or social infrastructure, and build ‘writing analytics literacy’ across stakeholders at UTS.
  • Source based summary writing: Source based summary writing is a key academic practice (incorporating academic integrity), involving evaluation and synthesis of multiple (sometimes competing) claims. There are limited methods to analyse such writing, but doing so could give insight into epistemic cognition, and other features of student cognition. A PhD in this area would develop approaches to the design and implementation of tasks to assess student’s source based summary writing, with novel implementations in partnership with UTS academics.
  • Virtual internships: Developing professional practice is a key for successful graduates, and a core component of the UTS model of learning. We are investigating the potential of ‘virtual internships’ (e.g. here), to support learners in developing their professional identity. A PhD on virtual internships would design and deploy a VI in partnership with UTS academics, investigating the potential of analytics (including writing analytics, and analytics across space) to support learning.
  • Feedback feedback: Building evaluative judgement is an important part of learning to distinguish between high and low quality work, including in self-assessment. Evaluative judgement and the giving of high quality feedback has been associated with positive learning outcomes in student assessors. Some work has been conducted to develop ‘feedback feedback’ systems – systems that support feedback-givers in providing high quality feedback. A PhD in this area would develop pedagogic models and analytic tools to deploy and evaluate the impact of ‘feedback feedback.


Balfour, S. P. (2013). Assessing writing in MOOCs: Automated essay scoring and calibrated peer review. Research & Practice in Assessment, 8(1), 40–48.

Bråten, I. (2008). Personal Epistemology, Understanding of Multiple Texts, and Learning Within Internet Technologies. In M. S. Khine (Ed.), Knowing, Knowledge and Beliefs (pp. 351–376). Dordrecht: Springer Netherlands. Retrieved from http://www.springerlink.com/content/j664674514614405/

Bråten, I., Britt, M. A., Strømsø, H. I., & Rouet, J.-F. (2011). The role of epistemic beliefs in the comprehension of multiple expository texts: Toward an integrated model. Educational Psychologist, 46(1), 48–70. http://doi.org/10.1080/00461520.2011.538647

de Waard, A., Buitelaar, P., & Eigner, T. (2009). Identifying the epistemic value of discourse segments in biology texts. In Proceedings of the Eighth International Conference on Computational Semantics (pp. 351–354). Stroudsburg, PA, USA: Association for Computational Linguistics. Retrieved from http://dl.acm.org/citation.cfm?id=1693756.1693802

Ferguson, L. E. (2014). Epistemic Beliefs and Their Relation to Multiple-Text Comprehension: A Norwegian Program of Research. Scandinavian Journal of Educational Research, 0(0), 1–22. http://doi.org/10.1080/00313831.2014.971863

Foltz, P. W., Britt, M. A., & Perfetti, C. A. (1996). Reasoning from multiple texts: An automatic analysis of readers’ situation models. In G. W. Cottrell (Ed.), Proceedings of the 18th Annual Cognitive Science Conference (pp. 110–115). Lawrence Erlbaum, NJ. Retrieved from http://www-psych.nmsu.edu/~pfoltz/reprints/cogsci96.html

Goldman, S. R., Ozuru, Y., Braasch, J. L. G., Manning, F. H., Lawless, K. A., Gomez, K. W., & Slanovits, M. (2011). Literacies for learning: A multiple source comprehension illustration. In N. Stein L. & S. Raudenbush (Eds.), Developmental science goes to school: Implications for policy and practice (pp. 30–44). Abingdon, Oxon: Routledge.

Groza, T., Handschuh, S., & Bordea, G. (2010). Towards automatic extraction of epistemic items from scientific publications. In Proceedings of the 2010 ACM Symposium on Applied Computing (pp. 1341–1348). New York, NY, USA: ACM. http://doi.org/10.1145/1774088.1774377

Hastings, P., Hughes, S., Magliano, J. P., Goldman, S. R., & Lawless, K. (2012). Assessing the use of multiple sources in student essays. Behavior Research Methods, 44(3), 622–633. http://doi.org/10.3758/s13428-012-0214-0

Kobayashi, K. (2014). Students’ consideration of source information during the reading of multiple texts and its effect on intertextual conflict resolution. Instructional Science, 42(2), 183–205.

McNamara, D. S., Louwerse, M. M., McCarthy, P. M., & Graesser, A. C. (2010). Coh-Metrix: Capturing Linguistic Features of Cohesion. Discourse Processes, 47(4), 292–330. http://doi.org/10.1080/01638530902959943

OECD. (2013). PISA 2015: Draft reading literacy framework. OECD Publishing. Retrieved from http://www.oecd.org/pisa/pisaproducts/Draft PISA 2015 Reading Framework .pdf

OECD, & Statistics Canada. (2010). Literacy in the Information Age – Final Report of the International Adult Literacy Survey. OECD. Retrieved from http://www.oecd.org/edu/skills-beyond-school/41529765.pdf

Rouet, J.-F., & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Text relevance and learning from text (pp. 19–52). Information Age Publishing (IAP). Retrieved from http://www.niu.edu/britt/recent_papers/pdfs/RouetBritt_chapter_for_McCrudden.pdf

Simsek, D., S. Buckingham Shum, Á. Sándor, A. D. Liddo and R. Ferguson (2013). XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of Scientific Metadiscourse. 1st International Workshop on Discourse-Centric Learning Analytics, 3rd International Conference on Learning Analytics & Knowledge, Leuven, BE (Apr. 8-12, 2013). . Open Access Eprint: http://oro.open.ac.uk/37391


Human-Centred Design for Learning Analytics


Theresa Anderson, Simon Buckingham Shum, Ruth Crick & Simon Knight (depending on topic)

The Challenge

We are in the midst of a profound shift to a data infused, an algorithmically pervaded society, with learning analytics (as a research field and marketplace) offering a stage for how this will play out in education. In this PhD, we seek a candidate who wants to engage with the debate on how socio-technical infrastructures deliver computational intelligence in society in accountable, ethical ways, in order to enhance rather than erase human agency.

This debate needs to be contextualised to education, and specifically learning analytics design and research — not only because algorithms are central to analytics, but because agency is central to deep learning. For example, learning analytics technologies provoke concerns on the part of some educators, who fear that an algorithmic mindset is incompatible with one that values the qualities and processes associated with agency: creativity, critical thinking, community, deep learning. For others, analytics provide the exact opposite: the chance to make such learning processes (not just products) a quality which can be evidenced rigorously for the first time.

This PhD will engage with this debate, exploring to what extent the concerns are justified, whether the critiques can be addressed through better design processes and software, and the state of the art in using analytics to enhance rather than reduce learner agency and mindfulness.

Analytic Approaches

Knight and Buckingham Shum (In Press) note some of the discourse now emerging around algorithms and agency in society at large. We can begin to contextualise these to learning analytics.

Data — especially “Big Data” — has a misleading aura of completeness around it. This must be subjected to critical scepticism and questioning if it is to serve as a societal good (boyd & Crawford, 2011). The curation of large datasets invariably requires human effort and when examined, is replete with compromises and limitations at odds with the dominant rhetoric of objectivity (Leonelli, 2014).

Learning analytics as scientific infrastructures. The design of an analytics lifecycle, from data capture to analysis, rendering, interpretation and action, is pervaded with human judgements and intentions. In the historical study of scientific infrastructures, we recognise that, “Classification systems provide both a warrant and a tool for forgetting… what to forget and how to forget it… The argument comes down to asking not only what gets coded in but what gets coded out of a given scheme” (Bowker & Star, 1999, pp. 277–279). Thus: “raw data is both an oxymoron and a bad idea; to the contrary, data should be cooked with care” (Bowker, 2006, p. 185). Data cannot therefore ‘speak for itself’ — it must be given voice and action through human sensemaking, or by a computational agent which nonetheless embodies assumptions.

Algorithmic accountability and stakeholder trust. Algorithms are opaque for a variety of reasons (Burrell, 2016), and shape as well as are shaped by cultural practices (Dourish, 2016). Algorithms sit at the heart of analytics, but their design and debugging is the preserve of the few. In an era when software is embedded in appliances and online everywhere, one might ask if learners or educators should be troubling themselves with why the system is behaving as it does. For others, however, there is an ethical and pedagogical need to articulate the possible definitions and meanings of “algorithmic accountability” (Diakopoulos, 2014), such that learning analytics have appropriate transparency to different stakeholders.

Learning analytics design processes that give the end-user voice. Academic and design fields such as Human-Computer Interaction, Social Informatics and Values-Sensitive Design provide ways to design learning analytics that could reduce the risk of such tools fostering student and faculty suspicion, and build a greater sense of agency, and hence trust, of such tools. Research undertaken in these fields recognises that design is not a neutral activity, which is why sensitivity to all stakeholder perspectives and their values are foregrounded along the lines discussed in Jaifari, Nathan and Hargraves (2015) for instance.  For this reason, the approach taken in this project is most appropriately framed using a participatory, value-sensitive methodology.

Agency: from clicks to constructs. Aspects of “learner agency” that are of particular interest to us are qualities such as resilience in the face of challenge and uncertainty, creativity and playfulness in problem solving, one’s ability to engage in social learning, and one’s ability to make connections across contexts (Anderson, 2012; 2013; 2014; Crick et al. 2015). These are among the highest order competencies that humans display, and which we seek to nurture in learners. While these can be assessed through direct observation and self-report, can these be meaningfully identified in data traces from student activity, or is attempting to quantify such qualities one step too far in algorithmic intelligence hubris?

A key outcome from this PhD is an account of how the wider critical discourse around algorithms in society inform, and is informed by, the design of learning analytics. A second outcome is an analysis of whether it is reasonable to design analytics as proxy indicators for ‘agency’ — to quantify what some consider to be unquantifiable.

Depending on the candidate’s interests and expertise, the argument might be backed in terms of theory, empirical evidence and design prototypes. Thus, the PhD might:

  • translate theoretical implications into practical design guidance for learning analytics designers, researchers and educators, to help them understand and engage with the issues of values and ethics that lie at the heart of any computational model
  • implement prototype analytics tools embodying those principles, or make modifications to existing tools and how they are deployed, in the light of the principles
  • devise and validate participatory design processes which empower learners and educators in ways that address concerns about privacy violations, algorithmic opacity, or inappropriate educational interventions from limited data
  • engage with the legal and ethical ramifications of a growing dependency on algorithms in educational systems.


In addition to the skills and dispositions that we are seeking in all candidates, you should have:

  • A Masters degree, Honours distinction or equivalent in a relevant discipline, e.g. Science & Technology Studies, Education, Design, Information Sciences, Human-Computer Interaction, Ethics of IT
  • Analytical, creative and innovative approach to solving problems
  • Experience in designing and conducting quantitative, qualitative or mixed-method studies

It is advantageous if you can evidence:

  • Experience with educational theory, instructional design, learning sciences
  • Experience in designing and conducting quantitative, qualitative or mixed-method studies
  • Coding ability
  • Peer reviewed publications
  • A digital scholarship profile
  • Design of user-centered software


Anderson, T.K. 2012, ‘Information Science and 21st Century Information Practices: creatively engaging with information‘ in Bawden, D. & Robinson, L. (eds), Introduction to Information Science, Facet Publishing, UK, pp. 15-17.

Anderson, T.K. 2013, ‘The 4Ps of innovation culture: conceptions of creatively engaging with information‘, Information Research – Proceedings of the Eighth International Conference on Conceptions of Library and Information Science, Copenhagen, Denmark, pp. 1-15.

Anderson, T.K. 2014, ‘Making the 4Ps as important as the 4Rs’, Knowledge Quest, vol. 42, no. 5, pp. 42-47.

Bowker, G. C. (2006). Memory Practices in the Sciences. Cambridge, MA: MIT Press.

Bowker, G. C., & Star, L. S. (1999). Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press.

boyd, d. and K. Crawford (2012). Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon. Information, Communication & Society 15(5): 662-679.

Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society 3(1). http://doi.org/10.1177/2053951715622512

Deakin Crick, R., S. Huang, A. Ahmed-Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies 63(2): 121- 160.

Dourish, P. (2016). Algorithms and their others: Algorithmic culture in context. Big Data & Society, July–December 2016: 1–11. http://doi.org/10.1177/2053951716665128

Diakopoulos, N. (2014). Algorithmic Accountability. Digital Journalism, 3(3), 398–415.

Jafari, N., Nathan, L., and Hargraves, I. (2015) Values as hypotheses: design, inquiry, and the service of values. Design Issues 31(4): 91-104.

Knight, S. and Buckingham Shum, S. (In Press). Theory and Learning Analytics. Handbook of Learning Analytics and Educational Data Mining.

Leonelli, S. (2014). What difference does quantity make? On the epistemology of Big Data in biology. Big Data & Society, Apr-June 2014, pp.1-11. [Supplementary Media]

Debate Visualisation & Analytics

Simon Buckingham Shum (UTS:CIC) & Anna De Liddo (Knowledge Media Institute, The Open University UK)

The Challenge

There has been a longstanding concern from many quarters of society about the deteriorating quality of civic and political discourse (e.g. Sunstein, 2017). This concern has been accentuated most recently by political events such as the UK “Brexit” Referendum and the U.S. Presidential election, in which the public debate is widely regarded to have been particularly poor. In parallel, an established body of work in educational research and practice has focused on students’ ability to engage critical thinking in their studies as well as contemporary issues, and develop their sensemaking ability to assess the quality of sources in an information saturated society (e.g. Davies & Barnett, 2015).

Evidently, there are many reasons behind the lack of quality of public discourse, and the multiple and often contrasting drivers that shape how someone casts their vote. No PhD project can possibly address all of these. However, the ability (and disposition) to think critically when confronted by truth claims is an important factor that can legitimately be laid at the door of the education system. In the era of fake news and cognitive warfare (Bennett 2016, Ferrara et al. 2014) how can we equip a viewer with the necessary skills to make sense of complex debates by more critically assessing claims, recognising efforts at social and rhetorical manipulation, questioning assumptions? How can media websites and social media help to elevate the quality of public discourse? A timely and substantive challenge for universities is to devise and validate practical, scaleable approaches that could rapidly improve such qualities and skills in students — our future citizens.

Analytics Approaches

The supervisors have a long track record of working together at the UK Open University’s Knowledge Media Institute (KMi), one of Europe’s leading labs developing future knowledge technologies. Our work focuses on the design and evaluation of Collective Intelligence and Visualisation tools for personal and collaborative sensemaking (Buckingham Shum, 2003; Okada, et al. 2014). The modelling and visualisation approach at the core of the work has focused on semiformal representations of discourse moves visualised as networks. The enabling platforms started with a powerful visual hypermedia desktop tool (Compendium), transitioning to collaborative web platforms to pool community claims and evidence (Evidence Hub), platforms for structured online deliberation (e.g. LiteMap and DebateHub), with increasing capability for automated analytics services (CIdashboard) to help make sense of large online conversations (De Liddo & al. 2014, 2012 a,b,c). Public deliberation platforms research intersects in interesting ways with research into argumentation analytics for Computer-Supported Collaborative Learning (Buckingham Shum, et al. 2014).

This PhD project builds on the most recent conceptual and technical outcomes of a 3 year project entitled Election Debate Visualisation (EDV: Pl üss and DeLiddo, 2015). In conjunction with the last UK General Election, tools were developed to enable viewers to provide real time feedback on the televised Prime-Ministerial debates, with this data subsequently integrated into a Debate Replay tool to show the aggregated reactions. In addition, other prototype analytics were provided, creating a new kind of tool to reflect on exactly what was unfolding during a debate.

This Civic Hall Rethinking Debates news story provides a good introduction to the EDV project’s goals and technical capabilities. After reading this, and Pl üss and DeLiddo (2015), view this movie for a quick and rather raw demo of some of the tool’s current capabilities.


This exciting project offers you the opportunity to be based in Sydney at UTS:CIC, while working as an affiliate KMi research student. This PhD project will investigate the potential of using a combination of interactive multimedia, semantic video annotation, and learning analytics, embedded in authentic learning tasks, to build students’ capacity to critique videos. Of particular interest are videos of political debates (but in principle, the topic could be open-ended).

The current tool could be tuned to the needs of diverse user groups —  students, journalists, political analysts, advocacy groups and of course, curious citizens. We’re looking for a technically competent candidate who can adapt the EDV platform to evaluate its potential as an educational tool, and evaluate it in partnership with academics who wish to pilot it with our students. Key technical skills (most important in bold):

  • Web Technologies based on the MEAN stack coding framework
  • Knowledge on Server Side Web technologies: Node.js, Apache
  • Database: MongoDB, SQL
  • Knowledge of Client Side Web technologies: AngularJS (version 1), c3.js and d3.js frameworks, html5 formatting language, Javascript programming language, CSS3 cascading stylesheets
  • Knowledge of Mobile Web technologies: Ionic development framework and typescript programming language

In addition to the EDV platform’s current capabilities, you may see a sound pedagogical rationale to introduce others, e.g. more advanced text analytics.

A PhD is far more than a development project of course. EDV is a vehicle to explore a challenging research question — but what would that be? Ideally, you already bring some research experience at Masters level, in government/industry, or a very strong Honours undergraduate research project. We want to hear your views on how this project might shape up, driven by your own interests. What are the key challenges — conceptual, empirical, technical? What might be the contributions? Do you bring a distinctive perspective?

Please contact us to express interest and to discuss your ideas informally. We will work with you to shape up a research proposal that will help us select the strongest candidate.


Bennett, W. L. (2016). News: The politics of illusion. University of Chicago Press.

Buckingham Shum, S. (2003). The Roots of Computer-Supported Argument Visualization. In: Paul A. Kirschner, Simon J. Buckingham Shum and Chad S. Carr (Eds.), Visualizing Argumentation: Software Tools for Collaborative and Educational Sense-Making.  Springer-Verlag: London, 2003 ISBN 1-85233-6641-1

Buckingham Shum, S., De Liddo, A. and Klein, M. (2014). DCLA Meet CIDA: Collective Intelligence Deliberation Analytics. Working Paper: 2nd International Workshop on Discourse-Centric Learning Analytics, LAK14: 4th International Conference on Learning Analytics & Knowledge, Indianapolis.

Davies, M. and Barnett, R. (2015). The Palgrave Handbook of Critical Thinking in Higher Education. ISBN-13: 978-1349478125

De Liddo A., Buckingham Shum S., Klein, M. “Arguing on the Web for Social Innovation: Lightweight Tools and Analytics for Civic Engagement”. In the Workshop: Arguing the Web: 2.0, 8th ISSA Conference on Argumentation. (Amsterdam, July 1-4, 2014) [video]

De Liddo, A., Sándor, Á., and Buckingham Shum, S. (2012 a). Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study. Computer Supported CooperativeWork Journal (CSCW), 21(4-5), pp. 417–448.

De Liddo A., Sándor, A., Buckingham Shum, S. (2012 b). Cohere and XIP: human annotation harnessing machine annotation power. In the Proceedings of the Computer Supported Cooperative Work Conference, CSCW2012 Companion, pp.: 49-50 [video]

De Liddo, A., Buckingham Shum, S., Convertino, G., Sándor, A., Klein M. (2012 c). Workshop: Collective intelligence as community discourse and action. In the Proceedings of the Computer Supported Cooperative Work Conference, CSCW2012 Companion, pp.:5-6

Ferrara, Emilio, Onur Varol, Clayton Davis, Filippo Menczer, and Alessandro Flammini. “The rise of social bots.” arXiv preprint arXiv:1407.5225 (2014).

Okada, A., Buckingham Shum, S. and Sherborne, T. (Eds.) (2008; 2014). Knowledge Cartography: Software Tools and Mapping Techniques. London, UK: Springer. (Second Edition)

Plüss, Brian and De Liddo, Anna (2015). Engaging Citizens with Televised Election Debates through Online Interactive Replays. In: Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video, 3-5 June 2015, Brussels, Belgium, pp. 179–184.

Sunstein, C. (2017). #Republic: Divided Democracy in the Age of Social MediaPrinceton University Press

Data interoperability and analytics for lifelong personalised learning


Kirsty Kitto, Roberto Martinez-Maldonado and Simon Buckingham Shum

The Challenge

It is likely that people entering the workforce today will need to change jobs multiple times throughout their lifetime (CEDA, 2015). Many existing job roles are likely to be automated, but new roles in the workforce of the future are emerging all the time. Higher education is likely to be just the start of a person’s learning journey; many people will need to up-skill, re-skill and retrain throughout their careers. This means that already thorny problems like the recognition of prior learning are going to become key; how can we recognise existing skills, knowledge and competency when they come from a wide array of domains and environments?

Increasingly we see claims emerging that technology will help to personalise learning, building upon the existing strengths of a learner and helping them to bolster their weaknesses. Many Adaptive Learning systems are already in existence and build upon a long line of work in Intelligent Tutoring (Nye, Graesser, & Hu, 2014; Ma, Adesope, et al., 2014) and Recommendation systems built for EdTech (Manouselis, Drachsler, et al., 2011). These systems claim to identify existing weaknesses in learners and to then personalise the learning experience, providing an individual journey that is adapted specifically for them. But as Caulfield (2016) correctly claims: “we have personalisation backwards” if we are attempting to provide the same remedy for students who come from very different backgrounds. Many others have called attention to the long history of attempts to “optimize” learning (e.g. Watters, 2015; Kohn, 2016), pointing out that it does nothing to innovate on an “old-school model that consists of pouring a bunch o’ facts into empty receptacles” (Kohn, 2016). Also worrying, the loss of autonomy associated with a tool that tells students precisely what to do next leads to a loss of serendipity and will discourage the development of growth mindsets and an ability to thrive in situations of ambiguity and uncertainty (Deakin Crick et al, 2015). This PhD project will seek to tackle these problems head on, by investigating ways in which technology solutions can be provided that help in the construction of personal learning journeys that help learners to build upon their existing knowledge, skills and backgrounds, and then demonstrate the achievement of capabilities and competencies that map into both formal educational systems and work based selection criteria.

Underlying such a project, we require a way of providing the learner of the future with a Personal Learning Record Store (PLRS) that they can access and make use of for life. This project will seek to develop early prototypes of a PLRS that satisfies core use cases identified by you. It will be important to keep in mind the long term legal, ethical, and social implications of a technology of this form, and so your project will be about more than just developing tech, you will need to keep the learner firmly in mind while solving core technical problems concerning  interoperability and learner facing learning analytics. Depending on the trajectory that you take, examples of the questions that such a project could investigate include:

  • What data needs to be stored in a PLRS in order to facilitate lifelong personalised learning pathways?
  • What form should a PLRS take to facilitate lifelong learning?
  • How could high level educationally relevant constructs be discovered from low level clickstream data and then mapped to the attainment of key skills and competencies?
  • What new learning designs and patterns can be created to take advantage of the large amount of learning data stored in a PLRS?
  • How can xAPI profiles and recipes be used to ensure that learning data collected from multiple educational systems and workplaces is both syntactically and semantically interoperable in a PLRS?
  • What analytics would enable a learner to understand key weaknesses (and strengths) that are evidenced by the low level data contained in their PLRS?
  • How can we map between identified curriculum documentation and the data stored in a PLRS?
  • What analytics will help lifelong learners to understand the data in their PLRS, and to order it appropriately?
  • How can selected data from a PLRS be pulled into e-Portfolios and curriculum vitae?

Analytic Approaches

The challenge of developing learning analytics for lifelong learning competencies is at a relatively early stage of development (Buckingham Shum & Deakin Crick, 2016). Early work with the Connected Learning Analytics (CLA) toolkit (Kitto, Cross et al., 2015, Bakharia, Kitto, et al., 2016) has demonstrated that it is possible to create interoperable data structures from apparently disparate data sources with careful consideration. This project will seek to extend those results by developing frameworks and use cases for personalised lifelong learning that take full advantage of the fact that learning can happen anywhere, at anytime, and in many different places.

Depending upon the emphasis that your research project develops, you will need to make use of emerging educational data standards such as xAPI (ADL, 2013) and IMS Caliper (IMS, 2015) and couple them with existing frameworks to ensure that PLRS data is interoperable despite being collected across many different places and contexts. A good place to start might involve investigating the way in which the xAPI concept of a Learning Record Store can be extended to enable individual learners to link them with existing organisational Information Systems (e.g. Student Information Systems, and Learning Management Systems). The World Wide Web Consortium’s (W3C) Resource Description Framework (RDF) Linked Data (LD) technology stack could also be used to ensure that concepts such as “course”, “award”, and “badge” map between example educational domains (e.g. Europe and Australia), which will enable data to travel between them as a learner moves between institutions from e.g. UTS to Oxford and then into an increasingly globalised workforce.

This project will help to progress our understanding of how we might be able to create an open source Learning Analytics API for any data stored in a PLRS that meets the requirements of specific identified xAPI recipes and profiles (ADL, 2016). This will help us to understand how learners might provide evidence for competency in 21st century skills like “creativity” and “communication”, and other core graduate employability skills (Bridgstock,  2017) by pulling data from their PLRS. This project also offers an opportunity to work towards rethinking the way in which people might use extra curricular activities to add further weight to their claims of competency and achievement.


In addition to the skills and dispositions that we are seeking in all candidates, you should have:

  • A Masters degree, Honours distinction or equivalent with at least above-average grades in computer science or equivalent
  • An analytical, creative and innovative approach to solving problems
  • A strong interest in data interoperability, linked data, SKOS, RDF, etc.
  • Strong programming skills in at least one relevant language (e.g. Python, C/C++, Java) and associated programming frameworks

It is advantageous if you can evidence:

  • Experience in designing APIs
  • Familiarity with at least one of Experience API (xAPI) and/or IMS Caliper
  • Experience with systems architecture and design
  • Peer reviewed publications
  • A digital scholarship profile
  • Design of user-centered software

Interested candidates should contact Kirsty.Kitto@uts.edu.au and Roberto.Martinez-Maldonado@uts.edu.au with informal queries. Please follow the application procedure for the submission of your proposal.


ADL. (2013). xAPI-Spec. https://github.com/adlnet/xAPI-Spec, version 1.0.3 accessed 11/4/2017

ADL. (2016). Companion Specification for xAPI Vocabularies, https://adl.gitbooks.io/companion-specification-for-xapi-vocabularies/content/ , version 1.0 accessed 11/4/2017

Bakharia, A., Kitto, K., Pardo, A., Gašević, D., & Dawson, S. (2016, April). Recipe for success: lessons learnt from using xAPI within the connected learning analytics toolkit. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 378-382). ACM.

Bridgstock, Ruth (2017). Graduate employability 2.0: Social Networks for learning, career development and innovation in the digital age. Available at:  http://www.graduateemployability2-0.com/

Buckingham Shum, S., & Deakin Crick, R. (2016). Learning analytics for 21st century competencies. Journal of Learning Analytics, 3(2), 6–21.  http://dx.doi.org/10.18608/jla.2016.32.2

Caulfield, Mike (2016) We have personalization backwards,  http://mfeldstein.com/we-have-personalization-backwards/

CEDA. (2015). Australia’s future workforce? Technical report, Committee for Economic Development of Australia (CEDA). http://www.ceda.com.au/research­and­policy/policy­priorities/workforce.

IMS. (2015). Caliper Analytics, http://www.imsglobal.org/activity/caliperram

Kitto, Kirsty, Sebastian Cross, Zak Waters, and Mandy Lupton. (2015). Learning analytics beyond the LMS: the connected learning analytics toolkit. In Proceedings of the 5th International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York: ACM, pp. 11-15

Kitto, K., Lupton, M., Davis, K & Waters, Z.(2016). Incorporating student-facing learning analytics into pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning. Proceedings ASCILITE 2016 Adelaide (pp. 338-347)

Kohn, Alfie (2016). The overselling of Education Technology, Edsurge: https://www.edsurge.com/news/2016-03-16-the-overselling-of-education-technology

Ma, W., Adesope, O., Nesbit, J.,  Liu, and Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901-918.

Manouselis, N., Drachsler, H., Vuorikari, R., Hummel, H., & Koper, R. (2011). Recommender systems in technology enhanced learning. In Recommender systems handbook (pp. 387-415). Springer US.

Nye, B. D., Graesser, A. C., & Hu, X. (2014). AutoTutor and family: A review of 17 years of natural language tutoring. International Journal of Artificial Intelligence in Education, 24(4), 427-469.

Sharples, Mike, and Jeremy Roschelle. (2010). Guest editorial: Special section on mobile and ubiquitous technologies for learning. IEEE Transactions on Learning Technologies, (1), pp. 4-6.

Watters, Audrey (2015). The algorithmic future of education. http://hackeducation.com/2015/10/22/robot-tutors