Home / Upcoming Events

Upcoming Events

Nov
27
Wed
PhD Seminar: Gloria Fernandez Nieto- Providing guidance in Multimodal Learning Analytics visual feedback interfaces for Collaborative Classrooms @ Connected Intelligence Centre
Nov 27 @ 10:30 am – 1:00 pm

 

Gloria Milena Fernandez Nieto

Abstract:

Feedback is a crucial aspect of classroom-based learning. Delivering high-quality feedback can help students to make well-informed decisions by understanding their learning goals, teacher’s pedagogical intentions, and their actual performance. However, providing actionable feedback is challenging, especially in large classes with many students or groups to follow up. One way to provide automated feedback is through learning analytics (LA) visual interfaces, in which digital traces and analytics outputs are shown to teachers and students. Yet, concerns about LA visual interfaces in terms of complexity, lack of guidance to communicate insights and lack of educationally meaningful impact have been highlighted in recent research. Besides, capturing, rendering visible and making sense of data about collocated activity in the classroom is even more complex and challenging than in fully computer-mediated settings. The present doctoral thesis aims to address the following research question: how to effectively communicate educationally meaningful insights to teachers and students to provoke reflection on teaching and learning in the collaborative classroom? This document presents the thesis proposal that includes: (i) literature review, (ii) motivation and identified gaps, (iii) thesis proposal, (iv) methodology to be followed, (v) current state of the project with the detailed plan to complete this research, and (vi) preliminary conclusions of this proposal.

This is a Stage 1 PhD seminar to mark the end of Year 1. All are welcome, and invited to share constructive feedback.

Bio:

Gloria Fernandez Nieto is a first year PhD student at University of Technology Sydney in the Connected Intelligence Centre. She is currently supervised by Professor Simon Buckingham Shum, PhD Kirsty Kitto and PhD Roberto Martinez. Her current research focuses on exploring alternatives of feedback to understand traces from data collected in the CSCL classroom to prompt reflection in teaching and learning practices. She also has focused her previous research on Learning Analytics, Technology Enhance Learning and Knowledge Management.

 

 

Nov
28
Thu
ALASI2019: #C21LA: Tracking & Assessing 21st Century Competencies with Learning Analytics @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Simon Buckingham Shum, Darrall Thompson, Maimuna Musarrat (CIC, UTS)
Zhonghua Zhang (Assessment Research Centre, University of Melbourne)
Srecko Joksimovic (Teaching Innovation Unit, University of South Australia)

Abstract

In response to the changing demands on citizens and the workforce, educational institutions are starting to shift their teaching and learning towards equipping students with knowledge, skills and dispositions that prepare them for lifelong learning. These have been termed 21st Century skills/competencies, Core/Soft Skills, General Capabilities, Graduate Attributes, etc. There is now a lot of activity in the school and higher education sectors tackling the challenge of tracking and assessing these competencies in practical ways. Learning Analytics should in principle have important contributions to make, providing computational support for tracking learner processes (not just products), beyond the classroom in more authentic settings, visualizing patterns, and providing rapid feedback to educators and learners (Buckingham Shum & Crick, 2016).  This workshop provides the chance to learn about ongoing efforts to develop and validate “C21LA”, and the nature of the challenges if these are to make a systemic impact, including the pedagogical, assessment, technological and political factors that together define educational infrastructures.

1. Focus of the workshop

21st century skills include the “Four Cs” (cf. Jefferson & Anderson, 2017) which are regularly referred to as creativity, critical reflection, communication and collaboration, Gardner’s “Five Minds” which also map to Thompson’s (2016) CAPRI model. However, there are many other lists that include other qualities such as learning dispositions, ethics and citizenship (e.g. Care, et al. 2018). While pedagogical shifts to equip students with these skills are certainly needed, that alone will not affect systemic change. A critical challenge is how these competencies can be tracked and assessed in meaningful ways, because assessment regimes drive educator and student behaviour. But since these skills are not easily quantifiable, need to be assessed over a period of time, and need to be displayed in interpersonal, societal and culturally valid contexts, traditional methods like observational or interview techniques are hard to apply at scale. Student self-report has an important place, but comes with obvious limitations. This has triggered significant educational research in the school and higher education sectors, but the potential of Learning Analytics is often not harnessed. 

Learning Analytics should in principle have important contributions to make (cf. Buckingham Shum & Crick, 2016), for instance: providing computational support for tracking learner processes (not just products); tracking activity not only inside the classroom but outside, in more authentic settings; tracking activity not only online and also face-to-face (via multimodal sensors/analytics);  providing rapid feedback to educators and learners to build metacognitive capabilities.

In recent years, learning analytics has been applied to develop more objective assessments for measuring some of the essential 21st century skills (e.g., ICT literacy – Learning in digital networks, Wilson, Gochyyev, & Scalise, 2016; Collaborative problem solving, Griffin & Care, 2015; Learning in online environments, Milligan & Griffin, 2016) which could not be objectively, reliably and validly assessed with traditional approaches. Researchers advocate that learning analytics and measurement science should be synthesized for facing the challenges of the assessment of the hard-to-measure 21st century skills (Wilson & Scalise, 2016). 

This workshop provides the chance for participants to share, and learn about, ongoing efforts to develop “C21LA” tools, and critically, how we validate them (e.g. Milligan, 2018; Milligan & Griffin, 2016). The workshop will include some ‘show and tell’, but speakers will be asked to reflect critically on the challenges that remain for these to make a systemic impact, including the pedagogical, assessment, technological and political factors that together define educational infrastructures.

2. Proposed workshop structure

The workshop will run for three hours in 30-minute segments, each segment focusing on one tool. Each segment will have a presentation (15-20 minutes), followed by discussion (10-15 minutes). There will be a plenary discussion at the end.

3. Workshop Presenter Credentials

Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he is inaugural Director of the Connected Intelligence Centre. He has been active in shaping the field of Learning Analytics, and co-founded the Society for Learning Analytics Research.

Darrall Thompson is a Senior Lecturer and Learning Futures Fellow in the UTS Faculty of Design, Architecture and Building. His award-winning research and design thinking are embodied in the REVIEW platform, a criteria-based system used for enhancing assessment and  evaluation capabilities among staff and students in universities and schools.

Zhonghua Zhang is a Research Fellow at the Melbourne Graduate School of Education in the The University of Melbourne. His research interests include assessment, educational measurement, and psychometrics. He has been leading a project which focuses on developing behavioral indicators from log stream data to measure students’ collaborative problem skill, which has been identified as one of the essential skills in the 21st century workplace. 

Srecko Joksimovic is a Research Fellow at the School of Education and Data Scientist in Teaching Innovation Unit, University of South Australia. His research interests focus on exploring the symbiosis of human and artificial cognition to understand knowledge processes and their impact on society.  

Maimuna (Muna) Musarrat is a Postdoctoral Research Associate at the UTS Connected Intelligence Centre, where she is working closely with the U@Uni Academy, researching the assessment of transferable skills in high school students using different tools. 

References

Buckingham Shum, S. & Crick, R. D. (2016). Learning analytics for 21st century competencies. Journal of Learning Analytics, 3 (2), 6–21. 

Care, E., Griffin, P. & Wilson, M. (2018). (Eds.) Assessment and teaching of 21st century skills: Research and applications. Springer

Jefferson, M. & Anderson, M. (2017). Transforming schools: Creativity, critical reflection, communication, collaboration. London: Bloomsbury

Griffin, P., & Care, E. (Eds.). (2015). Assessment and teaching of 21st century skills: methods and approach. Dordrecht: Springer.                                                                                                                                                                    

Milligan, S. (2018). Methodological foundations for the measurement of learning in learning analytics. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18). ACM, New York, NY, USA, pp. 466-470.

Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88–115. 

Thompson, D. (2016). Marks should not be the focus of assessment — but how can change be achieved? Journal of Learning Analytics, 3 (2), 193–212. 

Wilson, M. & Scalise, K. (2016). Learning analytics: Negotiating the intersection of measurement technology and information technology. In J. M. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy (Published in cooperation with AECT). New York: Springer.

Wilson M., Scalise, K., & Gochyyev, P. (2016). Assessment of learning in digital interactive social networks: A learning analytics approach. Online Learning, 20 (2), 97–119.

ALASI2019: Demonstration: Ethical edgecases – a middle space bringing system builders into contact with ethicists @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Kirsty Kitto1, Simon Knight1, Linda Corrin2

Abstract

This demonstration will run in a workshop mode that explores the issues that arise in relying purely upon ethical frameworks and checklists to influence the behaviour of LA practitioners. It will introduce a newly proposed conception of “practical LA ethics” which places the burden of ethical behaviour upon practitioners. An enabling ethical edge cases database will be used by participants to bring system builders into dialogue with legal and ethics experts, so adding to the sophistication of discussions of this important topic in the Australian context.

Keywords

Learning Analytics, ethics, edge cases, scaling adoption

Corresponding author 1 Email: {Kirsty.Kitto, Simon.Knight}@uts.edu.au Address: Connected Intelligence Centre, University of Technology Sydney. PO Box 123 Broadway NSW 2007 Australia

2 Email: lcorrin@swin.edu.au Address:  Education and Quality Services, Level 2, SPS Building, Swinburne University of Technology, Australia

1. Introduction and Focus

Learning Analytics (LA) has, since its inception, had a strong emphasis upon ethics, with numerous checklists and frameworks proposed to ensure that student privacy is respected and potential harms avoided. However, they often contain contradictory instructions, and few practitioners appear to be following them when building LA solutions. Indeed, McNamara, Smith, and Murphy-Hill (2018) recently demonstrated that the ACM code of ethics (https://www.acm.org/code-of-ethics) had no discernible impact upon the decisions made by 63 software engineering students and 105 professional software developers in responding to a set of 11 ethical vignettes. It is likely that similar results would be found for the many checklists and best practice approaches that have been proposed in LA, although this is an area where well-grounded research is desperately required. This does not imply that practitioners do not want to be ethical. Indeed, Johanes and Thille (2019) recently demonstrated that practitioners often have a strong desire to “do the right thing” when building LA solutions.

It seems that an approach to ethics that is grounded in frameworks and checklists alone is not sufficient. One possibility is to provide a “middle space” (Knight, Buckingham Shum and Littleton, 2014), where LA practitioners can work with ethicists, legal experts, and other stakeholders to deliver solutions that meet the needs of society. A new approach (Kitto and Knight, 2019) argues that we should adopt approaches grounded in practical ethics, and presents a database of “ethical edge cases” which holds potential to provide this middle space.

This workshop will introduce the ethical edge case database, and provide participants with a forum to provide feedback on its format, enhance and extend it. The publicly served database can be accessed at www.ethicalEdges.com, and an open source instance is available for modification (at https://github.com/uts-cic/EdgeCaseDB). All input is welcome!

2. Workshop description

This demonstration will take the format of a short (1.5 hours) workshop which interactively brings participants together to work on the ethical edge case database, by adding edge cases and extending existing ones. Participants will be introduced to a number of key ethical and legal frameworks that could impact upon LA, and asked to consider their influence on LA practitioners to date. They will then be introduced to the conception of an edge case and shown how these can drive the development of LA tools, before being guided through the construction of new edge cases and their entry into the LA edge case database.

2.1 Planned workshop schedule (1.5 hours)

Time allocated Focus
15min Introduction
15min Ethical frameworks and checklists – an introduction
45min Building ethical edge cases
15min Discussion and wrap up

2.2 Upon completing this workshop participants will be

  • Familiar with some of the major ethics frameworks that have been developed in LA
  • Aware of some of the tensions that exist in these frameworks when applied by practitioners
  • Familiar with the ethical edge case database and how it can be used to bring LA practitioners into contact with those who are working on the ethical and legal aspects of LA solutions.

2.3 You will need to bring

  • An interest in building LA solutions, the ethical/legal aspects of those solutions, or both!
  • A desire to participate in formulating ethical edge cases that can be used to seed the next generation of LA ethical practice.

3. Credentials of team

Dr Kirsty Kitto is a Senior Lecturer of Data Science at UTS‘s Connected Intellignce Centre (CIC). She is working with the postgraduate.futures team at UTS to extract Canvas data using the Live API and then pull it into student and staff facing LA dashboards.

Dr Simon Knight is a lecturer in the Faculty of Transdisciplinary Innovation. His research interests include learning design and educational technology, educator use of evidence in their practice and learning analytics (particularly writing analytics)

Associate Professor Linda Corrin is Academic Director, Transforming Learning at Swinburne University of Technology. Her interests in learning analytics range from how students and teachers interpret learning analytics data/visualisations to the ethical implications of the use of data in higher education. She is a co-ordinator of the ASCILITE Learning Analytics Special Interest Group and co-founder of the Victorian/Tasmanian Learning Analytics Network.

References

Johanes, P. and Thille, C. (2019). The heart of educational data infrastructures = conscious humanity and scientific responsibility, not infinite data and limitless experimentation. British Journal of Educational Technology (50th Anniversary Special Issue: Learning Analytics and AI: Politics, Pedagogy and Practices), XXXX

Kitto, K. and Knight, S. (2019). Practical ethics for building learning analytics. British Journal of Educational Technology (50th Anniversary Special Issue: Learning Analytics and AI: Politics, Pedagogy and Practices), XXXX

Knight, S., S. Buckingham Shum and Littleton, K. (2014). Epistemology, assessment, pedagogy: where learning meets analytics in the middle space. Journal of learning analytics, 1, (2), 23-47

McNamara, A., Smith, J., & Murphy-Hill, E. (2018). Does ACM’s code of ethics change ethical decision making in software development? In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (pp. 729–733). New York, NY: ACM. Retrieved from https://people.engr.ncsu.edu/ermurph3/papers/fse18nier.pdf

ALASI2019: Learning Analytics Growing Pains – Sociotechnical Infrastructure Changes as LA Tools Mature @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Simon Buckingham Shum1, Antonette Shibani2

Abstract

As Learning Analytics tools mature, there are often ‘growing pains’ in how the infrastructure adapts to the social and technical requirements of scaling up. Across institutions in Australia, there is increasing work being done in this transitional space, including moving from prototypes to products, institutional adoption of LA, engaging stakeholders, organisational leadership, long term impact, and invisible work in keeping everything going. This workshop aims to build common ground across institutional contexts by sharing stories and identifying insights, to inform the design of better sociotechnical infrastructures supporting this critical phase.

Keywords

learning analytics, infrastructure, tools, scaling up, adoption

Corresponding author 1 Email: Simon.BuckinghamShum@uts.edu.au Address: Connected Intelligence Centre, University of Technology Sydney, PO Box 123, Broadway NSW 2007, Australia

2 Email:antonette.shibani@uts.edu.au Address:  Faculty of Transdisciplinary Innovation, University of Technology Sydney, PO Box 123, Broadway NSW 2007, Australia

1. Workshop Focus

Learning Analytics (LA) as a field is reaching a new stage of maturity, as a growing number of tools transition from small scale pilots that have demonstrated promise, to larger scale services within an institution. Those pilots may have been a small scale deployment of a commercial product, or a prototype developed by in-house/external teams. This workshop aims to deepen the conversation between LA researchers and practitioners who are working in this transitional space, since as the scale of the system grows, technologies, roles and stakeholders change — and there are often “growing pains” as the infrastructure, social and technical, adjusts. By sharing our stories, it is hoped that the workshop will build common ground across institutional and LA contexts, and identify insights to inform the design of better sociotechnical infrastructures to support this critical phase.

Topics include but are not limited to:

  • From prototypes to products. Can we expect the same platform to serve both rapid prototyping and production services? How do we design with future evolution in mind?
  • Scaling up for institutional adoption. For adoption and embedding of LA at scale, a transition from technical to social systems is required (Gasevic, Tsai, Dawson & Pardo, 2019). What strategies are useful, and why? How do we handle resource management?
  • Stakeholder engagement. Who needs to be in the loop, and when? What obstacles are there to effective communication between specific stakeholder groups? Who purchases, invents, develops, maintains, and evaluates LA tools? What design processes assist this process? Who supports educators and students once deployed?
  • Invisible work. No matter how good the technology, embedding it into daily practice invariably brings “invisible work” that’s required to oil the wheels and keep everything going. What examples/stories do you have about what this looked like in your case study?
  • Organisational leadership. Institutions need strategy to build mindsets, capabilities, and capacity for LA, and this requires an alignment to their institutional vision and goals (Tsai, Moreno-Marcos, Jivet, Scheffel, Tammets, Kollom & Gaševic, 2018). How can the organisation facilitate or obstruct this process? Who are the key stakeholders in this scaling up process?
  • Long term impact. For sustainable use and implementation of LA, the development and evaluation of tools can no longer be supported by short term goals or one-off studies. What pedagogical grounding is required for long term impact? How do we balance scalability and catering for specific contexts (contextualization) for maximum impact? (Shibani, Knight & Buckingham Shum, 2019) How do we map supply and demand to truly embed LA in classrooms?

2. Confirmed Speakers

The following confirmed speakers will share their institutional insights during the workshop:

  • Promoting institutional adoption of a personalised feedback tool. The OnTask experience

Aberlado Pardo, Professor and Dean Academic at the Division of Information Technology, University of South Australia

  • Co-creation – How human-centred LA at Sydney has co-evolved with 2 to 1100 educators

Natasha Arthars, Postgraduate Research Fellow, DVC Education Portfolio, The University of Sydney

Danny Liu, Senior Lecturer, DVC Education Portfolio, The University of Sydney

  • Development and Dissemination of an Adaptive Learning System: Reflections and Lessons Learned

Hassan Khosravi, Senior Lecturer in Learning Analytics, The University of Queensland

  • A sustainable deployment of textbook smart e-resources in University courses: building a partnership with a publisher for effective learning design

Dr Lorenzo Vigentini, Academic Lead Educational Intelligence & analytics, UNSW Sydney

Dr Happy Novanda, Learning Design Manager, McGraw Hill International

Simon Banks, National Enterprise Manager, McGraw Hill International

  • How research and practice in LA co-evolve: Insights from the Writing Analytics tool AcaWriter

Antonette Shibani, Lecturer at the Faculty of Transdisciplinary Innovation, University of Technology Sydney

Simon Buckingham Shum, Professor of Learning Informatics and Director of the Connected Intelligence Centre, University of Technology Sydney

  • Stakeholder engagement in scaling up LA

Bruce McLaren, Associate Research Professor at the Human-Computer Interaction Institute, Carnegie Mellon University, USA

3. Workshop Presenter Credentials

Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 as inaugural director of the Connected Intelligence Centre. Developing approaches to helping UTS both innovate and achieve impact with LA infrastructure is central to CIC’s mission. Simon has been active in shaping the field of Learning Analytics since the inaugural LAK 2011 conference, serving as a Program Chair (2012/2018), convening many workshops, and a regular keynote speaker. He co-founded the Society for Learning Analytics Research, serving as a V-P and continuing on the Executive.

Antonette Shibani is a Lecturer in the Faculty of Transdisciplinary Innovation at the University of Technology Sydney, Australia. In her doctoral research, she explored the co-design and implementation of a writing analytics tool called ‘AcaWriter’ in higher education, enabling its move from research to classroom practice. Shibani has been involved in the international Learning Analytics community by presenting her work in a number of LAK conferences, and Australian Learning Analytics Summer Institutes (A-LASI). She has chaired/co-chaired five workshops in LAK and A-LASI to build writing analytics literacy within the LA community. She is currently an executive member of the Society for Learning Analytics Research.

References

Gasevic, D., Tsai, Y. S., Dawson, S., & Pardo, A. (2019). How do we start? An approach to learning analytics adoption in higher education. The International Journal of Information and Learning Technology,Vol. 36 No. 4, pp. 342-353. https://doi.org/10.1108/IJILT-02-2019-0024

Shibani, A., Knight, S., & Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model and Writing Analytics Evaluations. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 210-219). ACM.

Tsai, Y. S., Moreno-Marcos, P. M., Jivet, I., Scheffel, M., Tammets, K., Kollom, K., & Gaševic, D. (2018). The SHEILA Framework: Informing Institutional Strategies and Policy Processes of Learning Analytics. Journal of Learning Analytics, 5(3), 5-20.

ALASI2019: Multimodal Analytics for Classroom Proxemics @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Roberto Martinez-Maldonado1 and Gloria Fernandez Nieto2

Abstract

We use the term Classroom Proxemics to refer to how teachers and students use the classroom space, and the impact of this and the spatial design on learning and teaching. The increasing progress in ubiquitous technology makes it easier and cheaper to track students’ and teacher’s  physical actions unobtrusively, making it possible to consider such data for supporting research, educator interventions, and the provision of feedback regarding the use of the classroom space. This workshop is aimed at provoking reflection on potential ways in which teachers can effectively use positioning traces to gain insight into their classroom practice. The workshop will include hands-on ideation activities to explore potential ways in which positioning and other sources of proxemics data can support professional development and research in learning spaces. Indoor positioning sensors along other multimodal learning analytics technologies will be demonstrated during the workshop to facilitate understanding of the broader opportunities of such technologies for learning analytics.

Keywords

multimodal learning analytics, sensors, visualisation, positioning, teaching

Corresponding author 1 Email: Roberto.MartinezMaldonado@monash.edu Address: Faculty of Information Technologies, Monash University, VIC, Australia.

2 Email:Gloria.m.Fernandez-Nieto@student.uts.edu.au Address:  Connected Intelligence Centre, University of Technology Sydney, 2007, NSW Australia.

1. Focus of the workshop

Previous research has found that teachers’ positioning and mobility strategies in the classroom can strongly influence students’ engagement, motivation, disruptive behaviour and self-efficacy (see review by O’Neill & Stephenson, 2014). Inspired by work on instructional proxemics (Chin et al., 2017; McArthur, 2015), the term Classroom Proxemics is proposed to refer to the research space targeted in this workshop. First, this term points at foundational work by Hall (1966) who defined proxemics as the study of culturally dependent ways in which people use interpersonal distance to mediate their interactions. This work has been widely used in architecture and interior design, including the design of learning spaces (Thompson, 2012). Using proxemics as a theoretical lens is highly relevant, because teachers and students make use of the space, furniture, objects and various kinds of technology to interact among themselves.

Second, inspired by work on orchestration (Dillenbourg et al., 2011), the classroom can be considered as the ecological unit of analysis. The classroom includes social, epistemic and physical aspects, that are quite intertwined (Goodyear et al., 2018) and teachers may have varied degrees of control over these aspects according to their pedagogical approach and the tasks unfolding in them.

Feedback and visual representation of movement and positioning traces captured in physical spaces has been studied in previos work. For instance, (Chin, Mei, and Taib 2017) presents the approach of instructional proxemics to generate personal and pedagogical undestanding from how teachers of a second language use their spaces, body movement and positioning and its impact on learning and teaching by using human observations, video and audio.  Other researchers, by using indoor localization (Bdiwi et al. 2019),  and real case studies(Martinez-Maldonado 2019) explore visual representation of positioning data to explore its potencial in spacial pedagogy.

The focus of this workshop is at the intersection between work that has used classroom observations to generate understanding of classroom dynamics (McArthur, 2015) and emerging work focused on creating interfaces to enhance teachers’ awareness, using automatic position tracking (An et al., 2018; Martinez-Maldonado, 2019). Much work needs to be done to identify the kind of reflections that teacher’s positioning data can provoke, and the metrics that may be useful for sensemaking.

2. Participants

The intended audience includes participants interested in developing adaptive and flexible ways to investigate how learners and teachers use the learning spaces. We expect to conduct a 3-hours workshop with at least 10 participants representing different research communities, including the learning sciences (LS)/ education, technology-enhanced learning (TEL) and, also, more data intensive communities such as learning analytics and artificial intelligence in education (AIED). The participation of practitioners and educators will be also encouraged.

Participants will gain first-hand experience in using wearable sensors to track their positioning and in interacting with the data generated from these sensors.

3. Workshop activities

The workshop will follow a JIGZAW collaboration pattern according to the following schedule:

  • Introduction, the workshop will start with short introductions by all the participants and a brief summary of the focus and scope of the workshop (20 minutes);
  • Community Groups, will be formed with the aim of scoping the problem and identifying gaps which will be shared with all the workshop participants (60 minutes);
  • Design Groups, members from the community groups will be re-grouped into design groups to define future scenarios (60 minutes);
  • Consolidation and Reflection, a lead debrief will be facilitated with the goal of consolidating a group of researchers and practitioners interested in the topic (20 minutes).

Breaks and transitions are already considered in this plan

4. Organisers

Roberto Martinez-Maldonado is Senior Lecturer at Monash University, in Melbourne. His areas of research include Human-Computer Interaction, Learning Analytics, Artificial Intelligence in Education, and Collaborative Learning (CSCL). In the past years, his research has focused on applying artificial intelligence and visualisation techniques to help understand how people learn and collaborate in collocated environments. He currently is co-director of the CrossMMLA SIG, the special interest group on Multimodal Learning Analytics Across Spaces.

Gloria Fernandez Nieto is a second year PhD student at University of Technology Sydney in the Connected Intelligence Centre. She is currently supervised by Professor Simon Buckingham Shum, PhD Kirsty Kitto and PhD Roberto Martinez. Her current research focuses on exploring alternatives of feedback to understand traces from data collected in the CSCL classroom to prompt reflection in teaching and lerning practices. She also has focused her previos research on Learning Analytics, Technology Enhance Learning and Knowledge Management.

References

An, P., Bakker, S., Ordanovski, S., Taconis, R., & Eggen, B. (2018). ClassBeacons: Designing Distributed Visualization of Teachers’ Physical Proximity in the Classroom. In Proceedings of the International Conference on Tangible, Embedded, and Embodied Interaction, TEI’18, (pp. 357-367). Stockholm, Sweden. 3173243: ACM.

Bdiwi, Rawia, Cyril de Runz, Sami Faiz, and Arab Ali Cherif. 2019. “Smart Learning Environment: Teacher’s Role in Assessing Classroom Attention.” Research in Learning Technology 27(0). https://journal.alt.ac.uk/index.php/rlt/article/view/2072 (September 6, 2019).

Chin, H. B., Mei, C. C. Y., & Taib, F. (2017). Instructional Proxemics and Its Impact on Classroom Teaching and Learning. International Journal of Modern Languages and Applied Linguistics, 1(1), 1-20.

Dillenbourg, P., Zufferey, G., Alavi, H., Jermann, P., Do-Lenh, S., Bonnard, Q., Cuendet, S., & Kaplan, F. (2011). Classroom orchestration: The third circle of usability. In Proceedings of the International Conference on Computer Supported Collaborative Learning, CSCL’11, (pp. 510-517). Hong Kong. New York: Springer

Goodyear, P., Ellis, R. A., & Marmot, A. (2018). Learning spaces research: Framing actionable knowledge. In R. A. Ellis & P. Goodyear (Eds.), Spaces of Teaching and Learning, (pp. 221-238). Singapore: Springer.

Hall, E. T. (1966). The hidden dimension (Vol. 609). Garden City, NY, United States: Doubleday.

Martinez-Maldonado, R. (2019). I Spent More Time with that Team: Making Spatial Pedagogy Visible Using Positioning Sensors. In Proceedings of the International Conference on Learning Analytics & Knowledge, LAK’19, (pp. 21-25). ACM.

McArthur, J. A. (2015). Matching Instructors and Spaces of Learning: The Impact of Space on Behavioral, Affective and Cognitive Learning. Journal of Learning Spaces, 4(1), 1-16.

Thompson, S. (2012). The applications of proxemics and territoriality in designing efficient layouts for interior design studios and a prototype design studio. Masters dissertation. California State University, Northridge, United States

ALASI2019: Panel debate – the validity of using student evaluation surveys for performance based funding at Australian universities @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Leonie Payne1, Kirsty Kitto1, Michael Pracy1, Jason Lodge2, Abelardo Pardo3

Abstract

The Australian Federal Government announced in August 2019 that aspects of the Quality Indicators for Learning and Teaching (QILT) Student Experience and Graduate Outcomes surveys will form two of the four key metrics for performance based funding of Australian Universities from 2020. Given the lack of consensus on the validity and appropriate use of student evaluations of teaching, it is time to explore the rammifications of this decision. And who better but the Learning Analytics community to do so? We propose a plenary panel debate on the provocation “Student evaluations of teaching are the worst form of evaluation, except for all of the others”.

Keywords

Performance based funding, QILT, Student evaluation surveys, teaching quality, higher education

1 Email: {Leonie.E.Payne,Kirsty.Kitto,Michael.Pracy}@student.uts.edu.au, Connected Intelligence Centre, University of Technology Sydney. PO Box 123 Broadway NSW 2007 Australia

2 Email: jason.lodge@uq.edu.au, University of Queensland

3 Email: abelardo.pardo@unisa.edu.au, Uniersity of South Australia

1. Panel Debate Background – Performance Based Funding

The Performance-Based Funding for the Commonwealth Grant Scheme: Report for the Minister for Education was released in August 2019. This report outlines the proposed measures for performance-based university funding to be implemented in 2020, with the inclusion of the Student Experience and Graduate Outcomes QILT (Quality Indicators for Learning and Teaching) surveys. The metrics to be included are student satisfaction with teaching quality (Student Experience survey) and  graduate employment rates (Graduate Outcomes Survey) for domestic bachelor students. The stated aims of the scheme include to create more “accountability” for public investment on higher education priorities and to provide financial incentives to encourage improved university performance, with the identified key principles of “fitness-for-purpose, fairness, robustness and feasibility” (Commonwealth of Australia, 2019).

Given the implications that these decisions will have for university funding, and the diverse conflicting perspectives on the validity of student surveys as a form of teaching quality evaluation, we propose a panel debate for ALASI 2019. The topic will be: “Student evaluations of teaching are the worst form of evaluation, except for all of the others”. We envisage the debate would provide a highly interactive, entertaining and potentially controversial event, that would help to advance discussion about this important topic which will have high impact upon the Australian university sector.

1.1 Quality Indicators for Learning and Teaching (QILT)

The Quality Indicators for Learning and Teaching (QILT) is an annually published survey that allows comparison of Australian higher education institutions and study areas on measures of student experience (QILT 2018). It is an example of Student Evaluations of Teaching (SET) (Marsh 2007). The QILT survey provides an opportunity for students to make comparisons of universities based on surveyed student experience and graduate employment outcomes (QILT 2015).

1.2 Arguments for and against the Validity of Student Evaluations of Teaching

The Marsh (2007) review discusses a wide range of research which has demonstrated that there is validity in using Student Evaluations of Teaching as a measure of teaching performance. For  example, there is a well-established relationship between student ratings and learning with SETs having good internal consistency and stability (Abrami 2001). In addition, Sporen, Brockx, & Mortelmans (2013) found that SETs are also correlated with teachers’ self-evaluations, alumni ratings and evaluations by trained observers.  Aleamoni (1999) dispels the myth that student ratings are merely a “popularity contest”, finding students rated educators on their preparation and organisation, stimulation of interest, motivation, answering of questions, and courteous treatment of students. In contrast, a wide body of research questions the validity and appropriate application of student surveys for evaluating teaching quality. For example, Johnson (2000) cautions against the use of student evaluation questionnaires as a bureaucratic tool driven by market ideologies, arguing that while student evaluations may be useful as formative diagnostics they are not appropriate tools for summative judgement for employment decisions and tenure. Similarly, a study by Boysen et al (2014) demonstrates that teaching evaluations are interpreted by administrators and teaching faculty as posessing higher levels of precision than are warranted given the statistical methodologies used. Shevlin et al. (2000) argues that a ‘halo’ effect limits their ability to measure the multi-faceted, multi-dimensional nature of teaching effectiveness, and Macfadyen et al (2016) posit bias in what type of students even respond to SETs, arguing that this sample does not reflect the individual and course characteristics of the total student population. Thus, the cases for and against SETs are extensive, and conflicting, which sets the scene for a lively forum.

2. Panel Format

This panel would take the format of a plenary sesion debate, with 6 speakers allocated to one of 2 teams, for, and against the proposition. Team captains will be Jason Lodge and Kirsty Kitto with the remainder of the participants to be determined once ALASI attendees are known. We will endeavor to bring a QILT member, and the VC of Woollongong (who chaired the performance based funding review) into the panel as team participants. Abelardo Pardo will play the role of MC.

3. Organiser Credentials

  • Leonie Payne is a PhD Student at the Connected Intelligence Centre, UTS, where she is working on a thesis that aims to bring rigour to the evaluation of quality in Higher Education by accounting for bias in student response rates.
  • Kirsty Kitto is a Senior Lecturer at the Connected Intelligence Centre, UTS, where she leads a number of LA She was formerly seconded to the QUT Quality and Evaluation unit where she worked on analysing 4 years of SET data to derive performance metrics for teaching.
  • Michael Pracy works as a data scientist at the Connected Intelligence Centre, UTS. His background is in astrophysics, where he performed extensive work on controlling for bias in data obtained from astrophyisical phenomena.
  • Jason Lodge is an Associate Professor at the University of Queensland where he concentrates on the application of the learning sciences to higher education. Specifically, he is interested in the cognitive and emotional factors that influence learning and behaviour and how research findings from the learning sciences can be better used to enhance design for learning, teaching practice and education policy.
  • Abelardo Pardo is Professor and Dean Academic at the University of South Australia and the Environment at the University of South Australia. His research interests include the design and deployment of technology to increase the understanding and improve digital learning experiences. He is the current president of SoLAR.

References

Abrami, P. (2001). Improving judgments about teaching effectiveness using teacher rating forms. New directions for institutional research (109) 59–87.

Aleamoni, L. (1999). Student rating myths versus research facts from 1924 to 1998. Journal of Personnel Evaluation in Education 13(2) 153–166.

Boysen, G.A., Kelly, T.J., Raesly, H.N. & Casner, R.W. The (mis)interpretation of teaching evaluations by college faculty and administrators. Assessment & Evaluation in Higher Education, 39(6) 641-656.

Commonwealth of Australia (2019). Performance-Based Funding for the Commonwealth Grant Scheme, Report for the Minister for Education – June 2019, viewed 9th August 2019, https://www.education.gov.au/performance-based-funding-commonwealth-grant-scheme

Johnson, R. (2000). The Authority of the student Evaluation Questionnaire. Teaching in Higher Education, 5(4), 419-434.

Macfadyen, L., Dawson, W., Prest, S., & Gasevic, D. (2015). Whose feedback? A multilevel analysis of student completion of end-of-term teaching evaluations. Assessment & Evaluation in Higher Education 1–19.

Marsh, H. (2007). Students evaluations of university teaching: Dimensionality, reliability, validity, potential biases and usefulness. In The scholarship of teaching and learning in higher education: An evidence-based perspective, 319–383. Springer.

QILT 2015, Department of Education and Training, QILT website launched, M2 Presswire – September 17, 2015

QILT 2018, “Quality Indicators for Learning and Teaching”, viewed 9th August 2019, www.qilt.edu.au

Shevlin,  M., Banyard, P., Davies, M., & Griffiths, M. (2000). The Validity of Student Evaluation of Teaching in Higher Education: Love me, love my lectures?. Assessment & Evaluation in Higher Education, 25(4) 397-405.

Spooren, Pi., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching the state of the art. Review of Educational Research 83 (4) 598– 642.

 

 

ALASI2019: The Fifth Writing Analytics Workshop: Linking Reflective Writing Analytics to Learning Design @ University of Wollonging
Nov 28 @ 9:00 am – 9:30 am

Ming Liu1, Rosalie Goldsmith2,Sumati Ahuja3, Xiaodi Huang4

Abstract

Reflective writing is a fundamental learning activity across learning contexts. With the recent advancement of natural language processing techniques, text analytics are able to identify salient textual features of students’ written assignments, such as academic reflective essays and reflective statements, and generate actionable feedback. However, how to best use reflective writing analytics tools, such as AcaWriter, in different learning contexts is a challenging and important issue. This workshop seeks to connect writing analytics and educators who work on linking the writing tool into learning design. We will demonstrate how AcaWriter can analyze writing from learning contexts, and how to integrate this tool in various learning contexts. Participants will have a hand-on experience of using AcaWriter, and be shown a few case studies of using acaWriter in different subjects at the University of Technology, Sydney, and finally we  will discuss the challenges of linking writing analytics tools to learning design.

Keywords (not a research paper)

Writing analytics, AcaWriter, Automated Writing Feedback, Learning Design

Corresponding author 1 Email: ming.liu@uts.edu.au Address: Connected Intelligence Centre, University of Technology, Sydney, NSW, Australia

2 Email: Rosalie.Goldsmith@uts.edu.au Address: Institue for Interactive Media and Learning, University of Technology, Sydney, NSW, Australia

3 Email: Sumati.Ahuja@ut.edu.au Address: : Business School, University of Technology, Sydney, NSW, Australia

4 Email: xhuang@csu.edu.au Address: School of Computing and Mathematics, Charlies Sturt University, Albury-Wodonga, VIC, Australia

1. Workshop Focus

In recent years, the integration research of learning analytics to learning design has attracted a great attention in the learning analytics(LA) community (Lockyer & Dawson, 2011; Macfadyen, Lockyer and Rienties, 2019). With the advancement of natural language processing and machine learning, writing analytics has become an emerging field of LA and has shown its potential for students to revise their written assignments (Gibson et al., 2017), and for teachers to improve learning design with the evidence (Cherie, Gibson and Buckingham Shum, 2018). The proposed fifth workshop in the series will build on the previous ALASI writing analytics workshops to develop writing analytics literacy and learning design skills. The focus will be on the integration of AcaWriter with learning design by linking theory, pedagogy and assessment to close the feedback loop (Corrin, et al., 2018; Knight, Shum, & Littleton, 2014; Shibani, Knight, Buckingham Shum, & Ryan, 2017).

This workshop will provide participants with hands-on experience in using AcaWriter (in the first session), and detailed use cases in authentic learning enviornments and discussion of how best to use AcaWriter in different learning contexts (in the second session). Thus, the fifth workshop is intended to:

  • increase the participants’ knowledge of reflective writing analytics with using AcaWriter system;
  • enhance conversation on writing analytics literacy development and learning design integration by bring together text analytics researchers and educators.
  • move the field forward by creating a writing analytcis research and application community in Australia.

2. Workshop Presenter Credentials

Dr. Ming Liu is a research fellow of text analytics at the Connected Intelligence Centre, UTS. His research work is focused on researching and developing automated feedback tools that support writing, reading and peer reviewing in the context of individual and collaborative learning using learning analytics and artificial intelligence. He has initiated and participated several Australian and Chinese goverment funded writing analytics research projects, including AcaWriter (https://acawriter.uts.edu.au) funded by UTS, Glosser (Comprehensive support for collabroative writing) funded by ARC, iWrite (http://iwrite.sydney.edu.au/iwrite.html) funded by OLT, Cooperpad (Collabroative Writing Analytics Tool) and VisualPeer (Formative Peer review Analytics tool) funded by NSFC. He has co-chaired the LAK19 and ALASI2018 Writing Analytics workshops.

Dr Rosalie Goldsmith is an applied linguist who is a member of the Academic Language and Learning Team, University of Technology Sydney. Rosalie works with the faculty of Engineering & IT. Her ain research areas are Engineering Education, Writing Practices, reflective writing and reflective writing analytics, Practice Architectures Theory, peer learning, WIL and developing professional identity.

Dr. Sumati is an academic with an impressive background of working in industry for over 25 years and teaching in both undergraduate and postgraduate programs for 10 years. She has held senior positions at internationally renowned architectural firms. Sumati moved to full time academic role in 2018 after completing her PhD in Management from UTS Business School. Her PhD focused on the changing nature of professional work and how professionals respond to changes in the way their services are procured and delivered. The future of work continues to be Sumati’s research interest with a focus on how technologies are transforming the work of human experts.

Dr. Xiaodi Huang received his Bachelor of Science in Physics in 1989, M.Phil. degree in Computers in Education in 1992, and a PhD  in 2004. He is a senior lecturer in the School of Computing and Mathematics at Charles Sturt University. His research areas include visualization, data mining, and web services. He has published over 100 scholar papers in international journals and conferences. Dr Huang is a regular reviewer for several international journals, and serves as the committee members of a number of international conferences. He is a senior member of IEEE Computer Society and member of the ACM.

References

Macfadyen, L., Lockyer, L. and Rienties, B., (2019), Special Issue of Learning Design and Learning Analytics, Journal of Learning Analytics, in press.

Lockyer, L. & Dawson, S. (2011). Learning designs and learning analytics. In P. Long, G. Siemens, G. Conole & D. Gasevic (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 153-156). New York, NY, USA: ACM.

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective Writing Analytics for Actionable Feedback. Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge, March 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436. [Preprint] [Replay]

Lucas, C., Gibson, A. and Buckingham Shum, S. (In Press). Utilization of a novel online reflective learning tool for immediate formative feedback to assist pharmacy students’ reflective writing skills. American Journal of Pharmaceutical Educationhttps://doi.org/10.5688/ajpe6800

Antonette Shibani, Simon Knight, Simon Buckingham Shum and Philippa Ryan (2017). Design and Implementation of a Pedagogic Intervention Using Writing Analytics. In Proceedings of the 25th International Conference on Computers in Education. New Zealand: Asia-Pacific Society for Computers in Education.

Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (2017). Academic Writing Analytics for Civil Law: Participatory Design Through Academic and Student EngagementInternational Journal of Artificial Intelligence in Education, 28, (1), 1-28.

Top