Learning analytics entails the application of data science techniques, such as probability modelling and data visualization, to educational data in order to generate actionable knowledge to support teaching and learning (Duval, 2011; Siemens, 2012). Because of its origins in online courseware environments, which typically embraced knowledge-transmission modes of pedagogy, a large proportion of learning analytics research maintains a focus on assessment at the level of individual learners, emphasizing individual achievement and accountability.
One aspect of my research is concerned with the development of learning analytics for learning communities. Learning communities are characterized by a culture of learning wherein all participants are involved in a collective effort of understanding. One of the major barriers for teachers in adopting learning community approaches is that of assessment. In contrast to traditional forms of instruction, wherein the teacher has sole authority over the assessment of students’ work, learning communities provide students with a greater degree of agency, allowing them to “develop ways to assess their own progress and work with others to assess the community’s progress” (Bielaczyc & Collins, 1999, p. 272). Thus, in a learning community the activity designs must clearly articulate the underlying learning processes, making them visible and accessible for assessment. Furthermore, because learning communities focus on both individual and collective aspects of knowledge production, assessment in these contexts must serve the dual function of both measuring and scaffolding learning, producing a “feedforward effect” that serves to catalyze the development of new knowledge.
Throughout my research, I have designed and implemented various student- and teacher-facing learning analytics for learning communities. These analytics have captured real-time progress of the community at three levels of granularity: Individual students, small groups, and whole class. The goals of my work are closely aligned with those identified by Buckingham Shum and Crick (2016) concerning learning analytics for formative assessment of 21st century competencies:
…to forge new links from the body of educational/learning sciences research—which typically clarifies the nature of the phenomena under question using representations and language for researchers—to documenting how data, algorithms, code, and user interfaces come together through coherent design in order to automate such analyses, providing actionable insight for the educators, students, and other stakeholders who constitute the learning system (p. 8).
References:
Bielaczyc, K., & Collins, A. (1999). Learning communities in classrooms: A reconceptualization of educational practice. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 269–292). Mahwah, NJ: Lawrence Erlbaum Associates Inc.
Buckingham Shum, S., & Crick, R. D. (2016). Learning Analytics for 21st Century Competencies. Journal of Learning Analytics, 3(2), 6–21.
Duval, E. (2011). Attention Please!: Learning Analytics for Visualization and Recommendation. In Proceedings of the 1st Intl. Conference on Learning Analytics and Knowledge (pp. 9–17). New York: ACM.
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317.
Siemens, G. (2012). Learning Analytics: Envisioning a Research Discipline and a Domain of Practice. In Proceedings of the 2nd Conference on Learning Analytics and Knowledge (pp. 4–8). New York: ACM.