Learning Analytics: Learning and Growing Pains


After a jaunt through some analytical tools and methods, my course on learning analytics has returned to the path of overview and reflection. The subjects of this post are two different methods or approaches with some common criticisms and challenges. As a very young and inherently interdisciplinary field, learning analytics is still maturing, much like young apprentices who have been introduced to the tools of trade but have not yet developed the expertise necessary to deliver quality work. The use of network analysis in learning environments (including social network analysis methods) and the development of multimodal learning analytic systems are good examples of such limitations. Most of the studies utilizing these methods lack theoretical foundations or contexts, as well as standards in definitions and methods. Because the methods were originally developed outside of the field of educational research, they often require expertise, technological or methodological, not normally available to those with an interest in studying learning. Finally, these methods represent the need for multiple approaches to adequately examine the complexities of learning and cognitive processes.

The readings about these methods (chapters 4 and 6 from The Handbook of Learning Analytics, see Lang, et al., 2022) were particularly critical of the state of research put by those in the field. Specific criticisms include studies that provide “limited insight”, “disjointed empirical findings” that are “difficult to synthesize” (for social network analyses, Lang, et al., pg. 38), as well as “the lack of homogenous of methodological approaches” with “each study (using) different approaches”, resulting in “complete diversity” of research (regarding multimodal learning analytics, Lang, et al., pg. 60). The key problem identified by Oleksandra Poquet and Srecko Joksimovi, authors of the critique of learning analytics studies utilizing social network analysis, has been “the naive adoption of network analysis” without making the network methodologies explicit and careful considerations of the assumptions of these methods. This has resulted in a “cacophony of network approaches” that are indecipherable and provide disjointed results. Similar criticisms, albeit not quite as harsh, comes from Xavier Ocha, author of the chapter on multimodal learning analytics (MmLA), who characterized this specialty as lacking standards and best practices due in no small part to the youth and interdisciplinarity of the field (Lang, et al., pg. 61).

Both specialties, like that of the entire field of learning analytics, require expertise outside of the tradition field of education, particularly in the technology and data analytics (e.g. machine learning and artificial intelligence). The methods of multimodal learning analytics inherently require substantial knowledge and technical capabilities to effectively integrate the multitude of sensors into a network to provide simultaneous capture of trace data. Then there are the technologies needed to assimilate or “fuse” the disparate data so that it can be processed and analyzed using artificial intelligence methods (Lang, et al., pg. 62). These methods require substantial expertise in the assumptions and parameters that, when not fully understood, could lead to the “limited insight” and “disjointed empirical findings” that Poquet and Joksimovi warned about. The authors noted that network analysis as a methodology shifts the resulting networks from “relationships of data” to “relationships of constructs” (Lang, et al., pg. 40). In these theoretical models, attention needs to be paid to proper construction of the network model through selection of ties or nodes based on theory, and using the theoretical framework to interpret the model, rather than explaining the model through the data relationships alone (Lang, et al., pg. 41).

These two methods, both multimodal learning analytics, of course, but also network analysis for learning analytics, demonstrate the need for multiple methods to effectively discern the cognitive and learning processes. This is inherent in multimodal LA as this method is based on capturing data from an array of modes of communication through a variety of media or channels. This method also requires the data to be gathered in mixed-medial learning environments, and then that data to be analyzed in simultaneously in order to provide near real-time feedback to students or teachers. Similarly, it is rarely enough for network analysis alone to explain cognitive processes or learning behaviors. Nor are single dimensions of communications enough for network analyses to discern valid ties. The field is ripe for exploring using more complex network models, incorporating time, space and multiple dimensions of communication (Lang, et al., pg. 43).

I was most impressed by the sheer amount of technology that could be (and has been) used for multimodal learning analytics. The example provided, a presentation rehearsal system that provides feedback to students on their presentation skills, was impressive for its extent of coverage of audio and visual data capture, and even more so for its analysis of the data captured (see figure below). The system included not only cameras and microphones, but also software to track eye gaze and posture, analyze speech volume and formats to pick up on filled-in pauses (“um’s” and “ah’s”), and to process all the data simultaneously and provide the student presenter with near real-time feedback.

Diagram of the layout of the multimodal system for oral presentation feedback. Shows the pico projector (front and back), Raspberry sensor, microphone, camera, intel NUC, and acoustic foams.
Ochoa and Dominguez, 2020

Finally, one of the many new points that I learned from these readings was the distinction between network analysis as a method and that as a methodology. Network analysis as a method provides a network model based solely on the relationships of the data provided. While such models can be used to select features for more complex models and categorize ties or nodes, they are inherently naïve or arbitrary and lack theoretical context. They cannot, then, represent a theoretical construct. Network analysis as a methodology requires such a theoretical framework, and as such, care and attention paid to selection of model parameters, descriptive metrics, and statistical techniques. The concept of comparing network models with the use of null models was especially intriguing. Much like traditional null hypothesis testing, comparing network models requires models of “random networks simulated using hypothesized generative rules” – that is, a model of a specific learning behavior under normal conditions. Models of observed behavior could then be compared against these “null models” to identify diversions or deviations.

References

Lang, C., et al. (2022). Handbook of Learning Analytics. Vancouver, BC, Society for Learning Analytics Research (SoLAR).

Ochoa, X. and F. Dominguez (2020). “Controlled evaluation of a multimodal system to improve oral presentation skills in a real learning setting.” British Journal of Educational Technology 51(5): 1615-1630.

Leave a comment

Blog at WordPress.com.

Up ↑

Studying Research

Research is not a basic skill

Decolonising through critical librarianship

A platform for Cambridge librarians approaching decolonisation

The Librarian Parlor

Building a community of researchers

Librarian AND/OR researcher

My role as a librarian has always involved research, but I'm on a journey to develop as a researcher and reflect on my experiences

The Scholarly Kitchen

What’s Hot and Cooking In Scholarly Publishing

Libraries are For Use

Demonstrating the value of librarianship

Lib(rary) Performance

About library statistics & measurement - by Ray Lyons

Walt at Random

Demonstrating the value of librarianship

The Scholarly Kitchen

Demonstrating the value of librarianship

The Quarterly Journal of Economics Current Issue

Demonstrating the value of librarianship

Texas Library Association blogs

Demonstrating the value of librarianship

Stephen's Lighthouse

Demonstrating the value of librarianship

Demonstrating the value of librarianship

ResourceShelf

Demonstrating the value of librarianship

Reference Notes

Demonstrating the value of librarianship

Politifact.com Truth-O-Meter rulings from National

Demonstrating the value of librarianship

Open and Shut?

Demonstrating the value of librarianship

No Shelf Required

Demonstrating the value of librarianship

Aaron Tay's Musings about librarianship

Demonstrating the value of librarianship