Sign in / Join

The challenge for Learning Analytics: Sense making

https://sylviamoessinger.files.wordpress.com/2012/06/learninganalytics_chalkboard.jpg

Image: Educause

Its true Twitter can be a distraction. But it is an unparalleled  resource for new ideas and learning about things you didn't know you wanted to learn about. This morning my attention was drawn by a Tweet linking to a interview in Times Higher Education with Todd Rose entitled "taking on the 'averagarians'." Todd Rose believes that "more sophisticated examples of “averagarian” fallacies – making decisions about individuals on the basis of what an idealised average person would do – are causing havoc all round." The article suggests that this applies to higher education giving the example that "Universities assume that an average student should learn a certain amount of information in a certain amount of time. Those who are much quicker than average on 95 per cent of their modules and slower than average on 5 per cent may struggle to get a degree."

It seems to me that this is one of the problems with Data Analytics. It may or may not matter that an individual is doing better or worse than the average in a class or that they spend more or less time reading or even worse logged on to the campus VLE. Its not that this data isn't potentially useful but it is what sense to make of it. I'm currently editing a paper for submission to the workshop on Learning Analytics for Workplace and Professional Learning (LA for Work) at Learning Analytics and Knowledge Conference (LAK 2016) in April (I will post a copy of the paper here on Sunday). And my colleague Andreas Schmidt has contributed what I think is an important paragraph:

Supporting the learning of individuals with learning analytics is not just as designers of learning solutions how to present dashboards, visualizations and other forms of data representation. The biggest challenge of workplace learning analytics (but also learning analytics in general) is to support learners in making sense of the data analysis:

  • What does an indicator or a visualization tell about how to improve learning?
  • What are the limitations of such indicators?
  • How can we move more towards evidence-based interventions

And this is not just a individual task; it requires collaborative reflection and learning processes. The knowledge of how to use learning analytics results for improving learning also needs to evolve through a knowledge maturing process. This corresponds to Argyris & Schön’s double loop learning. Otherwise, if learning analytics is perceived as a top-down approach pushed towards the learner, it will suffer from the same problems as performance management. These pre-defined indicators (through their selection, computation, and visualization) implement a certain preconception which is not evaluated on a continuous basis by those involved in the process. Misinterpretations and a misled confidence in numbers can disempower learners and lead to an overall rejection of analytics-driven approaches.

Leave a reply