Learning Analytics – Levels of Analytics

Levels of Analytics

Learning analytics, and its related but separate discipline of Educational Data Mining (EDM) can be seen to operate at different levels, or scales. Following the nomenclature of Ferguson and Buckingham Shum (2012), analytics operate at three broad levels:

Macro Level Analytics exist at national and/or sector level.  In the crudest sense these could be represented by league tables of Institutions based on various metrics of performance. More nuanced uses might look at sector level trends such as participation rates compared to socio-economic data, drop out rates etc.

Meso Level Analytics typically exist at the level of an individual Institution. An example from a technology enhanced learning context might be a benchmarking exercise looking at the number of departments and modules using electronic management of assessment approaches.

Micro Level Analytics are those that are more likely apply to individual learners, and it is perhaps these that we think of more specifically when we think of “learner analytics” proper. These might include information that helps an individual learner assess their own performance, either over time and/or in comparison to peers. It may also refer to processes that can help to identify students who might be struggling in some sort of way, and plan interventions to help this. Another familiar example might be the giving of collective  feedback on an assessment task, where a lecturer might identify mistakes (and good points!) commonly made by groups of students and share these with  them to improve future performance. Whilst this maybe something that lecturers have done for years, increased use of electronic assessment methods might be able to provide a more detailed analysis, should this be useful.

The use of analytics can also be extended beyond looking at formal assessment data. Other forms of activity, such as engagement with discussion boards within the VLE can also be interrogated.  MOLE provides a number of reporting tools that do this (available under Evaluation, Course Reports in the Control Panel for any given course). More informal learning activities can also be analysed, such as engagement with social networking tools. Some of these already have their own analytics tools, such as Facebook, and third-party tools exist for others. An excellent example of one of these is TAGS, developed by Martin Hawksey, for analysing Twitter usage (https://tags.hawksey.info, and also see the blog posts and newsletter articles below on Martin’s equally excellent recent presentation to the White Rose Learning Technologists’ Forum on analytics).

Another form of analytics addresses the use of educational resources, referred to as “paradata”. This again is not necessarily new, as librarians for example have known for years that certain key texts are more popular than others, and have devised means such as short-loan collections and multiple copies of these texts to deal with this. This can now include how learners access resources within a VLE, or how specific questions within online tests are responded to.  VLEs and Library systems collect this data automatically, and the vendors of these are realising the sales potential of developing specific additional tools such as Blackboard Analytics. Resources can also be subjected to more qualitative measures of usefulness by learners, similar to “likes” in social  networking, or customer reviews in online shopping environments.

See also:

About Learning Analytics

 

Learning Analytics: Critical Factors

Learning Analytics: Useful Resources and References