Thursday, 15 July 2010

Learning analytics / teaching analytics

My current focus is on learning analytics. How can we tell from site analytics whether someone is learning, engaging in activities that have been shown to support learning, or exhibiting behaviours that are associated with learning? And, rather than develop Wheel 2.0, I'm looking at available analytics and whether they can be harnessed to do this. Hence the current focus on Google Analytics.

A problem is that identifying learning means that I need to be able to associate activity with specific individuals or groups of individuals. As I discussed below, I can do that to some limited extent with Google Analytics but it's not really set up for me to do that and, more to the point, focusing on individuals in this way feels intrusive and, I think, would need informed consent from those concerned if I pursued it to any extent.

So Google Analytics can give me some pointers as to whether learning (activities/behaviours) are taking place, but to link this to individual learners or groups of learners would involve another set of analytics, and those learners would have to be aware of what was taking place.

Coming at this from another perspective - how about teaching analytics? I'm not thinking here of time/motion studies about level of activity and output - I'm more interested in helping teachers / educators judge the value their output has for others. Google Analytics are potentially more helpful here, because the authors of online resources, and the creators of online discussions have publicly identified themselves, and so resources can be tied to individuals.

So, if I examine an online resource, I could look at how many visits it receives, how long those visits last and whether people move on to use linked resources. I can examine the effects of sharing a link to that resource on Twitter. How effective are my Twitter links compared with those of people widely known for their expertise in the field?

More broadly, if I look at an educational resource (I'm currently looking at Cloudworks) I can begin to identify the most effective resources and behaviours. In my next post, I'll describe some preliminary work I have done on this.


  1. When you unpick a lot of formal HE qualifications, you find the notion of "in partial fulfilment of..." cropping up. One of the partials is time spent engaged in the study of a topic, I suppose the idea being if you spend enough time immersed in a topic, some of it must rub off. This sort of maps onto things like CAT points too, eg where 1 point is approx 10 hrs of related study.

    So, one possible route, and one that was idled around in early social learn discussions, was to monitor time folk spent apparently consuming resources in the social learn envt on a particular topic.

    When I was running Google Analytics on T184 and TU120, I did wonder also about how we might use apparent time spent on pages as a signal about a particular page, where time on page might have contributory factors: difficulty, engagement with embedded activity, etc

  2. I was looking at 'time on page' for earlier in the week. However, it does seem to involve a fair amount of drilling down to be a useful analytic - the /register/interest/confirmed page was averaging over 3 minutes of user attention, and I'm pretty sure it contains no interesting or useful content. On the day that Simon presented about SocialLearn to the OU online conference, people spent over 16 minutes on average on the 'confirmed interest' page (OK - the small sample size obviously distorted this figure).