Clawing my way up through the trough of disillusionment with learning analytics

512px-Gartner_Hype_Cycle.svg

(image: Jeremykemp at English Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons)

Warning -this is a bit of a moan post.

Last week I attended the Jisc Learning Analytics Network meeting. It was a really good day, lots of people there, lots of good sharing, moaning, asking where next-ing.  One of the reasons I find these events useful is that they help focus my mind and give me a sense of relief that some of the challenges that I face are similar, if not exactly the same, as many others in the sector.

In terms of learning analytics, my experiences to date have been metaphor-tastic: (ever decreasing) circles, slopes, dead ends, stop-starts . . . I feel that it’s appropriate reflect on my journey via the well trodden Gartner hype cycle.

I’m the first to admit I enjoyed being swept up to the peak of inflated expectations. Exploring the potential of data and learning analytics was probably the last piece of innovation work I was involved in when I work with Cetis. I really enjoyed trying to figure out the practical applications and meanings for mainstream learning and teaching of the swirly twirly graphs at early LAK conferences. It was great to support the emerging UK community via early SoLAR meeting.  I learnt a huge amount being involved in the Cetis Analytics Series.  I always think I brought a  healthy degree of scepticism to some of the hype of learning analytics, but I could  (and still can) see the benefits of extracting, exploring and understanding data around learning and teaching.

From the giddy heights of the peak of inflated expectation, I knew when I moved to a “proper job” within a university I would have a bit of a slide down the slope to the trough of disillusionment. It’s getting out of the trough that I’m finding real difficulty with. Changes in senior management, have meant going through a bit of a treadmill in terms of gaining institutional support and understanding. That’s before even accessing any data.

The Jisc Effective Analytics Programme has been a bit of ray of light and hope for me. Towards the end of last year we took part in the Discovery phase of the programme. This involved a consultancy exercise, onsite for 3 days with a cross section of institutional stakeholders to assess our “readiness” for analytics. At the end of the exercise we got a report with our readiness matrix and some recommendations.  You can view our report here.

At the meeting last week a number of institutions who have gone through the Discovery phase took part in a panel discussion about the experience.  One common thread was the reassurance that the exercise gave to everyone in terms of being “on the right track” with things.  I was pleasantly surprised that we got such good score in terms of our cultural readiness. The validation of having an external report from a nationally recognised agency such as Jisc is also incredibly useful for those of us on the ground to remind/cajole (hit people of the head – oh wait that’s only in my dreams) with in terms of what we should be doing next.

I think one of the main problems with analytics is finding a starting point. Going through the Discovery phase does give a number of starting points. My frustration just now is that my institution is now going through a major rethink of our overall data architecture. So on the one hand I think “hurrah” because that does need to be done. On the other I feel that I am almost back to square one as terms of “business needs” anything to do with learning and teaching seems to fall off the list of things that need to be done pretty quickly.  It’s difficult to juggle priorities, what is more important, getting our admissions process working more efficiently or developing ways to understand what happens when students are engaging (or not) with modules and the rest of the “stuff” that happens at University? Or updating our student record system, or updating our finance systems?

Amidst all this it was good to get a day out to find out what others are up to in the sector. Thanks Jisc for providing these networking events. They really are so useful for the sector and long may they continue. UEL who hosted the event have been doing some great work over the past four years around learning analytics which has emerged from their original BI work with Jisc. The work they have been doing around module attendance (via their swipe card system and VLE data) and performance is something I hope we can do here at GCU sometime soon.

In the morning we got updates from 3 mini projects just have funded starting with the University of Greenwich and their investigations into module survey results and learning outcomes. The team explain more in this blog post. I was also very interested in the Student workload model mini project being developed at the OU.  You can read more about it here.

The other mini project from the University of Edinburgh, was interesting too, but in a different way. It is more what I would term, a pure LA research project with lots of text data mining, regression modelling of (MOOC) discussion forums. Part of me is fascinated by all of this “clever stuff”, but equally part of me just thinks that I will never be able to use any of that in my day job.  We don’t have huge discussion forums, in fact we are seeing (and in many ways encouraging) less use of them (even with our limited data views I know that) and more use of wikis and blogs for reflection and discussion. Maybe these techniques will work on these areas too, I hope so but sometimes thinking about that really does make my head hurt.

I hope that we can start moving on our pilot work around learning analytics soon. ’Til then, I will hang on in there and continue my slow climb up the slope, and maby one day arrive at the plateau.

Categories: analytics, Uncategorized | Tags: #learninganalytics | 10 Comments

About sheilmcn

I am a Senior Lecturer in Blended Learning at Glasgow Caledonian University.

10 thoughts on “Clawing my way up through the trough of disillusionment with learning analytics

  1. By a remarkable coincidence, Sheila, I read your post immediately after posting a message about learning analytics to a FutureLearn course I am following. I understand and share your scepticism. Sometimes I even question whether there is actually a plateau of productivity in this instance:-) If we are honest we must surely accept that many developments in educational technology have had a peak of inflated expectations and a trough of disillusionment, and then they have quietly disappeared (or quietly led a modest life in some niche backwater).

    Below is an edited version of my FutureLearn message:

    “… there is a risk that basing management on analysis of data can lead to a complete misunderstanding of what is actually going on, as it tends to ignore human factors.

    Consider this story (which I believe is true) from the health sector. A hospital decided to use post-operative death rates to compare the performance of anaesthetists. After they had done the data crunching they found that all the anaesthetists had remarkably similar post-operative death rates – apart from one, Dr X. His post-operative death rates were drastically worse than all the others. Clearly there was something very wrong with Dr X, so management decided to investigate. But all the surgeons said “Dr X is the best anaesthetist we have ever met. That’s why we always ask him to work with us when we have a really difficult operation or a really ill patient.” So Dr X’s low score was actually an indication of his superior competence, not his incompetence. He worked on the difficult cases.

    What those who put their faith in ‘big data’ seem to forget is that there is a two-way relationship between behaviour and data. Data does not just reflect behaviour, it also influences behaviour. To go back to the anaesthetist story, one can imagine Dr X saying to a surgeon, “No I won’t help with your tricky operation, I will do a few straightforward ones to get my average up.” So the concern for data can actually harm patient outcomes. The same can happen in education and training: learners can suffer because individual teachers and institutions are ‘gaming’ in order to improve their performance data. This can also discourage creative risk taking and result in conformist teaching/management.

    Now the big data/learning analytics enthusiasts might claim that the answer is to have ever more sophisticated data. But this will surely just result in ever more sophisticated gaming behaviour, with even more energy going into making the figures look good rather than into promoting real learning.

    So I was interested to read on page 9 of the JISC ‘Learner Analytics’ publication that “most interviewees are reluctant to claim any significant outcomes from their learning analytics activities to date”.

    Reply
    • Hi Terry

      Thanks for your comment. I totally agree with you, it’s not just about the data, and we need to be wary of what metrics we are using to avoid gaming the system as you rightly point out. However I think there are some things that it would be really useful to have more insights into so we can have more focused discussions with colleagues, students, the rest of the world about. Data is also not objective – so we need to be aware of the subjective lenses (algorithms) we use.

      Reply
  2. It’s a good post and I followed up by reading the JISC blog post that tells us that “group activity is good for learners ” . I get worried that things are being counted, evaluated, compared , just because they can be now, and that turns into an industry – rather than the focus being on the relationship between the learner and what needs to be learned .

    I remember an early VLE salesman being frustrated by my non excitement around a systems ability to record every key stroke and generate reports on this.

    I still quote the extensive study that showed learners who borrowed the most books from a University library got the best degrees . Perhaps this was not self evident but I don’t think longer reading lists was going to be the panacea.

    It is not the sophisticated stuff , teachers need supportive information that tells them that a learner is potentially at risk and some helpful supportive suggestions that will help them and the learner. Analytics can’t replace personal relationships.

    Reply
  3. I’m not sure if I am thrilled or just even more frustrated by this article. Part of me takes great solace in knowing I’m not alone: not alone in the belief that learning analytics can truly transform the way we assess students; and not alone if feeling like every time I get a glimpse of insight it is squashed by another roadblock in my understanding. It is like this vapor I can see and even touch but can not hold. Worse, for me, is I am an elementary teacher with absolutely NO background in analytics!!!

    What I do have is several year’s experience across grades PK-2 using several adaptive assessments and adaptive curricula: some truly horrific in design, some “almost there.” I’ve even conducted research on an adaptive app looking for convergent validity. I’ve cold called research staticians and they have all been extremely supportive. But the bottom line is, I believe we lack a clear understanding of the limits these type of data provide. As one of your commenters said, “It’s not the sophisticated stuff…relationships matter.” I do not believe the type of data available through learning analytics can ever tell me the reason my student struggled in math for a whole week was because his favorite chicken died (true story, by the way!

    Administrators, policy makers, teachers and parents need to UNDERSTAND the limitations AND how to use the data. Currently it is used in a punititive manner that impacts everyone! If we can use “big data” more like Russel Almond (and colleagues) do through Evidence Centered Design, that is as a part of the whole, we will have truly transformed assessment. If data is used as ONE source of feedback that is equal not greater or less than other elements of feedback, then we can really differentiate learning. At the end of the day I just want to know how each individual child is progressing, compared to only their individual growth, and what I need to do to support their growth.

    Keep clawing, there are those of us who believe it is worth this process. And…if you are at a shortage of programs to research, I can certainly point you in the direction of some I think have it right! Ha!

    Reply
    • thanks for your comment Rene. It is good to know I’m not alone, and yes I think there are many of us who just want a bit more information to add to all the factors that make up a successful learning and teaching experience. I know exactly what you mean about the the chicken!

      Sheila

      Reply
  4. Pingback: Notes and presentations from 5th UK Learning Analytics Network event in London | Effective Learning Analytics

  5. Pingback: A Tale of Two Conferences: #oer16 and #LAK16 | howsheilaseesIT

  6. Pingback: The day after the night before: thinking about data #codesign16 | howsheilaseesIT

  7. Pingback: Joint Jisc ALT #Codesign16 Data informed webinar | Jisc student experience blog

Leave a Reply

Fill in your details below or click an icon to log in:

Gravatar
WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s