Clawing my way up through the trough of disillusionment with learning analytics

512px-Gartner_Hype_Cycle.svg

(image: Jeremykemp at English Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons)

Warning -this is a bit of a moan post.

Last week I attended the Jisc Learning Analytics Network meeting. It was a really good day, lots of people there, lots of good sharing, moaning, asking where next-ing.  One of the reasons I find these events useful is that they help focus my mind and give me a sense of relief that some of the challenges that I face are similar, if not exactly the same, as many others in the sector.

In terms of learning analytics, my experiences to date have been metaphor-tastic: (ever decreasing) circles, slopes, dead ends, stop-starts . . . I feel that it’s appropriate reflect on my journey via the well trodden Gartner hype cycle.

I’m the first to admit I enjoyed being swept up to the peak of inflated expectations. Exploring the potential of data and learning analytics was probably the last piece of innovation work I was involved in when I work with Cetis. I really enjoyed trying to figure out the practical applications and meanings for mainstream learning and teaching of the swirly twirly graphs at early LAK conferences. It was great to support the emerging UK community via early SoLAR meeting.  I learnt a huge amount being involved in the Cetis Analytics Series.  I always think I brought a  healthy degree of scepticism to some of the hype of learning analytics, but I could  (and still can) see the benefits of extracting, exploring and understanding data around learning and teaching.

From the giddy heights of the peak of inflated expectation, I knew when I moved to a “proper job” within a university I would have a bit of a slide down the slope to the trough of disillusionment. It’s getting out of the trough that I’m finding real difficulty with. Changes in senior management, have meant going through a bit of a treadmill in terms of gaining institutional support and understanding. That’s before even accessing any data.

The Jisc Effective Analytics Programme has been a bit of ray of light and hope for me. Towards the end of last year we took part in the Discovery phase of the programme. This involved a consultancy exercise, onsite for 3 days with a cross section of institutional stakeholders to assess our “readiness” for analytics. At the end of the exercise we got a report with our readiness matrix and some recommendations.  You can view our report here.

At the meeting last week a number of institutions who have gone through the Discovery phase took part in a panel discussion about the experience.  One common thread was the reassurance that the exercise gave to everyone in terms of being “on the right track” with things.  I was pleasantly surprised that we got such good score in terms of our cultural readiness. The validation of having an external report from a nationally recognised agency such as Jisc is also incredibly useful for those of us on the ground to remind/cajole (hit people of the head – oh wait that’s only in my dreams) with in terms of what we should be doing next.

I think one of the main problems with analytics is finding a starting point. Going through the Discovery phase does give a number of starting points. My frustration just now is that my institution is now going through a major rethink of our overall data architecture. So on the one hand I think “hurrah” because that does need to be done. On the other I feel that I am almost back to square one as terms of “business needs” anything to do with learning and teaching seems to fall off the list of things that need to be done pretty quickly.  It’s difficult to juggle priorities, what is more important, getting our admissions process working more efficiently or developing ways to understand what happens when students are engaging (or not) with modules and the rest of the “stuff” that happens at University? Or updating our student record system, or updating our finance systems?

Amidst all this it was good to get a day out to find out what others are up to in the sector. Thanks Jisc for providing these networking events. They really are so useful for the sector and long may they continue. UEL who hosted the event have been doing some great work over the past four years around learning analytics which has emerged from their original BI work with Jisc. The work they have been doing around module attendance (via their swipe card system and VLE data) and performance is something I hope we can do here at GCU sometime soon.

In the morning we got updates from 3 mini projects just have funded starting with the University of Greenwich and their investigations into module survey results and learning outcomes. The team explain more in this blog post. I was also very interested in the Student workload model mini project being developed at the OU.  You can read more about it here.

The other mini project from the University of Edinburgh, was interesting too, but in a different way. It is more what I would term, a pure LA research project with lots of text data mining, regression modelling of (MOOC) discussion forums. Part of me is fascinated by all of this “clever stuff”, but equally part of me just thinks that I will never be able to use any of that in my day job.  We don’t have huge discussion forums, in fact we are seeing (and in many ways encouraging) less use of them (even with our limited data views I know that) and more use of wikis and blogs for reflection and discussion. Maybe these techniques will work on these areas too, I hope so but sometimes thinking about that really does make my head hurt.

I hope that we can start moving on our pilot work around learning analytics soon. ’Til then, I will hang on in there and continue my slow climb up the slope, and maby one day arrive at the plateau.

Looking in the mirror to discover our institutional capability for learning analytics

picture of a mirror

(image CC Share Alike https://commons.wikimedia.org/wiki/File:Mirror_fretwork_english_looking-glass.png)

It’s been a busy week here at GCU Blended Learning Towers.  We’ve just finished the onsite part of of the Jisc Effective Analytics Programme. So this week has been a flurry of workshops and interviews led by the consulting team of Andy Ramsden and Steve Bailey. Although Andy and Steve work for Blackboard, the discovery phase is “platform agnostic” and is as much about culture and people as technology.  The evaluation rubric used had more about culture and people than technology.  Having a team who really understand the UK HE sector was very reassuring. Sadly, it’s not often that you can say that about and HE.

I think GCU is the second institution to go through the discovery process, I know there are quite a few others who will be  doing the same over the next six months. The process is pretty straightforward and outlined in the diagram below.

discovery process diagram

A core team from the institution have a two online meetings with the consulting team, relevant institutional policy/strategy documentation is reviewed before the onsite visit. At the end of the onsite visit an overall recommendation is shared with early findings, before a final report is given to the institution.

I was pleased (probably slightly relieved too) that we got a “ready with recommendations”.  That’s what we were hoping for.

Although we are still awaiting the final report, the process has already been incredibly useful. It has allowed us to bring together some of our key stakeholders; (re)start conversations about the potential and importance of learning analytics; the need to develop our infrastructure, people and process to allow us to use our data more effectively. The final report will also be really helpful in terms of helping us focus our next steps.

Andy described the process as a bit like “holding a mirror to ourselves” which is pretty accurate.  The process hasn’t brought up issues we weren’t aware of. We know our underlying IT infrastructure needs “sorting”, we starting to do that. What is has done is to illustrate some potential areas to help us focus our next steps. In a sense it has helped us not to see forest from the trees, but rather show some twinkling lights and pathways through the forest.

All dashboards but no (meaningful) data – more on our #learninganalytics journey

Back in March I blogged about the start of our journey here at GCU into learning analytics. We had just produced our annual blended learning report which had some headline stats around learning and teaching activity. As I said in the post, the figures we are getting are not that accurate and extracting and making sense of the data from numerous sources has been no mean feat for those involved.  Since then we have been making good progress in moving things forward internally so this post is really an update on where we are.

When I was working on the Cetis Analytics Series, I remember Jean Mutton, University of Derby, telling me about the power of “data driven conversations”.  I have a far greater understanding of exactly what she meant by that.  Since instigating  initial discussions about the where, what, why, when, who and how of our data we’ve been having some really productive discussions, mainly with our IS department and most importantly with one of our Business Analysts, Ken Fraser, who is now my new BFF 🙂

Ken has totally risen to our data challenge and has been exploring our data sets and sprinkling a bit of BI magic over things. Like many institutions, we populate our VLE automagically via our student record system. It is a key source of data and really our primary data source for student information. However actual student activity is recorded in other systems, primarily our VLE. We haven’t quite cracked the automagic feedback of assessments from the VLE back into our SRS – but again I don’t think we’re alone there. So any meaningful analytics process(es) needs to draw on both of these data sources (as well as a number of other ones but that’s for another post).

We also take a snapshot of our VLE activity every night, which Ken has been churning into a datastore, which has been quickly filling up and seeing what he can extract.  Using Oracle BI systems he has been able to develop a number of dashboards far quicker than I expected. But, and there’s always a but, they are next to meaningless as the data we are extracting in our snapshot is pretty meaningless e.g we can get total number of users, but it looks like the total number of users we’ve had on the system since it was installed. It is also not a real time process. That’s not a huge issue just now, but we know we have the tools to allow real time reporting and ideally that’s what we are aiming for.

So we are now exploring the tables in our snapshot from the VLE to see if we can get a more useful data extraction. and thinking about how/if we can normalise the data, and make more robust connections to/from our primary data source the student record system. This is also raising a number of wider issues about our data/information management process. The cycle of data driven conversations is well and truly in motion.

In terms of learning analytics we are really at the exploratory meso stage just now.  We are also getting lots of support from Bb too which is very encouraging.  It may well be that using their system will be the most useful and cost effective solution in the long run in terms of learning analytics. However I don’t think we can make that decision until we really understand ourselves what data we have access to, what we can do with it given our resources, and what we want to do with it. Then I can get back to ranting about big data and thinking about the really important stuff like just what is learning analytics anyway?

Are we just all data end points?

I’ve had  two very contrasting data experiences this week which are both clarify and confusing my views on data and learning analytics.  Firstly there was the LACE (learning analytics community exchange) project webinar titled: Big Picture of Learning Analytics Interoperability. Brian Kelly has written up the event and his blog post contains a link to the recording.

If you think about it, interoperability is key to any kind of data and analytical work. However as the webinar explained, learning analytics has the added complication of the numerous levels and models it can work in and across. The project are very keen to engage stakeholders around concepts but I think they are suffering from the classic chicken and egg scenario just now. They want to engage with the community, but some of the abstract terms do make it difficult for the community (and I include myself here) to engage with, so they need real examples. However I’m not sure right now how I can engage with these large concepts. But in my next post where I’ll update on the work we;re doing here at GCU it might become clearer. I am very keen to be part/track this community so I guess I need to try harder to engage with the higher level concepts.

Anyway, as you’ll know, dear reader, I have been experimenting with visual note taking so used the webinar yesterday to do just that. It’s an interesting experience as it does make you listen in a different way. Asking questions is also kind of hard when you are trying to capture the wider conversation. This is my view naive of the webinar.

Visual notes from LACE webinar

Visual notes from LACE webinar

In contrast, the University of Edinburgh’s “Digital Scholarship Day of Ideas : Data” had a line up of speakers looking at data in quite a different way.  Luckily for me, and others, the event was live streamed and the recording will be available over the next few days on the website.  Also Nicola Osborne was in attendance and live blogging – well worth a read whilst waiting for the videos to be uploaded.

A common theme for most of the speakers was exploration of the assumption that data is neutral.  Being a digital humanities conference that’s hardly surprising, but there were key message coming through that I wish every wannabe and self proclaimed “big data guru”, could be exposed to and take head of. Data isn’t neutral, and just because you put “big” it front of it doesn’t change that.  It is always filtered and not always in a good way. I loved how Annette Markham described how advertisers can use data to flatten and equalise human experience, and her point that not all human experiences can be reduced to data end points however much advertisers selling an increasingly homogenised, consumerist view of the world want it to be.

This resonated in particular with me as I continue to develop my thoughts around learning analytics. I don’t want to (or believe that you can) reduce learning to data end points that have a set of algorithms which can “fix” thing i.e. learner behaviour. But at the same time I do believe that we can make more use of the data we do collect to help us understand what is going on, what works, what doesn’t and allow us to ask more questions around our learning environments. And by that I mean a  holistic view of learning environment that the individual develops themselves as much as the physical and digital environments they find themselves in.  I don’t want a homogenised education system, but at the same time I want to believe that using data more effectively could allow our heterogeneity to flourish.  Or am I just kidding myself? I think I need to have a nice cup of tea and think about this more. In the meantime I’d love to hear any views you may have.

 

Exploring the digital university – next steps digtial university ecosystems?

Regular readers of this (and my previous) blog, will know that exploring the notion of just what a digital university is, c/should be is an ongoing interest of mine. Over the past couple of years now my colleague Bill Johston and I have shared our thinking around the development of a model to explore notions of the digital university. The original series of blog posts got very high viewing figures and generated quite a bit of discussion via comments. We’ve developed the posts into a number of conference presentations and papers. But the most exciting and rewarding development was when Keith Smyth from Edinburgh’s Napier University contacted us about the posts in relation their strategic thinking and development around their digital future. Which in turn will help them to figure out what their vision of digital university will look like.

For the past year Bill and I have been critical friends to Napier’s Digital Futures Working Group. This cross institutional group was tasked with reviewing current practice and areas of activity relating to digital engagement, innovation and digital skills development, and with identifying short term initiatives to build on current practice as well as proposing possible future developments and opportunities. These will be shared by Napier over the coming months. Being part of the Napier initiative has encouraged me to try and develop a similar approach here at GCU.  I’m delighted that we have got senior management backing and later this month we’ll be running a one day consultation event here.

Earlier this week Bill, Keith and myself had a catch up where we spent quite a bit of time reflecting on “our journey” so far.  Partly this was because we have another couple of conference paper submissions we want to prepare.  Also as we now have a very rich set of findings from the Napier experience we needed to think about  our next steps. What can we at GCU learn from the Napier consultation experience? What are the next steps for both institutions? What common issues will emerge? What common solutions/decision points will emerge?  What are the best ways to share our findings internally and externally?

As we reflected on where we started we (well, to be precise, Bill) began to sketch out a kind of process map of where we started (which was a number of lengthy conversations in the staff kitchen between Bill and I) to where we might be this time next year, when hopefully we will have set of actions from GCU.

The diagram below is an attempt to replicate Bill’s diagram and outline the phases we have gone through so far. Starting with conversations, which evolved into a series of blogs posts, which evolved in conference papers/presentation, the blog posts were spotted by Keith and used as a basis for the development of their Digital Futures Working group, which is now being used as an exemplar for work beginning here at GCU.

Stages of the Digital University Conversation

I am more and more convinced that one of the key distinguishing features of a digital university is the ability of staff and students to have a commonly shared articulation and experience of the digitally enabled processes they engage with on a daily basis, and equally a shared understanding of what would be missing if these processes weren’t being digitally enabled. You know, the digital day of student, lecturer, admin person type of thing, but not visions written by “futurologists”, ones written by our staff and students.  Alongside this we could have the daily live of the physical spaces that we are using. So for example we could have overlays of buildings not only showing the footfall of people but also where and when they were accessing our wifi next works etc.

Now, I know we can/could do this already (for example we already show access/availability of computers in our labs via our website) and/or make pretty good educated guesses about what is happening in general terms. However it is becoming easier to get more data and more importantly visualise it in ways that encourage questions around “actionable insights’ not only for our digital spaces, digital infrastructure but our physical ones too. Knowing and sharing the institutional digital footprint is again central to the notion of digital university.

Alongside this, by using learning analytic techniques can we start to make see any correlations around where and why students are online? Can we understand and learn from patterns around access and engagement with learning activities?  Are students are using our uni provided spaces and wifi to do the majority of their uni work or to download “stuff” to listen/watch/read to on the bus? Are they just accessing specialist software/kit? Does it matter if they all have Facebook/youtube/whatsapp open all the time if we are confident (through our enhanced data driven insights) that they are successfully engaging with our programmes and that they have the digital literacy skills to connect and collaborate with the right people in the right spaces (both on and offline)?

As we were talking one word kept coming.  It’s maybe a bit old fashioned, I know they were all the rage a few years ago particularly in the repository sphere, but we did think that mapping the ecosystem of a digital university could be the next logical step. The ecosystem wouldn’t just be about the technology, infrastructure and data but the people and processes too.  Via the the SoLar discussion list I discovered the  Critical Questions for Big Data  article by Danah Boyd and Kate Crawford. As part of their conclusions they write:

“Manovich (2011) writes of three classes of people in the realm of Big Data: ‘those who create data (both consciously and by leaving digital footprints), those who have the means to collect it, and those who have expertise to analyze it’. We know that the last group is the smallest, and the most privileged: they are also the ones who get to determine the rules about how Big Data will be used, and who gets to participate.”

In terms of a digital university, I think we need to be doing our utmost to ensure we are extending membership of that third group, but just now there is a need to raise awareness to all about how and where their data is being collected and to give them a voice in terms of what they think is the best use of it.

What a digital university will actually look like will probably not differ that much from what a university looks like today, what will distinguish it will be the what happens within it and how everyone in that university interacts and shares through a myriad of digitally enabled processes.

How Sheila’s been seen this week – network visualisations and am I really a techie? (a touch of #lak14)

Like many of peers, my working life is a bit of a hybrid. Part of my invited speaker session at last year’s ALT-C conference involved me trying to deconstruct what I actually did.   Since then I have moved to a job with a more recognisable and commonly understood title ‘”Senior Lecturer”. However I don’t actually do much lecturing so it’s still all a bit complicated.  I’m part of the Blended Learning Team within our Learning Enhancement and Academic Development unit.  The three of us are technically literate but I don’t think any of us would identify ourselves as been technical or indeed techies. So I still find it a bit odd when the rest of our colleagues refer to us as technical. This week I’ve been thinking a lot about identity and networks and how I am perceived both internally and externally.

Now I know I am more technically digitally literate and crucially technically confident than many of my colleagues. Working with Cetis for so long it would have been kinda hard not to be. But I always have seen myself as a fulfilling a bridge or hybrid type role between the totally IT/technically focused people and those on the user/teaching and learning side of things.  I think this is becoming increasingly common place and it needs to be so. As technology becomes easier to use and more embedded into all aspects of our lives,  we need to encourage people to have a “let’s have a go” mind set, than “let’s ask the techies” – or in my case pseudo techie. Developing that aspect of digital literacy and confidence in our staff and students is, imho, crucial in terms of any institutional ambitions we at GCU (and anywhere else for that matter)  may have of becoming a digital university.

That said I’m not above donning the technical genius hat as I amaze colleagues with my skills and knowledge when they ask “have you heard of animoto?”  The hat has been firmly removed as two minutes after I demo’d it, they had rumbled how easy it was to use and all those links I sent were actually automagically created in the cloud. 

The annual learning analytics conference, LAK14, is taking place this week, and I’ve been dipping in and out of the twitter backchannel over the past couple of days. Thanks to the live blogging genius of Doug Clow, and others I feel like I’ve almost been there in person.  One of the sessions on Thursday was looking at networks and network visualisations.  These fascinate me, but like many I’m still trying to figure out what they actually mean in terms of learning and teaching. I’ve had some thoughts in relation to my experiences as a learner in MOOCs, but there’s lots more head scratching and experimentation to be done.  One of the tools being demo’d was Netlytic, 

“a cloud-based text and social networks analyzer that can automatically summarize large volumes of text and discover social networks from online conversations on social media sites such as Twitter, Youtube, blogs, online forums and chats. “

I had a bit of a play and within minutes had an analysis and visualisation on of text from the #lak14 hashtag – thanks to twitter it was almost like I was there!

Image

and a visualisation of my twitter network 
Image

Now, just need to figure out if this is more useful than the Martin Hawksey’s quite brilliant TAGs Explorer  . . .

Getting ready for learning analytics at GCU (not quite #lak14)

This week I’m going to try and keep up with the twitter back channel from #lak14 in Indianapolis, already it looks like some really interesting and innovative work is being presented. However, back in my world our learning analytics journey is really just beginning. 

Over the past couple of weeks I’ve been trying to do some basic investigation, introductions and explorations of learning analytics initially with colleagues from IT and the Library.  We are very much a the who, where, why, when and how stage.  So it’s been really useful to look back at the Cetis Analytics Series and also at the presentations from the UK Solar Flare events.  As ever the generosity of the community in sharing experiences is invaluable.  This presentation from Mark Stubbs at MMU helped to clarify a few things for our IT department in terms of data sources we need along side data from the VLE.  This slide was particularly useful. 

Image

BTW we need another one of those SoLar events  soon . . . 

However we do have access to some data, particularly from our VLE, GCU Learn.  Every year we produce a Blended Learning report which gives a snapshot overview of activity in GCU Learn across the University.  Getting and cleansing the data is always a bit of a chore and we are aware that the we can only provide a superficial view of activity. I won’t go into the ins and outs of our data access and data gate-keeping issues but I suspect that you, dear reader will understand so of our “challenges”.  

In broad visual terms we have broken our blended learning activity into four main areas (click on the image to see in more detail, btw the tools/activities are just samples not a definite list for each area.)

Blended Learning areas of activity  at GCU

We can get data at school level (we have three large academic schools) but not at department or module level. Given the dates of our semesters, annual stats are not much use either as they include weeks when there is no teaching so again that can skew the data.  This year we decided to take one month, November 2013, and base the report on that.  So although what we have is a very high level overview there are some clear trends coming through. To quote the Cetis definition of analytics, these trends are indeed giving us some ‘actionable insights’ not only in terms of blended learning activity but also in terms of our wider IT  and support provision. 

So get ready here are our headline figures:

•        18% decrease in average student accesses to GCULearn via the web
•        420% increase in average student accesses to GCULearn via mobile app
•        25% increase in number of GCULearn Communities
•        82% increase in use of CampusPack blogs
•        134% increase in use of wikis
•        232% increase in use of journals
•        222% increase in online feedback via Grademark in Nov 13 compared to Nov 12
•        167% increase in online Graded papers in Nov 13 compared to Nov 12

We don’t have a mobile or byod strategy and looks like we might not need one.  It’s happening, our users are talking with their mobile devices, and 80% of those devices are iOS.  What we need to ensure is that our content is web enabled and ensure that students can interact fully with activities via mobile devices.  A “switch on” policy and, probably more importantly, culture for learning and teaching is something we need to work with staff and students to develop. Ubiquitous and stable wifi across the institution is key to this. Improvements to Bb’s mobile app would help too and we can’t wait for the roll out of their new web enabled design to be in place.  

Staff and students are using the more interactive and student centred functionality of the VLE such as wikis and journals. And the use of assessment and feedback functionality is increasing dramatically.  We estimate that 41% of our modules are making active use of GCU Learn as opposed to just having a course shell and some powerpoint slides. Now we need to drill down into that school level data to get more module level detail on the types of assignments/activities being used, and in tandem develop staff confidence in using, developing and sharing assessment rubrics and their overarching learning designs. 

We are only starting to scratch the surface of learning analytics in our context, but the data we are getting is leading us to ask more detailed questions and demand more nuanced data collection and sense making. We are starting to bring people together to have data driven conversations, and share just exactly where our data is, who has access to it, when they have access to it, what format it is in, and how they access it. We have had initial discussion with Bb about their analytics package, however we need to have more internal discussions about what we can and want to do internally before making any decisions about that.  I’m hoping that I’ll be able to share the next part of our journey very soon.