I’ve been thinking about learning analytics and, while some Unis have managed to solve parts of the problem, I think that we need to confront the complexity of the problem, to explain why it’s so challenging. I break it into five key problems.
- Data. We don’t currently collect enough of it to analyse, what we do collect is of questionable value and isn’t clearly tied to mechanisms, and we have not confronted the spectre of what we do with this data when we get it.
- Mechanisms linking learning and what is produced. The mechanisms are complex. Students could be failing for any number of reasons, not the least of which is crap staff. Trying to work out what has happened by looking at outputs is unlikely to help.
- Focus. Generally, we measure things to evaluate people. This means that students do tests to get marked and, even where we mix this up with formative work, they tend to focus on the things that get them marks. That’s because it’s how we’ve trained them. This focus warps measurement into an enforcement and judgment mechanism, rather than a supportive and constructive mechanism.
- Community. We often mandate or apply analytics as an extension of the evaluation focus above. This means that we don’t have a community who are supported by analytics, we have a community of evaluators and the evaluated. This is what we would usually label as a Panopticon, because of the asymmetrical application of this kind of visibility. And it’s not a great environment for education. Without a strong community, why should staff go to the extra effort to produce the things required to generate more data if they can’t see a need for it? This is a terribly destructive loop as it requires learning analytics to work and be seen as effective before you have the data to make learning analytics work!
- Support. When we actually have the data, understand the mechanism, have the right focus and are linked in to the community, we still need the money, time and other resources to provide remediation, to encourage development, to pay for the technology, to send people to places where they can learn. For students and staff. We just don’t have that.
I think almost all Unis are suffering from the same problems. This is a terribly complex problem and it cannot be solved by technology alone.
It’s certainly not as easy as driving car. You know that you make the car go faster by pushing on one pedal and you make it go slower by pushing on another. You look at your speedometer. This measures how often your wheels are rotating and, by simple arithmetic, gives you your speed across the road. Now you can work out the speed you want to travel at, taking into account signs, conditions and things like that. Simple. But this simple, everyday, action and its outcomes are the result of many, many technological, social and personal systems interacting.
The speedometer in the car is giving you continuously available, and reasonably reliable, data on your performance. You know how to influence that performance through the use of simple and direct controls (mechanism). There exists a culture of driver training, road signage and engineering, and car design that provides you with information that ties your personal performance to external achievement (These are all part of support, focus and community). Finally, there are extrinsic mechanisms that function as checks and balances but, importantly, they are not directly tied to what you are doing in the car, although there are strong causative connections to certain outcomes (And we can see elements of support and community in this as we all want to drive on safe roads, hence state support for this is essential).
We are nowhere near the car scenario with learning analytics right now. We have some measurements of learning in the classroom because we grade assignments and mark exams. But these are not continuous feedback, to be consulted wherever possible, and the mechanisms to cause positive change in these are not necessarily clear and direct. I would argue that most of what we currently do is much closer to police enforcement of speed. We ask students to drive a track and, periodically, we check to see if they’re doing the correct speed. We then, often irrevocably from a grading sense, assign a mark to how well they are driving the track and settle back to measure them again later.
Learning analytics faces huge problems before it reaches this stage. We need vast quantities of data that we are not currently generating. Many University courses lack opportunities to demonstrate prowess early on. Many courses offer only two or three measurements of performance to determine the final grade. This trying to guess our speed when the speedo only lights up every three to four weeks after we have pressed a combination of pedals.
The mechanisms for improvement and performance control in University education are not just murky, they’re opaque. If we identify a problem, what happens? In the case of detecting that we are speeding, most of us will slow down. If the police detect you are speeding, they may stop you or (more likely) issue you a fine and eventually you’ll use up your licence and have to stop driving. We just give people low marks or fail them. But, combine this with mechanism issues, and suddenly we need to ask if we’re even ready to try to take action if we had the analytics.
Let’s say we get all the data and it’s reliable and pedagogically sensible. We work out how to link things together. We build community support and we focus it correctly. You run analytics over your data. After some digging, you discover that 70% of your teaching staff simply don’t know how to do their jobs. And, as far as you can see, have been performing at this standard for 20 years.
What do you do?
Until we are ready to listen to what analytics tell us, until we have had the discussion of how we deal with students (and staff) who may wish to opt out, and until we have looked at this as the monstrous, resource-hungry, incredibly complex problem that it is, we really have to ask if we’re ready to take learning analytics seriously. And, given how much money can be spent on this, it’s probably better to work out if we’re going to listen before we invest money into a solution that won’t work because it cannot work.
EduTECH AU 2015, Day 1, Higher Ed Leaders, “Revolutionising the Student Experience: Thinking Boldly” #edutechauPosted: June 2, 2015
Lucy Schulz, Deakin University, came to speak about initiatives in place at Deakin, including the IBM Watson initiative, which is currently a world-first for a University. How can a University collaborate to achieve success on a project in a short time? (Lucy thinks that this is the more interesting question. It’s not about the tool, it’s how they got there.)
Some brief facts on Deakin: 50,000 students, 11,000 of whom are on-line. Deakin’s question: how can we make the on-line experience as good if not better than the face-to-face and how can on-line make face-to-face better?
Part of Deakin’s Student Experience focus was on delighting the student. I really like this. I made a comment recently that our learning technology design should be “Everything we do is valuable” and I realise now I should have added “and delightful!” The second part of the student strategy is for Deakin to be at the digital frontier, pushing on the leading edge. This includes understanding the drivers of change in the digital sphere: cultural, technological and social.
(An aside: I’m not a big fan of the term disruption. Disruption makes room for something but I’d rather talk about the something than the clearing. Personal bug, feel free to ignore.)
The Deakin Student Journey has a vision to bring students into the centre of Uni thinking, every level and facet – students can be successful and feel supported in everything that they do at Deakin. There is a Deakin personality, an aspirational set of “Brave, Stylish, Accessible, Inspiring and Savvy”.
Not feeling this as much but it’s hard to get a feel for something like this in 30 seconds so moving on.
What do students want in their learning? Easy to find and to use, it works and it’s personalised.
So, on to IBM’s Watson, the machine that won Jeopardy, thus reducing the set of games that humans can win against machines to Thumb Wars and Go. We then saw a video on Watson featuring a lot of keen students who coincidentally had a lot of nice things to say about Deakin and Watson. (Remember, I warned you earlier, I have a bit of a thing about shiny videos but ignore me, I’m a curmudgeon.)
The Watson software is embedded in a student portal that all students can access, which has required a great deal of investigation into how students communicate, structurally and semantically. This forms the questions and guides the answer. I was waiting to see how Watson was being used and it appears to be acting as a student advisor to improve student experience. (Need to look into this more once day is over.)
Ah, yes, it’s on a student home page where they can ask Watson questions about things of importance to students. It doesn’t appear that they are actually programming the underlying system. (I’m a Computer Scientist in a faculty of Engineering, I always want to get my hands metaphorically dirty, or as dirty as you can get with 0s and 1s.) From looking at the demoed screens, one of the shiny student descriptions of Watson as “Siri plus Google” looks very apt.
Oh, it has cheekiness built in. How delightful. (I have a boundless capacity for whimsy and play but an inbuilt resistance to forced humour and mugging, which is regrettably all that the machines are capable of at the moment. I should confess Siri also rubs me the wrong way when it tries to be funny as I have a good memory and the patterns are obvious after a while. I grew up making ELIZA say stupid things – don’t judge me! 🙂 )
Watson has answered 26,000 questions since February, with an 80% accuracy for answers. The most common questions change according to time of semester, which is a nice confirmation of existing data. Watson is still being trained, with two more releases planned for this year and then another project launched around course and career advisors.
What they’ve learned – three things!
- Student voice is essential and you have to understand it.
- Have to take advantage of collaboration and interdependencies with other Deakin initiatives.
- Gained a new perspective on developing and publishing content for students. Short. Clear. Concise.
The challenges of revolution? (Oh, they’re always there.) Trying to prevent students falling through the cracks and make sure that this tool help students feel valued and stay in contact. The introduction of new technologies have to be recognised in terms of what they change and what they improve.
Collaboration and engagement with your University and student community are essential!
Thanks for a great talk, Lucy. Be interesting to see what happens with Watson in the next generations.
EduTECH AU 2015, Day 1, Higher Ed Leaders, Panel Discussion “Leveraging data for strategic advantage” #edutechauPosted: June 2, 2015
A most distinguished panel today. It can be hard to capture panel discussions so I will do what I can to get the pertinent points down. However, the fact that we are having this panel gives you some indication of the importance of this issue. Getting to know your data will make it easier for you to work out what to do in the future.
University of Wollongong (UoW) have set up a University-wide approach to Learning Analytics, with 30 courses in an early adopter program, scaling up over the next two years. Give things that they have learned.
- You need to have a very clear strategic approach for learning analytics. Learning analytics are built into key strategies. This ties in the key governing bodies and gives you the resources.
- Learning analytics need to be tied into IT and data management strategies – separating infrastructure and academics won’t work.
- The only driver for UoW is the academic driver, not data and not technology. All decisions are academic. “what is the value that this adds to maximums student learning, provide personalised learning and early identification of students at risk?”
- Governance is essential. UoW have a two-tier structure, a strategic group and an ethical use of data group. Both essential but separate.
- With data, and learning analytics, comes a responsibility for action. Actions by whom and, then, what action? What are the roles of the student, staff and support services? Once you have seen a problem that requires intervention, you are obliged to act.
I totally agree with this. I have had similar arguments on the important nature of 5.
The next speaker is from University of Melbourne (UoM), who wanted to discuss a high-level conceptual model. At the top of the model is the term ‘success’, a term that is not really understood or widely used, at national or local level. He introduced the term of ‘education analytics’ where we look at the overall identity of the student and interactions with the institution. We’re not having great conversations with students through written surveys so analytics can provide this information (a controversial approach). UoM want a new way, a decent way, to understand the student, rather than taking a simplistic approach. I think he mentioned intersectionality but not in a way that I really understood it.
Most of what determines student success in Australia isn’t academic, it’s personal, and we have to understand that. We also can’t depend on governments to move this, it will have to come out of the universities.
The next speaker is from University of Sydney, who had four points he wanted to make.
He started by talking about the potential of data. Data is there but it’s time to leverage it. Why are institutions not adopting LA as fast as they could? We understand the important of data-backed decision making.
Working with LA requires a very broad slice across the University – IT, BI, Academics, all could own it and they all want to control it. We want to collaborate so we need clear guidance and clear governance. Need to identify who is doing what without letting any one area steal it.
Over the last years, we have forgotten about the proximity of data. It’s all around us but many people think it’s not accessible. How do we get our hands on all of this data to make information-backed decisions in the right timeframe? This proximity applies to students as well, they should be able to see what’s going on as a day-by-day activity.
The final panellist is from Curtin University. Analytics have to be embedded into daily life and available with little effort if they’re going to be effective. At Curtin, analytics have a role in all places in the Uni, library, learning, life-long learning, you name it. Data has to be unified and available on demand. What do users want?
Curtin focused on creating demand – can they now meet that demand with training and staffing, to move to the next phase of attraction?
Need to be in a position of assisting everyone. This is a new world so have to be ready to help people quite a lot in the earlier stages. Is Higher Ed ready for the type of change that Amazon caused in the book market? Higher Ed can still have a role as validator of education but we have to learn to work with new approaches before our old market is torn out form underneath us.
We need to disentangle what the learner does from what the machine does.
That finished off the initial panel statements and then the chair moved to ask questions to the panel. I’ll try and summarise that.
One question was about the issue of security and privacy of student information. Can we take data that we used to help a student to complete their studies and then use that to try and recruit a new student, even anonymised? UoW mentioned that having a separate data ethics group for exactly this reason. UoW started this with a student survey, one question of which is “do you feel like this is Big Brother”. Fortunately, most felt that it wasn’t but they wanted to know what was going to happen with the data and the underlying driver had to be to help them to succeed.
Issuing a clear policy and embracing transparency is crucial here.
UoM made the point that much work is not built on a strong theoretical basis and a great deal of it is measuring what we already think we care about. There is a lot of value in clearly identifying what works and what doesn’t.
That’s about it for this session. Again, so much to think about.
I like words a lot but I also love words that introduce me to whole new ways of thinking. I remember first learning the word synecdoche (most usually pronounced si-NEK-de-kee), where you used the word for part of something to refer to that something as a whole (or the other way around). Calling a car ‘wheels’ or champagne ‘bubbles’ are good examples of this. It’s generally interesting which parts people pick for synecdoche, because it emphasises what is important about something. Cars have many parts but we refer to it in parts as wheelsI and motor. I could bore you to tears with the components of champagne but we talk about the bubbles. In these cases, placing emphasis upon one part does not diminish the physical necessity of the remaining components in the object but it does tell us what the defining aspect of each of them is often considered to be.
There are many ways to extract a defining characteristic and, rather than selecting an individual aspect for relatively simple structures (and it is terrifying that a car is simple in this discussion), we use descriptive statistics to allow us to summarise large volumes of data to produce measures such as mean, variance and other useful things. In this case, the characteristic we obtain is not actually part of the data that we’re looking at. This is no longer synecdoche, this is statistics, and while we can use these measures to arrive at an understanding (and potentially move to the amazing world of inferential statistics), we run the risk of talking about groups and their measurements as if the measurements had as much importance as the members of the group.
I have been looking a lot at learning analytics recently and George Siemens makes a very useful distinction between learning analytics, academic analytics and data mining. When we analyse the various data and measures that come out of learning, we want to use this to inform human decision making to improve the learning environment, the quality of teaching and the student experience. When we look at the performance of the academy, we worry about things like overall pass rates, recruitment from external bodies and where our students go on to in their careers. Again, however, this is to assist humans in making better decisions. Finally, and not pejoratively but distinctly, data mining delves deep into everything that we have collected, looking for useful correlations that may or may not translate into human decision making. By separating our analysis of the teaching environment from our analysis of the academic support environment, we can focus on the key aspects in the specific area rather than doing strange things that try to drag change across two disparate areas.
When we start analysis, we start to see a lot of numbers: acceptable failure rates, predicted pass rates, retention figures, ATARs, GPAs. The reason that I talk about data analytics as a guide to human decision making is that the human factor reminds us to focus on the students who are part of the figures. It’s no secret that I’m opposed to curve grading because it uses a clear statement of numbers (70% of students will pass) to hide the fact that a group of real students could fail because they didm’ perform at the same level as their peers in the same class. I know more than enough about the ways that a student’s performance can be negatively affected by upbringing and prior education to know that this is not just weak sauce, but a poisonous and vicious broth to be serving to students under the guide of education.
I can completely understand that some employers want to employ people who are able to assimilate information quickly and put it into practice. However, let’s be honest, an ability to excel at University is not necessarily an indication of that. They might coincide, certainly, but it’s no guarantee. When I applied for Officer Training in the Army, they presented me with a speed and accuracy test, as part of the battery of psychological tests, to see if I could do decision making accurately at speed while under no more stress than sitting in a little room being tested. Later on, I was tested during field training, over and over again, to see what would happen. The reason? The Army knows that the skills they need in certain areas need specific testing.
Do you want detailed knowledge? Well, the numbers conspire again to undermine you because a focus on numerical grade measures to arrive at a single characteristic value for a student’s performance (GPA) makes students focus on getting high marks rather than learning. The GPA is not the same as the wheels of the car – it has no relationship to the applicable ability of the student to arbitrary tasks nor, if I may wax poetic, does it give you a sense of the soul of the student.
We have some very exciting tools at our disposal and, with careful thought and the right attitude, there is no doubt that analytics will become a valuable way to develop learning environments, improve our academies and find new ways to do things well. But we have to remember that these aggregate measures are not people, that “10% of students” represented real, living human beings who need to be counted, and that we have a long way to go before have an analytical approach that has a fraction of the strength of synecdoche.