John Henry Died
Posted: December 23, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, collaboration, community, curriculum, design, education, educational problem, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, john henry, learning, meritocracy, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, workload Leave a commentEvery culture has its myths and legends, especially surrounding those incredible individuals who stand out or tower over the rest of the society. The Ancient Greeks and Romans had their gods, demigods, heroes and, many times, cautionary tales of the mortals who got caught in the middle. Australia has the stories of pre- and post-federation mateship, often anti-authoritarian or highlighting the role of the larrikin. We have a lot of bushrangers (with suspiciously good hearts or reacting against terrible police oppression), Simpson and his donkey (a first world war hero who transported men to an aid station using his donkey, ultimately dying on the battlefield) and a Prime Minister who goes on YouTube to announce that she’s now convinced that the Mayans were right and we’re all doomed – tongue firmly in cheek. Is this the totality of the real Australia? No, but the stylised notion of ‘mateship’, the gentle knock and the “come off the grass, you officious … person” attitude are as much a part of how many Australians see themselves as shrimp on a barbie is to many US folk looking at us. In any Australian war story, you are probably more likely to hear about the terrible hangover the Gunner Suggs had and how he dragged his friend a kilometre over rough stones to keep him safe, than you are to hear about how many people he killed. (I note that this mateship is often strongly delineated over gender and racial lines, but it’s still a big part of the Australian story.)
The stores that we tell and those that we pass on as part of our culture strongly shape our culture. Look at Greek mythology and you see stern warnings against hubris – don’t rate yourself too highly or the gods will cut you down. Set yourself up too high in Australian culture and you’re going to get knocked down as well: a ‘tall poppies’ syndrome that is part cultural cringe, inherited from colonial attitudes to the Antipodes, part hubris and part cultural confusion as Anglo, Euro, Asian, African and… well, everyone, come to terms with a country that took the original inhabitants, the Australian Aboriginal and Torres Strait Islanders, quite a while to adapt to. As someone who wasn’t born in Australia, like so many others who live here and now call themselves Australia, I’ve spent a long time looking at my adopted homeland’s stories to see how to fit. Along the way, because of travel, I’ve had the opportunity to look at other cultures as well: the UK, obviously as it’s drummed into you at school, and the US, because it interests me.
The stories of Horatio Alger, from the US, fascinate me, because of their repeated statement of the rags to riches story. While most of Alger’s protagonists never become amazingly wealthy, they rise, through their own merits, to take the opportunities presented to them and, because of this, a good man will always rise. This is, fundamentally, the American Dream – that any person can become President, effectively, through the skills that they have and through rolling up their sleeves. We see this Dream become ugly when any of the three principles no longer hold, in a framing I first read from Professor Harlon Dalton:
- The notion that we are judged solely on our merits:For this to be true, we must not have any bias, racist, gendered, religious, ageist or other. Given the recent ruling that an attractive person can be sacked, purely for being attractive and for providing an irresistible attraction for their boss, we have evidence that not only is this point not holding in many places, it’s not holding in ways that beggar belief.
- We will each have a fair opportunity to develop these merits:This assumes equal opportunity in terms of education, in terms of jobs, which promptly ignores things like school districts, differing property tax levels, teacher training approaches and (because of the way that teacher districts work) just living in a given state or country because your parents live there (and can’t move) can make the distance between a great education and a sub-standard child minding service. So this doesn’t hold either.
- Merit will out:Look around. Is the best, smartest, most talented person running your organisation or making up all of the key positions? Can you locate anyone in the “important people above me” who is holding that job for reasons other than true, relevant merit?
Australia’s myths are beneficial in some ways and destructive in others. For my students, the notion that we help each other, we question but we try to get things done is a positive interpretation of the mild anti-authoritarian mateship focus. The downside is drinking buddies going on a rampage and covering up for each other, fighting the police when the police are actually acting reasonably and public vandalism because of a desire to act up. The mateship myth hides a lot of racism, especially towards our indigenous community, and we can probably salvage a notion of community and collaboration from mateship, while losing some of the ugly and dumb things.
Horatio Alger myths would give hope, except for the bleak reality that many people face which is that it is three giant pieces of boloney that people get hit about the head with. If you’re not succeeding, then Horatio Alger reasoning lets us call you lazy or stupid or just not taking the opportunities. You’re not trying to pull yourself up by your bootstraps hard enough. Worse still, trying to meet up to this, sometimes impossible, guideline leads us into John Henryism. John Henry was a steel driver, who hammered and chiseled the rock through the mountains to build tunnels for the railroad. One day the boss brought in a steam driven hammer and John Henry bet that he could beat it, to show that he and his crew should not be replaced. After a mammoth battle between man and machine, John Henry won, only to die with the hammer in his hand.
Let me recap: John Henry died – and the boss still got a full day’s work that was equal to two steam-hammers. (One of my objections to “It’s a Wonderful Life” is that the rich man gets away with stealing the money – that’s not a fairy tale, it’s a nightmare!) John Henryism occurs when people work so hard to lift themselves up by their bootstraps that they nearly (or do) kill themselves. Men in their 50s with incredibly high blood pressure, ulcers and arthritis know what I’m talking about here. The mantra of the John Henryist is:
“When things don’t go the way that I want them to, that just makes me work even harder.”
There’s nothing intrinsically wrong with this when your goal is actually achievable and you apply this maxim in moderation. At its extreme, and for those people who have people standing on their boot caps, this is a recipe to achieve a great deal for whoever is benefiting from your labour.
And then dying.
As John Henry observes in the ballad (Springsteen version), “I’ll hammer my fool self to death”, and the ballad of John Henry is actually a cautionary tale to set your pace carefully because if you’re going to swing a hammer all day, every day, then you have to do it at a pace that won’t kill you. This is the natural constraint on Horatio Alger and balances all of the issues with merit and access to opportunity: don’t kill your “fool self” striving for something that you can’t achieve. It’s a shame, however, that the stories line up like this because there’s a lot of hopelessness sitting in that junction.
Dealing with students always makes me think very carefully about the stories I tell and the stories I live. Over the next few days, I hope to put together some thoughts on a 21st century myth form that inspires without demanding this level of sacrifice, and that encourages without forcing people into despair if existing obstacles block them – and it’s beyond their current control to shift. However, on that last point, what I’d really like to come up with is a story that encourages people to talk about obstacles and then work together to lift them out of the way. I do like a challenge, after all. 🙂
Vitamin Ed: Can It Be Extracted?
Posted: December 22, 2012 Filed under: Education | Tags: advocacy, blogging, community, curriculum, design, education, educational problem, educational research, ethics, Generation Why, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, vygotsky, workload Leave a commentThere are a couple of ways to enjoy a healthy, balanced diet. The first is to actually eat a healthy, balanced diet made up from fresh produce across the range of sources, which requires you to prepare and cook foods, often changing how you eat depending on the season to maximise the benefit. The second is to eat whatever you dang well like and then use an array of supplements, vitamins, treatments and snake oil to try and beat your diet of monster burgers and gorilla dogs into something that will not kill you in 20 years. If you’ve ever bothered to look on the side of those supplements, vitamins, minerals or whatever, that most people have in their ‘medicine’ cabinets, you might see statements like “does not substitute for a balanced diet” or nice disclaimers like that. There is, of course, a reason for that. While we can be fairly certain about a range of deficiency disorders in humans, and we can prevent these problems with selective replacement, many other conditions are not as clear cut – if you eat a range of produce which contains the things that we know we need, you’re probably getting a slew of things that we also need but don’t make themselves as prominent.
In terms of our diet, while the debate rages about precisely which diet humans should be eating, we can have a fairly good stab at a sound basis from a dietician’s perspective built out of actual food. Recreating that from raw sugars, protein, vitamin and mineral supplements is technically possible but (a) much harder to manage and (b) nowhere near as satisfying as eating the real food, in most cases. Let’s nor forget that very few of us in the western world are so distant from our food that we regard it purely as fuel, with no regard for its presentation, flavour or appeal. In fact, most of us could muster a grimace for the thought of someone telling us to eat something because it was good for us or for some real or imagined medical benefit. In terms of human nutrition, we have the known components that we have to eat (sugars, proteins, fats…) and we can identify specific vitamins and minerals that we need to balance to enjoy good health, yet there is not shortage of additional supplements that we also take out of concern for our health that may have little or no demonstrated benefit, yet still we take them.
There’s been a lot of work done in trying to establish an evidence base for medical supplements and far more of the supplements fail than pass this test. Willow bark, an old remedy for pain relief, has been found to have a reliable effect because it has a chemical basis for working – evidence demonstrated that and now we have aspirin. Homeopathic memory water? There’s no reliable evidence for this working. Does this mean it won’t work? Well, here we get into the placebo effect and this is where things get really complicated because we now have the notion that we have a set of replacements that will work for our diet or health because they contain useful chemicals, and a set of solutions that work because we believe in them.
When we look at education, where it’s successful, we see a lot of techniques being mixed in together in a ‘natural’ diet of knowledge construction and learning. Face-to-face and teamwork, sitting side-by-side with formative and summative assessment, as part of discussions or ongoing dialogues, whether physical or on-line. Exactly which parts of these constitute the “balanced” educational diet? We already know that a lecture, by itself, is not a complete educational experience, in the same way that a stand-alone multiple-choice question test will not make you a scholar. There is a great deal of work being done to establish an evidence basis for exactly which bits work but, as MIT said in the OCW release, these components do not make up a course. In dietary terms, it might be raw fuel but is it a desirable meal? Not yet, most likely.
Now let’s get into the placebo side of the equation, where students may react positively to something just because it’s a change, not because it’s necessarily a good change. We can control for these effects, if we’re cautious, and we can do it with full knowledge of the students but I’m very wary of any dependency upon the placebo effect, especially when it’s prefaced with “and the students loved it”. Sorry, students, but I don’t only (or even predominantly) care if you loved it, I care if you performed significantly better, attended more, engaged more, retaining the information for longer, could achieve more, and all of these things can only be measured when we take the trouble to establish base lines, construct experiments, measure things, analyse with care and then think about the outcomes.
My major concern about the whole MOOC discussion is not whether MOOCs are good or bad, it’s more to do with:
- What does everyone mean when they say MOOC? (Because there’s variation in what people identify as the components)
- Are we building a balanced diet or are we constructing a sustenance program with carefully balanced supplements that might miss something we don’t yet value?
- Have we extracted the essential Vitamin Ed from the ‘real’ experience?
- Can we synthesise Vitamin Ed outside of the ‘real’ educational experience?
I’ve been searching for a terminological separation that allows me to separate ‘real’/’conventional’ learning experiences from ‘virtual’/’new generation’/’MOOC’ experiences and none of those distinctions are satisfying – one says “Restaurant meal” and the other says “Army ration pack” to me, emphasising the separation. Worse, my fear is that a lot of people don’t regard MOOC as ever really having Vitamin Ed inside, as the MIT President clearly believed back in 2001.
I suspect that my search for Vitamin Ed starts from a flawed basis, because it assumes a single silver bullet if we take a literal meaning of the term, so let me me spread the concept out a bit to label Vitamin Ed as the essential educational components that define a good learning and teaching experience. Calling it Vitamin Ed gives me a flag to wave and an analogue to use, to explain why we should be seeking a balanced diet for all of our students, rather than a banquet for one and dog food for the other.
“We are not providing an MIT education on the web…”
Posted: December 21, 2012 Filed under: Education | Tags: advocacy, blogging, collaboration, community, curriculum, design, education, educational research, ethics, Generation Why, higher education, in the student's head, learning, measurement, moocs, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, vygotsky Leave a commentI’ve been re-watching some older announcements that describe open courseware initiatives, starting from one of the biggest, the MIT announcement of their OpenCourseWare (OCW) initiative in April, 2001. The title of this post actually comes from the video, around the 5:20 mark, (Video quoted under a CC-BY-NC-SA licence, more information available at: http://ocw.mit.edu/terms)
“Let me be very clear, we are not providing an MIT education on the Web. We are, however, providing core materials that are the infrastructure that undergirds that information. Real education, in our view, involves interaction between people. It’s the interaction between faculty and students, in our classrooms and our living group, in our laboratories that are the heart, the real essence, of an MIT education. “
While the OCW was going to be produced and used on campus, the development of OCW was seen as something that would make more time available for student interaction, not less. President Vest then goes on to confidently predict that OCW will not make any difference to enrolment, which is hardly surprising given that he has categorically excluded anyone from achieving an MIT education unless they enrol. We see here exactly the same discussion that keeps coming up: these materials can be used as augmenting materials in these conventional universities but can never, in the view of the President or Vice Chancellor, replace the actual experience of obtaining a degree from that institution.
Now, don’t get me wrong. I still think that the OCW initiative was excellent, generous and visionary but we are still looking at two fundamentally different use cases: the use of OCW to augment an existing experience and the use of OCW to bootstrap a completely new experience, which is not of the same order. It’s a discussion that we keep having – what happens to my Uni if I use EdX courses from another institution? Well, ok, let’s ask that question differently. I will look at this from two sides with the introduction of a new skill and knowledge area that becomes ubiquitous, Â in my sphere, Computer Science and programming. Let’s look at this in terms of growth and success.
What happens if schools start teaching programming to first year level?Â
Let’s say that we get programming into every single national curriculum for secondary school and we can guarantee that students come in knowing how to program to freshman level. There are two ways of looking at this and the first, which we have probably all seen to some degree, is to regard the school teaching as inferior and re-teach it. The net result of this will be bored students, low engagement and we will be wasting our time. The second, far more productive, approach is to say “Great! You can program. Now let’s do some Computer Science.” and we use that extra year or so to increase our discipline knowledge or put breadth courses back in so our students come out a little more well-rounded. What’s the difference between students learning it from school before they come to us, or through an EdX course on fundamental programming after they come to us?
Not much, really, as long as we make sure that the course meets our requirements – and, in fact, it gives us bricks-and-mortar-bound entities more time to do all that face-to-face interactive University stuff that we know students love and from which they derive great benefit. University stops being semi-vocational in some aspects and we leap into knowledge construction, idea generation, big projects and the grand dreams that we always talk about, yet often don’t get to because we have to train people in basic programming, drafting, and so on. Do we give them course credit? No, because they’re assumed knowledge, or barrier tested, and they’re not necessarily part of our structure anymore.
What happens if no-one wants to take my course anymore?
Now, we know that we can change our courses because we’ve done it so many times before over the history of the Academy – Latin, along with Greek the language of scholarship, was only used in half of the University publications of 1800. Let me wander through a classical garden for a moment to discuss the nature of change from a different angle, that of decline. Languages had a special place in the degrees of my University with Latin and Greek dominating and then with the daring possibility of allowing substitution of French or German for Latin or Greek from 1938. It was as recently as 1958 that Latin stopped being compulsory for high school graduation in Adelaide although it was still required for the study of Law – student demand for Latin at school therefore plummeted and Latin courses started being dropped from the school curriculum. The Law Latin requirement was removed around 1969-1970, which then dropped any demand for Latin even further. The reduction in the number of school teachers who could teach Latin required the introduction of courses at the University for students who had studied no Latin at all – Latin IA entered the syllabus. However, given that in 2007 only one student at all of the schools across the state of South Australian (roughly 1.2-1.4 million people) studied Latin in the final year of school, it is apparent that if this University wishes to teach Latin, it has to start by teaching all of Latin. This is a course, and a discipline, that is currently in decline. My fear is that, one day, someone will make the mistake of thinking that we no longer need scholars of this language. And that worries me, because I don’t know what people 30 years from now will actually want, or what they could add to the knowledge that we already have of one of our most influential civilisations.
This decline is not unique to Latin (or Greek, or classics in general) but a truly on-line course experience would allow us to actually pool those scholars we have left and offer scaled resources out for much longer than isolated pockets in real offices can potentially manage but, as President Vest notes, a storehouse of Latin texts does not a course make. What reduced the demand for Latin? Possibly the ubiquity of the language that we use which is derived from Latin combined with a change of focus away from a classical education towards a more job- and achievement-oriented (semi-vocational) style of education. If you ask me, programming could as easily go this way in about 20 years, once we have ways to let machines solve problems for us. A move towards a less go-go-go culture, smarter machines and a resurgence of the long leisure cycles associated with Science Fiction visions of the future and suddenly it is the engineers and the computer scientists who are looking at shrinking departments and no support in the schools. Let me be blunt: course popularity and desirability rises, stabilises and falls, and it’s very hard to tell if we are looking at a parabola or a pendulum. With that in mind, we should be very careful about how we define our traditions and our conventions, especially as our cunning tools for supporting on-line learning and teaching get better and better. Yes, interaction is an essential part of a good education, no argument at all, but there is an implicit assumption of critical mass that we have seen, time and again, to implicitly support this interaction in a face-to-face environment that is as much a function of popularity and traditionally-associated prestige as it is of excellence.
What are MIT doing now?
I look at the original OCW release and I agree that, at time of production, you could not reproduce the interaction between people that would give you an MIT education. But our tools are better now. They are, quite probably not close enough yet to give you an “MIT of the Internet” but should this be our goal? Not the production of a facsimile of the core materials that might, with MIT instructors, turn into a course, but the commitment to developing the tools that actually reproduce the successful components of the learning experience with group and personal interaction, allowing the formation of what we used to call a physical interactive experience in a virtual side? That’s where I think the new MIT initiatives are showing us how these things can work now, starting from their original idealistic roots and adding the technology of the 21st Century. I hope that other, equally prestigious, institutions are watching this, carefully.
Two Tier: Already Here
Posted: December 20, 2012 Filed under: Education | Tags: advocacy, blogging, community, curriculum, design, education, grand challenge, higher education, moocs, principles of design, resources, teaching, teaching approaches, thinking Leave a commentI was reading a Chronicle of Higher Ed article “For Whom is College Being Reinvented” and it was sobering reading. While I was writing yesterday about Oxford and Cambridge wanting to maintain their conventional University stance, Robert Archibald, an Economics Professor from the College of William and Mary, points out that the two tier system is already here in terms of good conventional and bad conventional – so that we would see an even larger disparity between luxury and economy courses. Getting into the “good” colleges will be a matter of money and prior preparation, much as it is many areas where the choice of school available to parents is increasingly driving residential moves in the early years of a child’s life. But it doesn’t end there because the ‘quality’ measure may be as much about the employability of the students after they’ve completed their studies – and, as the article says, now we start have to think about whether a “low-level” degree is then preferable to an “industry recognised” apprenticeship or trade training program. Now, our two tiers are as separate as radiographer and radiology but, as Robert Reich also observes in the same article, this is completely against what we should be doing: how can we do all this and maintain real equality between degrees and programs?
Of course, if you didn’t go to a great elementary and senior school, then, despite being on the path to the ‘second-tier’ school, which might be one that naturally migrates to a full electronic delivery for a number of perfectly reasonable economic reasons, you are probably someone who needs a more customised experience than a ‘boilerplate’ MOOC could offer: you actually need face-to-face. When we talk about disruption of the existing college system, we always assume that this is a positive thing, something that will lead to a better result for our students, so these potential issues with where these new technologies may get focused start to become very important.
For whom will these new systems work? Everyone or just the people that we’re happy to expose them to?
It’s perhaps the best question we have to frame the discussion – it’s not about whether the technology works, we know that it works well for certain things and it’s now matter of making sure that our pedagogical systems are correctly married to our computer systems to make the educational experience work. But, obviously, and as many much better writers than I have been saying, it has to work and be at least as good as the systems that it’s replacing – only now we realise that existing systems are not the same for everyone and that one person’s working system is someone else’s diabolically bad teaching experience. So the entire discussion about whether MOOCs work now have to be framed in the context of ‘compared to what‘?
It’s an interesting article that poses more questions than it answers, but it’s certainly part of the overall area we have to think about.
Legitimisation and Agency: I Believe That’s My Ox on Your Bridge
Posted: December 19, 2012 Filed under: Education | Tags: advocacy, blogging, collaboration, community, curriculum, design, education, educational problem, Generation Why, grand challenge, higher education, learning, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, workload 1 CommentThere’s an infamous newspaper advertisement that never ran, which reflected the entry of IBM into the minicomputer market. A number of companies, Data General principal among them, but including such (historically) powerful players as Digital Equipment Corporation, Prime and Hewlett Packard, were quite successful in the minicomputer market, growing rapidly and stealing market share from IBM’s mainframe market. (For an excellent account of these times, I recommend “The Soul of a New Machine” by Tracy Kidder.) IBM finally decided to enter the minicomputer market and, as analysts remarked at the time, IBM’s move into minicomputers legitimised the market.
Ed DeCastro, CEO of Data General, had a full-page news paper advertisement prepared, which I reproduce (mildly bowdlerised to keep my all ages posting status):
“They Say IBM’s Entry Into the Minicomputer Market Will Legitimize the Industry. The B***ards Say, Welcome.”
The ad never actually ran but was framed and put on Ed’s wall. The point, however, was well and precisely made: IBM’s approval was neither required nor desired, and nobody had set a goal of being legitimised.
Over on Mark’s blog, we see that a large number of UK universities are banding together to launch an on-line project, including the highly successful existing player in the analogous space, the Open University, but also some high power players such as Southampton and the disturbingly successful St Andrews. As Mark notes in the title, this is a serious change in terms of allying a UK effort that will produce a competitor (or competitors) to the existing US dominance. As Mark also notes:
Hmm — OxBridge isn’t throwing hats into the rings yet.
And this is a very thoughtful Hmm, because the Universities of Oxford and Cambridge are the impossible-to-ignore legitimising agencies because of their sheer weight on the rubber sheet of UK Academy Spacetime. When it comes to talking about groups of Universities in the UK, and believe me there are quite a few, the Russell Group awards the lion’s share of PhDs, with 78% of the most highly graded research staff as well, across the 24 Universities. One of its stated goals is to lead the research efforts of the UK, with another being to attract the best staff and students to its member institutions. However, the group of participants in the new on-line project involve Russell Group Universities and those outside, which makes the non-participation of Oxford and Cambridge even more interesting. How can a trans-group on-line proposal bring the best students in – or is this why we aren’t seeing involvement from Oxbridge, because of the two-tier perception between traditional and on-line? One can easily argue that Oxford and Cambridge have no need to participate because they are so entrenched in their roles and their success that, as I’ve noted on a different post, any ranking system that rates them out of, say, the top 5 in the UK has made itself suspect as a ranking, rather than a reflection of dropping quality. Oxbridge is at the heart of the UK’s tertiary system and competition will continue to be fierce to gain entry for the foreseeable future. They have no need to get together with the others in their group or beyond, although it’s not from protecting themselves from competitors, as they are not really in competition with most of the other Russell Group members – because they are Oxford and Cambridge.
It’s worth noting that Cambridge’s vice-chancellor Leszek Borysiewicz did think that this consortium was exciting, and I quote from the THE article:
“Online education is becoming an important approach which may open substantial opportunities to those without access to conventional universities,” he said.
And that pretty much confirms why Cambridge is happy to stand back – because they are almost the definition of a conventional university, catering to a well-established market for whom attending a bricks-and-mortar University is as (if not more) important than the course content or delivery mechanisms. The “Gentleman’s Third”, receiving the lowest possible passing grade for your degree examinations, indicates a dedication to many things at the University that are, most likely, of a less-than-scholarly nature but it is precisely for these activities that some people go to Oxford and Cambridge and it is also precisely these non-scholarly activities that we will have great difficulty transferring into a MOOC. There will be no Oxford-Cambridge boat race carried out on a browser-based Flash game, with distributed participants hooked up to rowing machines across the globe, nor will the Footlights be conducted as a Google Hangout (except, of course, highly ironically).
Over time, we’ll find out more about the role of tradition and convention in the composition and participation, but let me return to my opening anecdote. We are already dealing with issues of legitimacy in the on-line learning space, whether from pedagogical fatigue, academic cultural inertia, xenophobia, or the fact that some highly vaunted previous efforts have not been very good. The absence of two of the top three Universities in the UK in this fascinating and potentially quite fruitful collaboration makes me think a lot about IBM. I think of someone sitting back, watching things happen, certain in the knowledge that what they do is what the market needs and it is, oh happy day, what they are currently doing. When Oxford and Cambridge come in and anoint the MOOC, if they every do or if we ever can, then we have the same antique avuncular approach to patting an entire sector on the head and saying “oh, well done, but the grownups are here now”, and this is unlikely to result in anything good in terms of fellow feeling or transferability and accreditation of students, key challenges in MOOCs being taken more seriously. Right now, Oxford and Cambridge are choosing not to step in, and there is no doubt that they will continue to be excellent Universities for their traditional attendees – but is this a sensible long term survival strategy? Could they be contributing to the exploration of the space in a productive manner by putting their legitimising weight in sooner rather than later, at a time when they are saying “Let’s all look at this to see if it’s any good”, rather than going “Oh, hell. Now we have to do something”? Would there be much greater benefit in bringing in their considerable expertise, teaching and research excellence, and resources now, when there is so much room for ground level innovation?
This is certainly something I’m fearful of in my own system, where the Group of 8 Universities has most of the research funding, most of the higher degree granting and, as a goal at least, targets the best staff and students. Our size and tradition can be barriers to agility and innovation, although our recent strategy is obviously trying to set our University on a more innovative and more agile course. A number of recent local projects are embracing the legitimacy of new learning and teaching approaches. It is, however, very important to remember the example of IBM and how the holders of tradition may not necessarily be welcomed as a legitimising influence when other have been highly successful innovating in a new space, which the tradition holder has deemed beneath them until reality finally intruded.
It’s easy to stand back and say “Well, that’s fine for people who can’t afford mainframes” but such a stance must be balanced with looking to see whether people still need or want to afford mainframes. I think the future of education is heavily blended – MOOC + face-to-face is somewhere where I think we can do great things – but for now it’s very interesting to see how we develop as we start to take more and more steps down this path.
Education is not Music: A Long Winded Agreement with Aaron Bady
Posted: December 18, 2012 Filed under: Education | Tags: advocacy, blogging, community, curriculum, design, education, educational research, ethics, Generation Why, higher education, moocs, teaching, teaching approaches, thinking, universal principles of design Leave a commentMark Guzdial has been posting a great deal on MOOCs, as have we all although Mark is much easier to read than I am, and his recent comment on Aaron Bady’s response to Clay Shirky’s “Udacity is Napster” drew me to the great article by Bady and the following key quote inside Bady’s article:
“I think teaching is very different from music”
and I couldn’t agree more. Let me briefly list why I feel that a comparison to Napster has no real validity, to agree with Aaron that Clay Shirky’s argument is not well grounded for the discussion of education. What’s interesting is that I believe that Shirky identifies this point in his own essay, but doesn’t quite realise the full implications of what he’s saying:
Starting with Edison’s wax cylinders, and continuing through to Pandora and the iPod, the biggest change in musical consumption has come not from production but playback.
…
Those earlier inventions systems started out markedly inferior to the high-cost alternative: records were scratchy, PCs were crashy. But first they got better, then they got better than that, and finally, they got so good, for so cheap, that they changed people’s sense of what was possible.
The first thing we need to remember about music is that music is inherently fungible because, when viewed as a piece of work, you can replace it with another effectively identical item. Of course, here we need to be careful and define what we need by identical, because music, as it turns out, is almost never identical but it gets treated that way. If you doubt this, then go and review how much it costs to insert the song “Happy Birthday to You” into a movie or TV show. It doesn’t matter if it’s Homer Simpson yelling it drunkenly, or the Three Tenors singing it sotto voce as part of an Ally McBeal shower hallucination flashback, you will still be liable to fork out dollars to the company who claims to hold the copyright. If you understand the history of how we even made music small enough to send across the (much, much slower back then) Internet, we had to start with the MP3 format, which threw away enough ‘unneeded’ data from the original CD files to shrink the files to a little less than 10% of their original size. This is the technology that we needed before we could even get around the idea of Napster, because enough people had enough music on their hard drives (because we’d already dropped the size) to make file sharing useful. However, as Shirky also notes in his article, this lossy compression technique changes the way that music sounds and you can tell the difference if you listen carefully and know what to listen for. Yet, this is the same song and Napster got into trouble for sharing compressed artefacts of lower quality and perceptible difference from the CD originals, because music, as this kind of artefact, is fungible despite very different levels of quality. Identical, to an audiophile, means sounding precisely the same (or true to the source, really), but identical to the copyright owner is a representation that clearly indicates unauthorised use of copyright material – which is why George Harrison’s “My Sweet Lord” ended up begin described as sufficiently similar to “He’s So Fine”, despite it being a brand new recording and not just a compressed copy.
So, yes, Shirky’s original quotes are both true – we have improved playback and while MP3 is still very common, lossless and much higher quality reproductions are now available. However, the point that has been missed is that the vast majority of people do not care in the slightest. The average person will only notice a shift from MP3 to lossless if they suddenly discover that their iPod has dropped in capacity, when measured in number of songs, by a significant margin. If I listen to “Viva La Vida” by Coldplay, and yes, Joe Satriani fans, I picked that deliberately, then the effective difference in my enjoyment of the song, my ability to sing along tunelessly in the shower and the ability to recite the words if asked, has nothing to do with the quality. This is not true of certain pieces of classical music, where the compression artefacts start to have more of an effect, but these are not the core business of file sharers and those who trade in compressed artefacts. However, MP3 artefacts rarely sound like long scratches, dust on the record or a bad needle – yes, they can be irritating, but the electronic form, pre and post-compression, is generally protected from such things unless you get some serious cosmic ray action in your storage media and even then, you have to be very unlucky.
The Napster music argument, for me, falls down because the increase in quality does not have a direct connection to what the majority of the user base would have considered an acceptable product. Yes, it’s better now but, for most people, so what? Music sharing services are considered useful and valuable because they share songs that people want, where most people don’t think about the quality, they accept the name and the recognisable nature of the song as enough.
This is not at all true for education, because educational experiences vary wildly between lecturers, courses, institutions and eras to an extent that it is impossible to consider them in any way to be interchangeable – quality, here, is everything. If you have an international articulation program, you know that the first thing you have to do is to work out what has been taught, and how it has been taught, inside a course of the same name as one of yours. Even ‘name equivalence’ doesn’t mean anything here and we do not, or we should not, grant standing based on a coincidence of name for a course. There is no parallel guarantee that my low quality version of a course will give me the same ability to “sing in the shower” as the high quality course will – and this is, for me, an unassailable difference.
There is no doubt that the opportunities that might be offered by blended learning, full electronic offerings, and, yes, MOOCs (however they end up being defined) are something that we have to consider because, if they work, they allow us to educate the world, but claiming that this must occur because Udacity is like Napster completely ignores the core difference between education and music in terms of the consumer base and their focus on what it means for a service to meet their requirements. If students didn’t care about the perceived quality, then we wouldn’t have the notion of the ‘top schools’ or ‘low end schools’, so we know that this thinking exists. A student will happily put an MP3 on at a party, but it remains to be seen if they will constantly and out of design, not desperation, put a MOOC course on a job application, and expect a good result from it.
When Does Failing Turn You Into a Failure?
Posted: December 17, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, ci2012, community, conventicle, curriculum, design, education, educational research, ethics, feedback, Generation Why, grand challenge, higher education, icer2012, in the student's head, learning, principles of design, reflection, research, resources, student perspective, teaching, teaching approaches, thinking, tools, workload Leave a commentThe threat of failure is very different from the threat of being a failure. At the Creative Innovations conference I was just at, one of the strongest messages there was that we learn more from failure than we do from success, and that failure is inevitable if you are actually trying to be innovative. If you learn from your failures and your failure is the genuine result of something that didn’t work, rather than you sat around and watched it burn, then this is just something that happens, was the message from CI, and any other culture makes us overly-cautious and risk averse. As most of us know, however, we are more strongly encouraged to cover up our failures than to celebrate them – and we are frequently better off not trying in certain circumstances than failing.
At the recent Adelaide Conventicle, which I promise to write up very, very soon, Dr Raymond Lister presented an excellent talk on applying Neo-Piagetian concepts and framing to the challenges students face in learning programming. This is a great talk (which I’ve had the good fortune to see twice and it’s a mark of the work that I enjoyed it as much the second time) because it allows us to talk about failure to comprehend, or failure to put into practice, in terms of a lack of the underlying mechanism required to comprehend – at this point in the student’s development. As part of the steps of development, we would expect students to have these head-scratching moments where they are currently incapable of making any progress but, framing it within developmental stages, allows us to talk about moving students to the next stage, getting them out of this current failure mode and into something where they will achieve more. Once again, failure in this case is inevitable for most people until we and they manage to achieve the level of conceptual understanding where we can build and develop. More importantly, if we track how they fail, then we start to get an insight into which developmental stage they’re at.
One thing that struck me with Raymond’s talk, was that he starts off talking about “what ruined Raymond” and discussing the dire outcomes promised to him if he watched too much television, as it was to me for playing too many games, and it is to our children for whatever high tech diversion is the current ‘finger wagging’ harbinger of doom. In this case, ruination is quite clearly the threat of becoming a failure. However, this puts us in a strange position, because if failure is almost inevitable but highly valuable if managed properly and understood, what is it about being a failure that is so terrible? It’s like threatening someone that they’ll become too enthusiastic and unrestrained in their innovation!
I am, quelle surprise, playing with words here because to be a failure is to be classed as someone for whom success is no longer an option. If we were being precise, then we would class someone as a perpetual failure or, more simply, unsuccessful. This is, quite usually, the point at which it is acceptable to give up on someone – after all, goes the reasoning, we’re just pouring good money after bad, wasting our time, possibly even moving the deck chairs on the Titanic, and all those other expressions that allow us to draw that good old categorical line between us and others and put our failures into the “Hey, I was trying something new” basket and their failures into the “Well, he’s just so dumb he’d try something like that.” The only problem with this is that I’m really not sure that a lifetime of failure is a guaranteed predictor of future failure. Likely? Yeah, probably. So likely we can gamble someone’s life on it? No, I don’t believe so.
When I was failing courses in my first degree, it took me a surprisingly long time to work out how to fix it, most of which was down to the fact that (a) I had no idea how to study but (b) no-one around me was vaguely interested in the fact that I was failing. I was well on my way to becoming a perpetual failure, someone who had no chance of holding down a job let alone having a career, and it was a kind and fortuitous intervention that helped me. Now, with a degree of experience and knowledge, I can look back into my own patterns and see pretty much what was wrong with me – although, boy, would I have been a difficult cuss to work with. However, failing, which I have done since then and I will (no doubt) do again, has not appeared to have turned me into a failure. I have more failings than I care to count but my wife still loves me, my friends are happy to be seen with me and no-one sticks threats on my door at work so these are obviously in the manageable range. However, managing failure has been a challenging thing for me and I was pondering this recently – how people deal with being told that they’re wrong is very important to how they deal with failing to achieve something.
I’m reading a rather interesting, challenging and confronting, article on, and I cannot believe there’s a phrase for this, rage murders in American schools and workplaces, which claims that these horrifying acts are, effectively, failed revolts, which is with Mark Ames, the author of “Going Postal” (2005). Ames seems to believe that everything stems from Ronald Reagan (and I offer no opinion either way, I hasten to add) but he identifies repeated humiliation, bullying and inhumane conditions as taking ordinary people, who would not usually have committed such actions, and turning them into monstrous killing machines. Ames’ thesis is that this is not the rise of psychopathy but a rebellion against breaking spirit and the metaphorical enslavement of many of the working and middle class that leads to such a dire outcome. If the dominant fable of life is that success is all, failure is bad, and that you are entitled to success, then it should be, as Ames says in the article, exactly those people who are most invested in these cultural fables who would be the most likely to break when the lies become untenable. In the language that I used earlier, this is the most awful way to handle the failure of the fabric of your world – a cold and rational journey that looks like madness but is far worse for being a pre-meditated attempt to destroy the things that lied to you. However, this is only one type of person who commits these acts. The Monash University gunman, for example, was obviously delusional and, while he carried out a rational set of steps to eliminate his main rival, his thinking as to why this needed to happen makes very little sense. The truth is, as always, difficult and muddy and my first impression is that Ames may be oversimplifying in order to advance a relatively narrow and politicised view. But his language strikes me: the notion of the “repeated humiliation, bullying and inhumane conditions”, which appears to be a common language among the older, workplace-focused, and otherwise apparently sane humans who carry out such terrible acts.
One of the complaints made against the radio network at the heart of the recent Royal Hoax, 2DayFM, is that they are serial humiliators of human beings and show no regard for the general well-being of the people involved in their pranks – humiliation, inhumanity and bullying. Sound familiar? Here I am, as an educator, knowing that failure is going to happen for my students and working out how to bring them up into success and achievement when, on one hand, I have a possible set of triggers where beating down people leads to apparent madness, and at least part of our entertainment culture appears to delight in finding the lowest bar and crawling through the filth underneath it. Is telling someone that they’re a failure, and rubbing it in for public enjoyment, of any vague benefit to anyone or is it really, as I firmly believe, the best way to start someone down a genuinely dark path to ruination and resentment.
Returning to my point at the start of this (rather long) piece, I have met Raymond several times and he doesn’t appear even vaguely ruined to me, despite all of the radio, television and Neo-Piagetian contextual framing he employs. The message from Raymond and CI paints failure as something to be monitored and something that is often just a part of life – a stepping stone to future success – but this is most definitely not the message that generally comes down from our society and, for some people, it’s becoming increasingly obvious that their inability to handle the crushing burden of permanent classification as a failure is something that can have catastrophic results. I think we need to get better at genuinely accepting failure as part of trying, and to really, seriously, try to lose the classification of people as failures just because they haven’t yet succeeded at some arbitrary thing that we’ve defined to be important.
Taught for a Result or Developing a Passion
Posted: December 13, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, Bloom, community, curriculum, design, education, educational problem, educational research, ethics, feedback, Generation Why, grand challenge, higher education, learning, measurement, MIKE, PIRLS, PISA, principles of design, reflection, research, resources, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, workload Leave a commentAccording to a story in the Australian national broadcaster, the ABC, website, Australian school children are now ranked 27th out of 48 countries in reading, according to the Progress in International Reading Literacy Study, and that a quarter of Australia’s year 4 students had failed to meet the minimum standard defined for reading at their age. As expected, the Australian government  has said “something must be done” and the Australian Federal Opposition has said “you did the wrong thing”. Ho hum. Reading the document itself is fascinating because our fourth graders apparently struggle once we move into the area of interpretation and integration of ideas and information, but do quite well on simple inference. There is a lot of scope for thought about how we are teaching, given that we appear to have a reasonably Bloom-like breakdown on the data but I’ll leave that to the (other) professionals. Another international test, the Program for International School Assessment (PISA) which is applied to 15 year olds, is something that we rank relatively highly in, which measures reading, mathematics and science. (And, for the record, we’re top 10 on the PISA rankings after a Year 4 ranking of 27th. Either someone has gone dramatically wrong in the last 7 years of Australian Education, or Year 4 results on PIRLS doesn’t have as much influence as we might have expected on the PISA).We don’t yet have the results for this but we expect it out soon.
What is of greatest interest to me from the linked article on the ABC is the Oslo University professor, Svein Sjoberg, who points out the comparing educational systems around the globe is potentially too difficult to be meaningful – which is a refreshingly honest assessment in these performance-ridden and leaderboard-focused days. As he says:
“I think that is a trap. The PISA test does not address the curricular test or the syllabus that is set in each country.
Like all of these tests, PIRLS and PISA measure a student’s ability to perform on a particular test and, regrettably, we’re all pretty much aware, or should be by now, that using a test like this will give you the results that you built the test to give you. But one thing that really struck me from his analysis of the PISA was that the countries who perform better on the PISA Science ranking generally had a lower measure of interest in science. Professor Sjoberg noted that this might be because the students had been encouraged to become result-focused rather than encouraging them to develop a passion.
If Professor Sjoberg is right, then is not just a tragedy, it’s an educational catastrophe – we have now started optimising our students to do well in tests but be less likely to go and pursue the subjects in which they can get these ‘good’ marks. If this nasty little correlation holds, then will have an educational system that dominates in the performance of science in the classroom, but turns out fewer actual scientists – our assessment is no longer aligned to our desired outcomes. Of course, what it is important to remember is that the vast majority of these rankings are relative rather than absolute. We are not saying that one group is competent or incompetent, we are saying that one group can perform better or worse on a given test.
Like anything, to excel at a particular task, you need to focus on it, practise it, and (most likely) prioritise it above something else. What Professor Sjoberg’s analysis might indicate, and I realise that I am making some pretty wild conjecture on shaky evidence, is that certain schools have focused the effort on test taking, rather than actual science. (I know, I know, shock, horror) Science is not always going to fit into neat multiple choice questions or simple automatically marked answers to questions. Science is one of the areas where the viva comes into its own because we wish to explore someone’s answer to determine exactly how much they understand. The questions in PISA theoretically fall into roughly the same categories (MCQ, short answer) as the PIRLS so we would expect to see similar problems in dealing with these questions, if students were actually having a fundamental problem with the questions. But, despite this, the questions in PISA are never going to be capable of gauging the depth of scientific knowledge, the passion for science or the degree to which a student already thinks within the discipline. A bigger problem is the one which always dogs standardised testing of any sort, and that is the risk that answering the question correctly and getting the question right may actually be two different things.
Years ago, I looked at the examination for a large company’s offering in a certain area, I have no wish to get sued so I’m being deliberately vague, and it became rapidly apparent that on occasion there was a company answer that was not the same as the technically correct answer. The best way to prepare for the test was not to study the established base of the discipline but it was to read the corporate tracts and practise the skills on the approved training platforms, which often involved a non-trivial fee for training attendance. This was something that was tangential to my role and I was neither of a sufficiently impressionable age nor strongly bothered enough by it for it to affect me. Time was a factor and incorrect answers cost you marks – so I sat down and learned the ‘right way’ so that I could achieve the correct results in the right time and then go on to do the work using the actual knowledge in my head.
However, let us imagine someone who is 14 or 15 and, on doing the practice tests for ‘test X’ discovers that what is important is in hitting precisely the right answer in the shortest time – thinking about the problem in depth is not really on the table for a two-hour exam, unless it’s highly constrained and students are very well prepared. How does this hypothetical student retain respect for teachers who talk about what science is, the purity of mathematics, or the importance of scholarship, when the correct optimising behaviour is to rote-learn the right answers, or the safe and acceptable answers, and reproduce those on demand. (Looking at some of the tables in the PISA document, we see that the best performing nations in the top band of mathematical thinking are those with amazing educational systems – the desired range – and those who reputedly place great value in high power-distance classrooms with large volumes of memorisation and received wisdom – which is probably not the desired range.)
Professor Sjoberg makes an excellent point, which is that trying to work out what is in need of fixing, and what is good, about the Australian education system is not going to be solved by looking at single figure representations of our international rankings, especially when the rankings contradict each other on occasion! Not all countries are the same, pedagogically, in terms of their educational processes or their power distances, and adjacency of rank is no guarantee that the two educational systems are the same (Finland, next to Shanghai-China for instance). What is needed is reflection upon what we think constitutes a good education and then we provide meaningful local measures that allow us to work out how we are doing with our educational system. If we get the educational system right then, Â if we keep a bleary eye on the tests we use, we should then test well. Optimising for the tests takes the effort off the education and puts it all onto the implementation of the test – if that is the case, then no wonder people are less interested in a career of learning the right phrase for a short answer or the correct multiple-choice answer.
“You Will Never Amount to Anything!”
Posted: December 10, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, curriculum, education, educational research, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, learning, led zeppelin, measurement, principal skinner, principles of design, reflection, resources, simpsons, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, work/life balance, workload Leave a commentI am currently reading “When Giants Walked the Earth: A Biography of Led Zeppelin” by Mick Wall. I won’t go into much of the detail of the book but the message presented around the four members of the group is that most of them did not have the best experiences in school and that, in at least two cases, the statements written on their reports by their teachers were profoundly dismissive. Now, it is of course entirely possible the that the Led Zep lads were, at time of leaving school, incapable of achieving anything – except that this is a total nonsense as it is quite obvious that they achieved a degree of musical and professional success that few contemplate, let alone reach.
You’ll often read this kind of line in celebrity biographies – that semi-mythical reforging of the self after having been judged and found wanting. (From a narrative perspective, it’s not all that surprising as it’s an easy way to increase the tension.) But one of the reasons that it pops up is that such a statement is so damning that it is not surprising that a successful person might want to wander back to the person who said it and say “Really?” But to claim that such a statement is a challenge (as famously mocked in the Simpsons where Principal Skinner says that these children have not future and is forced to mutter, with false bonhomie, ‘Prove me wrong, kids, prove me wrong.’) is confused at best, disingenuous and misdirecting at worst. If you want someone to achieve something, provide a clear description of the task, the means to achieve that task and then set about educating and training. No-one has ever learned brain surgery by someone yelling “Don’t open that skull” so pretending that an entire life’s worth of motivation can be achieved by telling something that they have no worth is piffle. Possibly even balderdash.
The phrase “You Will Never Amount To Anything” is, in whatever form it is uttered, a truly useless sentiment. It barely has any meaning (isn’t just being alive being something and hence amounting to a small sort of anything?) but, of course, it is not stated in order to achieve an outcome other than to place the blame for the lack of engagement with a given system squarely at the feet of the accused. You have failed to take advantage of the educational opportunities that we have provided and this is such a terminal fault, that the remaining 90% of your life will be spent in a mobile block of amber, where you will be unable to affect any worthwhile interaction with the universe.
I note that, with some near misses, I have been spared this kind of statement but I do feel very strongly that it is really not anything that you can with any credibility or useful purpose. If you happen to be Death, the Grim Reaper, then you can stand at the end of someone’s life and say “Gosh, you didn’t do a great deal did you” (although, again, what does it mean to do anything anyway?) but saying it when someone is between the ages of 16 and 20? You might be able to depend upon the statistical reliability that, if rampant success in our society is only given to 1%, 99% of the time, everyone you say “You will not be a success” will accidentally fall into that category. It’s quite obvious that any number of the characteristics that are worthy of praise in school contribute nothing to the spectacular success enjoyed by some people, where these characteristics are “sitting quietly”, “wearing the correct uniform” or “not chewing gum”. These are excellent facets of compliance and will make for citizens who may be of great utility to the successful, but it’s hard to see many business leaders whose first piece of advice to desperate imitators is “always wear shiny shoes”.
If we are talking about perceived academic ability then we run into another problem, in that there is a great deal of difference between school and University, let along school and work. There is no doubt that the preparation offered by a good schooling system is invaluable. Reading, writing, general knowledge, science, mathematics, biology, the classics… all of these parts of our knowledge and our society can be introduced to students very usefully. But to say that your ability to focus on long division problems when you are 14 is actually going to be the grand limiting factor on your future contribution to the world? Nonsense.
Were you to look at my original degree, you might think “How on Earth did this man end up with a PhD? He appears to have no real grasp of study, or pathway through his learning.” and, at the time of the degree, you’d be right. But I thought about what had happened, learned from it, and decided to go back and study again in order to improve my level of knowledge and my academic record. I then went back and did this again. And again. Because I persevered, because I received good advice on how to improve and, most importantly, because a lot of people took the time to help me, I learned a great deal and I became a better student. I developed my knowledge. I learned how to learn and, because of that, I started to learn how to think about teaching, as well.
If you were to look at Nick Falkner at 14, you may have seen some potential but a worry lack of diligence and effort. At 16, you would have seen him blow an entire year of school exams because he didn’t pay attention. At 17 he made it into Uni, just, but it wasn’t until the wheels really started to fall off that he realised that being loquacious and friendly wasn’t enough. Scurrying out of Uni with a third-grade degree into a workforce that looked at the evidence of my learning drove home that improvements were to be made. Being unemployed for most of a year cemented it – I had set myself up for a difficult life and had squandered a lot of opportunities. And that is when serendipity intervened, because the man who has the office next to me now, and with whom I coffee almost every morning, suggested that I could come back and pursue a Masters degree to make up for the poor original degree, and that I would not have to pay for it upfront because it was available as a government deferred-payment option. (Thank you, again, Kevin!)
That simple piece of advice changed my life completely. Instead of not saying anything or being dismissive of a poor student, someone actually took the time to say “Well, here’s something you could do and here’s how you do it.” And now, nearly 20 years down the track, I have a PhD, a solid career in which I am respected as an educator and as a researcher and I get to inspire and help other students. There’s no guarantee that good advice will always lead to good outcomes (and we all know about the paving on the road to Hell) but it’s increasingly obvious to me that dismissive statements, unpleasant utterances and “cut you loose” curtness are far more likely to do nothing positive at all.
If the most that you can say to a student is “You’re never going to amount to anything”, it might be worth looking in a mirror to see exactly what you’ve amounted to yourself…







