The Emperor’s New Clothes Redux: The Sokal Hoax

Making way in a new area of scholarship can be challenging for many reasons, no matter how welcoming the community. One of the reasons for this is that there are points in our life where we are allowed to make larger mistakes, or be ignorant, but it is rarer for adults, especially those who are already employed within a job, to be allowed the latitude to say “I have no idea”. As I discussed yesterday, the fable of the Emperor’s New Clothes explains this dilemma well, because children have more licence to be honest to the point of tactlessness where an adult is always weighing up the implications of admitting that they cannot quite see what everyone else is talking about.

Some of you will be familiar with the Sokal Hoax, where Professor Alan Sokal, from physics at NYU, submitted an article to a journal of postmodern cultural studies. The work, “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity”, was accepted by the journal Social Text, which was (at the time) not practising academic peer review. Sokal did not intend for this article to be taken seriously or even expect it to published, although he did produce an article that he described it (in a follow-up article) as:

“a pastiche of Left-wing cant, fawning references, grandiose quotations, and outright nonsense . . . structured around the silliest quotations … he could find about mathematics and physics”

The entire affair is worth reading and you can find the Wikipedia summary here and a good critique about some of Sokal’s less intended consequences here (transcript of a New York Review of Books article). Regrettably, what is less clear is whether Sokal actually achieved very much, in real terms, by carrying out this action. Yes, Social Text moved to an academic peer review system and that’s generally better for all concerned. For those who don’t know, peer review is the process by which submitted articles go to a number of other people in the field and they review the work to see if it is fit to publish. This reduces the load on the editors and allows for more, and more specific, areas of expertise to be involved. It is not, however, faultless as a poor combination of peers can still lead to substandard, or plain wrong, work getting through, especially if reviewers farm the work out to their grad students or review under time constraints. It is, therefore, not all that surprising that Sokal’s deliberately targeted paper, which identified how to get a paper published by these editors in this journal, succeeded, and less surprising when you hear the editors’ account that they thought the paper needed revisions (removing much of the handwaving and contradictory footnotes) and were concerned about the article but, as the journal at the time was one of opinion, they published it anyway.

Such generosity on the part of the editors does not forgive the publication of some of the deliberate misuse of terminology and physics that Sokal uses to highlight the lack of rigour in the journal and the editorial review process. However, one of the problems I have with this is that, as a Computer Scientist speaking to Educational researchers, people often take what I say as a true account of my field, especially given that they do not have the expertise in my discipline to know (or care) about things like computability or algorithmic performance. If I were to submit a scholarly paper to a journal of education, am I doing anyone any favours by deliberately misrepresenting the aspects of my field, given that I am identified by discipline and school on submission?

Yes, people should use terms correctly and there is a great deal of misuse of science for uninformed or nefarious purposes, with some of the writings coming from post-modernist inspired writers being completely wrong. However, when one is not a physicist, one depends upon the knowledge gained from other people as to what physics is. There is a part of me that thinks that Sokal wasted an opportunity to actually fix a number of misunderstandings – for example, making a clear distinction between linear in strict mathematical and physical terms and linear, in post-Derridan terms, where the meaning is (quite deliberately) less well-defined and often pejorative. Words change. Terms change. Knowledge can still exist and continue to connect terms if we make the effort to bridge, rather than to mock or deride.

The post-modernists, especially Derrida, have attracted a great deal of negative interest, often for what appear to be semi-religious objects to their approach, although I would be the first to say that Derrida’s obsession with repurposing words, redefining concepts when it suits him, and providing grammatical constructions that further, rather than reduce, ambiguity do make him a valid target for at least a raised eyebrow on many occasions. I do not have a strong opinion as to whether the Emperor, in this case, is clothed or not, but I must be honest and say that I do not believe that the outputs and constants of science are a purely cultural construction, although I do agree that the mechanism of the scientific academy is very much a cultural artefact and if anything deserves to be reduced to its components for inspection, it is an institution that almost systematically seems to avoid recognising the contribution of women and non-western people except where unavoidable. I mention Derrida here, mostly because Derrida was the first point of media attack when Sokal’s hoax was revealed. This speaks volumes for the bravery of Sokal’s attack – when the media will leap up and put a face on a stick to wave it about because “philosophy X is all mumbo-jumbo and here is the head witch doctor” you really have to wonder what a non-peer reviewed opinion piece in a journal dedicated to same is actually achieving. Derrida thought that the major problem with the piece was that it would make a later, serious, attempt to discuss such issues impossible to achieve.

Of course, although Sokal’s Hoax is a triumph of exposing the publication of works based on their source. authority and obscurity, this is most certainly not restricted to post-modernist journals of opinion. A friend of mine called me in once to read through a paper that used such unusual terminology, for him, that he was unsure as to whether it was good or bad. Fortunately, it was in my discipline and, because I know and can use the word ontology without dying, I was able to identify it as a low-level rehash of some basic work in the field. It was sound work, using the correct terminology, but it certainly wasn’t at the level of the conference it had been sent to – to my friend, however, it was as meaningless as anything that Sokal mocked from Derrida. I am well aware that some of my areas, including knowledge management and educational research, are seen by others to be exactly the same as the post-modernist repurposing of scientific terminology that Sokal attacks.

The point is not who is lying to whom, or whether there is anything behind some of the more obscure utterings of the Post-Modernists, but it is whether deliberately winding people up with a hoax would achieve more than a genuine attempt to reach out to and correct a community, using your expertise and developing a voice in the other discipline to provide a sound translation. Epistemology, theory of knowledge, is important and I’m really not sure that hoaxing and mockery really achieves all that much, especially as, like any extrinsic punishment approach, it tells you not to do something but not how not to do it.


The Emperor’s New (Insert Noun Here)

I’ve always enjoyed the story of the Emperor’s New Clothes, because it has a number of different readings. We can speak of the tactless honesty of the innocent, the child who sees the emperor as he is, or of the willingness to uphold the status quo when it is imposed from a sufficiently high point, in the people who pretend that the emperor is clothed. We can also look at the villains of the piece, who weave a suit that is invisible to those who are stupid, incompetent or unfit to hold a position. This is, of course, genius because it forces the viewer of the suit into that most difficult of decisions: do I speak up (and force someone to explicitly work out if I have sufficient worth to counter the prevailing interpretation) or do I stay silent (to not be seen to be a fool).

There are some quite entertaining logical issues to wrestle with, starting from some fairly reasonable assumptions. Imagine that you are Courtier X, arriving in the room after Courtier 1, and you observe the Emperor. Now you know, full well, that the Emperor has always been up to this point clothed, and in the finest clothes of the land, and he is not know for his propensity for streaking. Walking into the room, you would expect the Emperor to be clothed. Let us assume that, out of a sense of survival and fellow-feeling, Courtier X-1, the one who arrived before you, hisses “He’s wearing a suit that is only invisible to idiots.” Surviving in the Royal Court would have prepared X for a life of rapid adjustment to changes of circumstance brought about by pique and the accidental collision of coronial concerns, so this information would immediately have shot through his mind and, whether you believed that the suit was there or not, behaving otherwise has some quite obvious downsides. Firstly, the Emperor obviously believes that he is wearing this suit. Secondly, there are X-1 other courtiers in the room who have now gone along with it. Thirdly, you have a family to feed and it’s not as if you could go off to another court.

The child’s voice is unaffected by such concerns. The child sees, he thinks, he speaks. Children are very frank when they deal with difficult matters such as the apparent ugliness or facial eructations of an aged relative, the apparent size or adiposity of strangers, or the details with which bodily functions are announced. (I can, however, see a Romulan reading of the tale where the child is sent into battle for his outspokenness and fails to achieve victory – but cultures always vary in these matters.) However, everyone is now embarrassed – doubly so because not only is the Emperor nude, but everyone around him has lied to him. With any luck the Imperial Executioner was in on the lie as well, so that he can run off ashamed before he has to behead everyone else.

Speaking truth to power is a difficult matter and we often seem to confuse it with “saying any old thing because it’s our opinion” and the two are really not the same at all. I have previously referred to the “just saying'” mentality, where offensive or bigoted commentary is presented because it is truthful, when it is quite obvious that it is designed to be hurtful and the words are hiding behind a pretence of honesty. Telling the Emperor that he is naked is the duty of the Emperor’s staff, because it allows us to deal with the real villains of the piece, rather than the difficult (and more likely) outcome that a small child went to bed that night with no supper. Telling the Emperor that he is fat really doesn’t serve any purpose unless you are genuinely concerned for his health and attempting to reduce his adiposity.

The Emperor’s New Clothes is often used to refer to other situations of social hypocrisy or the collective agreement on something that is not true and, as such, it is so heavily used in some areas that its coinage is seriously debased. One reading that I find fascinating is that we can regard the suit as the “words we may use to cloak our fears” (Naomi Wood, KSU) but these words do not protect us from the reality of the situation. The child is free of adult corruption, certainly, but this is also a colder and harsher world, a situation at odds with our normal thoughts on childhood.

I strongly believe that one of the key problems some of my colleagues have with educational research, and its associated vocabulary, is that some of them are convinced that we are somehow playing the Emperor’s New Clothes with them. After all, we are asking them to look at the old fabric, find it wanting, and then we are talking of a new one, describing it in terms that may not be used that often in the standard discipline. Worse, every so often I bet we make it look like any sensible person would be able to understand that this was a better approach – and this is quite damning of whoever says it, whether they are talking to students or staff. Speaking truth to power is as important peer-to-peer as it is student-to-teacher or peasant-to-king but we must distinguish between being rude and dismissive and genuinely seeking answers. I may not always succeed but I do try to use evidence, published work and, of course, the far more influential work of the real leaders in this field! I am nowhere near attaining expertise here but at least I now know where to look and where to start the discussions. I do not yet have a suit of knowledge, but I have a pair of shorts that I can wear in the company of the besuited so that we can have some discussions without me exposing myself too badly! 🙂

The antithesis of the New Clothes phenomenon also occurs frequently: people are looking at a fully-clothed person and pretending that they cannot see the clothes. Obviously, neither approach is sensible when pushed to the extreme. Sometimes we just have to use our eyes and our brains and tell people what we see. And that can be one of the hardest things to do – as well as the most valuable.


Pressganging Story into Service: The Dickens, you say?

a-christmas-carol-still

“Marley was dead” and so begins Charles Dickens’ “A Christmas Carol” which has been reprinted and remade so many times it is near impossible to avoid the cultural impact of this work in English-speaking areas. For those who have avoided it, for whatever reason, it is a simple story. An unpleasant miser, Ebenezer Scrooge, believes Christmas to be nothing but humbug, a waste of time, a period for the stupid to amuse themselves, and a way for those who work insufficiently hard to deprive him (Scrooge) of his hard-won money. Scrooge’s transformation within the book is the core of the story, initiated by the visit of his (long dead) business partner, Marley, who warns him that only a bleak and unpleasant afterlife awaits him after death. Marley tells Scrooge that three ghosts will visit him and to change while he still can.

The first ghost, Christmas Past, shows Scrooge a younger version of himself, when he was innocent and those obstacles he faced that put him onto his current (unpleasant, unloving and unloved) trajectory. The second ghost, Christmas Present, shows him the London he is in now. The joy of family and reuniting with old friends. The ghost takes Scrooge to visit the house of Bob Cratchit, Scrooge’s underpaid and overworked clerk, who lives with a large family and a seriously ill child, Tiny Tim, for whom no medical treatment is forthcoming because Scrooge pays Cratchit so little. Finally, Christmas Yet To Come arrives, and takes Scrooge on a dark journey to the death of Tiny Tim still as a young boy, Scrooge’s own death and the human vultures who pick over his belongings, and his untended grave in a dark corner of a forgotten cemetery.

Scrooge, reminded of his humanity, surrounded by humans and warned of the outcomes to others and himself of his perilous course, awakens on Christmas morning a changed man. His entire demeanour is permanently changed, not just for Christmas Day, but because he now seeks to be not just a better man, but the best man.

I have several film version of this that I like: the Patrick Stewart is good and the Bill Murray comedic-version “Scrooged” is slightly more delightful because Scrooge (Cross, in this version) is redeemed well before his course is as set. (And I like a happy ending.)

Yesterday I spoke about finding stories and myths that I could use and, even stripped of any religious overtones associated with the word Christmas, there’s still a lot to think about in the framing of A Christmas Carol. Dickens had suffered deep and lasting humiliation as a child and the engines of the Industrial Revolution had, by this time, ground up many older traditions and families along the way. Dickens appeal to the charity of those who can afford it is a core part of the work, as well as drawing back to pre-Cromwellian Christmas traditions that had been stamped out under the dour washed-out grey heel of the puritans. But, back to the framing.

The story starts with the description of Scrooge as someone who is happy with their lot, but shouldn’t be. His negative interpretation of the world is as much at odds with reality as his positive perception of the many flaws of his partner, Marley. Marley’s visit forces Scrooge to listen to the one person who could start him on his journey – because no-one else would have the authenticity to speak to him.

The journey begins with advice from a mentor who wishes you to avoid making their mistakes.

The three ghosts appear to force Scrooge to identify how he has changed, how flawed his perceptions are and that his actions, or inactions, will most likely have consequences that extend beyond his lifetime. 

In order to understand why (or if we need to change), we need to understand:

  • How we have already changed to this point
  • What our environment really looks like
  • Why change might be necessary

And none of this is any surprise for anyone who has read one, two or many self-help or realisation books – except that Dickens’ story is full of emotion and a reason for changing. In all of its forms, I have found the thread of the Cratchits to be one of the most moving. Scrooge’s loss and decline one could almost (well, I can’t but some could) write off as the unfortunate actions of a man who attained what he thought he wanted: wealth, and thus a derived happiness. Scrooge is obviously not happy but there are far too many who would ponder ‘why’ when he was so rich! (For every aphorism regarding “money not buying happiness”, there are many examples apparently to the contrary and Dave Gilmore’s famous riposte “… but it will let you park your yacht right next to it.”)

TinyTim, for me, is the core of this myth because Tim is ill, through no fault of his own but because of the time, the body and the family that he was born into.  It’s not Tim’s fault but that simple fact is not enough to save him from dying – he needs other people to realise that he deserves better just because of what he is (a child) rather than who he is (a child of poor parents). Scrooge is not an evil man, although he is most certainly not a good man at the start, and the death of the child is never what he intended, because it would never have occurred to him the Cratchit would have that much of a life outside of the office. Scrooge’s indifference to the world, to the city of London, to Cratchit and to his own humanity is part of the initial transformation that he undertook, to become the Scrooge that we saw. That is the essence of Scrooge – he can change because he changed before. When Scrooge changes, he finally starts down the path to happiness, which appears to hold him in this enlightened and positively changed state for the rest of his long (and happy) life.

I enjoy the story and it’s something I always revisit leading up to Christmas because it is very easy to start getting all ‘bah, humbug’ in the face of commercialism, over expectation and the sheer hype of the holiday season. However, looking at it as a story about change, I’m forced to think about who could come to me and say “Don’t be like me”. How have I changed from where I was 20, 10 or even 5 years ago? What am I ignoring around me that I could be appreciating more?

Where will this path take me?

What would you expect to see, if the mentor and the three ghosts came to see you?


John Henry Died

Every culture has its myths and legends, especially surrounding those incredible individuals who stand out or tower over the rest of the society. The Ancient Greeks and Romans had their gods, demigods, heroes and, many times, cautionary tales of the mortals who got caught in the middle. Australia has the stories of pre- and post-federation mateship, often anti-authoritarian or highlighting the role of the larrikin. We have a lot of bushrangers (with suspiciously good hearts or reacting against terrible police oppression), Simpson and his donkey (a first world war hero who transported men to an aid station using his donkey, ultimately dying on the battlefield) and a Prime Minister who goes on YouTube to announce that she’s now convinced that the Mayans were right and we’re all doomed – tongue firmly in cheek. Is this the totality of the real Australia? No, but the stylised notion of ‘mateship’, the gentle knock and the “come off the grass, you officious … person” attitude are as much a part of how many Australians see themselves as shrimp on a barbie is to many US folk looking at us. In any Australian war story, you are probably more likely to hear about the terrible hangover the Gunner Suggs had and how he dragged his friend a kilometre over rough stones to keep him safe, than you are to hear about how many people he killed. (I note that this mateship is often strongly delineated over gender and racial lines, but it’s still a big part of the Australian story.)

The stores that we tell and those that we pass on as part of our culture strongly shape our culture. Look at Greek mythology and you see stern warnings against hubris – don’t rate yourself too highly or the gods will cut you down. Set yourself up too high in Australian culture and you’re going to get knocked down as well: a ‘tall poppies’ syndrome that is part cultural cringe, inherited from colonial attitudes to the Antipodes, part hubris and part cultural confusion as Anglo, Euro, Asian, African and… well, everyone, come to terms with a country that took the original inhabitants, the Australian Aboriginal and Torres Strait Islanders, quite a while to adapt to. As someone who wasn’t born in Australia, like so many others who live here and now call themselves Australia, I’ve spent a long time looking at my adopted homeland’s stories to see how to fit. Along the way, because of travel, I’ve had the opportunity to look at other cultures as well: the UK, obviously as it’s drummed into you at school, and the US, because it interests me.

The stories of Horatio Alger, from the US, fascinate me, because of their repeated statement of the rags to riches story. While most of Alger’s protagonists never become amazingly wealthy, they rise, through their own merits, to take the opportunities presented to them and, because of this, a good man will always rise. This is, fundamentally, the American Dream – that any person can become President, effectively, through the skills that they have and through rolling up their sleeves. We see this Dream become ugly when any of the three principles no longer hold, in a framing I first read from Professor Harlon Dalton:

  1. The notion that we are judged solely on our merits:For this to be true, we must not have any bias, racist, gendered, religious, ageist or other. Given the recent ruling that an attractive person can be sacked, purely for being attractive and for providing an irresistible attraction for their boss, we have evidence that not only is this point not holding in many places, it’s not holding in ways that beggar belief.
  2. We will each have a fair opportunity to develop these merits:This assumes equal opportunity in terms of education, in terms of jobs, which promptly ignores things like school districts, differing property tax levels, teacher training approaches and (because of the way that teacher districts work) just living in a given state or country because your parents live there (and can’t move) can make the distance between a great education and a sub-standard child minding service. So this doesn’t hold either.
  3. Merit will out:Look around. Is the best, smartest, most talented person running your organisation or making up all of the key positions? Can you locate anyone in the “important people above me” who is holding that job for reasons other than true, relevant merit?

Australia’s myths are beneficial in some ways and destructive in others. For my students, the notion that we help each other, we question but we try to get things done is a positive interpretation of the mild anti-authoritarian mateship focus. The downside is drinking buddies going on a rampage and covering up for each other, fighting the police when the police are actually acting reasonably and public vandalism because of a desire to act up. The mateship myth hides a lot of racism, especially towards our indigenous community, and we can probably salvage a notion of community and collaboration from mateship, while losing some of the ugly and dumb things.

The tunnel went through.

The tunnel went through.

Horatio Alger myths would give hope, except for the bleak reality that many people face which is that it is three giant pieces of boloney that people get hit about the head with. If you’re not succeeding, then Horatio Alger reasoning lets us call you lazy or stupid or just not taking the opportunities. You’re not trying to pull yourself up by your bootstraps hard enough. Worse still, trying to meet up to this, sometimes impossible, guideline leads us into John Henryism. John Henry was a steel driver, who hammered and chiseled the rock through the mountains to build tunnels for the railroad. One day the boss brought in a steam driven hammer and John Henry bet that he could beat it, to show that he and his crew should not be replaced. After a mammoth battle between man and machine, John Henry won, only to die with the hammer in his hand.

Let me recap: John Henry died – and the boss still got a full day’s work that was equal to two steam-hammers. (One of my objections to “It’s a Wonderful Life” is that the rich man gets away with stealing the money – that’s not a fairy tale, it’s a nightmare!) John Henryism occurs when people work so hard to lift themselves up by their bootstraps that they nearly (or do) kill themselves. Men in their 50s with incredibly high blood pressure, ulcers and arthritis know what I’m talking about here. The mantra of the John Henryist is:

“When things don’t go the way that I want them to, that just makes me work even harder.”

There’s nothing intrinsically wrong with this when your goal is actually achievable and you apply this maxim in moderation. At its extreme, and for those people who have people standing on their boot caps, this is a recipe to achieve a great deal for whoever is benefiting from your labour.

And then dying.

As John Henry observes in the ballad (Springsteen version), “I’ll hammer my fool self to death”, and the ballad of John Henry is actually a cautionary tale to set your pace carefully because if you’re going to swing a hammer all day, every day, then you have to do it at a pace that won’t kill you. This is the natural constraint on Horatio Alger and balances all of the issues with merit and access to opportunity: don’t kill your “fool self” striving for something that you can’t achieve. It’s a shame, however, that the stories line up like this because there’s a lot of hopelessness sitting in that junction.

Dealing with students always makes me think very carefully about the stories I tell and the stories I live. Over the next few days, I hope to put together some thoughts on a 21st century myth form that inspires without demanding this level of sacrifice, and that encourages without forcing people into despair if existing obstacles block them – and it’s beyond their current control to shift. However, on that last point, what I’d really like to come up with is a story that encourages people to talk about obstacles and then work together to lift them out of the way. I do like a challenge, after all. 🙂


Vitamin Ed: Can It Be Extracted?

Mmm. Taste the learnination.

Mmm. Taste the learnination.

There are a couple of ways to enjoy a healthy, balanced diet. The first is to actually eat a healthy, balanced diet made up from fresh produce across the range of sources, which requires you to prepare and cook foods, often changing how you eat depending on the season to maximise the benefit. The second is to eat whatever you dang well like and then use an array of supplements, vitamins, treatments and snake oil to try and beat your diet of monster burgers and gorilla dogs into something that will not kill you in 20 years. If you’ve ever bothered to look on the side of those supplements, vitamins, minerals or whatever, that most people have in their ‘medicine’ cabinets, you might see statements like “does not substitute for a balanced diet” or nice disclaimers like that. There is, of course, a reason for that. While we can be fairly certain about a range of deficiency disorders in humans, and we can prevent these problems with selective replacement, many other conditions are not as clear cut – if you eat a range of produce which contains the things that we know we need, you’re probably getting a slew of things that we also need but don’t make themselves as prominent.

In terms of our diet, while the debate rages about precisely which diet humans should be eating, we can have a fairly good stab at a sound basis from a dietician’s perspective built out of actual food. Recreating that from raw sugars, protein, vitamin and mineral supplements is technically possible but (a) much harder to manage and (b) nowhere near as satisfying as eating the real food, in most cases. Let’s nor forget that very few of us in the western world are so distant from our food that we regard it purely as fuel, with no regard for its presentation, flavour or appeal. In fact, most of us could muster a grimace for the thought of someone telling us to eat something because it was good for us or for some real or imagined medical benefit. In terms of human nutrition, we have the known components that we have to eat (sugars, proteins, fats…) and we can identify specific vitamins and minerals that we need to balance to enjoy good health, yet there is not shortage of additional supplements that we also take out of concern for our health that may have little or no demonstrated benefit, yet still we take them.

There’s been a lot of work done in trying to establish an evidence base for medical supplements and far more of the supplements fail than pass this test. Willow bark, an old remedy for pain relief, has been found to have a reliable effect because it has a chemical basis for working – evidence demonstrated that and now we have aspirin. Homeopathic memory water? There’s no reliable evidence for this working. Does this mean it won’t work? Well, here we get into the placebo effect and this is where things get really complicated because we now have the notion that we have a set of replacements that will work for our diet or health because they contain useful chemicals, and a set of solutions that work because we believe in them.

When we look at education, where it’s successful, we see a lot of techniques being mixed in together in a ‘natural’ diet of knowledge construction and learning. Face-to-face and teamwork, sitting side-by-side with formative and summative assessment, as part of discussions or ongoing dialogues, whether physical or on-line. Exactly which parts of these constitute the “balanced” educational diet? We already know that a lecture, by itself, is not a complete educational experience, in the same way that a stand-alone multiple-choice question test will not make you a scholar. There is a great deal of work being done to establish an evidence basis for exactly which bits work but, as MIT said in the OCW release, these components do not make up a course. In dietary terms, it might be raw fuel but is it a desirable meal? Not yet, most likely.

Now let’s get into the placebo side of the equation, where students may react positively to something just because it’s a change, not because it’s necessarily a good change. We can control for these effects, if we’re cautious, and we can do it with full knowledge of the students but I’m very wary of any dependency upon the placebo effect, especially when it’s prefaced with “and the students loved it”. Sorry, students, but I don’t only (or even predominantly) care if you loved it, I care if you performed significantly better, attended more, engaged more, retaining the information for longer, could achieve more, and all of these things can only be measured when we take the trouble to establish base lines, construct experiments, measure things, analyse with care and then think about the outcomes.

My major concern about the whole MOOC discussion is not whether MOOCs are good or bad, it’s more to do with:

  • What does everyone mean when they say MOOC? (Because there’s variation in what people identify as the components)
  • Are we building a balanced diet or are we constructing a sustenance program with carefully balanced supplements that might miss something we don’t yet value?
  • Have we extracted the essential Vitamin Ed from the ‘real’ experience?
  • Can we synthesise Vitamin Ed outside of the ‘real’ educational experience?

I’ve been searching for a terminological separation that allows me to separate ‘real’/’conventional’ learning experiences from ‘virtual’/’new generation’/’MOOC’ experiences and none of those distinctions are satisfying – one says “Restaurant meal” and the other says “Army ration pack” to me, emphasising the separation. Worse, my fear is that a lot of people don’t regard MOOC as ever really having Vitamin Ed inside, as the MIT President clearly believed back in 2001.

I suspect that my search for Vitamin Ed starts from a flawed basis, because it assumes a single silver bullet if we take a literal meaning of the term, so let me me spread the concept out a bit to label Vitamin Ed as the essential educational components that define a good learning and teaching experience. Calling it Vitamin Ed gives me a flag to wave and an analogue to use, to explain why we should be seeking a balanced diet for all of our students, rather than a banquet for one and dog food for the other.


We’re Still Here

Or not.

Or not.

I realise that the ‘appointed’ date of doom hasn’t come yet but we see no evidence of the Apocalypse here in Australia.

We now resume our normal blog production.

(My apologies to any Mayans for the sheer irritation that this cultural misappropriation may have caused.)


“We are not providing an MIT education on the web…”

I’ve been re-watching some older announcements that describe open courseware initiatives, starting from one of the biggest, the MIT announcement of their OpenCourseWare (OCW) initiative in April, 2001. The title of this post actually comes from the video, around the 5:20 mark, (Video quoted under a CC-BY-NC-SA licence, more information available at: http://ocw.mit.edu/terms)

“Let me be very clear, we are not providing an MIT education on the Web. We are, however, providing core materials that are the infrastructure that undergirds that information. Real education, in our view, involves interaction between people. It’s the interaction between faculty and students, in our classrooms and our living group, in our laboratories that are the heart, the real essence, of an MIT education. “

While the OCW was going to be produced and used on campus, the development of OCW was seen as something that would make more time available for student interaction, not less. President Vest then goes on to confidently predict that OCW will not make any difference to enrolment, which is hardly surprising given that he has categorically excluded anyone from achieving an MIT education unless they enrol. We see here exactly the same discussion that keeps coming up: these materials can be used as augmenting materials in these conventional universities but can never, in the view of the President or Vice Chancellor, replace the actual experience of obtaining a degree from that institution.

Now, don’t get me wrong. I still think that the OCW initiative was excellent, generous and visionary but we are still looking at two fundamentally different use cases: the use of OCW to augment an existing experience and the use of OCW to bootstrap a completely new experience, which is not of the same order. It’s a discussion that we keep having – what happens to my Uni if I use EdX courses from another institution? Well, ok, let’s ask that question differently. I will look at this from two sides with the introduction of a new skill and knowledge area that becomes ubiquitous,  in my sphere, Computer Science and programming. Let’s look at this in terms of growth and success.

What happens if schools start teaching programming to first year level? 

Let’s say that we get programming into every single national curriculum for secondary school and we can guarantee that students come in knowing how to program to freshman level. There are two ways of looking at this and the first, which we have probably all seen to some degree, is to regard the school teaching as inferior and re-teach it. The net result of this will be bored students, low engagement and we will be wasting our time. The second, far more productive, approach is to say “Great! You can program. Now let’s do some Computer Science.” and we use that extra year or so to increase our discipline knowledge or put breadth courses back in so our students come out a little more well-rounded. What’s the difference between students learning it from school before they come to us, or through an EdX course on fundamental programming after they come to us?

Not much, really, as long as we make sure that the course meets our requirements – and, in fact, it gives us bricks-and-mortar-bound entities more time to do all that face-to-face interactive University stuff that we know students love and from which they derive great benefit. University stops being semi-vocational in some aspects and we leap into knowledge construction, idea generation, big projects and the grand dreams that we always talk about, yet often don’t get to because we have to train people in basic programming, drafting, and so on. Do we give them course credit? No, because they’re assumed knowledge, or barrier tested, and they’re not necessarily part of our structure anymore.

What happens if no-one wants to take my course anymore?

Now, we know that we can change our courses because we’ve done it so many times before over the history of the Academy – Latin, along with Greek the language of scholarship, was only used in half of the University publications of 1800. Let me wander through a classical garden for a moment to discuss the nature of change from a different angle, that of decline. Languages had a special place in the degrees of my University with Latin and Greek dominating and then with the daring possibility of allowing substitution of French or German for Latin or Greek from 1938. It was as recently as 1958 that Latin stopped being compulsory for high school graduation in Adelaide although it was still required for the study of Law – student demand for Latin at school therefore plummeted and Latin courses started being dropped from the school curriculum. The Law Latin requirement was removed around 1969-1970, which then dropped any demand for Latin even further. The reduction in the number of school teachers who could teach Latin required the introduction of courses at the University for students who had studied no Latin at all – Latin IA entered the syllabus. However, given that in 2007 only one student at all of the schools across the state of South Australian (roughly 1.2-1.4 million people) studied Latin in the final year of school, it is apparent that if this University wishes to teach Latin, it has to start by teaching all of Latin. This is a course, and a discipline, that is currently in decline. My fear is that, one day, someone will make the mistake of thinking that we no longer need scholars of this language. And that worries me, because I don’t know what people 30 years from now will actually want, or what they could add to the knowledge that we already have of one of our most influential civilisations.

This decline is not unique to Latin (or Greek, or classics in general) but a truly on-line course experience would allow us to actually pool those scholars we have left and offer scaled resources out for much longer than isolated pockets in real offices can potentially manage but, as President Vest notes, a storehouse of Latin texts does not a course make. What reduced the demand for Latin? Possibly the ubiquity of the language that we use which is derived from Latin combined with a change of focus away from a classical education towards a more job- and achievement-oriented (semi-vocational) style of education. If you ask me, programming could as easily go this way in about 20 years, once we have ways to let machines solve problems for us. A move towards a less go-go-go culture, smarter machines and a resurgence of the long leisure cycles associated with Science Fiction visions of the future and suddenly it is the engineers and the computer scientists who are looking at shrinking departments and no support in the schools. Let me be blunt: course popularity and desirability rises, stabilises and falls, and it’s very hard to tell if we are looking at a parabola or a pendulum. With that in mind, we should be very careful about how we define our traditions and our conventions, especially as our cunning tools for supporting on-line learning and teaching get better and better. Yes, interaction is an essential part of a good education, no argument at all, but there is an implicit assumption of critical mass that we have seen, time and again, to implicitly support this interaction in a face-to-face environment that is as much a function of popularity and traditionally-associated prestige as it is of excellence.

What are MIT doing now?

I look at the original OCW release and I agree that, at time of production, you could not reproduce the interaction between people that would give you an MIT education. But our tools are better now. They are, quite probably not close enough yet to give you an “MIT of the Internet” but should this be our goal? Not the production of a facsimile of the core materials that might, with MIT instructors, turn into a course, but the commitment to developing the tools that actually reproduce the successful components of the learning experience with group and personal interaction, allowing the formation of what we used to call a physical interactive experience in a virtual side? That’s where I think the new MIT initiatives are showing us how these things can work now, starting from their original idealistic roots and adding the technology of the 21st Century. I hope that other, equally prestigious, institutions are watching this, carefully.


Two Tier: Already Here

Hah! I look down on you, you apples!

Hah! I look down on you, you apples!

I was reading a Chronicle of Higher Ed article “For Whom is College Being Reinvented” and it was sobering reading. While I was writing yesterday about Oxford and Cambridge wanting to maintain their conventional University stance, Robert Archibald, an Economics Professor from the College of William and Mary, points out that the two tier system is already here in terms of good conventional and bad conventional – so that we would see an even larger disparity between luxury and economy courses. Getting into the “good” colleges will be a matter of money and prior preparation, much as it is many areas where the choice of school available to parents is increasingly driving residential moves in the early years of a child’s life. But it doesn’t end there because the ‘quality’ measure may be as much about the employability of the students after they’ve completed their studies – and, as the article says, now we start have to think about whether a “low-level” degree is then preferable to an “industry recognised” apprenticeship or trade training program. Now, our two tiers are as separate as radiographer and radiology but, as Robert Reich also observes in the same article, this is completely against what we should be doing: how can we do all this and maintain real equality between degrees and programs?

Of course, if you didn’t go to a great elementary and senior school, then, despite being on the path to the ‘second-tier’ school, which might be one that naturally migrates to a full electronic delivery for a number of perfectly reasonable economic reasons, you are probably someone who needs a more customised experience than a ‘boilerplate’ MOOC could offer: you actually need face-to-face. When we talk about disruption of the existing college system, we always assume that this is a positive thing, something that will lead to a better result for our students, so these potential issues with where these new technologies may get focused start to become very important.

For whom will these new systems work? Everyone or just the people that we’re happy to expose them to?

It’s perhaps the best question we have to frame the discussion – it’s not about whether the technology works, we know that it works well for certain things and it’s now matter of making sure that our pedagogical systems are correctly married to our computer systems to make the educational experience work. But, obviously, and as many much better writers than I have been saying, it has to work and be at least as good as the systems that it’s replacing – only now we realise that existing systems are not the same for everyone and that one person’s working system is someone else’s diabolically bad teaching experience. So the entire discussion about whether MOOCs work now have to be framed in the context of ‘compared to what‘?

It’s an interesting article that poses more questions than it answers, but it’s certainly part of the overall area we have to think about.


Legitimisation and Agency: I Believe That’s My Ox on Your Bridge

There’s an infamous newspaper advertisement that never ran, which reflected the entry of IBM into the minicomputer market. A number of companies, Data General principal among them, but including such (historically) powerful players as Digital Equipment Corporation, Prime and Hewlett Packard, were quite successful in the minicomputer market, growing rapidly and stealing market share from IBM’s mainframe market. (For an excellent account of these times, I recommend “The Soul of a New Machine” by Tracy Kidder.) IBM finally decided to enter the minicomputer market and, as analysts remarked at the time, IBM’s move into minicomputers legitimised the market.

Ed DeCastro, CEO of Data General, had a full-page news paper advertisement prepared, which I reproduce (mildly bowdlerised to keep my all ages posting status):

“They Say IBM’s Entry Into the Minicomputer Market Will Legitimize the Industry. The B***ards Say, Welcome.”

The ad never actually ran but was framed and put on Ed’s wall. The point, however, was well and precisely made: IBM’s approval was neither required nor desired, and nobody had set a goal of being legitimised.

The Nova, the first Data General minicomputer, with Ed DeCastro in the background.

The Nova, the first Data General minicomputer, with Ed DeCastro in the background.

Over on Mark’s blog, we see that a large number of UK universities are banding together to launch an on-line project, including the highly successful existing player in the analogous space, the Open University, but also some high power players such as Southampton and the disturbingly successful St Andrews. As Mark notes in the title, this is a serious change in terms of allying a UK effort that will produce a competitor (or competitors) to the existing US dominance. As Mark also notes:

Hmm — OxBridge isn’t throwing hats into the rings yet.

And this is a very thoughtful Hmm, because the Universities of Oxford and Cambridge are the impossible-to-ignore legitimising agencies because of their sheer weight on the rubber sheet of UK Academy Spacetime. When it comes to talking about groups of Universities in the UK, and believe me there are quite a few, the Russell Group awards the lion’s share of PhDs, with 78% of the most highly graded research staff as well, across the 24 Universities. One of its stated goals is to lead the research efforts of the UK, with another being to attract the best staff and students to its member institutions. However, the group of participants in the new on-line project involve Russell Group Universities and those outside, which makes the non-participation of Oxford and Cambridge even more interesting. How can a trans-group on-line proposal bring the best students in – or is this why we aren’t seeing involvement from Oxbridge, because of the two-tier perception between traditional and on-line? One can easily argue that Oxford and Cambridge have no need to participate because they are so entrenched in their roles and their success that, as I’ve noted on a different post, any ranking system that rates them out of, say, the top 5 in the UK has made itself suspect as a ranking, rather than a reflection of dropping quality. Oxbridge is at the heart of the UK’s tertiary system and competition will continue to be fierce to gain entry for the foreseeable future. They have no need to get together with the others in their group or beyond, although it’s not from protecting themselves from competitors, as they are not really in competition with most of the other Russell Group members – because they are Oxford and Cambridge.

It’s worth noting that Cambridge’s vice-chancellor Leszek Borysiewicz did think that this consortium was exciting, and I quote from the THE article:

“Online education is becoming an important approach which may open substantial opportunities to those without access to conventional universities,” he said.

And that pretty much confirms why Cambridge is happy to stand back – because they are almost the definition of a conventional university, catering to a well-established market for whom attending a bricks-and-mortar University is as (if not more) important than the course content or delivery mechanisms. The “Gentleman’s Third”, receiving the lowest possible passing grade for your degree examinations, indicates a dedication to many things at the University that are, most likely, of a less-than-scholarly nature but it is precisely for these activities that some people go to Oxford and Cambridge and it is also precisely these non-scholarly activities that we will have great difficulty transferring into a MOOC. There will be no Oxford-Cambridge boat race carried out on a browser-based Flash game, with distributed participants hooked up to rowing machines across the globe, nor will the Footlights be conducted as a Google Hangout (except, of course, highly ironically).

Over time, we’ll find out more about the role of tradition and convention in the composition and participation, but let me return to my opening anecdote. We are already dealing with issues of legitimacy in the on-line learning space, whether from pedagogical fatigue, academic cultural inertia, xenophobia, or the fact that some highly vaunted previous efforts have not been very good. The absence of two of the top three Universities in the UK in this fascinating and potentially quite fruitful collaboration makes me think a lot about IBM. I think of someone sitting back, watching things happen, certain in the knowledge that what they do is what the market needs and it is, oh happy day, what they are currently doing. When Oxford and Cambridge come in and anoint the MOOC, if they every do or if we ever can, then we have the same antique avuncular approach to patting an entire sector on the head and saying “oh, well done, but the grownups are here now”, and this is unlikely to result in anything good in terms of fellow feeling or transferability and accreditation of students, key challenges in MOOCs being taken more seriously. Right now, Oxford and Cambridge are choosing not to step in, and there is no doubt that they will continue to be excellent Universities for their traditional attendees – but is this a sensible long term survival strategy? Could they be contributing to the exploration of the space in a productive manner by putting their legitimising weight in sooner rather than later, at a time when they are saying “Let’s all look at this to see if it’s any good”, rather than going “Oh, hell. Now we have to do something”? Would there be much greater benefit in bringing in their considerable expertise, teaching and research excellence, and resources now, when there is so much room for ground level innovation?

This is certainly something I’m fearful of in my own system, where the Group of 8 Universities has most of the research funding, most of the higher degree granting and, as a goal at least, targets the best staff and students. Our size and tradition can be barriers to agility and innovation, although our recent strategy is obviously trying to set our University on a more innovative and more agile course. A number of recent local projects are embracing the legitimacy of new learning and teaching approaches. It is, however, very important to remember the example of IBM and how the holders of tradition may not necessarily be welcomed as a legitimising influence when other have been highly successful innovating in a new space, which the tradition holder has deemed beneath them until reality finally intruded.

It’s easy to stand back and say “Well, that’s fine for people who can’t afford mainframes” but such a stance must be balanced with looking to see whether people still need or want to afford mainframes. I think the future of education is heavily blended – MOOC + face-to-face is somewhere where I think we can do great things – but for now it’s very interesting to see how we develop as we start to take more and more steps down this path.


Education is not Music: A Long Winded Agreement with Aaron Bady

Mark Guzdial has been posting a great deal on MOOCs, as have we all although Mark is much easier to read than I am, and his recent comment on Aaron Bady’s response to Clay Shirky’s “Udacity is Napster” drew me to the great article by Bady and the following key quote inside Bady’s article:

“I think teaching is very different from music”

and I couldn’t agree more. Let me briefly list why I feel that a comparison to Napster has no real validity, to agree with Aaron that Clay Shirky’s argument is not well grounded for the discussion of education. What’s interesting is that I believe that Shirky identifies this point in his own essay, but doesn’t quite realise the full implications of what he’s saying:

Starting with Edison’s wax cylinders, and continuing through to Pandora and the iPod, the biggest change in musical consumption has come not from production but playback.

Those earlier inventions systems started out markedly inferior to the high-cost alternative: records were scratchy, PCs were crashy. But first they got better, then they got better than that, and finally, they got so good, for so cheap, that they changed people’s sense of what was possible.

The first thing we need to remember about music is that music is inherently fungible because, when viewed as a piece of work, you can replace it with another effectively identical item. Of course, here we need to be careful and define what we need by identical, because music, as it turns out, is almost never identical but it gets treated that way. If you doubt this, then go and review how much it costs to insert the song “Happy Birthday to You” into a movie or TV show. It doesn’t matter if it’s Homer Simpson yelling it drunkenly, or the Three Tenors singing it sotto voce as part of an Ally McBeal shower hallucination flashback, you will still be liable to fork out dollars to the company who claims to hold the copyright. If you understand the history of how we even made music small enough to send across the (much, much slower back then) Internet, we had to start with the MP3 format, which threw away enough ‘unneeded’ data from the original CD files to shrink the files to a little less than 10% of their original size. This is the technology that we needed before we could even get around the idea of Napster, because enough people had enough music on their hard drives (because we’d already dropped the size) to make file sharing useful. However, as Shirky also notes in his article, this lossy compression technique changes the way that music sounds and you can tell the difference if you listen carefully and know what to listen for. Yet, this is the same song and Napster got into trouble for sharing compressed artefacts of lower quality and perceptible difference from the CD originals, because music, as this kind of artefact, is fungible despite very different levels of quality. Identical, to an audiophile, means sounding precisely the same (or true to the source, really), but identical to the copyright owner is a representation that clearly indicates unauthorised use of copyright material – which is why George Harrison’s “My Sweet Lord” ended up begin described as sufficiently similar to “He’s So Fine”, despite it being a brand new recording and not just a compressed copy.

So, yes, Shirky’s original quotes are both true – we have improved playback and while MP3 is still very common, lossless and much higher quality reproductions are now available. However, the point that has been missed is that the vast majority of people do not care in the slightest. The average person will only notice a shift from MP3 to lossless if they suddenly discover that their iPod has dropped in capacity, when measured in number of songs, by a significant margin. If I listen to “Viva La Vida” by Coldplay, and yes, Joe Satriani fans, I picked that deliberately, then the effective difference in my enjoyment of the song, my ability to sing along tunelessly in the shower and the ability to recite the words if asked, has nothing to do with the quality. This is not true of certain pieces of classical music, where the compression artefacts start to have more of an effect, but these are not the core business of file sharers and those who trade in compressed artefacts. However, MP3 artefacts rarely sound like long scratches, dust on the record or a bad needle – yes, they can be irritating, but the electronic form, pre and post-compression, is generally protected from such things unless you get some serious cosmic ray action in your storage media and even then, you have to be very unlucky.

The Napster music argument, for me, falls down because the increase in quality does not have a direct connection to what the majority of the user base would have considered an acceptable product. Yes, it’s better now but, for most people, so what? Music sharing services are considered useful and valuable because they share songs that people want, where most people don’t think about the quality, they accept the name and the recognisable nature of the song as enough.

This is not at all true for education, because educational experiences vary wildly between lecturers, courses, institutions and eras to an extent that it is impossible to consider them in any way to be interchangeable – quality, here, is everything. If you have an international articulation program, you know that the first thing you have to do is to work out what has been taught, and how it has been taught, inside a course of the same name as one of yours. Even ‘name equivalence’ doesn’t mean anything here and we do not, or we should not, grant standing based on a coincidence of name for a course. There is no parallel guarantee that my low quality version of a course will give me the same ability to “sing in the shower” as the high quality course will – and this is, for me, an unassailable difference.

There is no doubt that the opportunities that might be offered by blended learning, full electronic offerings, and, yes, MOOCs (however they end up being defined) are something that we have to consider because, if they work, they allow us to educate the world, but claiming that this must occur because Udacity is like Napster completely ignores the core difference between education and music in terms of the consumer base and their focus on what it means for a service to meet their requirements. If students didn’t care about the perceived quality, then we wouldn’t have the notion of the ‘top schools’ or ‘low end schools’, so we know that this thinking exists. A student will happily put an MP3 on at a party, but it remains to be seen if they will constantly and out of design, not desperation, put a MOOC course on a job application, and expect a good result from it.