“You Will Never Amount to Anything!”
Posted: December 10, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, curriculum, education, educational research, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, learning, led zeppelin, measurement, principal skinner, principles of design, reflection, resources, simpsons, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, work/life balance, workload Leave a commentI am currently reading “When Giants Walked the Earth: A Biography of Led Zeppelin” by Mick Wall. I won’t go into much of the detail of the book but the message presented around the four members of the group is that most of them did not have the best experiences in school and that, in at least two cases, the statements written on their reports by their teachers were profoundly dismissive. Now, it is of course entirely possible the that the Led Zep lads were, at time of leaving school, incapable of achieving anything – except that this is a total nonsense as it is quite obvious that they achieved a degree of musical and professional success that few contemplate, let alone reach.
You’ll often read this kind of line in celebrity biographies – that semi-mythical reforging of the self after having been judged and found wanting. (From a narrative perspective, it’s not all that surprising as it’s an easy way to increase the tension.) But one of the reasons that it pops up is that such a statement is so damning that it is not surprising that a successful person might want to wander back to the person who said it and say “Really?” But to claim that such a statement is a challenge (as famously mocked in the Simpsons where Principal Skinner says that these children have not future and is forced to mutter, with false bonhomie, ‘Prove me wrong, kids, prove me wrong.’) is confused at best, disingenuous and misdirecting at worst. If you want someone to achieve something, provide a clear description of the task, the means to achieve that task and then set about educating and training. No-one has ever learned brain surgery by someone yelling “Don’t open that skull” so pretending that an entire life’s worth of motivation can be achieved by telling something that they have no worth is piffle. Possibly even balderdash.
The phrase “You Will Never Amount To Anything” is, in whatever form it is uttered, a truly useless sentiment. It barely has any meaning (isn’t just being alive being something and hence amounting to a small sort of anything?) but, of course, it is not stated in order to achieve an outcome other than to place the blame for the lack of engagement with a given system squarely at the feet of the accused. You have failed to take advantage of the educational opportunities that we have provided and this is such a terminal fault, that the remaining 90% of your life will be spent in a mobile block of amber, where you will be unable to affect any worthwhile interaction with the universe.
I note that, with some near misses, I have been spared this kind of statement but I do feel very strongly that it is really not anything that you can with any credibility or useful purpose. If you happen to be Death, the Grim Reaper, then you can stand at the end of someone’s life and say “Gosh, you didn’t do a great deal did you” (although, again, what does it mean to do anything anyway?) but saying it when someone is between the ages of 16 and 20? You might be able to depend upon the statistical reliability that, if rampant success in our society is only given to 1%, 99% of the time, everyone you say “You will not be a success” will accidentally fall into that category. It’s quite obvious that any number of the characteristics that are worthy of praise in school contribute nothing to the spectacular success enjoyed by some people, where these characteristics are “sitting quietly”, “wearing the correct uniform” or “not chewing gum”. These are excellent facets of compliance and will make for citizens who may be of great utility to the successful, but it’s hard to see many business leaders whose first piece of advice to desperate imitators is “always wear shiny shoes”.
If we are talking about perceived academic ability then we run into another problem, in that there is a great deal of difference between school and University, let along school and work. There is no doubt that the preparation offered by a good schooling system is invaluable. Reading, writing, general knowledge, science, mathematics, biology, the classics… all of these parts of our knowledge and our society can be introduced to students very usefully. But to say that your ability to focus on long division problems when you are 14 is actually going to be the grand limiting factor on your future contribution to the world? Nonsense.
Were you to look at my original degree, you might think “How on Earth did this man end up with a PhD? He appears to have no real grasp of study, or pathway through his learning.” and, at the time of the degree, you’d be right. But I thought about what had happened, learned from it, and decided to go back and study again in order to improve my level of knowledge and my academic record. I then went back and did this again. And again. Because I persevered, because I received good advice on how to improve and, most importantly, because a lot of people took the time to help me, I learned a great deal and I became a better student. I developed my knowledge. I learned how to learn and, because of that, I started to learn how to think about teaching, as well.
If you were to look at Nick Falkner at 14, you may have seen some potential but a worry lack of diligence and effort. At 16, you would have seen him blow an entire year of school exams because he didn’t pay attention. At 17 he made it into Uni, just, but it wasn’t until the wheels really started to fall off that he realised that being loquacious and friendly wasn’t enough. Scurrying out of Uni with a third-grade degree into a workforce that looked at the evidence of my learning drove home that improvements were to be made. Being unemployed for most of a year cemented it – I had set myself up for a difficult life and had squandered a lot of opportunities. And that is when serendipity intervened, because the man who has the office next to me now, and with whom I coffee almost every morning, suggested that I could come back and pursue a Masters degree to make up for the poor original degree, and that I would not have to pay for it upfront because it was available as a government deferred-payment option. (Thank you, again, Kevin!)
That simple piece of advice changed my life completely. Instead of not saying anything or being dismissive of a poor student, someone actually took the time to say “Well, here’s something you could do and here’s how you do it.” And now, nearly 20 years down the track, I have a PhD, a solid career in which I am respected as an educator and as a researcher and I get to inspire and help other students. There’s no guarantee that good advice will always lead to good outcomes (and we all know about the paving on the road to Hell) but it’s increasingly obvious to me that dismissive statements, unpleasant utterances and “cut you loose” curtness are far more likely to do nothing positive at all.
If the most that you can say to a student is “You’re never going to amount to anything”, it might be worth looking in a mirror to see exactly what you’ve amounted to yourself…
A tragic and unintended outcome of an act with no benefit
Posted: December 9, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, education, ethics, Generation Why, higher education, in the student's head, reflection, student perspective, teaching approaches, thinking Leave a commentRecently, a pair of radio hosts from the Sydney 2Day FM station prank-called the hospital in which the Duchess of Cambridge was receiving treatment for medical issues associated with her pregnancy. Pretending to be the Queen, at 5:30am UK time, they managed to fool the nurse who was staffing reception (as the normal reception staff were not on duty) and got put through to the ward, where they managed to extract some information. Exceedingly sadly, after the hoax became apparent, this rather thoughtless and unfunny invasion of privacy has now had a tragic final act, in that the nurse who was believed to have passed the call through, Jacintha Saldanha, has been found dead, apparently by her own hand. You can read about this in a reasonable summary from the Sydney Morning Herald.
There is (currently) no direct connection between the prank event and the death of Ms Saldanha but, given who the people and the profile that we are talking about, one can easily imagine the pressure (real or imaginary) that someone would be under if they had failed to protect any patient, let alone the one that we are discussing. Of course, the radio show hosts did not intend for this outcome and, before there are any more calls for their heads, let us remember moral accident and the fact that, while their action was an inexplicable invasion of privacy, foolish, unfeeling and in poor taste, it was never intended to be lethal. Should they face questions? Yes.
Why?
Because it is not hard to summon the modicum of empathy required to understand why a woman who is experiencing any difficulties at all during pregnancy might have the reasonable expectation to be left alone and not be picked on for the delight of two radio hosts and their audience. Regardless of which family the Duke of Cambridge was born into and into which the Duchess of Cambridge has married, they are people and, by all accounts, live a surprisingly normal life for the couple who will (most likely) one day rule as the King and Queen of the United Kingdom. It is none of my business as to the details of the Duchess’ illness or condition, unless she wishes to release it, any more than it is the Queen’s business to prank call me into revealing the mark I received for Numerical Analysis I the first time I sat it, in the hopes of embarrassing me.
(With the greatest respect, Your Majesty, it was a 23 Fail because I did not attend lectures or do enough of the preparatory work. I would be grateful if you would consider using that knowledge wisely, Ma’am.)
As it stands there is the usual angry media reaction (and popular backlash) one sees when a stupid prank goes horribly wrong but what was never truly questioned is why on earth we persist with this nonsense in the first place? I often ask my students very direct questions when they tell me things. “Why did you do this?” is, apparently, a startling question to some of my students because it seems to stun them with its simplicity.
“You performed this action that had no positive value or it had a negative and unpleasant impact on the world. Why did you do this?” is the simplest, sanest question that should be asked whenever anybody does something like that. No doubt all of my poor Grand Challenge students are waiting for me to type Cui Bono? so I’ll get that out of the way but, in reality, cui bono (who benefits) seeks to locate the benefiting party to assign malign intent, rather than quisquam bono?, which is what I’m asking here: does anyone actually benefit. (My Latin is very rusty so I welcome corrections from classical scholars and revenant Romans.)
I often mutter things along the lines of “Just because you can, doesn’t mean you should”, mainly because I’m now middle-aged and it’s somewhat expected, but also because I strongly believe that we are moving into an age where the ability to do stupid things on the global scale is now within the reach of anyone with a telephone, a web browser and a general lack of empathy or kindness.
It is because I understand people, and I do have empathy, that I have the deepest sympathies for the family of Ms Saldanha, a husband and two teenagers, who must be suffering through a terrible and public loss, but are doing so with a great deal of dignity as I understand it. However, it would be wrong not to have some feeling for the radio hosts themselves because it would be the most egregious error to assign intent to their thoughtlessness. They did not set out to create this situation. However, and let me be clear, any situation that they did set out was almost completely without benefit to anyone, lacked respect, lacked empathy, was invasive, was unpleasant and should never have been attempted. Their lack of genuine apology could be seen, until recently, in the Tweeted advertisements carried in one of the host’s feeds until it was suspended. (For me, it is the lack of empathy that is sadly unsurprising. Why should Michael Christian be doing anything other than his job in this situation: producing high impact media buzz and then tapping it to drive up ratings? Of course, if he had a real sense of what he was doing, he would have pulled the prank either before it started or once they got past the reception, because they were about to violate someone’s privacy. Are we at fault because of who we select to hold the broadcast roles? Can you blame the gladiators for being bloodthirsty when we’re screaming around the circus?)
My next question to my students would normally be “So what now?” What is it that the student is planning to change in order for this situation to not occur again? In the case of my students, they are juggling work, family and being young. However, almost all of the things that my students do have some benefit (pub crawls notwithstanding). In this case, the CEO of the radio station has offered that, while no-one could have foreseen this, prank calls had been going on for years… Yes. And? We died of cholera for years, too. Let’s not argue tradition for something that has as its prime fruits the embarrassment and humiliation of another person, where we play with people without knowing how robust they are for this game.
Jacintha Saldanha is, tragically, dead and it does appear that this questionable act of entertainment may have been associated with her death. Perhaps, now is not a bad time to put the prank call into the same giant old wardrobe where we put all of the behaviours that never really made any sense and certainly make no sense when we should know so much better – and let’s stop the practice.
Why are we doing something? What is the benefit? Is our enjoyment really worth humiliating or embarrassing someone else on public radio? Where is the benefit in this, for anyone? If my students can drag together sensible and coherent answers to this when asked, so can our broadcast institutions and our journalists.
Brief Stats Update: Penultimate Word Count Notes
Posted: December 8, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, data visualisation, design, education, educational problem, educational research, higher education, student perspective, teaching, teaching approaches, thinking, tools, work/life balance, workload, writing Leave a commentI occasionally dump the blog and run it through some Python script deliciousness to find out how many words I’ve written. This is no measure of worth or quality, more a metric of my mania. As I noted in October, I was going to hit what I thought was my year target much earlier. Well, yes, it came and it went and, sure enough, I plowed through it. At time of writing, on published posts alone, we’re holding at around 1.2 posts/day, 834 words/post and a smidgen over 340,000 words, which puts me (in word count) just after Ayn Rand’s “The Fountainhead” (311,596) but well behind her opus “Atlas Shrugged” (561,996). In terms of Objectivism? Let’s just say that I won’t be putting any kind of animal into that particular fight at the moment.
Now, of course, I can plug in the numbers and see that this puts my final 2012 word count somewhere in the region of 362,000 words. I must admit, there is a part of me that sees that number and thinks “Well, we could make it an even 365,000 and that’s a neat 1000 words/day” but, of course, that’s dumb for several reasons:
- I have not checked in detail exactly how well my extraction software is grabbing the right bits of the text. There are hyperlinks and embellishments that appear to be taken care of, but we are probably only on the order of 95% accuracy here. Yes, I’ve inspected it and I haven’t noticed anything too bad, but there could be things slipping through. After all of this is over, I am going to drag it all together and analyse it properly but, let me be clear, just because I can give you a word count to 6 significant figures, doesn’t mean that it is accurate to 6 significant figures.
- Should I even be counting those sections of text that are quoted? I do like to put quotes in, sometimes from my own work, and this now means I’m either counting something that I didn’t write or I’m counting something that I did write twice!
- Should I be counting the stats posts themselves as they are, effectively, metacontent? This line item is almost above that again! This way madness lies!
- It was never about the numbers in the first place, it was about thinking about my job, my students, my community and learning and teaching. That goal will have been achieved whether I write one word/day from now on or ten thousand!
But, oh, the temptation to aim for that ridiculous and ultimately deceptive number. How silly but, of course, how human to look at the measurable goal rather than the inner achievement or intrinsic reward that I have gained from the thinking process, the writing, the refining of the text, the assembly of knowledge and the discussion.
Sometime after January the 1st, I will go back and set the record straight. I shall dump the blog and analyse it from here to breakfast time. I will release the data to interested (and apparently slightly odd) people if they wish. But, for now, this is not the meter that I should be watching because it is not measuring the progress that I am making, nor is it a good compass that I should follow.
I Can’t Find My Paperless Office For All The Books
Posted: December 8, 2012 Filed under: Education | Tags: authenticity, blogging, book, data visualisation, design, education, Generation Why, higher education, in the student's head, literature, measurement, principles of design, resources, student perspective, teaching approaches, thinking, tools, writing 2 CommentsI tidied up my office recently and managed to clear out about a couple of boxes full of old paper. Some of these were working drafts of research papers, covered in scrawl (usually red because it shows up more), some were book chapter mark-ups, and some were things like project meeting plans that I could scribble on as people spoke. All of this went into either the secure waste bins (sekrit stuff) or the general recycling because I do try to keep the paper footprint down. However, my question to myself is two-fold:
- Why do I still have an office full of paper when I have a desktop, (two) laptops, an iPad and an iPhone, and I happily take notes and annotate documents on them?
- Why am I surrounded by so many books, still?
I don’t think I’ve ever bought as many books as I have bought this year. By default, if I can, I buy them as the electronic and paper form so that I can read them when I travel or when I’m in the office. There are books on graphic design, books on semiotics, books on data visualisation and analysis, and now, somewhat recursively, books on the end of books. My wife found me a book called “This is not the end of the book”, which is a printed conversation between Umberto Eco and Jean-Claude Carrière, curated by Jean-Philippe de Tonnac. I am looking forward to reading it but it has to wait until some of the other books are done. I have just finished Iain M. Banks latest “The Hydrogen Sonata”, am swimming through an unauthorised biography of Led Zeppelin and am still trying to finish off the Derren Brown book that I have been reading on aeroplanes for the past month or so. Sitting behind all this are “Cloud Atlas” and “1Q84”, both of which are officially waiting until I have finished my PhD application portfolio for creative writing. (Yes, dear reader, I’m nervous because they could as easily say ‘No’ as ‘Yes’ but then I will learn how to improve and, if I can’t take that, I shouldn’t be teaching. To thine own dogfood, be as a consumer.)
Why do I still write on paper? Because it feels good. I select pens that feel good to write with, or pencils soft enough to give me a good relationship to the paper. The colour of ink changes as it hits the paper and dries and I am slightly notorious for using inks that do not dry immediately. When I was a winemaker, I used black Bic fine pens, when many other people used wet ink or even fountain pens, because the pen could write on damp paper and, even when you saturated the note, the ink didn’t run. These days, I work in an office and I have the luxury of using a fountain pen to scrawl in red or blue across documents, and I can enjoy the sensation.
Why do I still read on paper? Because it is enjoyable and I have a long relationship with the book, which began from a very early age. The book is also, nontrivially, one of the few information storage devices that can be carried on to a plane without having to be taken from one’s bag or shut down for the periods of take off and landing. I am well aware of the most dangerous points in an aircraft’s cycle and I strongly prefer to be distracted by, if not in-flight entertainment, then a good solid book. But it is also the pleasure of being able to separate the book from the devices that link me into my working world, yet without adding a new data storage management issue. Yes, I could buy a Kindle and not have to check my e-mail, but then I have to buy books from this store and I have to carry that charger or fit it next to my iPad, laptop and phone when travelling. Books, once read, can either be donated to your hosts in another place or can be tossed into the suitcase, making room for yet more books – but of course a device may carry many books. If I have no room in my bag for a book, then I don’t have to worry about the fragility of making space in my carry-on by putting it into the suitcase.
And, where necessary, the book/spider interaction causes more damage to the spider than the contents of the book. My thesis was sufficiently large to stun a small mammal, but you would not believe how hard it was to get ethical approval for that!
The short answer to both questions is that I enjoy using the physical forms although I delight in the practicality, the convenience and the principle of the electronic forms. I am a happy hybridiser who wishes only to enjoy the experience of reading and writing in a way that appeals to me. In a way, the electronic format makes it easier for me to share my physical books. I have a large library of books from when I was younger that, to my knowledge, has books that it is almost impossible to find in print or libraries any more. Yet, I am in that uncomfortable position of being a selfish steward, in that I cannot release some of these books for people to read because I hold the only copy that I know of. As I discover more books in electronic or re-print format (the works of E. Nesbit, Susan Cooper in the children’s collection of my library, for example) then I am free to use the books as they were intended, as books.
What we have now, what is emerging, certainly need not be the end of the book but it will be interesting to look back, in fifty years or so, to find out what we did. If the book has become the analogue watch of information, where it moved from status symbol for its worth, to status symbol for its value, to affectation and, now, to many of my students, an anachronism for those who don’t have good time signal on their phones. I suspect that a watch does not have the sheer enjoyability of the book or the pen on paper, but, if you will excuse me, time will tell.
Information and Education: Other Cultures, Other Views
Posted: December 8, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, education, higher education, in the student's head, Kaurna, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design Leave a commentI’ve had the good fortune to be able to start finding out about how other cultures deal with information and education. This is important for several reasons. Firstly, it helps to remind me that the perception of the dominant monoculture is both primarily a perception and an accident of history, geography and timing. Secondly, it reminds me how easy it is to slip into the monocultural assumption. Finally, it helps me to prepare my students for a world that could be very different from this one.
I’m not a true relativist, I think that some cultural practices (including but not limited to formalised child abuse and female circumcision) are indefensible because they are far too great an imposition on the individual. So let me get that cultural bias onto the table to allow you to frame what I say next. Some ideas, especially when we start dealing with the value of wisdom, and the specific roles of the knowledge keepers in the dispensation and passage of that wisdom, fascinate me but I am still not sufficiently versed to be able to discuss it with any authority or detail. I can, however, discuss ideas with my students such as secret knowledge, without being a Mason, or gendered knowledge, without being of a practising culture, because to do so allows them to realise that there is more to the world than European-derived cultural norms. We don’t have to necessarily agree with all of these other ideas, especially where gender discrimination is preventing access to essential knowledge or limiting advancement, but it is important to understand that it exists.
The role of the knowledge keeper varies with culture and it can be quite confronting for my students to encounter a situation where a single person has the knowledge and may not be available all the time. At the recent Adelaide Computing Education Conventicle that I ran, two presenters from the University of South Australia presented work on integrating Australian Indigenous Culture into ICT project work and discussed the way that it changed the projects. The person needed is in hospital for treatment? Then you’ll have to wait until they get back because they are the person that you have to talk to. A friend has told me about this before in the context of geological information in the Australian Outback. You want to know about this section of the land? Well, you can’t ask the men about it, it’s not their land. If you want to ask the women, then you’re going to have to work out who can ask it and what can be told in a way that can be viewed from people outside (and men).
Just because we want to know something from a specific culture does not give us the right to demand it and getting this across to students is, I think, one of the most important steps in establishing a mutual respect between cultures and a way of avoiding misunderstandings in the future. It’s easy to start jumping up and down in that tiresome Western manner about this kind of information management but I think we can be pretty sure that the majority of the indigenous population of Australia would have quite a lot to say about having to conform to our cultural norms, so we should think pretty carefully before we start placing our rule sets over their knowledge.
Uncle Lewis O’Brien, Elder of the Kaurna people, noted once that it was common to welcome newcomers to your land, to show them around so that they could see how good the land was and how much care was being taken of it, but it was always done in the understanding that, one day, the visitor would go home. As he noted, wryly, perhaps his people should have been clearer on that last bit with the original white settlers. But we were here now.
Cultural issues are important to the people in that culture and working out how we can marry these requirements allows us to demonstrate our maturity as people and our level of comfort with our own beliefs. If, one day, somebody shows me something so amazing and truthful that I start believing in a new belief system or an entirely new way of living, then I hope that I would be able to cope with it and make sense of it. In New Zealand, Maori medical researchers are working through the cultural taboo of handling the dead in order to meet the educational requirements of working with tissue samples. If we can work with closing shops on Saturday or Sunday for Synagogue or Church (as we did for centuries), then we can have some thinking about incorporating the living beliefs of other cultures without dying of shock or making racist statements about ‘backwards cultures’. You go and thrive in the middle of Australia for a while and tell me how much knowledge it required to avoid dying of thirst on the third day.
I’m always worried when we start rejecting other cultures because monocultures are not strong, they’re weak. By definition, they are static and immutable – the rock, not the water. They’re prone to a single attack vector and, if they fail, they fail on the massive scale. I’m not talking just about our unnatural dependency on one banana or one wheat, I’m talking about real disasters that have occurred because of a lack of resistance to animal-borne diseases. The current thinking is that both North America and Australia were far more heavily populated than the original European explorers thought, but that earlier contact had introduced devastating levels of disease that almost wiped out the populations – making the subsequent colonisation and seizure of land easier. These were accidental resistance monocultures, caused by geographical isolation. Now we are connected and we have no excuse for this.
What my students have to understand is that the world of three hundred years ago was not the world of two hundred or one hundred years ago. Empires rise and fall. Cultures come and go. Today’s leader is tomorrow’s footnote. Learning how to work with other cultures and how to reduce the dependency on a single strand may be what changes the way that our history unfolds. I’m not naive enough to believe that we’re at the end of history (the end of conflict) but I think that we’re sufficiently well connected and well informed that we can tell our students that not everything different is wrong and scary, and that not everything familiar is right and just.
I wonder what they’ll be saying about us, in 2112?
Data Visualisation: Strong Messages Educate Better
Posted: December 7, 2012 Filed under: Education | Tags: advocacy, authenticity, data visualisation, design, education, higher education, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, tools 1 CommentKnow what this is?
Rather pretty, isn’t it – but it has a definite direction, like someone has throw something from the right and it has hit the ground and scattered.
This image is from the Bomb Sight website, and shows all of the bombs that fell on London (and surrounds) from the 7th of October, 1940, to the 6th of June, 1941. The Bomb Sight team have been working from a variety of data sources to put together a reasonably reliable picture of the recorded bombs on London over that 242 day period. If you zoom in (and it starts zoomed in), you start to see how many sites took 2, 3, 4 or more bombs (10, 11, plus) over that time.
If I were to put together a number of bombs and a number of days and say “X bombs fell in London over Y days”, you could divide X by Y and say “Gosh.” Or I can show you a picture like the one above and tell you that each of those dots represents at least one bomb, possibly as many as 10 or so, and watch your jaw drop.
Seen this way, the Blitz becomes closer to those of us who were fortunate enough not to live through that terrible period. We realise any number of things, most of which is that close proximity to a force who wishes you ill is going to result in destruction and devastation of a level that we might not be able to get our heads around, unless we see it.
Seen this way, it’s a very strong message of what actually happened. It has more power. In a world of big numbers and enormous data, it’s important to remember how we can show things so that we tell their stories in the right way. Numbers can be ignored. Pictures tell better stories, as long as we are honest and truthful in the way that we use them.
The Invisible Fragility of our World of Knowledge
Posted: December 7, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, boustrophedon, community, education, feedback, higher education, internet, reflection, resources, rongorongo, thinking, tools, universal principles of design 1 CommentIf I were to mention that I was currently researching Rongorongo, as background for a story in which the protagonists communicated in a range of reverse boustrophedonic texts, there are three likely outcomes.
- You would roll your eyes and close the browser, or,
- You would think “Aha, that’s what I was talking about last night at the Friends of Rapanui Quiz Night. How apt!”, or,
- You would go and look up Rongorongo and boustrophedon in Wikipedia.
What I am fairly sure that most of you will not do, is to go and look up the information in a book, go to a library or even ask another human. (Some of you will have used physical means such as books or libraries because you are being deliberate physical users. I am after the usage patterns that your adopt unconsciously, or as a matter of actual habit, then those that are employed because of a deliberate endeavour to use another source.) There is no doubt that we live in an amazing world of immediately available information and that it has changed the way that we use, store and retrieve information but this immediacy has come at a cost: we tend not to use or consult physical media as much. As a result, there is less of the physical to hand, most of the time. I have noticed a major change in the way that I use information and, while I tend to read and annotate material on printed paper (using a fountain paper, no less, so I am not judging anyone for their affectations), I search and edit in the digital form. Why? Each form has its own efficiencies.
The absence of the physical artefact is often not noticeable unless we are cut off from the Internet or from our stored versions of the material. Last week, my laptop decided that it would no longer boot and I realised, with mounting horror, that my only copies of certain works in progress were sitting on this ‘dead’ machine. Why weren’t they backed up? Because I was not connected to the Internet for a few hours and I had left my actual backup device at home, to reduce the risk of losing both laptop and backup in the same localised catastrophe.
The majority of the on-line information repositories are remarkable in their ease of use and sheer utility – as long as you can connect to them .We, however, have an illusion of availability and cohesion that is deceptive and it is the comfortable analogue of the printed page that lulls us into this. Wikipedia, for example, presents a single page full of text, just like a book does. It is only when you look at the History and the Discussion that it dawns on you that each character on the page could have been contributed by a difference source. While the printed page is the final statement of a set of arguments between the authors, the editors and their mutual perceptions of reality, it is static once printed. In Wikipedia, its strength and its weakness is that the argument never ends. Anything on a publicly editable page is inherently fragile and ephemeral. What is there today may not be there tomorrow and there is no guarantee that what appears sound now will be anything other than horrible and deliberately broken in a second.
The fragility doesn’t stop there, however, because we don’t actually have any part of Wikipedia inside our offices, unless you happen to be Jimmy Wales. (Hi!) Wikipedia.org, the domain name of Wikipedia, is registered in California, but the server I was connected to (for the queries I put above) was in Washington State, and there were some 17 active network devices involved in routing traffic from me (in Adelaide) to the server (in Washington) and then getting the information back. This doesn’t count the active electronic devices that I can’t see in this path and, believe me, there will be a lot of them. Now we build a lot of redundancy into the global network that we call the Internet (the network of networks of networks) but a major catastrophe on the West Coast will quickly force so much traffic onto those backup links that information flow will stop and, for some good technical reasons, it will then start to fall over.
So the underlying physical pathways that actually shunt the network information from point to point could fall over. At that point, if I had a book on the linguistics of Easter Island, I could read it by torchlight even if I had no local power. A severe power failure here or in enough places along the way, or at Wikipedia’s data centres? Suddenly, my ability to find out anything is blocked.
But let’s look at the information itself. People have been editing the Rongorongo page for over 10 years. The first version (that we can see, Wikipedia can invisibly delete revisions) is recorded for the 25th of November, 2002. Happy double digits, Rongorongo page! Since then there have been roughly 3000 edits. Are all of them the same quality? Hmm. Here are some comments:
14 April 2006, “reinstate link to disambiguate Rongorongo, wife of Turi, NZ”
18 May 2006, “If I want to be blocked, why do I improve these pages? REMEMBER LIUVIGILD! TRY BLOCKING ME!!! BWAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA” (sic)
18 May 2006, “Excuse my insolence. This is not vandalism, as it is all true. Why do you insist on reverting it? Please send a PERSONAL message of explanation. Sincerely, 64.107.172.130”
28 April 2007, “Inhabitants of Easter Island have many names for it.”
7 April 2011, “A picture of a banana leaf is not helpful here. I looked on this banana leaf for scribblings. I know what one looks like, and if someone doesn’t, they can read about it at banana.”
12 November, 2012, “What’s wrong, Kwamikagami? It is what it is, isn’t it? Just a straight up comparison of rongorongo and Indus Valley glyphs, nothing more. I’d love to know which ones are ‘not true”‘according to you”
There are periods when this page is changing every few minutes and sometimes the data is the same for days or even months. But most people don’t know this because they never think to look in the history or talk sections. Right now, it appears that someone is disputing the authority of Kwamikagami, a person who has carried out a lot of edits on this page. This is important because if you say to someone “Hey, look at this page” then 3000 edits over 10 years says that the chances of the page changing in a day is something like 80%. The burstiness would have an impact on this but the general idea is that the simple page on a dead text(?) is more likely to change on a daily basis than not.
Does this make Wikipedia any better or any worse than the printed page? I think it makes it different because we have to treat it as an evolving discussion that we have walked in on, because of its inherent fragility and ephemeral nature.
We live in amazing times, where I can use a small hand-held device to access almost everything that our species has created. And yet, when I go to look at how robust this knowledge source is and how vulnerable we are to losing our connection to that knowledge, I am reminded that we are going to have to work out how to do this properly. If we give up the fixed physical forms (books, CDs, DVDs), then so be it, but we must make sure that we deal with this fragility before we become too seduced by the immediacy. We have to think about this for our students too. How do we provide them with artefacts that they can consult down the line, when they need to look something up? Books have no licensing agreements, never expire and do not have to be abandoned when a digital format changes. Yet, they have none of the advantages.
I mention this because I am really looking forward to seeing how people address and solve this challenge – how can we have the best of the immediate and convenient, while having the enduring presence and guarantee of future access? Rongorongo itself is a physical artefact for which we have lost the knowledge of reading, or if it is even a text at all. It’s a reminder that we have faced this problem before and we have not solved it sufficiently well. Perhaps this time.
What if we are wrong? Musings on the way home
Posted: December 6, 2012 Filed under: Education | Tags: apocalypse, education, higher education, teaching approaches, thinking Leave a commentWhen I was a physics student, many years ago, we would sometimes entertain the notion of what would happen if something wasn’t the way that it was. The impact of changing the Planck constant and diffracting through doorways (ignoring how much else would immediately break). What would happen if the speed of light was much slower – or much faster. What would happen if there was no static coefficient of friction. (The short answer to that last one is Wheeeeeeeeeeeee and the penguins dominate the Earth.)
These thought experiments constitute a principle aspect of physics, specifically, and science, in general. What if?
Now we are approaching another date of a so-called apocalypse and, as I have already posted, both I and the Mayans agree: being scared of the end of this ba’ak’tun makes as much sense as being worried about Sunday night.
But what if? What if, after everything, the world ends on the 21st of December, 2012?
Let’s start this by working out which 21st of December we are talking about. Is it GMT offset or the first country to officially have the time? Is it even 00:00:01 on that date or something convenient like midday? Do we all have to be in the correct day or will the world end in neat hourly blocks (half hourly for difficult time zones like Adelaide). Spain is GMT+1 but sits under England. Will there be an embarrassing absence of ocean as the seas pour into the hole vacated by the destruction of the Iberian Peninsula?
What do we even mean by the end of the world, anyway? The destruction of all life, all human life, most human life, the flooding of the land, fire, famine, pestilence or the complete obliteration of the planet itself? Is this a grand Universal extinction event or localised to our galaxy?
These are important questions! If we are talking the wiping out of only some life forms on the planet, with an otherwise intact biosphere, then we have a small but fit for purpose International Space Station. Once the disaster is over, the crew can descend and they can repopulate the Earth.
All guys?
Really?
We are really not taking this apocalypse seriously, are we? We have one opportunity for an isolated spot that could theoretically jump start our race – and looking at the pictures it’s a zero gravity moustache growing competition.
I’m being facetious, obviously, but it is amazing how far the apocalypse idea spreads without any of the questions of any detail being answered. The eschatological aspects of the Bible have been fleshed out in the most amazing detail but this current Mayan apocalypse? Meh.
We are currently seeing another, far more serious, threat manifesting in the steadily unfolding issues caused by climate change and what scares me is that people have been postulating the What If scenarios on that for decades. We are longer talking about What If for this, we are talking about What Now. Yet we still argue as if the real and demonstrable changes are as mythical as the Mayan scenario.
It would be darkly amusing if December the 21st, 2012, is revealed, decades hence, to have been the tipping point between salvageable and irretrievable. I sometimes refer to this as the Atlantis moment, the point at which your civilisation is doomed to extinction and myth.
“What if?” is not just a good scientific tool for my students, it’s an ethical and philosophical tool as well. What if we don’t tell these men that we can cure their syphilis? What if I argue in a way that suggests a reversed order of priority for key tasks? What if I take money to stay silent? What would happen if I did nothing? What if? WHAT IF?
What if all of us are wrong about apocalypses because we don’t see well enough through longer periods of time to see what a true disaster looks like?
I’m not expecting the world to end before Christmas (I have flights booked and would hate to miss the party) but it’s not a bad time to step back and think about what would happen, if we were in such dire straits. What if we are?
AAEE 2012 – Yes, Another Conference
Posted: December 5, 2012 Filed under: Education | Tags: aaee2012, advocacy, ci2012, conventicle, education, educational research, feedback, Generation Why, higher education, in the student's head, learning, reflection, research, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design 3 CommentsIn between writing up the conventicle (which I’m not doing yet), the CI Conference (which I’m doing slowly) and sleep (infrequent), I’m attending the Australasian Association for Engineering Education 2012 conference. Today, I presented a paper on e-Enhancing existing courses and, through a co-author, another paper on authentic teaching tool creation experiences.
My first paper gave me a chance to look at the Google analytics and tracking data for the on-line material I created in 2009. Since then, there have been:
- 11,118 page views
- 2.99 pages viewed/visit
- 1,721 unique visitors
- 3,715 visits overall
The other thing that is interesting is that roughly 60% of the viewers return to view the podcasts again. The theme of my talk was “Is E-Enhancement Worth It” and I had the pleasure of pointing out that I felt that it was because, as I was presenting, I was simultaneously being streamed giving my thoughts of computer networks to students in Singapore and (strangely enough) Germany. As I said in the talk and in the following discussion, the podcasts are far from perfect and, to increase their longevity, I need to make them shorter and more aligned to a single concept.
Why?
Because while the way I present concepts may change, because of sequencing and scaffolding changes, the way that I present an individual concept is more likely to remain the same over time. My next step is to make up a series of conceptual podcasts that are maybe 3-5 minutes in duration. Then the challenge is how to assemble these – I have ideas but not enough time.
One of the ideas raised today is the idea that we are seeing the rise of the digital native, a new type of human acclimatised to a short gratification loop, multi-tasking, and a non-linear mode of learning. I must be honest and say that everything I’ve read on the multi-tasking aspect, at least, leads me to believe that this new generation don’t multi-task any better than anyone else did. If they do two things, then they do them more slowly and don’t achieve the same depth: there’s no shortage of research work on this and given the limits of working memory and cognition this makes a great deal of sense. Please note, I’m not saying that I don’t believe that Homo Multiplexor can’t emerge, it’s just that I have not yet any strong scientific evidence to back up the anecdotes. I’m perfectly willing to believe that default searching activities have changed (storing ways of searching rather than the information) because that is a logical way to reduce cognitive load but I am yet to see strong evidence that my students can do two things at once well and without any loss of time. Either working memory has completely changed, which we should be able to test, or we risk confusing the appearance of doing two things at once with actually doing two things at once.
This is one of those situations that, as one of my colleagues observed, leaves us in that difficult position of being told, with great certainty, about a given student (often someone’s child) who can achieve great things while simultaneously watching TV and playing WoW. Again, I do not rule out the possibility of a significant change in humanity (we’re good at it) but I have often seen that familiar tight smile and the noncommittal nod as someone doesn’t quite acknowledge that your child is somehow the spearhead of a new parallelised human genus.
It’s difficult sometimes to express ideas like this. Compare this to the numbers I cited above. Everyone who reads this will look at those numbers and, while they will think many things, they are unlikely to think “I don’t believe that”. Yet I know that there are people who have read this and immediately snorted (or the equivalent) because they frankly disbelieve me o the multi-tasking, with no more or less hard evidence than that supporting the numbers. I’m actually expecting some comments on this one because the notion of the increasing ability of young people to multitask is so entrenched. If there is a definitive set of work supporting this, then I welcome it. The only problem is that all I can find supports the original work on working memory and associated concepts – there are only so many things you can focus on and beyond that you might be able to function but not at much depth. (There are exceptions, of course, but the 0.1% of society do not define the rule.)
The numbers are pasted straight out of my Google analytics for the learning materials I put up – yet you have no more reason to believe them than if I said “83% of internet statistics are made up”, which is a made up statistic. (If is is true, it is accidentally true.) We see again one of the great challenges in education: numbers are convincing, evidence that contradicts anecdote is often seen as wrong, finding evidence in the first place can be hard.
One more day of conference tomorrow! I can only wonder what we’ll be exposed to.
Killing Your Darlings: The Cost of Innovation (CI 2012)
Posted: December 3, 2012 Filed under: Education | Tags: advocacy, ci2012, education, ethics, feedback, higher education, principles of design, reflection, resources, thinking, tools, universal principles of design Leave a commentI’m going to take a little more informal approach to some of the themes expressed at CI 2012, because I have a lot of things to do, and you have a lot of things to do, so we can’t sit here waiting for me write everything up and you most certainly don’t want to read 100,000 words about What Nick Did In Late Spring In Melbourne. So let’s go forward.
Innovation is the introduction of the new, whether product, service or idea, but we know what this really means – it means that we have to let go of something old. Letting go of something old is not going to be easy, and how difficult it is can be a very complicated and emotional calculus, so innovation, which can already be hard, is made harder because change can hurt.
If you’re a writer, you may have heard the term “Kill your darlings”, which is attributed to Faulkner (the other one) and is a recasting of the following quote from Sir Arthur Quiller-Couch:
“Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it – whole-heartedly – and delete it before sending your manuscripts to press. Murder your darlings”
On shallow reading, it appears that any attachment to something makes it eligible for extinction when what is really meant is that sentimentality is the enemy of objectivity. Innovative change is full of situations where your attachment to elements of your existing situation, or an entrenched commitment to the status quo (no, not the band), will compromise your ability to objectively assess whether you are making a correct decision.
There is a statement that every industry will go away at some stage – we’ve seen the rise and fall of so many that such a statement appears to have some credibility. But what about education? We have changed a great deal but will the industry of education every truly disappear? I honestly can’t say but I can talk about a simpler problem, which is what the “darlings” are in the traditional Higher Education system. And, sure enough, when we start talking about innovation and the threat of the new, we see these darlings protected in a way that doesn’t necessarily always seem objective. Now, we don’t have to kill any of them but change is inevitable and, if change is to come in, something has to go out. I have a starting list, which I’m planning to work on over time.
- Darling #1, The Lecture:
We know that the traditional 1-to-many broadcast lecture is a successful way to occupy the time of everyone in the room but it is most certainly not the best way to get certain types of information across. There are many different aspects to this but conference talks and seminars are a world away from the traditional “today I will talk slowly about differential equations while I flash hundreds of slides past you at a speed that you can’t record and no you can’t have any notes or recording”.
Yes, some lecturers are better than others but when information transfer and retention is important, the lecture is not the right delivery mechanism. Yet, it’s almost unassailable in its ubiquity. It’s a darling.
- Darling #2, The Exam:
I was looking back at my Grand Challenges course, which had a 20% final examination of some of the core topics, and thought about what it had achieved. From my marking of the exam and review of how students prepared, my goal for the exam worked for most of the class. Most had reviewed all of the core material and organised it in a useful way to be able to summarise the core content of the course.
But did it have to be assigned as a 1 hour exam in a giant examination hall? Did it anything to the course?
You know, I’m not sure that it did. Next time, I might just assign an exercise to provide a portfolio of work from the course in an organised form and then have an assessment of that which is effectively a viva voce examination to assess that students had done enough work to produce a useful index and had sufficient familiarity to rapidly contextualise problems and knowledge. But, and this is important, far more conversationally.
The examination can be made highly objective and has the advantage that you are really pretty sure that the student is doing the work – but we’re already seeing cheating technology that we will have more and more trouble dealing with. If the only supporting argument for the exam is that it’s harder to cheat, we need a better reason. If the argument is that it will force the student to learn the work, then we’ve got that around the wrong way. We need to bring motivation back into the rest of the course. Right now, the vast majority of learning happens 2-3 days before the exam and is forgotten by the following weekend.
And yet, exams are everywhere. They’re entrenched institutional artefacts. Hello, darling.
- Darling #3, Me and my University:
Oh no! Apostasy! But let’s be honest, the primary question around MOOC is whether we need the Universities that we’ve had for so many hundreds of years. If we’re questioning the University, then we’re starting to question the role and future of the teaching academic. Teacherless education was a theme that popped up occasionally at CI 2012 and, while I instinctively react to this in terms of ‘well, who builds these experiences’, we can still learn a lot by looking at what we actually need to make things work.
I have a small office in a big and old University, with my academic robes hanging on the door for when I walk into the graduation ceremony in the giant old sandstone building once or so every year to farewell and congratulate my graduating students. How much of this is necessary recognition of achievement and how much is a darling?
Let’s face it – we’re darlings ourselves.
Let me stress that I am not saying that everything must go, but innovation needs space and that means something else has to go. Rather than saying that everything is sacrosanct, we should really be looking at what can and should go, which will drive a search for the new and innovative. My hope would be that by looking at these things, we find the reasons why some of these could stay and belong in the future, rather than propping them up with sentimentality and an ultimately weak approach to necessary change and reinvigoration.
What are your darlings?




