Thinking about teaching spaces: if you’re a lecturer, shouldn’t you be lecturing?
Posted: December 30, 2012 Filed under: Education | Tags: blogging, collaboration, community, curriculum, design, education, educational problem, feedback, Generation Why, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, vygotsky Leave a commentI was reading a comment on a philosophical post the other day and someone wrote this rather snarky line:
He’s is a philosopher in the same way that (celebrity historian) is a historian – he’s somehow got the job description and uses it to repeat the prejudices of his paymasters, flattering them into thinking that what they believe isn’t, somehow, ludicrous. (Grangousier, Metafilter article 123174)
Rather harsh words in many respects and it’s my alteration of the (celebrity historian)’s name, not his, as I feel that his comments are mildy unfair. However, the point is interesting, as a reflection upon the importance of job title in our society, especially when it comes to the weighted authority of your words. From January the 1st, I will be a senior lecturer at an Australian University and that is perceived differently where I am. If I am in the US, I reinterpret this title into their system, namely as a tenured Associate Professor, because that’s the equivalent of what I am – the term ‘lecturer’ doesn’t clearly translate without causing problems, not even dealing with the fact that more lecturers in Australia have PhDs, where many lecturers in the US do not. But this post isn’t about how people necessarily see our job descriptions, it’s very much about how we use them.
In many respects, the title ‘lecturer’ is rather confusing because it appears, like builder, nurse or pilot, to contain the verb of one’s practice. One of the big changes in education has been the steady acceptance of constructivism, where the learners have an active role in the construction of knowledge and we are facilitating learning, in many ways, to a greater extent than we are teaching. This does not mean that teachers shouldn’t teach, because this is far more generic than the binding of lecturers to lecturing, but it does challenge the mental image that pops up when we think about teaching.
If I asked you to visualise a classroom situation, what would you think of? What facilities are there? Where are the students? Where is the teacher? What resources are around the room, on the desks, on the walls? How big is it?
Take a minute to do just this and make some brief notes as to what was in there. Then come back here.
It’s okay, I’ll still be here!
Adelaide Computing Education Conventicle 2012: “It’s all about the people”
Posted: December 27, 2012 Filed under: Education | Tags: acec2012, advocacy, authenticity, blogging, collaboration, community, conventicle, curriculum, design, education, educational problem, educational research, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, learning, principles of design, reconciliation, reflection, resources, student perspective, teaching, teaching approaches, thinking, universal principles of design 1 Commentacec 2012 was designed to be a cross-University event (that’s the whole point of the conventicles, they bring together people from a region) and we had a paper from the University of South Australia: ‘”It’s all about the people”; building cultural competence in IT graduates’ by Andrew Duff, Kathy Darzanos and Mark Osborne. Andrew and Kathy came along to present and the paper was very well received, because it dealt with an important need and a solid solution to address that need, which was inclusive, insightful and respectful.
For those who are not Australians, it is very important to remember that the original inhabitants of Australia have not fared very well since white settlement and that the apology for what happened under many white governments, up until very recently, was only given in the past decade. There is still a distance between the communities and the overall process of bringing our communities together is referred to as reconciliation. Our University has a reconciliation statement and certain goals in terms of representation in our staff and student bodies that reflect percentages in the community, to reduce the underrepresentation of indigenous Australians and to offer them the same opportunities. There are many challenges facing Australia, and the health and social issues in our indigenous communities are often exacerbated by years of poverty and a range of other issues, but some of the communities have a highly vested interest in some large-scale technical, ICT and engineering solutions, areas where indigenous Australians are generally not students. Professor Lester Irabinna Rigney, the Dean of Aboriginal Education, identified the problem succinctly at a recent meeting: when your people live on land that is 0.7m above sea level, a 0.9m sea-level rise starts to become of concern and he would really like students from his community to be involved in building the sea walls that address this, while we look for other solutions!
Andrea, Kathy and Mark’s aim was to share out the commitment to reconciliation across the student body, making this a whole of community participation rather than a heavy burden for a few, under the guiding statement that they wanted to be doing things with the indigenous community, rather than doing things to them. There’s always a risk of premature claiming of expertise, where instead of working with a group to find out what they want, you walk in and tell them what they need. For a whole range of very good and often heartbreaking reasons, the Australian indigenous communities are exceedingly wary when people start ordering them about. This was the first thing I liked about this approach: let’s not make the same mistakes again. The authors were looking for a way to embed cultural awareness and the process of reconciliation into the curriculum as part of an IT program, sharing it so that other people could do it and making it practical.
Their key tenets were:
- It’s all about the diverse people. They developed a program to introduce students to culture, to give them more than one world view of the dominant culture and to introduce knowledge of the original Australians. It’s an important note that many Australians have no idea how to use certain terms or cultural items from indigenous culture, which of course hampers communication and interaction.
For the students, they were required to put together an IT proposal, working with the indigenous community, that they would implement in the later years of their degree. Thus, it became part of the backbone of their entire program.
- Doing with [people], not to [people]. As discussed, there are many good reasons for this. Reduce the urge to be the expert and, instead, look at existing statements of right and how to work with other peplum, such as the UN rights of indigenous people and the UniSA graduate attributes. This all comes together in the ICUP – Indigenous Content in Undergraduate Program
How do we deal with information management in another culture? I’ve discussed before the (to many) quite alien idea that knowledge can reside with one person and, until that person chooses or needs to hand on that knowledge, that is the person that you need. Now, instead of demanding knowledge and conformity to some documentary standard, you have to work with people. Talking rather than imposing, getting the client’s genuine understanding of the project and their need – how does the client feel about this?
Not only were students working with indigenous people in developing their IT projects, they were learning how to work with other peoples, not just other people, and were required to come up with technologically appropriate solutions that met the client need. Not everyone has infinite power and 4G LTE to run their systems, nor can everyone stump up the cash to buy an iPhone or download apps. Much as programming in embedded systems shakes students out of the ‘infinite memory, disk and power’ illusion, working with other communities in Australia shakes them out of the single worldview and from the, often disrespectful, way that we deal with each other. The core here is thinking about different communities and the fact that different people have different requirements. Sometimes you have to wait to speak to the right person, rather than the available person.
The online forum has four questions that students have to find a solution to, where the forum is overseen by an indigenous tutor. The four questions are:
- What does culture mean to you?
- Post a cultural artefact that describes your culture?
- I came here to study Computer Science – not Aboriginal Australians?
- What are some of the differences between Aboriginal and non-Aboriginal Australians?
The first two are amazing questions – what is your answer to question number 2? The second pair of questions are more challenging and illustrate the bold and head-on approach of this participative approach to reconciliation. Reconciliation between all of the Australian communities requires everyone to be involved and, being honest, questions 3 and 4 are going to open up some wounds, drag some silly thinking out into the open but, most importantly, allow us to talk through issues of concern and confusion.
I suspect that many people can’t really answer question 4 without referring back to mid-50s archetypal depictions of Australian Aborigines standing on one leg, looking out over cliffs, and there’s an excellent ACMI (Australian Centre for the Moving Image) exhibit in Melbourne that discusses this cultural misappropriation and stereotyping. One of the things that resonated with me is that asking these questions forces people to think about these things, rather than repeating old mind grooves and received nonsense overheard in pubs, seen on TV and heard in racist jokes.
I was delighted that this paper was able to be presented, not least because the goal of the team is to share this approach in the hope of achieving even greater strides in the reconciliation process. I hope to be able to bring some of it to my Uni over the next couple of years.
John Henry Died
Posted: December 23, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, collaboration, community, curriculum, design, education, educational problem, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, john henry, learning, meritocracy, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, workload Leave a commentEvery culture has its myths and legends, especially surrounding those incredible individuals who stand out or tower over the rest of the society. The Ancient Greeks and Romans had their gods, demigods, heroes and, many times, cautionary tales of the mortals who got caught in the middle. Australia has the stories of pre- and post-federation mateship, often anti-authoritarian or highlighting the role of the larrikin. We have a lot of bushrangers (with suspiciously good hearts or reacting against terrible police oppression), Simpson and his donkey (a first world war hero who transported men to an aid station using his donkey, ultimately dying on the battlefield) and a Prime Minister who goes on YouTube to announce that she’s now convinced that the Mayans were right and we’re all doomed – tongue firmly in cheek. Is this the totality of the real Australia? No, but the stylised notion of ‘mateship’, the gentle knock and the “come off the grass, you officious … person” attitude are as much a part of how many Australians see themselves as shrimp on a barbie is to many US folk looking at us. In any Australian war story, you are probably more likely to hear about the terrible hangover the Gunner Suggs had and how he dragged his friend a kilometre over rough stones to keep him safe, than you are to hear about how many people he killed. (I note that this mateship is often strongly delineated over gender and racial lines, but it’s still a big part of the Australian story.)
The stores that we tell and those that we pass on as part of our culture strongly shape our culture. Look at Greek mythology and you see stern warnings against hubris – don’t rate yourself too highly or the gods will cut you down. Set yourself up too high in Australian culture and you’re going to get knocked down as well: a ‘tall poppies’ syndrome that is part cultural cringe, inherited from colonial attitudes to the Antipodes, part hubris and part cultural confusion as Anglo, Euro, Asian, African and… well, everyone, come to terms with a country that took the original inhabitants, the Australian Aboriginal and Torres Strait Islanders, quite a while to adapt to. As someone who wasn’t born in Australia, like so many others who live here and now call themselves Australia, I’ve spent a long time looking at my adopted homeland’s stories to see how to fit. Along the way, because of travel, I’ve had the opportunity to look at other cultures as well: the UK, obviously as it’s drummed into you at school, and the US, because it interests me.
The stories of Horatio Alger, from the US, fascinate me, because of their repeated statement of the rags to riches story. While most of Alger’s protagonists never become amazingly wealthy, they rise, through their own merits, to take the opportunities presented to them and, because of this, a good man will always rise. This is, fundamentally, the American Dream – that any person can become President, effectively, through the skills that they have and through rolling up their sleeves. We see this Dream become ugly when any of the three principles no longer hold, in a framing I first read from Professor Harlon Dalton:
- The notion that we are judged solely on our merits:For this to be true, we must not have any bias, racist, gendered, religious, ageist or other. Given the recent ruling that an attractive person can be sacked, purely for being attractive and for providing an irresistible attraction for their boss, we have evidence that not only is this point not holding in many places, it’s not holding in ways that beggar belief.
- We will each have a fair opportunity to develop these merits:This assumes equal opportunity in terms of education, in terms of jobs, which promptly ignores things like school districts, differing property tax levels, teacher training approaches and (because of the way that teacher districts work) just living in a given state or country because your parents live there (and can’t move) can make the distance between a great education and a sub-standard child minding service. So this doesn’t hold either.
- Merit will out:Look around. Is the best, smartest, most talented person running your organisation or making up all of the key positions? Can you locate anyone in the “important people above me” who is holding that job for reasons other than true, relevant merit?
Australia’s myths are beneficial in some ways and destructive in others. For my students, the notion that we help each other, we question but we try to get things done is a positive interpretation of the mild anti-authoritarian mateship focus. The downside is drinking buddies going on a rampage and covering up for each other, fighting the police when the police are actually acting reasonably and public vandalism because of a desire to act up. The mateship myth hides a lot of racism, especially towards our indigenous community, and we can probably salvage a notion of community and collaboration from mateship, while losing some of the ugly and dumb things.
Horatio Alger myths would give hope, except for the bleak reality that many people face which is that it is three giant pieces of boloney that people get hit about the head with. If you’re not succeeding, then Horatio Alger reasoning lets us call you lazy or stupid or just not taking the opportunities. You’re not trying to pull yourself up by your bootstraps hard enough. Worse still, trying to meet up to this, sometimes impossible, guideline leads us into John Henryism. John Henry was a steel driver, who hammered and chiseled the rock through the mountains to build tunnels for the railroad. One day the boss brought in a steam driven hammer and John Henry bet that he could beat it, to show that he and his crew should not be replaced. After a mammoth battle between man and machine, John Henry won, only to die with the hammer in his hand.
Let me recap: John Henry died – and the boss still got a full day’s work that was equal to two steam-hammers. (One of my objections to “It’s a Wonderful Life” is that the rich man gets away with stealing the money – that’s not a fairy tale, it’s a nightmare!) John Henryism occurs when people work so hard to lift themselves up by their bootstraps that they nearly (or do) kill themselves. Men in their 50s with incredibly high blood pressure, ulcers and arthritis know what I’m talking about here. The mantra of the John Henryist is:
“When things don’t go the way that I want them to, that just makes me work even harder.”
There’s nothing intrinsically wrong with this when your goal is actually achievable and you apply this maxim in moderation. At its extreme, and for those people who have people standing on their boot caps, this is a recipe to achieve a great deal for whoever is benefiting from your labour.
And then dying.
As John Henry observes in the ballad (Springsteen version), “I’ll hammer my fool self to death”, and the ballad of John Henry is actually a cautionary tale to set your pace carefully because if you’re going to swing a hammer all day, every day, then you have to do it at a pace that won’t kill you. This is the natural constraint on Horatio Alger and balances all of the issues with merit and access to opportunity: don’t kill your “fool self” striving for something that you can’t achieve. It’s a shame, however, that the stories line up like this because there’s a lot of hopelessness sitting in that junction.
Dealing with students always makes me think very carefully about the stories I tell and the stories I live. Over the next few days, I hope to put together some thoughts on a 21st century myth form that inspires without demanding this level of sacrifice, and that encourages without forcing people into despair if existing obstacles block them – and it’s beyond their current control to shift. However, on that last point, what I’d really like to come up with is a story that encourages people to talk about obstacles and then work together to lift them out of the way. I do like a challenge, after all. 🙂
Vitamin Ed: Can It Be Extracted?
Posted: December 22, 2012 Filed under: Education | Tags: advocacy, blogging, community, curriculum, design, education, educational problem, educational research, ethics, Generation Why, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, vygotsky, workload Leave a commentThere are a couple of ways to enjoy a healthy, balanced diet. The first is to actually eat a healthy, balanced diet made up from fresh produce across the range of sources, which requires you to prepare and cook foods, often changing how you eat depending on the season to maximise the benefit. The second is to eat whatever you dang well like and then use an array of supplements, vitamins, treatments and snake oil to try and beat your diet of monster burgers and gorilla dogs into something that will not kill you in 20 years. If you’ve ever bothered to look on the side of those supplements, vitamins, minerals or whatever, that most people have in their ‘medicine’ cabinets, you might see statements like “does not substitute for a balanced diet” or nice disclaimers like that. There is, of course, a reason for that. While we can be fairly certain about a range of deficiency disorders in humans, and we can prevent these problems with selective replacement, many other conditions are not as clear cut – if you eat a range of produce which contains the things that we know we need, you’re probably getting a slew of things that we also need but don’t make themselves as prominent.
In terms of our diet, while the debate rages about precisely which diet humans should be eating, we can have a fairly good stab at a sound basis from a dietician’s perspective built out of actual food. Recreating that from raw sugars, protein, vitamin and mineral supplements is technically possible but (a) much harder to manage and (b) nowhere near as satisfying as eating the real food, in most cases. Let’s nor forget that very few of us in the western world are so distant from our food that we regard it purely as fuel, with no regard for its presentation, flavour or appeal. In fact, most of us could muster a grimace for the thought of someone telling us to eat something because it was good for us or for some real or imagined medical benefit. In terms of human nutrition, we have the known components that we have to eat (sugars, proteins, fats…) and we can identify specific vitamins and minerals that we need to balance to enjoy good health, yet there is not shortage of additional supplements that we also take out of concern for our health that may have little or no demonstrated benefit, yet still we take them.
There’s been a lot of work done in trying to establish an evidence base for medical supplements and far more of the supplements fail than pass this test. Willow bark, an old remedy for pain relief, has been found to have a reliable effect because it has a chemical basis for working – evidence demonstrated that and now we have aspirin. Homeopathic memory water? There’s no reliable evidence for this working. Does this mean it won’t work? Well, here we get into the placebo effect and this is where things get really complicated because we now have the notion that we have a set of replacements that will work for our diet or health because they contain useful chemicals, and a set of solutions that work because we believe in them.
When we look at education, where it’s successful, we see a lot of techniques being mixed in together in a ‘natural’ diet of knowledge construction and learning. Face-to-face and teamwork, sitting side-by-side with formative and summative assessment, as part of discussions or ongoing dialogues, whether physical or on-line. Exactly which parts of these constitute the “balanced” educational diet? We already know that a lecture, by itself, is not a complete educational experience, in the same way that a stand-alone multiple-choice question test will not make you a scholar. There is a great deal of work being done to establish an evidence basis for exactly which bits work but, as MIT said in the OCW release, these components do not make up a course. In dietary terms, it might be raw fuel but is it a desirable meal? Not yet, most likely.
Now let’s get into the placebo side of the equation, where students may react positively to something just because it’s a change, not because it’s necessarily a good change. We can control for these effects, if we’re cautious, and we can do it with full knowledge of the students but I’m very wary of any dependency upon the placebo effect, especially when it’s prefaced with “and the students loved it”. Sorry, students, but I don’t only (or even predominantly) care if you loved it, I care if you performed significantly better, attended more, engaged more, retaining the information for longer, could achieve more, and all of these things can only be measured when we take the trouble to establish base lines, construct experiments, measure things, analyse with care and then think about the outcomes.
My major concern about the whole MOOC discussion is not whether MOOCs are good or bad, it’s more to do with:
- What does everyone mean when they say MOOC? (Because there’s variation in what people identify as the components)
- Are we building a balanced diet or are we constructing a sustenance program with carefully balanced supplements that might miss something we don’t yet value?
- Have we extracted the essential Vitamin Ed from the ‘real’ experience?
- Can we synthesise Vitamin Ed outside of the ‘real’ educational experience?
I’ve been searching for a terminological separation that allows me to separate ‘real’/’conventional’ learning experiences from ‘virtual’/’new generation’/’MOOC’ experiences and none of those distinctions are satisfying – one says “Restaurant meal” and the other says “Army ration pack” to me, emphasising the separation. Worse, my fear is that a lot of people don’t regard MOOC as ever really having Vitamin Ed inside, as the MIT President clearly believed back in 2001.
I suspect that my search for Vitamin Ed starts from a flawed basis, because it assumes a single silver bullet if we take a literal meaning of the term, so let me me spread the concept out a bit to label Vitamin Ed as the essential educational components that define a good learning and teaching experience. Calling it Vitamin Ed gives me a flag to wave and an analogue to use, to explain why we should be seeking a balanced diet for all of our students, rather than a banquet for one and dog food for the other.
“We are not providing an MIT education on the web…”
Posted: December 21, 2012 Filed under: Education | Tags: advocacy, blogging, collaboration, community, curriculum, design, education, educational research, ethics, Generation Why, higher education, in the student's head, learning, measurement, moocs, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, vygotsky Leave a commentI’ve been re-watching some older announcements that describe open courseware initiatives, starting from one of the biggest, the MIT announcement of their OpenCourseWare (OCW) initiative in April, 2001. The title of this post actually comes from the video, around the 5:20 mark, (Video quoted under a CC-BY-NC-SA licence, more information available at: http://ocw.mit.edu/terms)
“Let me be very clear, we are not providing an MIT education on the Web. We are, however, providing core materials that are the infrastructure that undergirds that information. Real education, in our view, involves interaction between people. It’s the interaction between faculty and students, in our classrooms and our living group, in our laboratories that are the heart, the real essence, of an MIT education. “
While the OCW was going to be produced and used on campus, the development of OCW was seen as something that would make more time available for student interaction, not less. President Vest then goes on to confidently predict that OCW will not make any difference to enrolment, which is hardly surprising given that he has categorically excluded anyone from achieving an MIT education unless they enrol. We see here exactly the same discussion that keeps coming up: these materials can be used as augmenting materials in these conventional universities but can never, in the view of the President or Vice Chancellor, replace the actual experience of obtaining a degree from that institution.
Now, don’t get me wrong. I still think that the OCW initiative was excellent, generous and visionary but we are still looking at two fundamentally different use cases: the use of OCW to augment an existing experience and the use of OCW to bootstrap a completely new experience, which is not of the same order. It’s a discussion that we keep having – what happens to my Uni if I use EdX courses from another institution? Well, ok, let’s ask that question differently. I will look at this from two sides with the introduction of a new skill and knowledge area that becomes ubiquitous, in my sphere, Computer Science and programming. Let’s look at this in terms of growth and success.
What happens if schools start teaching programming to first year level?
Let’s say that we get programming into every single national curriculum for secondary school and we can guarantee that students come in knowing how to program to freshman level. There are two ways of looking at this and the first, which we have probably all seen to some degree, is to regard the school teaching as inferior and re-teach it. The net result of this will be bored students, low engagement and we will be wasting our time. The second, far more productive, approach is to say “Great! You can program. Now let’s do some Computer Science.” and we use that extra year or so to increase our discipline knowledge or put breadth courses back in so our students come out a little more well-rounded. What’s the difference between students learning it from school before they come to us, or through an EdX course on fundamental programming after they come to us?
Not much, really, as long as we make sure that the course meets our requirements – and, in fact, it gives us bricks-and-mortar-bound entities more time to do all that face-to-face interactive University stuff that we know students love and from which they derive great benefit. University stops being semi-vocational in some aspects and we leap into knowledge construction, idea generation, big projects and the grand dreams that we always talk about, yet often don’t get to because we have to train people in basic programming, drafting, and so on. Do we give them course credit? No, because they’re assumed knowledge, or barrier tested, and they’re not necessarily part of our structure anymore.
What happens if no-one wants to take my course anymore?
Now, we know that we can change our courses because we’ve done it so many times before over the history of the Academy – Latin, along with Greek the language of scholarship, was only used in half of the University publications of 1800. Let me wander through a classical garden for a moment to discuss the nature of change from a different angle, that of decline. Languages had a special place in the degrees of my University with Latin and Greek dominating and then with the daring possibility of allowing substitution of French or German for Latin or Greek from 1938. It was as recently as 1958 that Latin stopped being compulsory for high school graduation in Adelaide although it was still required for the study of Law – student demand for Latin at school therefore plummeted and Latin courses started being dropped from the school curriculum. The Law Latin requirement was removed around 1969-1970, which then dropped any demand for Latin even further. The reduction in the number of school teachers who could teach Latin required the introduction of courses at the University for students who had studied no Latin at all – Latin IA entered the syllabus. However, given that in 2007 only one student at all of the schools across the state of South Australian (roughly 1.2-1.4 million people) studied Latin in the final year of school, it is apparent that if this University wishes to teach Latin, it has to start by teaching all of Latin. This is a course, and a discipline, that is currently in decline. My fear is that, one day, someone will make the mistake of thinking that we no longer need scholars of this language. And that worries me, because I don’t know what people 30 years from now will actually want, or what they could add to the knowledge that we already have of one of our most influential civilisations.
This decline is not unique to Latin (or Greek, or classics in general) but a truly on-line course experience would allow us to actually pool those scholars we have left and offer scaled resources out for much longer than isolated pockets in real offices can potentially manage but, as President Vest notes, a storehouse of Latin texts does not a course make. What reduced the demand for Latin? Possibly the ubiquity of the language that we use which is derived from Latin combined with a change of focus away from a classical education towards a more job- and achievement-oriented (semi-vocational) style of education. If you ask me, programming could as easily go this way in about 20 years, once we have ways to let machines solve problems for us. A move towards a less go-go-go culture, smarter machines and a resurgence of the long leisure cycles associated with Science Fiction visions of the future and suddenly it is the engineers and the computer scientists who are looking at shrinking departments and no support in the schools. Let me be blunt: course popularity and desirability rises, stabilises and falls, and it’s very hard to tell if we are looking at a parabola or a pendulum. With that in mind, we should be very careful about how we define our traditions and our conventions, especially as our cunning tools for supporting on-line learning and teaching get better and better. Yes, interaction is an essential part of a good education, no argument at all, but there is an implicit assumption of critical mass that we have seen, time and again, to implicitly support this interaction in a face-to-face environment that is as much a function of popularity and traditionally-associated prestige as it is of excellence.
What are MIT doing now?
I look at the original OCW release and I agree that, at time of production, you could not reproduce the interaction between people that would give you an MIT education. But our tools are better now. They are, quite probably not close enough yet to give you an “MIT of the Internet” but should this be our goal? Not the production of a facsimile of the core materials that might, with MIT instructors, turn into a course, but the commitment to developing the tools that actually reproduce the successful components of the learning experience with group and personal interaction, allowing the formation of what we used to call a physical interactive experience in a virtual side? That’s where I think the new MIT initiatives are showing us how these things can work now, starting from their original idealistic roots and adding the technology of the 21st Century. I hope that other, equally prestigious, institutions are watching this, carefully.
When Does Failing Turn You Into a Failure?
Posted: December 17, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, ci2012, community, conventicle, curriculum, design, education, educational research, ethics, feedback, Generation Why, grand challenge, higher education, icer2012, in the student's head, learning, principles of design, reflection, research, resources, student perspective, teaching, teaching approaches, thinking, tools, workload Leave a commentThe threat of failure is very different from the threat of being a failure. At the Creative Innovations conference I was just at, one of the strongest messages there was that we learn more from failure than we do from success, and that failure is inevitable if you are actually trying to be innovative. If you learn from your failures and your failure is the genuine result of something that didn’t work, rather than you sat around and watched it burn, then this is just something that happens, was the message from CI, and any other culture makes us overly-cautious and risk averse. As most of us know, however, we are more strongly encouraged to cover up our failures than to celebrate them – and we are frequently better off not trying in certain circumstances than failing.
At the recent Adelaide Conventicle, which I promise to write up very, very soon, Dr Raymond Lister presented an excellent talk on applying Neo-Piagetian concepts and framing to the challenges students face in learning programming. This is a great talk (which I’ve had the good fortune to see twice and it’s a mark of the work that I enjoyed it as much the second time) because it allows us to talk about failure to comprehend, or failure to put into practice, in terms of a lack of the underlying mechanism required to comprehend – at this point in the student’s development. As part of the steps of development, we would expect students to have these head-scratching moments where they are currently incapable of making any progress but, framing it within developmental stages, allows us to talk about moving students to the next stage, getting them out of this current failure mode and into something where they will achieve more. Once again, failure in this case is inevitable for most people until we and they manage to achieve the level of conceptual understanding where we can build and develop. More importantly, if we track how they fail, then we start to get an insight into which developmental stage they’re at.
One thing that struck me with Raymond’s talk, was that he starts off talking about “what ruined Raymond” and discussing the dire outcomes promised to him if he watched too much television, as it was to me for playing too many games, and it is to our children for whatever high tech diversion is the current ‘finger wagging’ harbinger of doom. In this case, ruination is quite clearly the threat of becoming a failure. However, this puts us in a strange position, because if failure is almost inevitable but highly valuable if managed properly and understood, what is it about being a failure that is so terrible? It’s like threatening someone that they’ll become too enthusiastic and unrestrained in their innovation!
I am, quelle surprise, playing with words here because to be a failure is to be classed as someone for whom success is no longer an option. If we were being precise, then we would class someone as a perpetual failure or, more simply, unsuccessful. This is, quite usually, the point at which it is acceptable to give up on someone – after all, goes the reasoning, we’re just pouring good money after bad, wasting our time, possibly even moving the deck chairs on the Titanic, and all those other expressions that allow us to draw that good old categorical line between us and others and put our failures into the “Hey, I was trying something new” basket and their failures into the “Well, he’s just so dumb he’d try something like that.” The only problem with this is that I’m really not sure that a lifetime of failure is a guaranteed predictor of future failure. Likely? Yeah, probably. So likely we can gamble someone’s life on it? No, I don’t believe so.
When I was failing courses in my first degree, it took me a surprisingly long time to work out how to fix it, most of which was down to the fact that (a) I had no idea how to study but (b) no-one around me was vaguely interested in the fact that I was failing. I was well on my way to becoming a perpetual failure, someone who had no chance of holding down a job let alone having a career, and it was a kind and fortuitous intervention that helped me. Now, with a degree of experience and knowledge, I can look back into my own patterns and see pretty much what was wrong with me – although, boy, would I have been a difficult cuss to work with. However, failing, which I have done since then and I will (no doubt) do again, has not appeared to have turned me into a failure. I have more failings than I care to count but my wife still loves me, my friends are happy to be seen with me and no-one sticks threats on my door at work so these are obviously in the manageable range. However, managing failure has been a challenging thing for me and I was pondering this recently – how people deal with being told that they’re wrong is very important to how they deal with failing to achieve something.
I’m reading a rather interesting, challenging and confronting, article on, and I cannot believe there’s a phrase for this, rage murders in American schools and workplaces, which claims that these horrifying acts are, effectively, failed revolts, which is with Mark Ames, the author of “Going Postal” (2005). Ames seems to believe that everything stems from Ronald Reagan (and I offer no opinion either way, I hasten to add) but he identifies repeated humiliation, bullying and inhumane conditions as taking ordinary people, who would not usually have committed such actions, and turning them into monstrous killing machines. Ames’ thesis is that this is not the rise of psychopathy but a rebellion against breaking spirit and the metaphorical enslavement of many of the working and middle class that leads to such a dire outcome. If the dominant fable of life is that success is all, failure is bad, and that you are entitled to success, then it should be, as Ames says in the article, exactly those people who are most invested in these cultural fables who would be the most likely to break when the lies become untenable. In the language that I used earlier, this is the most awful way to handle the failure of the fabric of your world – a cold and rational journey that looks like madness but is far worse for being a pre-meditated attempt to destroy the things that lied to you. However, this is only one type of person who commits these acts. The Monash University gunman, for example, was obviously delusional and, while he carried out a rational set of steps to eliminate his main rival, his thinking as to why this needed to happen makes very little sense. The truth is, as always, difficult and muddy and my first impression is that Ames may be oversimplifying in order to advance a relatively narrow and politicised view. But his language strikes me: the notion of the “repeated humiliation, bullying and inhumane conditions”, which appears to be a common language among the older, workplace-focused, and otherwise apparently sane humans who carry out such terrible acts.
One of the complaints made against the radio network at the heart of the recent Royal Hoax, 2DayFM, is that they are serial humiliators of human beings and show no regard for the general well-being of the people involved in their pranks – humiliation, inhumanity and bullying. Sound familiar? Here I am, as an educator, knowing that failure is going to happen for my students and working out how to bring them up into success and achievement when, on one hand, I have a possible set of triggers where beating down people leads to apparent madness, and at least part of our entertainment culture appears to delight in finding the lowest bar and crawling through the filth underneath it. Is telling someone that they’re a failure, and rubbing it in for public enjoyment, of any vague benefit to anyone or is it really, as I firmly believe, the best way to start someone down a genuinely dark path to ruination and resentment.
Returning to my point at the start of this (rather long) piece, I have met Raymond several times and he doesn’t appear even vaguely ruined to me, despite all of the radio, television and Neo-Piagetian contextual framing he employs. The message from Raymond and CI paints failure as something to be monitored and something that is often just a part of life – a stepping stone to future success – but this is most definitely not the message that generally comes down from our society and, for some people, it’s becoming increasingly obvious that their inability to handle the crushing burden of permanent classification as a failure is something that can have catastrophic results. I think we need to get better at genuinely accepting failure as part of trying, and to really, seriously, try to lose the classification of people as failures just because they haven’t yet succeeded at some arbitrary thing that we’ve defined to be important.
Core Values of Education and Why We Have To Oppose “Pranking”
Posted: December 11, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, community, education, educational problem, ethics, feedback, Generation Why, higher education, in the student's head, learning, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, workload 2 CommentsI’ve had a lot of time to think about education this year (roughly 400 hours at current reckoning) so it’s not surprising that I have some opinions on what constitutes the key values of education. Of course, as was noted at the Creative Innovations conference I went to, a corporate values statement is a wish list that doesn’t necessarily mean much so I’m going to talk about what I see when education is being performed well. After I’ve discussed these, I’m then going to briefly argue for why these values mean that stupid stunts (such as the Royal prank where some thoughtless DJs called up a hospital) should be actions that we identify as cruel and unnecessary interpretations of the term ‘entertainment’.
- Truth.
We start from the assumption that we only educate or train our students in what we, reasonably, assume to be the truth. We give the right answers when we know them and we admit it when we don’t. Where we have facts, we use them. When we are standing on opinion, we identify it. When we are telling a story, where the narrative matters more than the contents, we are careful to identify what we are doing and why we are doing it. We try not to deceive, even accidentally, and we do not make a practice of lying, even to spare someone’s feelings. In order to know the truth, we have to know our subject and we try to avoid blustering, derision and appealing to authority when we feel that we are being challenged.
There is no doubt that this can be hard in contentious and emerging areas but, as a primary value, it’s at the core of our educational system. Training someone to recite something that is not true, while still popular in many parts of the world, is indoctrination, not education.
- Respect.
We respect the students that we teach and, in doing this, we prepare them to respect us. We don’t assume that they are all the same, that they all learn at the same rate, that they have had all the preparation that they need for courses or our experiences, nor do assume that they can take anything that we feel inclined to fling at them. We respect them by treating them as people, as individuals, as vulnerable, emotional and potentially flawed humans. We evaluate their abilities before we test their mettle. We give them space to try again. We do all this because it then allows them, without hypocrisy or obligation, to treat us the same way. Respect of effort and of application does not demand perfection or obsession from either party.
- Fairness. We are objective in our assignment and assessment of work and generous in our interpretations when such generosity does not compromise truth or respect. We do not give false praise but we do all give all praise that is due, at the same time giving all of the notes for improvement. We strive to ensure that every student has the same high-quality and fair experience, regardless of who they are and what they do. When we define the rules, we stick to them, unless we have erred in their construction when, having fixed the rules, we then offer the best interpretation to every student. Our students acting in error or unfairly does not allow us to reciprocate in kind. The fairness of our system is not conditional upon a student being a perfect person and its strength lies in the fact that it is fair for all, regardless. What we say, we mean and what we mean, we say. A student’s results are ultimately the reflection of their own application to the course, relative to their opportunities to excel. Students are not unfairly punished because we have not bothered to work out if they are prepared for the course (which is very different from their own application of effort inside the course, which is ultimately their responsibility moderated by the unforeseen and vagaries of life), nor does the action of one student unduly influence the results of another, except where this is clearly identified and students have sufficient autonomy to control the outcome of this situation.
These stupid pranking stunts on the radio are usually considered acceptable because the person being pranked is contacted after the fact to ask if it can be broadcast. Frankly, I think this is bordering on coercive (because you risk being a bad sport if you don’t participate and I suspect that the radio stations don’t accept a simple first ‘no’) but some may disagree. (It’s worth noting that while the radio station tried to contact the nurses, they failed to get approval to broadcast.)
These pranks are, at heart, valueless lies, usually calculated to embarrass someone or expose them undertaking a given behaviour. They are neither truthful nor respectful. While this is often the high horse of pomposity (haven’t you got a sense of humour), it is important to realise that truly funny things can usually be enjoyed by everyone and that there is a world of difference between a joke that involves old friends and one that exploits strangers. The second situation just isn’t fair. The radio station is setting up a situation that is designed to elicit a response that everyone other than the victim will find amusing, because the victim is somehow funny or vulnerable. Basically, it’s unfair. You don’t get to laugh at or humiliate someone in a public forum just because you think it’s funny – didn’t we get over this in primary school? A lack of fairness often leads to situations that are coercive because we impose cultural norms, or peer-pressure, to force people to ‘go along with the joke’.
I had a student in my office recently, while another academic who happened to be my wife was helping me clear a backlog of paper, and before I discussed his final mark, I asked my wife if she would mind leaving the room. This was because there was no way I could ask the student if he minded discussing his mark with my wife in the room and not risk the situation being coercive. It’s a really simple thing to fix if you think about it. In order to respect the student’s privacy, I needed to be fair in the way that I controlled his ability to make decisions. Now I’m not worried that this student is easily coerced but that’s not my call to make – it’s not up to me to tell a student if they are going to be comfortable or not.
The Royal prank has clearly identified that that we can easily go down very dark and unexpected roads when we start to treat people as props, without sticking to the truth or respecting them enough to think about how they might feel about our actions, and that’s patently unfair. If these are our core values, and again many would disagree, then we have to stand up and object when we see them being mucked around with by our society. As educators, we have to draw a line and say that “just because you think it’s funny, doesn’t mean that you were right to do it” and we can do that and not be humourless or party-poopers. We do it because we want to allow people to still be funny, and have fun, muck around and have a joke with people that they know – because we’ve successfully trained them to know when they should stop, because we’ve correctly instilled the values of truth, respect and fairness.
“You Will Never Amount to Anything!”
Posted: December 10, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, curriculum, education, educational research, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, learning, led zeppelin, measurement, principal skinner, principles of design, reflection, resources, simpsons, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, work/life balance, workload Leave a commentI am currently reading “When Giants Walked the Earth: A Biography of Led Zeppelin” by Mick Wall. I won’t go into much of the detail of the book but the message presented around the four members of the group is that most of them did not have the best experiences in school and that, in at least two cases, the statements written on their reports by their teachers were profoundly dismissive. Now, it is of course entirely possible the that the Led Zep lads were, at time of leaving school, incapable of achieving anything – except that this is a total nonsense as it is quite obvious that they achieved a degree of musical and professional success that few contemplate, let alone reach.
You’ll often read this kind of line in celebrity biographies – that semi-mythical reforging of the self after having been judged and found wanting. (From a narrative perspective, it’s not all that surprising as it’s an easy way to increase the tension.) But one of the reasons that it pops up is that such a statement is so damning that it is not surprising that a successful person might want to wander back to the person who said it and say “Really?” But to claim that such a statement is a challenge (as famously mocked in the Simpsons where Principal Skinner says that these children have not future and is forced to mutter, with false bonhomie, ‘Prove me wrong, kids, prove me wrong.’) is confused at best, disingenuous and misdirecting at worst. If you want someone to achieve something, provide a clear description of the task, the means to achieve that task and then set about educating and training. No-one has ever learned brain surgery by someone yelling “Don’t open that skull” so pretending that an entire life’s worth of motivation can be achieved by telling something that they have no worth is piffle. Possibly even balderdash.
The phrase “You Will Never Amount To Anything” is, in whatever form it is uttered, a truly useless sentiment. It barely has any meaning (isn’t just being alive being something and hence amounting to a small sort of anything?) but, of course, it is not stated in order to achieve an outcome other than to place the blame for the lack of engagement with a given system squarely at the feet of the accused. You have failed to take advantage of the educational opportunities that we have provided and this is such a terminal fault, that the remaining 90% of your life will be spent in a mobile block of amber, where you will be unable to affect any worthwhile interaction with the universe.
I note that, with some near misses, I have been spared this kind of statement but I do feel very strongly that it is really not anything that you can with any credibility or useful purpose. If you happen to be Death, the Grim Reaper, then you can stand at the end of someone’s life and say “Gosh, you didn’t do a great deal did you” (although, again, what does it mean to do anything anyway?) but saying it when someone is between the ages of 16 and 20? You might be able to depend upon the statistical reliability that, if rampant success in our society is only given to 1%, 99% of the time, everyone you say “You will not be a success” will accidentally fall into that category. It’s quite obvious that any number of the characteristics that are worthy of praise in school contribute nothing to the spectacular success enjoyed by some people, where these characteristics are “sitting quietly”, “wearing the correct uniform” or “not chewing gum”. These are excellent facets of compliance and will make for citizens who may be of great utility to the successful, but it’s hard to see many business leaders whose first piece of advice to desperate imitators is “always wear shiny shoes”.
If we are talking about perceived academic ability then we run into another problem, in that there is a great deal of difference between school and University, let along school and work. There is no doubt that the preparation offered by a good schooling system is invaluable. Reading, writing, general knowledge, science, mathematics, biology, the classics… all of these parts of our knowledge and our society can be introduced to students very usefully. But to say that your ability to focus on long division problems when you are 14 is actually going to be the grand limiting factor on your future contribution to the world? Nonsense.
Were you to look at my original degree, you might think “How on Earth did this man end up with a PhD? He appears to have no real grasp of study, or pathway through his learning.” and, at the time of the degree, you’d be right. But I thought about what had happened, learned from it, and decided to go back and study again in order to improve my level of knowledge and my academic record. I then went back and did this again. And again. Because I persevered, because I received good advice on how to improve and, most importantly, because a lot of people took the time to help me, I learned a great deal and I became a better student. I developed my knowledge. I learned how to learn and, because of that, I started to learn how to think about teaching, as well.
If you were to look at Nick Falkner at 14, you may have seen some potential but a worry lack of diligence and effort. At 16, you would have seen him blow an entire year of school exams because he didn’t pay attention. At 17 he made it into Uni, just, but it wasn’t until the wheels really started to fall off that he realised that being loquacious and friendly wasn’t enough. Scurrying out of Uni with a third-grade degree into a workforce that looked at the evidence of my learning drove home that improvements were to be made. Being unemployed for most of a year cemented it – I had set myself up for a difficult life and had squandered a lot of opportunities. And that is when serendipity intervened, because the man who has the office next to me now, and with whom I coffee almost every morning, suggested that I could come back and pursue a Masters degree to make up for the poor original degree, and that I would not have to pay for it upfront because it was available as a government deferred-payment option. (Thank you, again, Kevin!)
That simple piece of advice changed my life completely. Instead of not saying anything or being dismissive of a poor student, someone actually took the time to say “Well, here’s something you could do and here’s how you do it.” And now, nearly 20 years down the track, I have a PhD, a solid career in which I am respected as an educator and as a researcher and I get to inspire and help other students. There’s no guarantee that good advice will always lead to good outcomes (and we all know about the paving on the road to Hell) but it’s increasingly obvious to me that dismissive statements, unpleasant utterances and “cut you loose” curtness are far more likely to do nothing positive at all.
If the most that you can say to a student is “You’re never going to amount to anything”, it might be worth looking in a mirror to see exactly what you’ve amounted to yourself…
A tragic and unintended outcome of an act with no benefit
Posted: December 9, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, community, education, ethics, Generation Why, higher education, in the student's head, reflection, student perspective, teaching approaches, thinking Leave a commentRecently, a pair of radio hosts from the Sydney 2Day FM station prank-called the hospital in which the Duchess of Cambridge was receiving treatment for medical issues associated with her pregnancy. Pretending to be the Queen, at 5:30am UK time, they managed to fool the nurse who was staffing reception (as the normal reception staff were not on duty) and got put through to the ward, where they managed to extract some information. Exceedingly sadly, after the hoax became apparent, this rather thoughtless and unfunny invasion of privacy has now had a tragic final act, in that the nurse who was believed to have passed the call through, Jacintha Saldanha, has been found dead, apparently by her own hand. You can read about this in a reasonable summary from the Sydney Morning Herald.
There is (currently) no direct connection between the prank event and the death of Ms Saldanha but, given who the people and the profile that we are talking about, one can easily imagine the pressure (real or imaginary) that someone would be under if they had failed to protect any patient, let alone the one that we are discussing. Of course, the radio show hosts did not intend for this outcome and, before there are any more calls for their heads, let us remember moral accident and the fact that, while their action was an inexplicable invasion of privacy, foolish, unfeeling and in poor taste, it was never intended to be lethal. Should they face questions? Yes.
Why?
Because it is not hard to summon the modicum of empathy required to understand why a woman who is experiencing any difficulties at all during pregnancy might have the reasonable expectation to be left alone and not be picked on for the delight of two radio hosts and their audience. Regardless of which family the Duke of Cambridge was born into and into which the Duchess of Cambridge has married, they are people and, by all accounts, live a surprisingly normal life for the couple who will (most likely) one day rule as the King and Queen of the United Kingdom. It is none of my business as to the details of the Duchess’ illness or condition, unless she wishes to release it, any more than it is the Queen’s business to prank call me into revealing the mark I received for Numerical Analysis I the first time I sat it, in the hopes of embarrassing me.
(With the greatest respect, Your Majesty, it was a 23 Fail because I did not attend lectures or do enough of the preparatory work. I would be grateful if you would consider using that knowledge wisely, Ma’am.)
As it stands there is the usual angry media reaction (and popular backlash) one sees when a stupid prank goes horribly wrong but what was never truly questioned is why on earth we persist with this nonsense in the first place? I often ask my students very direct questions when they tell me things. “Why did you do this?” is, apparently, a startling question to some of my students because it seems to stun them with its simplicity.
“You performed this action that had no positive value or it had a negative and unpleasant impact on the world. Why did you do this?” is the simplest, sanest question that should be asked whenever anybody does something like that. No doubt all of my poor Grand Challenge students are waiting for me to type Cui Bono? so I’ll get that out of the way but, in reality, cui bono (who benefits) seeks to locate the benefiting party to assign malign intent, rather than quisquam bono?, which is what I’m asking here: does anyone actually benefit. (My Latin is very rusty so I welcome corrections from classical scholars and revenant Romans.)
I often mutter things along the lines of “Just because you can, doesn’t mean you should”, mainly because I’m now middle-aged and it’s somewhat expected, but also because I strongly believe that we are moving into an age where the ability to do stupid things on the global scale is now within the reach of anyone with a telephone, a web browser and a general lack of empathy or kindness.
It is because I understand people, and I do have empathy, that I have the deepest sympathies for the family of Ms Saldanha, a husband and two teenagers, who must be suffering through a terrible and public loss, but are doing so with a great deal of dignity as I understand it. However, it would be wrong not to have some feeling for the radio hosts themselves because it would be the most egregious error to assign intent to their thoughtlessness. They did not set out to create this situation. However, and let me be clear, any situation that they did set out was almost completely without benefit to anyone, lacked respect, lacked empathy, was invasive, was unpleasant and should never have been attempted. Their lack of genuine apology could be seen, until recently, in the Tweeted advertisements carried in one of the host’s feeds until it was suspended. (For me, it is the lack of empathy that is sadly unsurprising. Why should Michael Christian be doing anything other than his job in this situation: producing high impact media buzz and then tapping it to drive up ratings? Of course, if he had a real sense of what he was doing, he would have pulled the prank either before it started or once they got past the reception, because they were about to violate someone’s privacy. Are we at fault because of who we select to hold the broadcast roles? Can you blame the gladiators for being bloodthirsty when we’re screaming around the circus?)
My next question to my students would normally be “So what now?” What is it that the student is planning to change in order for this situation to not occur again? In the case of my students, they are juggling work, family and being young. However, almost all of the things that my students do have some benefit (pub crawls notwithstanding). In this case, the CEO of the radio station has offered that, while no-one could have foreseen this, prank calls had been going on for years… Yes. And? We died of cholera for years, too. Let’s not argue tradition for something that has as its prime fruits the embarrassment and humiliation of another person, where we play with people without knowing how robust they are for this game.
Jacintha Saldanha is, tragically, dead and it does appear that this questionable act of entertainment may have been associated with her death. Perhaps, now is not a bad time to put the prank call into the same giant old wardrobe where we put all of the behaviours that never really made any sense and certainly make no sense when we should know so much better – and let’s stop the practice.
Why are we doing something? What is the benefit? Is our enjoyment really worth humiliating or embarrassing someone else on public radio? Where is the benefit in this, for anyone? If my students can drag together sensible and coherent answers to this when asked, so can our broadcast institutions and our journalists.




