Three Stories: #3 Taking Time for Cats

There are a number of draft posts sitting on this blog. Posts, which for one reason or another, I’ve either never finished, because the inspiration ran out, or I’ve never published, because I decided not to share them. Most of them were written when I was trying to make sense of being too busy, while at the same time I was taking on more work and feeling bad about not being able to commit properly to everything. I probably won’t ever share many of these posts but I still want to talk about some of the themes.

So, let me tell you a story about  cats.

One of the things about cats is that they can be mercurial, creatures of fancy and rapid mood changes. You can spend all day trying to get a cat to sit on your lap and, once you’ve given up and sat back down, 5 minutes later you find a cat on your lap. That’s just the way of cats.

When I was very busy last year, and the year before, I started to see feedback comments from my students that said things like “Nick is great but I feel interrupting him” or I’d try and squeeze them into the 5 minutes I had between other things. Now, students are not cats, but they do have times when they feel they need to come and see you and, sometimes, when that time passes, the opportunity is lost. This isn’t just students, of course, this is people. That’s just the way of people, too. No matter how much you want them to be well organised, predictable and well behaved, sometimes they’re just big, bipedal, mostly hairless cats.

One day, I decided that the best way to make my change my frantic behaviour was to set a small goal, to make me take the time I needed for the surprising opportunities that occurred in a day.

I decided that every time I was walking around the house, even if I was walking out to go to work and thought I was in a hurry, if one of the cats came up to me, I would pay attention to it: scratch it, maybe pick it up, talk to it, and basically interact with the cat.

Over time, of course, what this meant was that I saw more of my cats and I spent more time with them (cats are mercurial but predictable about some things). The funny thing was that the 5 minutes or so I spent doing this made no measurable difference to my day. And making more time for students at work started to have the same effect. Students were happier to drop in to see if I could spend some time with them and were better about making appointments for longer things.

Now, if someone comes to my office and I’m not actually about to rush out, I can spend that small amount of time with them, possibly longer. When I thought I was too busy to see people, I was. When I thought I had time to spend with people, I could.

Yes, this means that I have to be a little more efficient and know when I need to set aside time and do things in a different way, but the rewards are enormous.

I only realised the true benefit of this recently. I flew home from a work trip to Melbourne to discover that my wife and one of our cats, Quincy, were at the Animal Emergency Hospital, because Quincy couldn’t use his back legs. There was a lot of uncertainty about what was wrong and what could be done and, at one point, he stopped eating entirely and it was… not good there for a while.

The one thing that made it even vaguely less awful in that difficult time was that I had absolutely no regrets about the time that we’d spent together over the past 6 months. Every time Quincy had come up to say ‘hello’, I’d stopped to take some time with him. We’d lounged on the couch. He’d napped with me on lazy Sunday afternoons. We had a good bond and, even when the vets were doing things to him, he trusted us and that counted for a lot.

Quincy is now almost 100% and is even more of a softie than before, because we all got even closer while we were looking after him. By spending (probably at most) another five minutes a day, I was able to be happier about some of the more important things in my life and still get my “real” work done.

Fortunately, none of my students are very sick at the moment, but I am pretty confident that I talk to them when they need to (most of the time, there’s still room for improvement) and that they will let me know if things are going badly – with any luck at a point when I can help.

Your time is rarely your own but at least some of it is. Spending it wisely is sometimes not the same thing as spending it carefully. You never actually know when you won’t get the chance again to spend it on something that you value.


Three Stories: #2 Why I Don’t Make New Year’s Resolutions

This is a story I’ve never told anyone before, but I hope that it will help to explain why I think many students struggle with making solid change in their academic and life practices. They focus on endpoints and set deadlines reactively, rather than focusing on process and finding a good time to change. Let me explain this in the narrative.

When I was younger, I was quite a bit heavier than I am now – by about 30% of my body mass. As I got older, this became more of a problem and my weight went up and down quite a lot as I tried to get a regular regime of exercise into my life and cut back on my eating. Unfortunately, when I get stressed, I tend to eat, and one of the things I used to get stressed about was … losing weight. It’s a common, vicious, circle. Anyway, one year, after a Christmas where I had found it difficult to fit into my ‘good’ clothes and just felt overstuffed and too hot most of the time, I decided that enough was enough. I would make a New Year’s Resolution to lose weight. Seriously. (As background, Christmas in Australia is in Summer, so we sing snows about snow and eat roast turkey while sitting around in 90-100F/32-38C heat – so if your clothes are squeezy, boy, are you going to feel it.)

I can’t remember the details of the New Year’s Eve party but I do remember waking up the next day and thinking “Ok, so now I lose weight”. But there were some problems.

  1. It was still very hot.
  2. Everything was closed because it was a public holiday.
  3. I was still stuffed from Christmas/NY indulgence.
  4. I was hungover.
  5. I had no actual plan.
  6. I hadn’t actually taken any steps towards either dietary change or exercise that I could implement.

So, instead of getting out of bed and doing anything healthy, I thought “Oh, ok, I’ll start tomorrow.” because it was just about impossible, to my mind, to get things started on that day. I made some plans as to what I’d do the next day and thought “Ok, that’s what I’ll do tomorrow.”

But then a friend called on the 2nd and they were in town so we caught up. Then I was back at work and it was really busy.

And… and… and…

When I finally did lose the weight, many years later, and get to a more stable point, it wasn’t through making a resolution – it was through developing a clear plan to achieve a goal. I set out to work up to walking 10 miles as loops around my block. Then, when I achieved that, I assessed myself and realised that I could replace that with running. So then, ever time I went out, I ran a little at the start and walked the rest. Finally I was (slowly) running the whole distance. Years later, a couple of bad falls have stopped me from long-distance running, but I have three marathons and numerous halves under my belt.

Why didn’t it work before? Well, lack of preparation is always bad, but also because New Year’s is one of the worst possible times to try and make a massive change unless you’ve actually prepared for it and the timing works for you. Think about it:

  1. New Year’s Eve is a highly social activity for many people as are the days after- any resolutions involving food, alcohol, sex or tobacco are going to much harder to keep.
  2. It’s high stakes, especially if you make your resolution public. Suddenly, failure is looming over you and other people may be either trying to force you into keeping your resolution – and some people will actively be trying to tempt you out of it.
  3. There’s just a lot going on around this time for most people and it’s not a time when you have lots of extra headspace. If your brain is already buzzing, making big change will make it harder.
  4. Setting your resolution as a goal is not the same as setting a strategy. This is really important if you fall off the wagon, so to speak. If you are trying to give up smoking but grab a quick cigarette on the 3rd, then your resolution is shot. If you have a plan to cut down, allowing for the occasional divergence, then you can be human without thinking “Oh, now I have to abandon the whole project.”
  5. New Year’s Resolutions tend to be tip of the mind things – if something had been really bothering you for months, why wait until NYE to do it? This means that you’re far less likely to think everything out.

After thinking over this for quite a long time, I’ve learned a great deal about setting goals for important changes and you have to try to make these changes:

  1. When you have a good plan as to what you’re trying to achieve or what you’re just trying to do as a regular practice.
  2. When you have everything you need to make it work.
  3. When you have enough headspace to think it through.
  4. When you won’t beat yourself up too badly if it goes wrong.

So have a Happy New Year and be gentle on yourself for a few days. If you really want to change something in your life, plan for it properly and you stand a much better chance of success. Don’t wait until a high stakes deadline to try and force change on yourself – it probably won’t work.

HNY2014


Matt Damon: Computer Science Superstar?

There was a recent article in Salon regarding the possible use of celebrity presenters, professional actors and the more photogenic to present course material in on-line courses. While Coursera believes that, in the words of Daphne Koller, “education is not a performance”, Udacity, as voiced by Sebastian Thrun, believes that we can model on-line education more in the style of a newscast. In the Udacity model, there is a knowledgeable team and the content producer (primary instructor) is not necessarily going to be the presenter. Daphne Koller’s belief is that the connection between student and teacher would diminish if actors were reading scripts that had content they didn’t deeply understand.

My take on this is fairly simple. I never want to give students the idea that the appearance of knowledge is an achievement in the same league as actually developing and being able to apply that knowledge. I regularly give talks about some of the learning and teaching techniques we use and  I have to be very careful to explain that everything good we do is based on solid learning design and knowledge of the subject, which can be enhanced by good graphic design and presentation but cannot be replaced by these. While I have no doubt that Matt Damon could become a good lecturer in Computer Science, should he wish to, having him stand around and pretend to be one sends the wrong message.

Matt Damon demonstrating an extrinsic motivational technique called "fear of noisy death".

Matt Damon demonstrating an extrinsic motivational technique called “fear of noisy death”.

(And, from the collaborative perspective, if we start to value pleasant appearance over knowledge, do we start to sort our students into groups by appearance and voice timbre? This is probably not the path we want to go down. For now, anyway.)

 


Three Stories: #1 What I Learned from Failure

It’s considered bad form to start ‘business stories’ with “Once upon a time” but there’s a strong edge of bard to my nature and it’s the end of a long year. (Let’s be generous.) So, are you sitting comfortably? (Ok, I’ll spare you ‘Once…’)

Many years ago, I went to university, after a relatively undistinguished career at school. I got into a course that was not my first preference but, rather than wonder why I had not set the world on fire academically, I assumed that it was because I hadn’t really tried. The companion to this sentiment is that I could achieve whatever I wanted academically, as long as I really wanted it and actually tried. This concept, that I could achieve anything academic I wanted if I tried, got a fairly good workout over the next few years, despite evidence that I was heading in a downward spiral academically. What I became good at was barely avoiding failure, rather than excelling, and while this is a skill, it’s a dangerous line to try and walk. If you’re genuinely aiming to excel, which includes taking the requisite planning steps and time commitment you need, and you fall short then you will probably still do quite well and pass. If you are focused lower down, then missing that bar means failure.

What I didn’t realise at the time was that I was almost doomed to fail when I tried to set my own interpretation of what constituted the right level of effort and participation. If you are a student who has a good knowledge of the whole course then you will have a pretty good idea of how you have answered questions in exams, what is required for assignments and, if you wanted to, you could choose to answer part of a question and have some idea of how many marks are involved. If you don’t know the material in detail, then your perception of your own performance is going to be heavily filtered by your own lack of knowledge. (A reminder of a previous post on this for those who are new here or are vague post-Christmas.)

After some years out in the workforce, and coming back to do postgraduate study, I finally learned something from what should have been quite clear to me, if it hadn’t been hidden by two things: my firm conviction that I could change things immediately if I wished to, and my completely incorrect assumption that my own performance in a subject could be assessed by someone with my level of knowledge!

I became a good student because I finally worked out three key things (with a lot of help and support from my teachers and my friends);

  1. There is no “lower threshold” of knowledge that allows you to predict if you’re going to pass. If you have enough grasp of the course to know how much you need to do to pass, then you probably know enough to do much better than that! (Terry Pratchett covers this beautifully in a book called “Moving Pictures“, where a student has to know the course better than the teachers to maintain a very specific grade over the years.)
  2. Telling yourself that you “could have done better” is almost completely useless unless you decide to do better and put a plan in place to achieve that. This excuse gets you off the hook but, unless it’s teamed with remedial action, it’s just an excuse.
  3. Setting yourself up for failure is just as effective as setting yourself up for success, but it can be far subtler and comprised of many small actions that you don’t take, rather than a few actions that you do take.

Knowing what is going wrong (or thinking you do) doesn’t change anything unless you actively try to change it. It’s a simple truth that, I hope, is a useful and interesting story.


A Break in the Silence: Time to Tell a Story

It has been a while since I last posted here but that is a natural outcome of focusing my efforts elsewhere – at some stage I had to work out what I had time to do and do it. I always tell my students to cut down to what they need to do and, once I realised that the time I was spending on the blog was having one of the most significant impacts on my ability to juggle everything else, I had to eat my own dogfood and cut back on the blog.

Of course, I didn’t do it correctly because instead of cutting back, I completely cut it out. Not quite what I intended but here’s another really useful piece of information: if you decide to change something then clearly work out how you are going to change things to achieve your goal. Which means, ahem, working out what your goals are first.

I’ve done a lot of interesting stuff over the last 6 months, and there are more to come, which means that I do have things to write about but I shall try and write about one a week as a minimum, rather than one per day. This is a pace that I hope to keep up and one that will mean that more of you will read more of what I write, rather than dreading the daily kiloword delivery.

I’ll briefly reflect here on some interesting work and seminars I’ve been looking at on business storytelling – taking a personal story, something authentic, and using it to emphasise a change in business behaviour or to emphasise a characteristic. I recently attended one of the (now defunct) One Thousand and One’s short seminars on engaging people with storytelling. (I’m reading their book “Hooked” at the moment. It’s quite interesting and refers to other interesting concepts as well.) I realise that such ideas, along with many of my notions of design paired with content, will have a number of readers peering at the screen and preparing a retort along the lines of “Storytelling? STORYTELLING??? Whatever happened to facts?”

Why storytelling? Because bald facts sometimes just don’t work. Without context, without a way to integrate information into existing knowledge and, more importantly, without some sort of established informational relationship, many people will ignore facts unless we do more work than just present them.

How many examples do you want: Climate Change, Vaccination, 9/11. All of these have heavily weighted bodies of scientific evidence that states what the answer should be, and yet there is powerful and persistent opposition based, largely, on myth and storytelling.

Education has moved beyond the rationing out of approved knowledge from the knowledge rich to those who have less. The tyrannical informational asymmetry of the single text book, doled out in dribs and drabs through recitation and slow scrawling at the front of the classroom, looks faintly ludicrous when anyone can download most of the resources immediately. And yet, as always, owning the book doesn’t necessarily teach you anything and it is the educator’s role as contextualiser, framer, deliverer, sounding board and value enhancer that survives the death of the drip-feed and the opening of the flood gates of knowledge. To think that storytelling is the delivery of fairytales, and that is all it can be, is to sell such a useful technique short.

To use storytelling educationally, however, we need to be focused on being more than just entertaining or engaging. Borrowing heavily from “Hooked”, we need to have a purpose in telling the story, it needs to be supported by data and it needs to be authentic. In my case, I have often shared stories of my time in working with  computer networks, in short bursts, to emphasise why certain parts of computer networking are interesting or essential (purpose), I provide enough information to show this is generally the case (data) and because I’m talking about my own experiences, they ring true (authenticity).

If facts alone could sway humanity, we would have adopted Dewey’s ideas in the 1930s, instead of rediscovering the same truths decade after decade. If only the unembellished truth mattered, then our legal system would look very, very different. Our students are surrounded by talented storytellers and, where appropriate, I think those ranks should include us.

Now, I have to keep to the commitment I made 8 months ago, that I would never turn down the chance to have one of my cats on my lap when they wanted to jump up, and I wish you a very happy new year if I don’t post beforehand.


Skill Games versus Money Games: Disguising One Game As Another

I recently ran across a very interesting article on Gamasutra on the top tips for turning a Free To Play (F2P) game into a Paying game by taking advantage of the way that humans think and act. F2P games are quite common but, obviously, it costs money to make a game so there has to be some sort of associated revenue stream. In some cases, the F2P is a Lite version of the pay version, so after being hooked you go and buy the real thing. Sometimes there is an associated advertising stream, where you viewing the ads earns the producer enough money to cover costs. However, these simple approaches pale into insignificance when compared with the top tips in the link.

Ramin identifies two games for this discussion: games of skill, where it is your ability to make sound decisions that determines the outcome, and money games, where your success is determined by the amount of money you can spend. Games of chance aren’t covered here but, given that we’re talking about motivation and agency, we’re depending upon one specific blindspot (the inability of humans to deal sensibly with probability) rather than the range of issues identified in the article.

I dont want to rehash the entire article but the key points that I want to discuss are the notion of manipulating difficulty and fun pain. A game of skill is effectively fun until it becomes too hard. If you want people to keep playing then you have to juggle the difficulty enough to make it challenging but not so hard that you stop playing. Even where you pay for a game up front, a single payment to play, you still want to get enough value out of it – too easy and you finish too quickly and feel that you’ve wasted your money; too hard and you give up in disgust, again convinced that you’ve wasted your money. Ultimately, in a pure game of skill, difficulty manipulation must be carefully considered. As the difficulty ramps up, the player is made uncomfortable, the delightful term fun pain is applied here, and resolving the difficulty removes this.

Or, you can just pay to make the problem go away. Suddenly your game of skill has two possible modes of resolution: play through increasing difficulty, at some level of discomfort or personal inconvenience, or, when things get hard enough, pump in a deceptively small amount of money to remove the obstacle. The secret of the P2P game that becomes successfully monetised is that it was always about the money in the first place and the initial rounds of the game were just enough to get you engaged to a point where you now have to pay in order to go further.

You can probably see where I’m going with this. While it would be trite to describe education as a game of skill, it is most definitely the most apt of the different games on offer. Progress in your studies should be a reflection of invested time in study, application and the time spent in developing ideas: not based on being ‘lucky’, so the random game isn’t a choice. The entire notion of public education is founded on the principle that educational opportunities are open to all. So why do some parts of this ‘game’ feel like we’ve snuck in some covert monetisation?

I’m not talking about fees, here, because that’s holding the place of the fee you pay to buy a game in the first place. You all pay the same fee and you then get the same opportunities – in theory, what comes out is based on what the student then puts in as the only variable.

But what about textbooks? Unless the fee we charge automatically, and unavoidably, includes the cost of the textbook, we have now broken the game into two pieces: the entry fee and an ‘upgrade’. What about photocopying costs? Field trips? A laptop computer? An iPad? Home internet? Bus fare?

It would be disingenuous to place all of this at the feet of public education – it’s not actually the fault of Universities that financial disparity exists in the world. It is, however, food for thought about those things that we could put into our courses that are useful to our students and provide a paid alternative to allow improvement and progress in our courses. If someone with the textbook is better off than someone without the textbook, because we don’t provide a valid free alternative, then we have provided two-tiered difficulty. This is not the fun pain of playing a game, we are now talking about genuine student stress, a two-speed system and a very high risk that stressed students will disengage and leave.

From my earlier discussions on plagiarism, we can easily tie in Ramin’s notion of the driver of reward removal, where players have made so much progress that, on facing defeat, they will pay a fee to reduce the impact of failure; or, in some cases, to remove it completely. As Ramin notes:

“This technique alone is effective enough to make consumers of any developmental level spend.”

It’s not just lost time people are trying to get back, it’s the things that have been achieved in that time. Combine that with, in our case, the future employability and perception of that piece of paper, and we have a very strong behavioural driver. A number of the tricks Ramin describes don’t work as well on mature and aware thinkers but this one is pretty reliable. If it’s enough to make people pay money, regardless of their development level, then there are lots of good design decisions we can make from this – lower risk assessment, more checkpointing, steady progress towards achievement. We know lots of good ways to avoid this, if we consider it to be a problem and want to take the time to design around it.

This is one of the greatest lessons I’ve learned about studying behaviour, even as a rank amateur. Observing what people do and trying to build systems that will work despite that makes a lot more sense than building a system that works to some ideal and trying to jam people into it. The linked article shows us how people are making really big piles of money by knowing how people work. It’s worth looking at to make sure that we aren’t, accidentally, manipulating students in the same way.


The defining question.

There has been a lot going on for me recently. A lot of thinking, a lot of work and an amount of getting involved in things because my students trust me and will come to me to ask questions, which sometimes puts me in the uncomfortable position of having to juggle my accommodation for the different approaches of my colleagues and my own beliefs, as well as acting in everyone’s best interests. I’m not going to go into details but I think that I can summarise my position on everything, as an educator, by phrasing it in one question.

Is this course of action to the student’s benefit?

I mean, that’s it, isn’t it? If the job is educating students and developing the citizens of tomorrow, then everything that we do should be to the benefit of the student and/or future graduate. But it’s never simple, is it, because the utilitarian calculus to derive benefit quickly becomes complicated when we consider the effect of institutional reputation or perception on the future benefit to the student. But maybe that’s over thinking things (gasp, I hear regular readers cry). I’m not sure I know how to guide student behaviour to raise my University’s ranking in various measures – but I do know how to guide student behaviour to reduce the number of silly or thoughtless things they do, to enhance their learning and to help them engage. Maybe the simple question is the best? Will the actions I take today improve my students’ knowledge or enhance their capacity to learn? Have I avoided wasting their time doing something that we do because we have always done it, rather than giving them something to do because it is what we should be doing? Am I always considering the benefit to the largest group of students, while considering the needs of the individual?

Every time I see a system that has a fixed measure of success, people optimise for it. If it’s maximum profit, people maximise profit. If it’s minimum space, people cut their space. Guidelines help a lot in working out which course of action to take: when faced with a choice between A and B, choose the option that maximises your objective. This even works without a strong vision of the future, which is good because I’m not sure we have a clear enough view of the long path to graduation to really be specific about this. There is always a risk that people will get the assessment of benefit wrong, which can lead to soft marking or lax standards, but I’m not a believer that post hoc harshness is the solution to inherited laxity from another system (especially where that may be a perception that’s not grounded in reality). Looking at all of my actions in terms of a real benefit, to the student, to their community, to our equality standards, to our society – that shines a bright light on what we do so we can clearly see what we’re doing and, if it requires change, illuminates the path to change.


Another semester, more lessons learned (mostly by me).

I’ve just finished the lecturing component for my first year course on programming, algorithms and data structures. As always, the learning has been mutual. I’ve got some longer posts to write on this at some time in the future but the biggest change for this year was dropping the written examination component down and bringing in supervised practical examinations in programming and code reading. This has given us some interesting results that we look forward to going through, once all of the exams are done and the marks are locked down sometime in late July.

Whenever I put in practical examinations, we encounter the strange phenomenon of students who can mysteriously write code in very short periods of time in a practical situation very similar to the practical examination, but suddenly lose the ability to write good code when they are isolated from the Internet, e-Mail and other people’s code repositories. This is, thank goodness, not a large group (seriously, it’s shrinking the more I put prac exams in) but it does illustrate why we do it. If someone has a genuine problem with exam pressure, and it does occur, then of course we set things up so that they have more time and a different environment, as we support all of our students with special circumstances. But to be fair to everyone, and because this can be confronting, we pitch the problems at a level where early achievement is possible and they are also usually simpler versions of the types of programs that have already been set as assignment work. I’m not trying to trip people up, here, I’m trying to develop the understanding that it’s not the marks for their programming assignments that are important, it’s the development of the skills.

I need those people who have not done their own work to realise that it probably didn’t lead to a good level of understanding or the ability to apply the skill as you would in the workforce. However, I need to do so in a way that isn’t unfair, so there’s a lot of careful learning design that goes in, even to the selection of how much each component is worth. The reminder that you should be doing your own work is not high stakes – 5-10% of the final mark at most – and builds up to a larger practical examination component, worth 30%, that comes after a total of nine practical programming assignments and a previous prac exam. This year, I’m happy with the marks design because it takes fairly consistent failure to drop a student to the point where they are no longer eligible for redemption through additional work. The scope for achievement is across knowledge of course materials (on-line quizzes, in-class scratchy card quizzes and the written exam), programming with reference materials (programming assignments over 12 weeks), programming under more restricted conditions (the prac exams) and even group formation and open problem handling (with a team-based report on the use of queues in the real world). To pass, a student needs to do enough in all of these. To excel, they have to have a good broad grasp of theoretical and practical. This is what I’ve been heading towards for this first-year course, a course that I am confident turns out students who are programmers and have enough knowledge of core computer science. Yes, students can (and will) fail – but only if they really don’t do enough in more than one of the target areas and then don’t focus on that to improve their results. I will fail anyone who doesn’t meet the standard but I have no wish to do any more of that than I need to. If people can come up to standard in the time and resource constraints we have, then they should pass. The trick is holding the standard at the right level while you bring up the people – and that takes a lot of help from my colleagues, my mentors and from me constantly learning from my students and being open to changing the learning design until we get it right.

Of course, there is always room for improvement, which means that the course goes back up on blocks while I analyse it. Again. Is this the best way to teach this course? Well, of course, what we will do now is to look at results across the course. We’ll track Prac Exam performance across all practicals, across the two different types of quizzes, across the reports and across the final written exam. We’ll go back into detail on the written answers to the code reading question to see if there’s a match for articulation and comprehension. We’ll assess the quality of response to the exam, as well as the final marked outcome, to tie this back to developmental level, if possible. We’ll look at previous results, entry points, pre-University marks…

And then we’ll teach it again!


The Continuum of Ethical Challenge: Why the Devil Isn’t Waiting in the Alleyway and The World is Harder than Bioshock.

This must be a record for a post title but I hope to keep the post itself shortish. Years ago, when I was still at school, a life counsellor (who was also a pastor) came to talk to us about life choices and ethics. He was talking about the usual teen cocktail: sex, drugs and rebellion.. However, he made an impression on me by talking about his early idea of temptation. Because of the fire and brimstone preaching he’d grown up with, he half expected temptation to take the form of the Devil, beckoning him into an alleyway to take an illicit drag on a cigarette. As he grew up, and grew wiser, he realised that living ethically was really a constant set of choices, interlocking or somewhat dependant, rather than an easy life periodically interrupted by strictly defined challenges that could be overcome with a quick burst of willpower.

A picture of an alley with green eyes floating, superimposed over it.

It’s still a creepy mental image, of course.

I recently started replaying the game Bioshock, which I have previously criticised elsewhere, and was struck by the facile nature of the much-vaunted ethical aspect to game play. For those who haven’t played it, you basically have a choice between slaughtering or saving little girls – apart from that, you have very little agency or ability to change the path you’re on. In fact, rather than provide you with the continual dilemma of whether you should observe, ignore or attack the inhabitants of the game world, you very quickly realise that there are no ‘good’ people in the world (or there are none that you are actually allowed to attack, they are all carefully shielded from you) so you can reduce your ‘choices’ when encountering a figure crouching over a pram to “should I bludgeon her to death, or set her on fire and shoot her in the head”. (It’s ok, if you try anything approaching engagement, she will try and kill you.) In fact, one of the few ‘innocents’ in the game is slaughtered in front of you while you watch impotently. So your ethical engagement is restricted, at very distinctly defined intervals, to either harvesting or rescuing the little girls who have been stolen from orphanages and turned into corpse scavenging monsters. This is as ridiculous as the intermittent Devil in the alleyway, in fact, probably more so!

I completely agree with that counsellor from (goodness) 30 years ago – it would be a nonsense to assume that tests of our ethics can be conveniently compartmentalised to a time when our resolve is strong and can be so easily predicted. The Bioshock model (or models like it, such as Call of Duty 4, where everyone is an enemy or can’t be shot in a way that affects our game beyond a waggled finger and being taken back to a previous save) is flawed because of the limited extent of the impact of the choices you make – in fact, Bioshock is particularly egregious because the ‘outcome’ of your moral choice has no serious game impact except to show you a different movie at the end. Before anyone says “it’s only a game”, I agree, but they were the ones who imposed the notion that this ethical choice made a difference. Games such as Deus Ex gave you very much un-cued opportunities to intervene or not – with changes to the game world depending on what happened. As a result, people playing Deus Ex had far more moral engagement with the game and everyone I’ve spoken to felt as if they were making the choices that led to the outcome: autonomy, mastery and purpose anyone? That was in 2000 – very few games actually see the world as one that you can influence (although some games are now coming up to par on this).

I think about this a lot for my learning design. While my students  may recognise ethical choices in the real world, I am always concerned that a learning design that reduces their activities to high stakes hurdle challenges will mimic the situation where we have, effectively, put the Devil in the alleyway and you can switch on your ‘ethical’ brain at this point. I posed a question to my students in their sample exam where I proposed that they had commissioned someone to write their software for an assignment – and them asked to think about the effect that this decision would have on their future self in terms of knowledge development, if we assumed that they would always be better prepared if they did the work themselves. This takes away the focus from the day or so leading up to an individual assignment and starts to encourage continuum thinking, where every action is take as part of a whole life of ethical actions. I’m a great believer that skills only develop with practice and knowledge only stays in your head when you reinforce it, so any opportunity to encourage further development of ethical thinking is to be encouraged!


“Hi, my name is Nick and I specialise in failure.”

I recently read an article on survivorship bias in the “You Are Not So Smart” website, via Metafilter. While the whole story addressed the World War II Statistical Research Group, it focused on the insight contributed by Abraham Wald, a statistician. The World War II Allied bomber losses were large, very large, and any chances of reducing this loss was incredibly valuable. The question was “How could the US improve their chances of bringing their bombers back intact?” Bombers landing back after missions were full of holes but armour just can’t be strapped willy-nilly on to a plane without it becoming land-locked. (There’s a reason that birds are so light!) The answer, initially, was obvious – find the place where the most holes were, by surveying the fleet, and patching them. Put armour on the colander sections and, voila, increased survival rate.

No, said Wald. That wouldn’t help.

Wald’s logic is both simple and convincing. If a plane was coming back with those holes in place, then the holes in the skin were not leading to catastrophic failure – they couldn’t have been if the planes were returning! The survivors were not showing the damage that would have led to them becoming lost aircraft. Wald used the already collected information on the damage patterns to work out how much damage could be taken on each component and the likelihood of this occurring during a bombing run. based on what kind of forces it encountered.

It’s worth reading the entire article because it’s a simple and powerful idea – attributing magical properties to the set of steps taken by people who have become ultra-successful is not going to be as useful as looking at what happened to take people out of the pathway to success. If you’ve read Steve Jobs’ biography then you’re aware that he had a number of interesting traits, only some of which may have led to him becoming as successful as he did. Of course, if you’ve been reading a lot, you’ll be aware of the importance of  Paul Jobs, Steve Wozniak, Robert Noyce, Bill Gates, Jony Ive, John Lasseter, and, of course, his wife, Laurene Powell Jobs. So the whole “only eating fruit” thing, the “reality distortion field” thing and “not showering” thing (some of which he changed, some he didn’t) – which of these are the important things? Jobs, like many successful people, failed at some of his endeavours, but never in a way that completely wiped him out. Obviously. Now, when he’s not succeeding, he’s interesting, because we can look at the steps that took him down and say “Oh, don’t do that”, assuming that it’s something that can be changed or avoided . When he’s succeeding, there are so many other things getting in the way that depend upon what’s happened to you so far, who your friends are, and how many resources you get to play with, it’s hard to be able to give good advice on what to do.

I have been studying failure for some time. Firstly in myself, and now in my students. I look for those decisions, or behaviours, that lead to students struggling in their academic achievement, or to falling away completely in some cases. The majority of the students who come to me with a high level of cultural, financial and social resources are far less likely to struggle because, even when faced with a set-back, they rarely hit the point where they can’t bounce back – although, sadly, it does happen but in far fewer numbers. When they do fall over, it is for the same reasons as my less-advantaged students, who just do so in far greater numbers because they have less resilience to the set-backs. By studying failure, and the lessons learned and the things to be avoided, I can help all of my students and this does not depend upon their starting level. If I were studying the top 5% of students, especially those who had never received a mark less than A+, I would be surprised if I could learn much that I could take and usefully apply to those in the C- bracket. The reverse, however? There’s gold to be mined there.

By studying the borderlines and by looking for patterns in the swirling dust left by those departing, I hope that I can find things which reduce failure everywhere – because every time someone fails, we run the risk of not getting them back simply because failure is disheartening. Better yet, I hope to get something that is immediately usable, defensible and successful. Probably rather a big ask for a protracted study of failure!