Time Banking and Plagiarism: Does “Soul Destroying” Have An Ethical Interpretation?
Posted: June 25, 2012 Filed under: Education | Tags: advocacy, blogging, design, education, educational problem, feedback, higher education, in the student's head, learning, plagiarism, resources, student perspective, teaching, teaching approaches, time banking, tools, work/life balance, workload 4 CommentsYesterday, I wrote a post on the 40 hour week, to give an industrial basis for the notion of time banking, and I talked about the impact of overwork. One of the things I said was:
The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. (Emphasis mine.)
Back-breaking is me being rather overly emphatic regarding the impact of work, although in manual industries workplace accidents caused by fatigue and overwork can and do break backs – and worse – on a regular basis.
But soul-destroying? Am I just saying that someone will perform their tasks as an automaton or zombie, or am I saying something more about the benefit of full cognitive function – the soul as an amalgam of empathy, conscience, consideration and social factors? Well, the answer is that, when I wrote it, I was talking about mindlessness and the removal of the ability to take joy in work, which is on the zombie scale, but as I’ve reflected on the readings more, I am now convinced that there is an ethical dimension to fatigue-related cognitive impairment that is important to talk about. Basically, the more tired you get, the more likely you are to function on the task itself and this can have some serious professional and ethical considerations. I’ll provide a basis for this throughout the rest of this post.
The paper I was discussing, on why Crunch Mode doesn’t work, listed many examples from industry and one very interesting paper from the military. The paper, which had a broken link in the Crunch mode paper, may be found here and is called “Sleep, Sleep Deprivation, and Human Performance in Continuous Operations” by Colonel Gregory Belenky. Now, for those who don’t know, in 1997 I was a commissioned Captain in the Royal Australian Armoured Corps (Reserve), on detachment to the Training Group to set up and pretty much implement a new form of Officer Training for Army Reserve officers in South Australia. Officer training is a very arduous process and places candidates, the few who make it in, under a lot of stress and does so quite deliberately. We have to have some idea that, if terrible things happen and we have to deploy a human being to a war zone, they have at least some chance of being able to function. I had been briefed on most of the issues discussed in Colonel Belenky’s paper but it was only recently that I read through the whole thing.
And, to me today as an educator (I resigned my commission years ago), there are still some very important lessons, guidelines and warnings for all of us involved in the education sector. So stay with me while I discuss some of Belenky’s terminology and background. The first term I want to introduce is droning: the loss of cognitive ability through lack of useful sleep. As Belenky puts in, in the context of US Army Ranger training:
…the candidates can put one foot in front of another and respond if challenged, but have difficulty grasping their situation or acting on their own initiative.
What was most interesting, and may surprise people who have never served with the military, is that the higher the rank, the less sleep people got – and the higher level the formation, the less sleep people got. A Brigadier in charge of a Brigade is going to, on average, get less sleep than the more junior officers in the Brigade and a lot less sleep than a private soldier in a squad. As an officer, my soldiers were fed before me, rested before me and a large part of my day-to-day concern was making sure that they were kept functioning. This keeps on going up the chain and, as you go further up, things get more complex. Sadly, the people shouldering the most complex cognitive functions with the most impact on the overall battlefield are also the people getting the least fuel for their continued cognitive endeavours. They are the most likely to be droning: going about their work in an uninspired way and not really understanding their situation. So here is more evidence from yet another place: lack of sleep and fatigue lead to bad outcomes.
One of the key issues Belenky talks about is the loss of situational awareness caused by the accumulated sleep debt, fatigue and overwork suffered by military personnel. He gives an example of an Artillery Fire Direction Centre – this is where requests for fire support (big guns firing large shells at locations some distance away) come to and the human plotters take your requests, transform them into instructions that can be given to the gunners and then firing starts. Let me give you a (to me) chilling extract from the report, which the Crunch Mode paper also quoted:
Throughout the 36 hours, their ability to accurately derive range, bearing, elevation, and charge was unimpaired. However, after circa 24 hours they stopped keeping up their situation map and stopped computing their pre-planned targets immediately upon receipt. They lost situational awareness; they lost their grasp of their place in the operation. They no longer knew where they were relative to friendly and enemy units. They no longer knew what they were firing at. Early in the simulation, when we called for simulated fire on a hospital, etc., the team would check the situation map, appreciate the nature of the target, and refuse the request. Later on in the simulation, without a current situation map, they would fire without hesitation regardless of the nature of the target. (All emphasis mine.)
Here, perhaps, is the first inkling of what I realised I meant by soul destroying. Yes, these soldiers are overworked to the point of droning and are now shuffling towards zombiedom. But, worse, they have no real idea of their place in the world and, perhaps most frighteningly, despite knowing that accidents happen when fire missions are requested and having direct experience of rejecting what would have resulted in accidental hospital strikes, these soldiers have moved to a point of function where the only thing that matters is doing the work and calling the task done. This is an ethical aspect because, from their previous actions, it is quite obvious that there was both a professional and ethical dimension to their job as the custodians of this incredibly destructive weaponry – deprive them of enough sleep and they calculate and fire, no longer having the cognitive ability (or perhaps the will) to be ethical in their delivery. (I realise a number of you will have choked on your coffee slightly at the discussion of military ethics but, in the majority of cases, modern military units have a strong ethical code, even to the point of providing a means for soldiers to refuse to obey illegal orders. Most failures of this system in the military can be traced to failures in a unit’s ethical climate or to undetected instability in the soldiers: much as in the rest of the world.)
The message, once again, is clear. Overwork, fatigue and sleeplessness reduce the ability to perform as you should. Belenky even notes that the ability to benefit from training quite clearly deteriorates as the fatigue levels increase. Work someone hard enough, or let them work themselves hard enough, and not only aren’t they productive, they can’t learn to do anything else.
The notion of situational awareness is important because it’s a measure of your sense of place, in an organisational sense, in a geographical sense, in a relative sense to the people around you and also in a social sense. Get tired enough and you might swear in front of your grandma because your social situational awareness is off. But it’s not just fatigue over time that can do this: overloading someone with enough complex tasks can stress cognitive ability to the point where similar losses of situational awareness can occur.
Helmet fire is a vivid description of what happens when you have too many tasks to do, under highly stressful situations, and you lose your situational awareness. If you are a military pilot flying on instruments alone, especially with low or zero visibility, then you have to follow a set of procedures, while regularly checking the instruments, in order to keep the plane flying correctly. If the number of tasks that you have to carry out gets too high, and you are facing the stress of effectively flying the plane visually blind, then your cognitive load limits will be exceeded and you are now experiencing helmet fire. You are now very unlikely to be making any competent contributions at all at this stage but, worse, you may lose your sense of what you were doing, where you are, what your intentions are, which other aircraft are around you: in other words, you lose situational awareness. At this point, you are now at a greatly increased risk of catastrophic accident.
To summarise, if someone gets tired, stressed or overworked enough, whether acutely or over time, their performance goes downhill, they lose their sense of place and they can’t learn. But what does this have to do with our students?
A while ago I posted thoughts on a triage system for plagiarists – allocating our resources to those students we have the most chance of bringing back to legitimate activity. I identified the three groups as: sloppy (unintentional) plagiarism, deliberate (but desperate and opportunistic) plagiarism and systematic cheating. I think that, from the framework above, we can now see exactly where the majority of my ‘opportunistic’ plagiarists are coming from: sleep-deprived, fatigued and (by their own hands or not) over-worked students losing their sense of place within the course and becoming focused only on the outcome. Here, the sense of place is not just geographical, it is their role in the social and formal contracts that they have entered into with lecturers, other students and their institution. Their place in the agreements for ethical behaviour in terms of doing the work yourself and submitting only that.
If professional soldiers who have received very large amounts of training can forget where there own forces are, sometimes to the tragic extent that they fire upon and destroy them, or become so cognitively impaired that they carry out the mission, and only the mission, with little of their usual professionalism or ethical concern, then it is easy to see how a student can become so task focussed that start to think about only ending the task, by any means, to reduce the cognitive load and to allow themselves to get the sleep that their body desperately needs.
As always, this does not excuse their actions if they resort to plagiarism and cheating – it explains them. It also provides yet more incentive for us to try and find ways to reach our students and help them form systems for planning and time management that brings them closer to the 40 hour ideal, that reduces the all-nighters and the caffeine binges, and that allows them to maintain full cognitive function as ethical, knowledgable and professional skill practitioners.
If we want our students to learn, it appears that (for at least some of them) we first have to help them to marshall their resources more wisely and keep their awareness of exactly where they are, what they are doing and, in a very meaningful sense, who they are.
Time Banking: Aiming for the 40 hour week.
Posted: June 24, 2012 Filed under: Education | Tags: education, educational problem, higher education, in the student's head, learning, measurement, MIKE, principles of design, resources, student perspective, teaching, teaching approaches, time banking, tools, universal principles of design, work/life balance 5 CommentsI was reading an article on metafilter on the perception of future leisure from earlier last century and one of the commenters linked to a great article on “Why Crunch Mode Doesn’t Work: Six Lessons” via the International Game Designers Association. This article was partially in response to the quality of life discussions that ensued after ea_spouse outed the lifestyle (LiveJournal link) caused by her spouse’s ludicrous hours working for Electronic Arts, a game company. One of the key quotes from ea_spouse was this:
Now, it seems, is the “real” crunch, the one that the producers of this title so wisely prepared their team for by running them into the ground ahead of time. The current mandatory hours are 9am to 10pm — seven days a week — with the occasional Saturday evening off for good behavior (at 6:30pm). This averages out to an eighty-five hour work week. Complaints that these once more extended hours combined with the team’s existing fatigue would result in a greater number of mistakes made and an even greater amount of wasted energy were ignored.
This is an incredible workload and, as Evan Robinson notes in the “Crunch Mode” article, this is not only incredible but it’s downright stupid because every serious investigation into the effect of working more than 40 hours a week, for extended periods, and for reducing sleep and accumulating sleep deficit has come to the same conclusion: hours worked after a certain point are not just worthless, they reduce worth from hours already worked.
Robinsons cites studies and practices coming from industrialists as Henry Ford, who reduced shift length to a 40-hour work week in 1926, attracting huge criticism, because 12 years of research had shown that the shorter work week meant more output, not less. These studies have been going on since the 18th century and well into the 60’s at least and they all show the same thing: working eight hours a day, five days a week gives you more productivity because you get fewer mistakes, you get less fatigue accumulation and you have workers that are producing during their optimal production times (first 4-6 hours of work) without sliding into their negatively productive zones.
As Robinson notes, the games industry doesn’t seem to have got the memo. The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. The fact that you can get fired for having the audacity to try and work otherwise also helps a great deal in motivating people to adopt the strategy.
Why spend so many hours in the office? Remember when I said that it’s sometimes hard for people to see what I’m doing because, when I’m thinking or planning, I can look like I’m sitting in the office doing nothing? Imagine what it looks like if, two weeks before a big deadline, someone walks into the office at 5:30pm and everyone’s gone home. What does this look like? Because of our conditioning, which I’ll talk about shortly, it looks like we’ve all decided to put our lives before the work – it looks like less than total commitment.
As a manager, if you can tell everyone above you that you have people at their desks 80+ hours a week and will have for the next three months, then you’re saying that “this work is important and we can’t do any more.” The fact that people were probably only useful for the first 6 hours of every day, and even then only for the first couple of months, doesn’t matter because it’s hard to see what someone is doing if all you focus on is the output. Those 80+ hour weeks are probably only now necessary because everyone is so tired, so overworked and so cognitively impaired, that they are taking 4 times as long to achieve anything.
Yes, that’s right. All the evidence says that more than 2 months of overtime and you would have been better off staying at 40 hours/week in terms of measurable output and quality of productivity.
Robinson lists six lessons, which I’ll summarise here because I want to talk about it terms of students and why forward planning for assignments is good practice for better smoothing of time management in the future. Here are the six lessons:
- Productivity varies over the course of the workday, with greatest productivity in the first 4-6 hours. After enough hours, you become unproductive and, eventually, destructive in terms of your output.
- Productivity is hard to quantify for knowledge workers.
- Five day weeks of eight house days maximise long-term output in every industry that has been studied in the past century.
- At 60 hours per week, the loss of productivity caused by working longer hours overwhelms the extra hours worked within a couple of months.
- Continuous work reduces cognitive function 25% for every 24 hours. Multiple consecutive overnighters have a severe cumulative effect.
- Error rates climb with hours worked and especially with loss of sleep.
My students have approximately 40 hours of assigned work a week, consisting of contact time and assignments, but many of them never really think about that. Most plan in other things around their ‘free time’ (they may need to work, they may play in a band, they may be looking after families or they may have an active social life) and they fit the assignment work and other study into the gaps that are left. Immediately, they will be over the 40 hour marker for work. If they have a part-time job, the three months of one of my semesters will, if not managed correctly, give them a lumpy time schedule alternating between some work and far too much work.
Many of my students don’t know how they are spending their time. They switch on the computer, look at the assignment, Skype, browse, try something, compile, walk away, grab a bite, web surf, try something else – wow, three hours of programming! This assignment is really hard! That’s not all of them but it’s enough of them that we spend time on process awareness: working out what you do so you know how to improve it.
Many of my students see sports drinks, energy drinks and caffeine as a licence to not sleep. It doesn’t work long term as most of us know, for exactly the reasons that long term overwork and sleeplessness don’t work. Stimulants can keep you awake but you will still be carrying most if not all of your cognitive impairment.
Finally, and most importantly, enough of my students don’t realise that everything I’ve said up until now means that they are trying to sit my course with half a brain after about the halfway point, if not sooner if they didn’t rest much between semesters.
I’ve talked about the theoretical basis for time banking and the pedagogical basis for time banking: this is the industrial basis for time banking. One day I hope that at least some of my students will be running parts of their industries and that we have taught them enough about sensible time management and work/life balance that, as people in control of a company, they look at real measures of productivity, they look at all of the masses of data supporting sensible ongoing work rates and that they champion and adopt these practices.
As Robinson says towards the end of the article:
Managers decide to crunch because they want to be able to tell their bosses “I did everything I could.” They crunch because they value the butts in the chairs more than the brains creating games. They crunch because they haven’t really thought about the job being done or the people doing it. They crunch because they have learned only the importance of appearing to do their best to instead of really of doing their best. And they crunch because, back when they were programmers or artists or testers or assistant producers or associate producers, that was the way they were taught to get things done. (Emphasis mine.)
If my students can see all of their requirements ahead of time, know what is expected, have been given enough process awareness, and have the will and the skill to undertake the activities, then we can potentially teach them a better way to get things done if we focus on time management in a self-regulated framework, rather than imposed deadlines in a rigid authority-based framework. Of course, I still have a lot of work to to demonstrate that this will work but, from industrial experience, we have yet another very good reason to try.
Flow, Happiness and the Pursuit of Significance
Posted: June 22, 2012 Filed under: Education | Tags: Csíkszentmihályi, curriculum, education, educational research, flow, higher education, learning, measurement, MIKE, reflection, resources, student perspective, teaching, teaching approaches, time banking, tools, universal principles of design, vygotsky, Zone of proximal development Leave a commentI’ve just been reading Deirdre McCloskey’s article on “Happyism” in The New Republic. While there are a number of points I could pick at in the article, I question her specific example of statistical significance and I think she’s oversimplified a number of the philosophical points, there are a lot of interesting thoughts and arguments within the article.
One of my challenges in connecting with my students is that of making them understand what the benefit is to them of adopting, or accepting, suggestions from me as to how to become better as discipline practitioners, as students and, to some extent, as people. It would be nice if doing the right thing in this regard could give the students a tangible and measurable benefit that they could accumulate on some sort of meter – I have performed well, my “success” meter has gone up by three units. As McCloskey points out, this effectively requires us to have a meter for something that we could call happiness, but it is then tied directly to events that give us pleasure, rather than a sequence of events that could give us happiness. Workflows (chains of actions that lead to an eventual outcome) can be assessed for accuracy and then the outcome measured, but it is only when the workflow is complete that we can assess the ‘success’ of the workflow and then derive pleasure, and hence happiness, from the completion of the workflow. Yes, we can compose a workflow from sub-workflows but we will hit the same problem if we focus on an outcome-based model – at some stage, we are likely to be carrying out an action that can lead to an event from which we can derive a notion of success, but this requires us to be foresighted and see the events as a chain that results in this outcome.
And this is very hard to meter and display in a way that says anything other than “Keep going!” Unsurprisingly, this is not really the best way to provide useful feedback, reward or fodder for self-actualisation.
I have a standing joke that, as a runner, I go to a sports doctor because if I go to a General Practitioner and say “My leg hurts after I run”, the GP will just say “Stop running.” I am enough of a doctor to say that to myself – so I seek someone who is trained to deal with my specific problems and who can give me a range of feedback that may include “stop running” because my injuries are serious or chronic, but can provide me with far more useful information from which I can make an informed choice. The happiness meter must be able to work with workflow in some way that is useful – keep going is not enough. We therefore need to look at the happiness meter.
McCloskey identifies Bentham, founder of utilitarianism, as the original “pleasure meter” proponent and implicitly addressed the beneficial calculus as subverting our assessment of “happiness units” (utils) into a form that assumes that we can reasonably compare utils between different people and that we can assemble all of our life’s experiences in a meaningful way in terms of utils in the first place!
To address the issue of workflow itself, McCloskey refers to the work of Mihály Csíkszentmihályi on flow: “the absorption in a task just within our competence”. I have talked about this before, in terms of Vygotsky’s zone of proximal development and the use of a group to assist people who are just outside of the zone of flow. The string of activities can now be measured in terms of satisfaction or immersion, as well as the outcomes of this process. Of course, we have the outcomes of the process in terms of direct products and we have outcomes in terms of personal achievement at producing those products. Which of these go onto the until meter, given that they are utterly self-assessed, subjective and, arguably, orthogonal in some cases. (If you have ever done your best, been proud of what you did, but failed in your objective, you know what I’m talking about.)
My reading of McCloskey is probably a little generous because I find her overall argument appealing. I believe that her argument may be distilled are:
- If we are going to measure, we must measure sensibly and be very clear in our context and the interpretation of significance.
- If we are going to base any activity on our measurement, then the activity we create or change must be related to the field of measurement.
Looking at the student experience in this light, asking students if they are happy with something is, ultimately, a pointless activity unless I either provide well-defined training in my measurement system and scale, or I am looking for a measurement of better or worse. This is confounded by simple cognitive biasses including, but not limited to, the Hawthorne Effect and confirmation bias. However, measuring what my students are doing, as Csíkszentmihályi did in the flow experiments, will show me if they are so engaged with their activities that they are staying in the flow zone. Similarly, looking at participation and measuring outputs in collaborative activities where I would expect the zone of proximal development to be in effect is going to be far more revealing than asking students if they liked something or not.
As McCloskey discusses, there is a point at which we don’t seem to get any happier but it is very hard to tell if this is a fault in our measurement and our presumption of a three-point non-interval scale and it then often degenerates into a form of intellectual snobbery that, unsurprisingly, favours the elites who will be studying the non-elites. (As an aside, I learnt a new word. Clerisy: “A distinct class of learned or literary people” If you’re going to talk about the literate elites, it’s nice to have a single word to do so!) In student terms, does this mean that there is a point at which even the most keen of our best and brightest will not try some of our new approaches? The question, of course, is whether the pursuit of happiness is paralleling the quest for knowledge, or whether this is all one long endured workflow that results in a pleasure quantum labelled ‘graduation’.
As I said, I found it to be an interesting and thoughtful piece, despite some problems and I recommend it to you, even if we must then start an large debate in the comments on how much I misled you!
The Many Types of Failure: What Does Zero Mean When Nothing Is Handed Up?
Posted: June 18, 2012 Filed under: Education | Tags: advocacy, blogging, curriculum, design, education, educational problem, educational research, higher education, in the student's head, learning, measurement, MIKE, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, time banking, tools, workload 3 CommentsYou may have read about the Edmonton, Canada, teacher who expected to be sacked for handing out zeros. It’s been linked to sites as diverse as Metafilter, where a long and interesting debate ensued, and Cracked, where it was labelled one of the ongoing ‘pussifications’ of schools. (Seriously? I know you’re a humour site but was there some other way you could have put that? Very disappointed.)
Basically, the Edmonton Public School Board decided that, rather than just give a zero for a missed assignment, this would be used as a cue for follow-up work and additional classes at school or home. Their argument – you can’t mark work that hasn’t been submitted, let’s use this as a trigger to try and get submission, in case the source is external or behavioural. This, of course, puts the onus on the school to track the students, get the additional work completed, and then mark out of sequence. Lynden Dorval, the high school teacher who is at the centre of this, believe that there is too much manpower involved in doing this and that giving the student a zero forces them to come to you instead.

Some of you may never have seen one of these before. This is a zero, which is the lowest mark you can be awarded for any activity. (I hope!)
Now, of course, this has split people into two fairly neat camps – those who believe that Dorval is the “hero of zero” and those who can see the benefit of the approach, including taking into account that students still can fail if they don’t do enough work. (Where do I stand? I’d like to know a lot more than one news story before I ‘pick a side’.) I would note that a lot of tired argument and pejorative terminology has also come to the fore – you can read most of the buzzwords used against ‘progressives’ in this article, if you really want to. (I can probably summarise it for you but I wouldn’t do it objectively. This is just one example of those who are feting Dorval.)
Of course, rather than get into a heated debate where I really don’t have enough information to contribute, I’d rather talk about the basic concept – what exactly does a zero mean? If you hand something in and it meets none of my requirements, then a zero is the correct and obvious mark. But what happens if you don’t hand anything in?
With the marking approach that I practice and advertise, which uses time-based mark penalties for late submission, students are awarded marks for what they get right, rather than have marks deducted for what they do wrong. Under this scheme, “no submission” gives me nothing to mark, which means that I cannot give you any marks legitimately – so is this a straight-forward zero situation? The time penalties are in place as part of the professional skill requirements and are clearly advertised, and consistently policed. I note that I am still happy to give students the same level of feedback on late work, including their final mark without penalty, which meets all of the pedagogical requirements, but the time management issues can cost a student some, most or all of their marks. (Obviously, I’m actively working on improving engagement with time management through mechanisms that are not penalty based but that’s for other posts.)
As an aside, we have three distinct fail grades for courses at my University:
- Withdraw Fail (WF), where a student has dropped the course but after the census date. They pay the money, it stays on their record, but as a WF.
- Fail (F), student did something but not enough to pass.
- Fail No Submission (FNS), student submitted no work for assessment throughout the course.
Interestingly, for my Uni, FNS has a numerical grade of 0, although this is not shown on the transcript. Zero, in the course sense, means that you did absolutely nothing. In many senses, this represents the nadir of student engagement, given that many courses have somewhere from 1-5, maybe even 10%, of marks available for very simple activities that require very little effort.
My biggest problem with late work, or no submission, is that one of the strongest messages I have from that enormous data corpus of student submission that I keep talking about is that starting a pattern of late or no submission is an excellent indicator of reduced overall performance and, with recent analysis, a sharply decreased likelihood of making it to third year (final year) in your college studies. So I really want students to hand something in – which brings me to the crux of the way that we deal with poor submission patterns.
Whichever approach I take should be the one that is most likely to bring students back into a regular submission pattern.
If the Public School Board’s approach is increasing completion rates and this has a knock-on effect which increases completion rates in the future? Maybe it’s time to look at that resourcing profile and put the required money into this project. If it’s a transient peak that falls off because we’re just passing people who should be failing? Fuhgeddaboutit.
To quote Sherlock Holmes (Conan Doyle, naturally):
It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. (A Scandal in Bohemia)
“Data! Data! Data!” he cried impatiently. “I can’t make bricks without clay.” (The Adventure of the Copper Beeches)
It is very easy to take a side on this and it is very easy to see how both sides could have merit. The issue, however, is what each of these approaches actually does to encourage students to submit their assignment work in a more timely fashion. Experiments, experimental design, surveys, longitudinal analysis, data, data, data!
If I may end by waxing lyrical for a moment (and you will see why I stick to technical writing):
If zeroes make Heroes, then zeroes they must have! If nulls make for dulls, then we must seek other ways!
The Internet is Forever
Posted: June 17, 2012 Filed under: Education | Tags: advocacy, blogging, education, ethics, feedback, in the student's head, martha payne, reflection, student perspective, teaching approaches, thinking, tools 3 CommentsI realise that, between this blog and my other blog, I have a lot of ‘Nick” out there and there is always a chance that this may come back to haunt me. Well, given that I’m blogging under my own name and I have a vague idea of how this whole Internet thing works, I was ready for this possibility. What always amazes me, however, is when people don’t realise that the Internet is neither memoryless nor able to be reformatted through fiat, no matter how much you want it to be so. Anything that goes out into the Internet is, for most reasonable definitions, going to be there forever. Trying to act against the Internet… ooh… look up the Streisand Effect (Wikipedia link), if you don’t know what that is.
You may have read about the 9-year old Scottish school girl, Martha Payne, who was a bit disappointed about the range and quantity of school lunches she was receiving so, with her dad’s help and with her teachers’ knowledge, started a blog about it. You can read the whole story here (Wired link), with lots of tasty links, but the upshot is this:
- Martha wasn’t happy with her lunches because she wanted a bit more salad, to go along with the fried food, pizza and croquettes that made up her lunch.
- Very politely, and without a huge axe to grind, she started putting up pictures of her lunch.
- Within two weeks, unlimited salads had been added for children at her school. (This is just one of the improvements that took place over time.)
- To make better use of the positive feedback and publicity, after about 20 posts, she asked people who liked and followed her blog to donate money to a group that fund school meals in Africa.
- People started following her in greater numbers. Other students started sending in pictures of their lunches.
- People started writing about her.
- Martha was pulled out of class to be told that she could no longer photograph her school meals because of something that showed up in a newspaper.

This was one of the first school lunches that Martha posted about (picture from her blog). Yes, that’s the lot. The rabid sausage looking thing is potato covered in stuff. That is also MAXIMUM ALLOWABLE CORN.
At this point, the people who were directing the school, the Argyll and Bute Council, went ever so slightly mad and forgot everything I just told you about the Internet. Firstly, because it was now obvious to hundreds of thousands, if not millions, of people that the A&B Council had censored a little girl from publishing pictures of her lunch. Secondly, because they posted an inaccurate and rather unpleasant statement about it, seeming to forget that everyone else could see what Martha had said and what the newspaper had said. This, of course, led to far more people knowing about the original blog than any other action that they could have taken. (I’m jealous, here, because Katrina had been following the blog before the shutdown!)
Thirdly, they forgot that the Internet is forever – that their statements, their actions to try and stop the tide from rolling, their questionable interpretation of events that might, if I were less generous, look both disingenuous and condescending (although I would never accuse the Argyll and Bute Council of such actions, obviously), these actions, and everyone’s reactions to them, are now out there. Archived. Indexed. Contextualised. Remembered.
Of course, the outcomes are unsurprising. After the Scottish Education Minister’s jaw was retrieved from the carpet, I can only imagine the speed with which the council was rung and asked exactly why they thought it a good idea to carry out their actions against a polite 9 year old girl. I note that the ban has now been lifted, the charity that Martha was working with now has so much money from donations that they can now build four kitchens to feed African school children, and some councillors have had a rather quick lesson in what globally instantaneous persistent communication means in the 21st century.
The issue here is that one girl looked at her plate, thought about it, spoke to some people and then,very politely, said “Please ;, may I have some more ;?” More salad then ensued! Food got healthier! The people at the school had responded sensibly. Children in Africa were getting more food! This was a giant win-win for the school and A&B Council – but somebody in the council couldn’t resist the urge to take a silly action in response to something that was no more Martha’a fault than the reporting of the Titanic caused the iceberg to drift into the sea lane.
Well done, Martha! Good luck with your continued photography of your increasingly pleasant, nutritious and delightful Scottish school lunches.
Time Banking IV: The Role of the Oracle
Posted: June 14, 2012 Filed under: Education | Tags: education, educational problem, educational research, feedback, Generation Why, higher education, learning, measurement, principles of design, resources, student perspective, teaching, teaching approaches, time banking 1 CommentI’ve never really gone into much detail on how I would make a system like Time Banking work. If a student can meet my requirements and submit their work early then, obviously, I have to provide some sort of mechanism that allows the students to know that my requirements have been met. The first option is that I mark everything as it comes in and then give the student their mark, allowing them to resubmit until they get 100%.
That’s not going to work, unfortunately, as, like so many people, I don’t have the time to mark every student’s assignment over and over again. I wait until all assignments have been submitted, review them as a group, mark them as a group and get the best use out of staying in the same contextual framework and working on the same assignments. If I took a piecemeal approach to marking, it would take me longer and, especially if the student still had some work to do, I could end up marking the same assignment 3,4, however many times and multiplying my load in an unsupportable way.
Now, of course I can come up with simple measures that the students can check for themselves. Of course, the problem we have here is setting something that a student can mis-measure as easily as they measure. If I say “You must have at least three pages for an essay” I risk getting three pages of rubbish or triple spaced 18 point print. It’s the same for any measure of quantity (number of words, number of citations, length of comments and so on) instead of quality. The problem is, once again, that if the students were capable of determining the quality of their own work and determining the effort and quality required to pass, they wouldn’t need time banking because their processes are already mature!
So I’m looking for an indicator of quality that a student can use to check their work and that costs me only (at most) a small amount of effort. In Computer Science, I can ask the students to test their work against a set of known inputs and then running their program to see what outputs we get. There is then the immediate problem of students hacking their code and just throwing it against the testing suite to see if they can fluke their way to a solution. So, even when I have an idea of how my oracle, my measure of meeting requirements, is going to work, there are still many implementation details to sort out.
Fortunately, to help me, I have over five years worth of student data through our automated assignment submission gateway where some assignments have an oracle, some have a detailed oracle, some have a limited oracle and some just say “Thanks for your submission.” The next stage in the design of the oracle is to go back and to see what impact the indications of progress and completeness had on the students. Most importantly, for me, is the indication of how many marks a student had to get in order to stop trying to make fresh submissions. If before the due date, did they always strive for 100%? If late, did they tend to stop at more than 50% of achieved marks, or more than 40% in the case of trying to avoid receiving a failing grade based on low assignment submission?
Are there significant and measurable differences between assignments with an oracle and those that have none (or a ‘stub’, so to speak)? I know what many people expect to find in the data, but now I have the data and I can go and interrogate that!
Every time that I have questions like this about the implementation, I have a large advantage in that I already have a large control body of data, before any attempts were made to introduce time banking. I can look at this to see what student behaviour is like and try to extract these elements and use them to assist students in smoothing out their application of effort and develop more mature time management approaches.
Now to see what the data actually says – I hope to post more on this particular aspect in the next week or so.
Let’s Turn Down the Stupid (Ignorance is Our Enemy)
Posted: June 11, 2012 Filed under: Education, Opinion | Tags: education, educational problem, Generation Why, in the student's head, learning, student perspective, thinking 1 Comment(This is a longish opinion piece that has very little educational discussion. I leave it you as to whether you wish to read it or not.)
I realise that a number of you may read my blog posts and think “Well, how nice for him. He has tenure in a ‘good’ University, has none of his own kids to worry about and is obviously socially mobile and affluent.” Some of you may even have looked up my public record salary when I talk about underpaying teachers and wondered why I don’t just shut up and enjoy my life, rather than blathering on here. It would be easy to cast me as some kind of Mr Happy/Pollyanna figure, always seeing the positive and rushing out onto the sports field with a rousing “We’re all winners, children” attitude.
Nothing could be further from the truth. I get up every day knowing that the chances are that I will not make a difference, that all of my work will be undone by a scare campaign in a newspaper, that I may catch a completely preventable disease because too few people got vaccinated, that I and my family may not have enough food or lose my house because people ignore science, that anti-scientific behaviour is clawing back many of the victories that we have already achieved.
I’m no Pollyanna. I get up every day ready to fight ignorance and try to bring knowledge to places where ignorance reigns. Sometimes I manage it – those are good days. But I can’t just talk to my own students, I have to reach out into the community because I see such a small percentage of a small percentage as my students. If I want lasting change, and I believe that most educators are all trying to change the world for the better, then I have to deal with the fact that my message, and my students, have to be able to be seen outside of our very small and isolated community.
This morning, while out running, we had gone a bit over 14 kilometres (about 9 miles) when I saw a cyclist up ahead off us, stopped on a little wooden ramp that went under one of the bridges. He heard us coming and waved us down, very quickly.
Someone had strung fishing line across the path, carefully anchored on both sides, at around mid-chest height for adult runners and walkers, or neck/head height for children.
Of course, the moment we realised this we looked around for the utter idiots who were no doubt waiting to film this or watch it but they showed a modicum of sense in that we couldn’t see them. (Of course, what could we have done even if we had seen them. They were most likely children and the police aren’t likely to get involved for a ‘fishing line’ related incident.) What irritated me most about this was that I was running with someone who was worried about the future and I was solemnly telling her that I had great hope for the future, that the problems could be solved if we worked at it and this is what I always tried to get across to my students.
And then we nearly got garrotted by an utterly thoughtless act of stupidity. Even a second’s thought would lead you to the understanding that this was more than a joke, it was potentially deadly. And yet, the people who put this up, who I have no doubt waited to watch or film it, were incapable of doing this. I can only hope that they were too young, or too mentally incapacitated, to know better. Because when someone knowingly does this, it takes them from ignorance to evil. Fortunately, the number of truly evil people, people who do these things in full knowledge and delight, are small. At least, that’s what I tell myself to get myself to sleep at night. We must always be watchful for evil but in the same way that we watch for the infrequently bad storm – when we see the signs, we batten down, but we don’t live our lives in the storm cellar. Ignorance, for me, is far more prevalent and influential than evil – and often has very similar effects as it can take people from us, by killing or injuring them or by placing them into so much mental or physical pain that they can no longer do what they could have done with their lives.
The biggest obstacle we face is ignorance and acts taken in ignorance, whether accidentally or wilfully so. There’s no point me training up the greatest mind in the history of the world, only for that person to be killed by someone throwing a rock off a bridge for fun. Today, I could easily have been seriously injured because someone thought it was funny to watch people run into an obstacle at speed. Yes, the line probably would have broken and I was unlikely to have suffered too much harm. Unless it didn’t. Unless it took out an eye.
But I’m not giving up. I say, mostly joking, when I run across things like this “This is why we fight.” and I mean it. This is exactly why education is important. This is why teachers are important. This is why knowledge is important. Because, without all of these, ignorance will win and it will eventually kill us.
I am sick of stupid, ignorant and evil people. I’m sick of grown men getting away with disgraceful behaviour because “boys will be boys”. I’m sick of any ignorant or thoughtless act being tolerated with “Oh well, these things happen”. However, me being sick of this does nothing unless I act to stop it. Me acting to stop it may do nothing. Me doing nothing to stop it definitely does nothing.
What are the Fiction and Non-Fiction Equivalents of Computer Science?
Posted: June 9, 2012 Filed under: Education, Opinion | Tags: data visualisation, design, education, educational problem, herdsa, higher education, icer, learning, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, universal principles of design 2 CommentsI commented yesterday that I wanted to talk about something covered in Mark’s blog, namely if it was possible to create an analogy between Common Core standards in different disciplines with English Language Arts and CS as the two exemplars. In particular, Mark pondered, and I quote him verbatim:
”Students should read as much nonfiction as fiction.” What does that mean in terms of the notations of computing? Students should read as many program proofs as programs? Students should read as much code as comments?
This a great question and I’m not sure that I have much of an answer but I’ve been enjoying thinking about it. We bandy the terms syntax and semantics around in Computer Science a lot: the legal structures of the programs we write and the meanings of the components and the programs. Is it even meaningful to talk about fiction and non-fiction in these terms and where do these fit? I’ve gone in a slightly different direction from Mark but I hope to bring it back to his suggestions later on.
I’m not an English specialist, so please forgive me or provide constructive guidance as you need to, but both fiction and non-fiction rely upon the same syntactic elements and the same semantic elements in linguistic terms – so the fact that we must have legal programs with well-defined syntax and semantics pose no obstacle to a fictional/non-fictional interpretation.
Forgive me as I go to Wikipedia for definitions for fiction and non-fiction for a moment:
“Non-fiction (or nonfiction) is the form of any narrative, account, or other communicative work whose assertions and descriptions are understood to be factual.” (Warning, embedded Wikipedia links)
“Fiction is the form of any narrative or informative work that deals, in part or in whole, with information or events that are not factual, but rather, imaginary—that is, invented by the author” (Again, beware Wikipedia).
Now here we can start to see something that we can get our teeth into. Many computer programs model reality and are computerised representation of concrete systems, while others may have no physical analogue at all or model a system that has never or may never exist. Are our simulations and emulations of large-scale system non-fiction? If so, is a virtual reality fictional because it has never existed or non-fictional because we are simulating realistic gravity? (But, of course, fiction is often written in a real world setting but with imaginary elements.)
From a software engineering perspective, I can see an advantage to making statements regarding abstract representations and concrete analogues, much as I can see a separation in graphics and game design between narrative/event engine construction and the physics engine underneath.
Is this enough of a separation? Mark’s comments on proof versus program is an interesting one: if we had an idea (an author’s creation) then it is a fiction until we can determine that it exists, but proof or implementation provides this proof of existence. In my mind, a proof and a program are both non-fiction in terms of their reification, but the idea that they span may still be fictional. Comments versus code is also very interesting – comments do not change the behaviour of code but explain, from the author’s mind, what has happened. (Given some student code and comment combinations, I can happily see a code as non-fiction, comment as fiction modality – or even comment as magical reality!)
Of course, this is all an enjoyable mental exercise, but what can I take from this and use in my teaching. Is there a particular set of code or comments that students should read for maximum benefit and can we make a separation that, even if not partitioned so neatly across two sets, gives us the idea of what constitutes a balanced diet of the products of our discipline?
I’d love to see some discussion on this but, if nothing here, then I’m happy to buy the first round of drinks at HERDSA or ICER to start a really good conversation going!
What’s the Big Idea?
Posted: June 8, 2012 Filed under: Education | Tags: awesome sandwich, big ideas, curriculum, data visualisation, education, educational problem, grand challenge, higher education, principles of design, student perspective, teaching, teaching approaches Leave a commentI was reading Mark Guzdial’s blog just before sitting down to write tonight and came across this post. Mark was musing about the parallels between the Common Core standards of English Language arts and those of Computing Literacy. He also mentioned the CS:Principles program – an AP course designed to give an understanding of fundamental principles, the breadth of application and the way that computing can change the world.
I want to talk more about the parallels that Mark mentioned but I’ll do that in another post because I read through the CS:Principles Big Ideas and wanted to share them with you. There are seven big ideas:
- Creativity, recognising the innately creative nature of computing;
- Abstraction, where we rise above detail to allow us to focus on the right things;
- Data, where data is the foundation of the creation of knowledge;
- Algorithms, to develop solutions to computational problems;
- Programming, the enabler of our dreams of solutions and the way that we turn algorithms into solution – the basis of our expression;
- Internet, the ties that bind all modern computing together; and
- Impact, the fact that Computing can, and regularly does, change the world.
I think that I’m going to refer to these with the NSF Grand Challenges as part of my new Grand Challenges course, because there is a lot of similarity. I’ve nearly got the design finished so it’s not too late to incorporate new material. (I don’t like trying to rearrange courses too late into the process because I use a lot of linked assessment and scaffolding, it gets very tricky and easy to make mistakes if I try and insert a late design change.)
For me, the first and the last ideas are among the most important. Yes, you may be able to plod your way through simple work in computing but really good solutions require skill, practice, and creativity. When you get a really good solution or approach to a problem, you are going to change things – possibly even the world. It looks like someone took the fundamentals of computing and jammed together between two pieces of amazing stuff, framing the discipline inside the right context for a change. Instead of putting computing in a nerd sandwich, it’s in an awesome sandwich. I like that a lot.
Allowing yourself to be creative, understanding abstraction, knowing how to put data together, working out to move the data around in the right ways and then coding it correctly, using all of the resources that you have to hand and that you can reach out and touch through the Internet – that’s how to change the world.
Last Lecture Blues
Posted: June 7, 2012 Filed under: Education | Tags: authenticity, education, educational problem, feedback, games, Generation Why, higher education, in the student's head, reflection, resources, student perspective, teaching, teaching approaches Leave a commentI delivered the last lecture, technically the review and revision lecture, for my first year course today. As usual, when I’ve had a good group of students or a course I enjoy, the relief in having reduced my workload is minor compared to the realisation that another semester has come to an end and that this particular party is over.
Today’s lecture was optional but we still managed to get roughly 30% of the active class along. Questions were asked, questions were discussed, outline answers were given and then, although they all say and listened until I’d finished a few minutes late, they were all up and gone. The next time I’ll see most of them is at the exam, a few weeks from now. After that? It depends on what I teach. Some of these students I’ll run into over the years and we’ll actually get to know each other. Some may end up as my honours or post-graduate students. Some will walk out of the gates this semester and never return.
Now, hear me out, because I’m not complaining about it, but this is not the easiest job in the world. Done properly, education requires constant questioning, study, planning, implementing, listening, talking and, above all, dealing with the fact that you may see the best student you ever have for a maximum of 6 months. It is, however, a job that I love, a job that I have a passion for and, of course, in many ways it’s a calling more than a job.
One of the things I’ve had a chance to reflect on in this blog is how much I enjoy my job, while at the same time recognising how hard it is to do it well. Many times, the students I need to speak to most are those who contact me least, who up and fade away one day, leaving me wondering what happened to them.
At the end of the semester, it’s a good time to ask myself some core questions and see if I can give some good answers:
- Did I do the best job that I could do, given the resources (structures, curriculum, computers etc) that I had to work with?
- Did I actively seek out the students who needed help, rather than just waiting for people to contact me?
- Did I look for pitfalls before I ran into them?
- Did I look after the staff who were working with or for me, and mentor them correctly?
- Did I try to make everything that I worked with an awesome experience for my students?
This has been the busiest six months of my life and one of my joys has been walking into a lecture theatre, having written the course, knowing the material and losing myself in an hour of interactive, collaborative and enjoyable knowledge exchange with my students. Despite that, being so busy, sometimes I didn’t quite have the foresight that I should had had and my radar range was measured in days rather than weeks. Don’t get me wrong, everything got done, but I could have tried to locate troubled students more actively, and some minor pitfalls nearly got me.
However, I think that we still delivered a great course and I’m happy with 1, 4 and 5. I aimed for awesome and I think we hit it fairly often. 2 and 3 needed work but I’ve already started making the required changes to make this better.
On reflection, I’d give myself an 8ish/10 but, of course, that’s not enough. Overall, in the course, because of the excellent support from my co-lecturer and my teaching staff, the course itself I’d push up into the 9-pluses. I, however, should be up there as well and right now, I’m too busy.
So, it’s time for some rebalancing into the new semester. Some more structure for identifying problems students. Looking at things a little earlier. And aiming for an awesome 10/10 for my own performance next semester.
To all my students, past and present, it’s been fantastic. Best of luck with your exams!




