What are we assessing? How?

How we can create a better assessment system, without penalties, that works in a grade-free environment? Let’s provide a foundation for this discussion by looking at assessment today.

fx_Bloom_New

Bloom’s Revised Taxonomy

We have many different ways of understanding exactly how we are assessing knowledge. Bloom’s taxonomy allows us to classify the objectives that we set for students, in that we can determine if we’re just asking them to remember something, explain it, apply it, analyse it, evaluate it or, having mastered all of those other aspects, create a new example of it. We’ve also got Bigg’s SOLO taxonomy to classify levels of increasing complexity in a student’s understanding of subjects. Now let’s add in threshold concepts, learning edge momentum, neo-Piagetian theory and …

Let’s summarise and just say that we know that students take a while to learn things, can demonstrate some convincing illusions of progress that quickly fall apart, and that we can design our activities and assessment in a way that acknowledges this.

I attended a talk by Eric Mazur, of Peer Instruction fame, and he said a lot of what I’ve already said about assessment not working with how we know we should be teaching. His belief is that we rarely rise above remembering and understanding, when it comes to testing, and he’s at Harvard, where everyone would easily accept their practices as, in theory, being top notch. Eric proposed a number of approaches but his focus on outcomes was one that I really liked. He wanted to keep the coaching role he could provide separate from his evaluator role: another thing I think we should be doing more.

Eric is in Physics but all of these ideas have been extensively explored in my own field, especially where we start to look at which of the levels we teach students to and then what we assess. We do a lot of work on this in Australia and here is some work by our groups and others I have learned from:

  • Szabo, C., Falkner, K. & Falkner, N. 2014, ‘Experiences in Course Design using Neo-Piagetian Theory’
  • Falkner, K., Vivian, R., Falkner, N., 2013, ‘Neo-piagetian Forms of Reasoning in Software Development Process Construction’
  • Whalley, J., Lister, R.F., Thompson, E., Clear, T., Robbins, P., Kumar, P. & Prasad, C. 2006, ‘An Australasian study of reading and comprehension skills in novice programmers, using Bloom and SOLO taxonomies’
  • Gluga, R., Kay, J., Lister, R.F. & Teague, D. 2012, ‘On the reliability of classifying programming tasks using a neo-piagetian theory of cognitive development’

I would be remiss to not mention Anna Eckerdal’s work, and collaborations, in the area of threshold concepts. You can find her many papers on determining which concepts are going to challenge students the most, and how we could deal with this, here.

Let me summarise all of this:

  • There are different levels at which students will perform as they learn.
  • It needs careful evaluation to separate students who appear to have learned something from students who have actually learned something.
  • We often focus too much on memorisation and simple explanation, without going to more advanced levels.
  • If we want to assess advanced levels, we may have to give up the idea of trying to grade these additional steps as objectivity is almost impossible as is task equivalence.
  • We should teach in a way that supports the assessment we wish to carry out. The assessment we wish to carry out is the right choice to demonstrate true mastery of knowledge and skills.

If we are not designing for our learning outcomes, we’re unlikely to create courses to achieve those outcomes. If we don’t take into account the realities of student behaviour, we will also fail.

We can break our assessment tasks down by one of the taxonomies or learning theories and, from my own work and that of others, we know that we will get better results if we provide a learning environment that supports assessment at the desired taxonomic level.

But, there is a problem. The most descriptive, authentic and open-ended assessments incur the most load in terms of expert human marking. We don’t have a lot of expert human markers. Overloading them is not good. Pretending that we can mark an infinite number of assignments is not true. Our evaluation aesthetics are objectivity, fairness, effectiveness, timeliness and depth of feedback. Assignment evaluation should be useful to the students, to show progress, and useful to us, to show the health of the learning environment. Overloading the marker will compromise the aesthetics.

Our beauty lens tells us very clearly that we need to be careful about how we deal with our finite resources. As Eric notes, and we all know, if we were to test simpler aspects of student learning, we can throw machines at it and we have a near infinite supply of machines. I cannot produce more experts like me, easily. (Snickers from the audience) I can recruit human evaluators from my casual pool and train them to mark to something like my standard, using a rubric or using an approximation of my approach.

Thus I have a framework of assignments, divide by level, and I appear to have assignment evaluation resources. And the more expert and human the marker, the more … for want of a better word … valuable the resource. The better feedback it can produce. Yet the more valuable the resource, the less of it I have because it takes time to develop evaluation skills in humans.

Tune in tomorrow for the penalty free evaluation and feedback that ties all of this together.


Assessment is (often) neither good nor true.

If you’ve been reading my blog over the past years, you’ll know that I have a lot of time for thinking about assessment systems that encourage and develop students, with an emphasis on intrinsic motivation. I’m strongly influenced by the work of Alfie Kohn, unsurprisingly given I’ve already shown my hand on Focault! But there are many other writers who are… reassessing assessment: why we do it, why we think we are doing it, how we do it, what actually happens and what we achieve.

Screen Shot 2016-01-09 at 6.50.12 PM

In my framing, I want assessment to be as all other aspects of education: aesthetically satisfying, leading to good outcomes and being clear and what it is and what it is not. Beautiful. Good. True. There are some better and worse assessment approaches out there and there are many papers discussing this.  One of these that I have found really useful is Rapaport’s paper on a simplified assessment process for consistent, fair and efficient grading. Although I disagree with some aspects, I consider it to be both good, as it is designed to clearly address a certain problem to achieve good outcomes, and it is true, because it is very honest about providing guidance to the student as to how well they have met the challenge. It is also highly illustrative and honest in representing the struggle of the author in dealing with the collision of novel and traditional assessment systems. However, further discussion of Rapaport is for the near future. Let me start by demonstrating how broken things often are in assessment, by taking you through a hypothetical situation.

Thought Experiment 1

Two students, A and B, are taking the same course. There are a number of assignments in the course and two exams. A and B, by sheer luck, end up doing no overlapping work. They complete different assignments to each other, half each and achieve the same (cumulative bare pass overall) marks. They then manage to score bare pass marks in both exams, but one answers only the even questions and only answers the odd. (And, yes, there are an even number of questions.) Because of the way the assessment was constructed, they have managed to avoid any common answers in the same area of course knowledge. Yet, both end up scoring 50%, a passing grade in the Australian system.

Which of these students has the correct half of the knowledge?

I had planned to build up to Rapaport but, if you’re reading the blog comments, he’s already been mentioned so I’ll summarise his 2011 paper before I get to my main point. In 2011, William J. Rapaport, SUNY Buffalo, published a paper entitled “A Triage Theory of Grading: The Good, The Bad and the Middling.” in Teaching Philosophy. This paper summarised a number of thoughtful and important authors, among them Perry, Wolff, and Kohn. Rapaport starts by asking why we grade, moving through Wolff’s taxonomic classification of assessment into criticism, evaluation, and ranking. Students are trained, by our world and our education systems to treat grades as a measure of progress and, in many ways, a proxy for knowledge. But this brings us into conflict with Perry’s developmental stages, where students start with a deep need for authority and the safety of a single right answer. It is only when students are capable of understanding that there are, in many cases, multiple right answers that we can expect them to understand that grades can have multiple meanings. As Rapaport notes, grades are inherently dual: a representative symbol attached to a quality measure and then, in his words, “ethical and aesthetic values are attached” (emphasis mine.) In other words, a B is a measure of progress (not quite there) that also has a value of being … second-tier if an A is our measure of excellence. A is not A, as it must be contextualised. Sorry, Ayn.

When we start to examine why we are grading, Kohn tells us that the carrot and stick is never as effective as the motivation that someone has intrinsically. So we look to Wolff: are we critiquing for feedback, are we evaluating learning, or are we providing handy value measures for sorting our product for some consumer or market? Returning to my thought experiment above, we cannot provide feedback on assignments that students don’t do, our evaluation of learning says that both students are acceptable for complementary knowledge, and our students cannot be discerned from their graded rank, despite the fact that they have nothing in common!

Yes, it’s an artificial example but, without attention to the design of our courses and in particular the design of our assessment, it is entirely possible to achieve this result to some degree. This is where I wish to refer to Rapaport as an example of thoughtful design, with a clear assessment goal in mind. To step away from measures that provide an (effectively) arbitrary distinction, Rapaport proposes a tiered system for grading that simplifies the overall system with an emphasis on identifying whether a piece of assessment work is demonstrating clear knowledge, a partial solution, an incorrect solution or no work at all.

This, for me, is an example of assessment that is pretty close to true. The difference between a 74 and a 75 is, in most cases, not very defensible (after Haladyna) unless you are applying some kind of ‘quality gate’ that really reduces a percentile scale to, at most, 13 different outcomes. Rapaport’s argument is that we can reduce this further and this will reduce grade clawing, identify clear levels of achieve and reduce marking load on the assessor. That last point is important. A system that buries the marker under load is not sustainable. It cannot be beautiful.

There are issues in taking this approach and turning it back into the grades that our institutions generally require. Rapaport is very open about the difficulties that he has turning his triage system into an acceptable letter grade and it’s worth reading the paper to see that discussion alone, because it quite clearly shows what

Rapaport’s scheme clearly defines which of Wolff’s criteria he wishes his assessment to achieve. The scheme, for individual assessments, is no good for ranking (although we can fashion a ranking from it) but it is good to identify weak areas of knowledge (as transmitted or received) for evaluation of progress and also for providing elementary critique. It says what it is and it pretty much does it. It sets out to achieve a clear goal.

The paper ends with a summary of the key points of Haladyna’s 1999 book “A Complete Guide to Student Grading”, which brings all of this together.

Haladyna says that “Before we assign a grade to any students, we need:

  1. an idea about what a grade means,
  2. an understanding of the purposes of grading,
  3. a set of personal beliefs and proven principles that we will use in teaching

    and grading,

  4. a set of criteria on which the grade is based, and, finally,
  5. a grading method,which is a set of procedures that we consistently follow

    in arriving at each student’s grade. (Haladyna 1999: ix)

There is no doubt that Rapaport’s scheme meets all of these criteria and, yet, for me, we have not yet gone far enough in search of the most beautiful, most good and most true extent that we can take this idea. Is point 3, which could be summarised as aesthetics not enough for me? Apparently not.

Tomorrow I will return to Rapaport to discuss those aspects I disagree with and, later on, discuss both an even more trimmed-down model and some more controversial aspects.


Musing on Industrial Time

Now Print, Black, Linocut, (C) Nick Falkner, 2013

I caught up with a good friend recently and we were discussing the nature of time. She had stepped back from her job and was now spending a lot of her time with her new-born son. I have gone to working three days a week, hence have also stepped back from the five-day grind.  It was interesting to talk about how this change to our routines had changed the way that we thought of and used time. She used a term that I wanted to discuss here, which was industrial timeto describe the clock-watching time of the full-time worker. This is part of the larger area of time discipline, how our society reacts to and uses time, and is really quite interesting. Both of us had stopped worrying about the flow of time in measurable hours on certain days and we just did things until we ran out of day. This is a very different activity from the usual “do X now, do Y in 15 minutes time” that often consumes us. In my case, it took me about three months of considered thought and re-training to break the time discipline habits of thirty years. In her case, she has a small child to help her to refocus her time sense on the now.

Modern time-sense is so pervasive that we often don’t think about some of the underpinnings of our society. It is easy to understand why we have years and, although they don’t line up properly, months given that these can be matched to astronomical phenomena that have an effect on our world (seasons and tides, length of day and moonlight, to list a few). Days are simple because that’s one light/dark cycle. But why there are 52 weeks in a year? Why are there 7 days in a week? Why did the 5-day week emerge as a contiguous block of 5 days? What is so special about working 9am to 5pm?

A lot of modern time descends from the struggle of radicals and unionists to protect workers from the excesses of labour, to stop people being worked to death, and the notion of the 8 hour day is an understandable division of a 24 hour day into three even chunks for work, rest and leisure. (Goodness, I sound like I’m trying to sell you chocolate!)

If we start to look, it turns out that the 7 day week is there because it’s there, based on religion and tradition. Interestingly enough, there have been experiments with other week lengths but it appears hard to shift people who are used to a certain routine and, tellingly, making people wait longer for days off appears to be detrimental to adoption.

If we look at seasons and agriculture, then there is a time to sow, to grow, to harvest and to clear, much as there is a time for livestock to breed and to be raised for purpose. If we look to the changing time of sunrise and sunset, there is a time at which natural light is available and when it is not. But, from a time discipline perspective, these time systems are not enough to be able to build a large-scale, industrial and synchronised society upon – we must replace a distributed, loose and collective notion of what time is with one that is centralised, authoritarian and singular. While religious ceremonies linked to seasonal and astronomical events did provide time-keeping on a large scale prior to the industrial revolution, the requirement for precise time, of an accuracy to hours and minutes, was not possible and, generally, not required beyond those cues given from nature such as dawn, noon, dusk and so on.

After the industrial revolution, industries and work was further developed that was heavily separated from a natural linkage – there are no seasons for a coal mine or a steam engine – and the development of the clock and reinforcement of the calendar of work allowed both the measurement of working hours (for payment) and the determination of deadlines, given that natural forces did not have to be considered to the same degree. Steam engines are completed, they have no need to ripen.

With the notion of fixed and named hours, we can very easily determine if someone is late when we have enough tools for measuring the flow of time. But this is, very much, the notion of the time that we use in order to determine when a task must be completed, rather than taking an approach that accepts that the task will be completed at some point within a more general span of time.

We still have confusion where our understanding of “real measures” such as days, interact with time discipline. Is midnight on the 3rd of April the second after the last moment of April the 2nd or the second before the first moment of April the 4th? Is midnight 12:00pm or 12:00am? (There are well-defined answers to this but the nature of the intersection is such that definitions have to be made.)

But let’s look at teaching for a moment. One of the great criticisms of educational assessment is that we confuse timeliness, and in this case we specifically mean an adherence to meeting time discipline deadlines, with achievement. Completing the work a crucial hour after it is due can lead to that work potentially not being marked at all, or being rejected. But we do usually have over-riding reasons for doing this but, sadly, these reasons are as artificial as the deadlines we impose. Why is an Engineering Degree a four-year degree? If we changed it to six would we get better engineers? If we switched to competency based training, modular learning and life-long learning, would we get more people who were qualified or experienced with engineering? Would we get less? What would happen if we switched to a 3/1/2/1 working week? Would things be better or worse? It’s hard to evaluate because the week, and the contiguous working week, are so much a part of our world that I imagine that today is the first day that some of you have thought about it.

Back to education and, right now, we count time for our students because we have to work out bills and close off accounts at end of financial year, which means we have to meet marking and award deadlines, then we have to project our budget, which is yearly, and fit that into accredited degree structures, which have year guidelines…

But I cannot give you a sound, scientific justification for any of what I just wrote. We do all of that because we are caught up in industrial time first and we convince ourselves that building things into that makes sense. Students do have ebb and flow. Students are happier on certain days than others. Transition issues on entry to University are another indicator that students develop and mature at different rates – why are we still applying industrial time from top to bottom when everything we see here says that it’s going to cause issues?

Oh, yes, the “real world” uses it. Except that regular studies of industrial practice show that 40 hour weeks, regular days off, working from home and so on are more productive than the burn-out, everything-late, rush that we consider to be the signs of drive. (If Henry Ford thinks that making people work more than 40 hours a week is bad for business, he’s worth listening to.) And that’s before we factor in the development of machines that will replace vast numbers of human jobs in the next 20 years.

I have a different approach. Why aren’t we looking at students more like we regard our grape vines? We plan, we nurture, we develop, we test, we slowly build them to the point where they can produce great things and then we sustain them for a fruitful and long life. When you plant grape vines, you expect a first reasonable crop level in three years, and commercial levels at five. Tellingly, the investment pattern for grapes is that it takes you 10 years to break even and then you start making money back. I can’t tell you how some of my students will turn out until 15-25 years down the track and it’s insanity to think you can base retrospective funding on that timeframe.

You can’t make your grapes better by telling them to be fruitful in two years. Some vines take longer than others. You can’t even tell them when to fruit (although can trick them a little). Yet, somehow, we’ve managed to work around this to produce a local wine industry worth around $5 billion dollars. We can work with variation and seasonal issues.

One of the reasons I’m so keen on MOOCs is that these can fit in with the routines of people who can’t dedicate themselves to full-time study at the moment. By placing well-presented, pedagogically-sound materials on-line, we break through the tyranny of the 9-5, 5 day work week and let people study when they are ready to, where they are ready to, for as long as they’re ready to. Like to watch lectures at 1am, hanging upside down? Go for it – as long as you’re learning and not just running the video in the background while you do crunches, of course!

Once you start to question why we have so many days in a week, you quickly start to wonder why we get so caught up on something so artificial. The simple answer is that, much like money, we have it because we have it. Perhaps it’s time to look at our educational system to see if we can do something that would be better suited to developing really good knowledge in our students, instead of making them adept at sliding work under our noses a second before it’s due. We are developing systems and technologies that can allow us to step outside of these structures and this is, I believe, going to be better for everyone in the process.

Conformity isn’t knowledge, and conformity to time just because we’ve always done that is something we should really stop and have a look at.


On being the right choice.

I write fiction in my (increasing amounts of) free time and I submit my short stories to a variety of magazines, all of whom have rejected me recently. I also applied to take part in a six-week writing workshop called Clarion West this year, because this year’s instructors were too good not to apply! I also got turned down for Clarion West.

Only one of these actually stung and it was the one where, rather than thinking hey, that story wasn’t right for that venue, I had to accept that my writing hadn’t been up to the level of the 16 very talented writers who did get in. I’m an academic so being rejected from conferences is part of my job (as is being told that I’m wrong and, occasionally, told that I’m right but in a way that makes it sounds like I stumbled over it.)

And there is a difference because one of these is about the story itself and the other is about my writing, although many will recognise that this is a tenuous and artificial separation, probably to keep my self-image up. But this is a setback and I haven’t written much (anything) since the last rejection but that’s ok, I’ll start writing again and I’ll work on it and, maybe, one day I’ll get something published and people will like it and that will be that dealt with.

It always stings, at least a little, to be runner-up or not selected when you had your heart set on something. But it’s interesting how poisonous it can be to you and the people around you when you try and push through a situation where you are not the first choice, yet you end up with the role anyway.

For the next few paragraphs, I’m talking about selecting what to do, assuming that you have the choice and freedom to make that choice. For those who are struggling to stay alive, choice is often not an option. I understand that, so please read on knowing that I’m talking about making the best of the situations where your own choices can be used against you.

There’s a position going at my Uni, it doesn’t matter what, and I was really quite interested in it, although I knew that  people were really looking around outside the Uni for someone to fill it. It’s been a while and it hasn’t been filled so, when the opportunity came up, I asked about it and noted my interest.

But then, I got a follow-up e-mail which said that their first priority was still an external candidate and that they were pushing out the application period even further to try and do that.

Now, here’s the thing. This means that they don’t want me to do it and, so you know, that is absolutely fine with me. I know what I can do and I’m very happy with that but I’m not someone with a lot of external Uni experience. (Soldier, winemaker, sysadmin, international man of mystery? Yes. Other Unis? Not a great deal.) So I thanked them for the info, wished them luck and withdrew my interest. I really want them to find someone good, and quickly, but they know what they want and I don’t want to hang around, to be kicked into action when no-one better comes along.

I’m good enough at what I do to be a first choice and I need to remember that. All the time.

It’s really important to realise when you’d be doing a job where you and the person who appoints you know that you are “second-best”. You’re only in the position because they couldn’t find who they wanted. It’s corrosive to the spirit and it can produce a treacherous working relationship if you are the person that was “settled” on. The role was defined for a certain someone – that’s what the person in charge wants and that is what they are going to be thinking the whole time someone is in that role. How can you measure up to the standards of a better person who is never around to make mistakes? How much will that wear you down as a person?

As academics, and for many professional, there are so many things that we can do, that it doesn’t make much sense to take second-hand opportunities, after the A players have chosen not to show up. If you’re doing your job well and you go for something where that’s relevant, you should be someone’s first choice, or you should be in the first sweep. If not, then it’s not something that they actually need you for. You need to save your time and resources for those things where people actually want you – not a warm body that you sort of approximate. You’re not at the top level yet? Then it’s something to aim for but you won’t be able to do the best projects and undertake the best tasks to get you into that position, if you’re always standing in and doing the clean-up work because you’re “always there”.

I love my friends and family because they don’t want a Nick-ish person in their life, they want me. When I’m up, when I’m down, when I’m on, when I’m off – they want me. And that’s the way to bolster a strong self-image and make sure that you understand how important you can be.

If you keep doing stuff where you could be anyone, you won’t have the time to find, pursue or accept those things that really need you and this is going to wear away at you. Years ago, I stopped responding when someone sent out an e-mail that said “Can anyone do this?” because I was always one of the people who responded but this never turned into specific requests to me. Since I stopped doing it, people have to contact me and they value me far more realistically because of it.

I don’t believe I’m on the Clarion West reserve list (no doubt they would have told me), which is great because I wouldn’t go now. If my writing wasn’t good enough then, someone getting sick doesn’t magically make my writing better and, in the back of my head and in the back of the readers’, we’ll all know that I’m not up to standard. And I know enough about cognitive biases to know that it would get in the way of the whole exercise.

Never give up anything out of pique, especially where it’s not your essence that is being evaluated, but feel free to politely say No to things where they’ve made it clear that they don’t really want you but they’re comfortable with settling.

If you’re doing things well, no-one should be settling for you – you should always be in that first choice.

Anything else? It will drive you crazy and wear away your soul. Trust me on this.

A picture of a tree standing in a field.

You, too, can be outstanding in your field.


That’s not the smell of success, your brain is on fire.

Would you mind putting out the hippocampus when you have a chance?

Would you mind putting out the hippocampus when you have a chance?

I’ve written before about the issues of prolonged human workload leading to ethical problems and the fact that working more than 40 hours a week on a regular basis is downright unproductive because you get less efficient and error-prone. This is not some 1968 French student revolutionary musing on what benefits the soul of a true human, this is industrial research by Henry Ford and the U.S. Army, neither of whom cold be classified as Foucault-worshipping Situationist yurt-dwelling flower children, that shows that there are limits to how long you can work in a sustained weekly pattern and get useful things done, while maintaining your awareness of the world around you.

The myth won’t die, sadly, because physical presence and hours attending work are very easy to measure, while productive outputs and their origins in a useful process on a personal or group basis are much harder to measure. A cynic might note that the people who are around when there is credit to take may end up being the people who (reluctantly, of course) take the credit. But we know that it’s rubbish. And the people who’ve confirmed this are both philosophers and the commercial sector. One day, perhaps.

But anyone who has studied cognitive load issues, the way that the human thinking processes perform as they work and are stressed, will be aware that we have a finite amount of working memory. We can really only track so many things at one time and when we exceed that, we get issues like the helmet fire that I refer to in the first linked piece, where you can’t perform any task efficiently and you lose track of where you are.

So what about multi-tasking?

Ready for this?

We don’t.

There’s a ton of research on this but I’m going to link you to a recent article by Daniel Levitin in the Guardian Q&A. The article covers the fact that what we are really doing is switching quickly from one task to another, dumping one set of information from working memory and loading in another, which of course means that working on two things at once is less efficient than doing two things one after the other.

But it’s more poisonous than that. The sensation of multi-tasking is actually quite rewarding as we get a regular burst of the “oooh, shiny” rewards our brain gives us for finding something new and we enter a heightened state of task readiness (fight or flight) that also can make us feel, for want of a better word, more alive. But we’re burning up the brain’s fuel at a fearsome rate to be less efficient so we’re going to tire more quickly.

Get the idea? Multi-tasking is horribly inefficient task switching that feels good but makes us tired faster and does things less well. But when we achieve tiny tasks in this death spiral of activity, like replying to an e-mail, we get a burst of reward hormones. So if your multi-tasking includes something like checking e-mails when they come in, you’re going to get more and more distracted by that, to the detriment of every other task. But you’re going to keep doing them because multi-tasking.

I regularly get told, by parents, that their children are able to multi-task really well. They can do X, watch TV, do Y and it’s amazing. Well, your children are my students and everything I’ve seen confirms what the research tells me – no, they can’t but they can give a convincing impression when asked. When you dig into what gets produced, it’s a different story. If someone sits down and does the work as a single task, it will take them a shorter time and they will do a better job than if they juggle five things. The five things will take more than five times as long (up to 10, which really blows out time estimation) and will not be done as well, nor will the students learn about the work in the right way. (You can actually sabotage long term storage by multi-tasking in the wrong way.) The most successful study groups around the Uni are small, focused groups that stay on one task until it’s done and then move on. The ones with music and no focus will be sitting there for hours after the others are gone. Fun? Yes. Efficient? No. And most of my students need to be at least reasonably efficient to get everything done. Have some fun but try to get all the work done too – it’s educational, I hear. 🙂

It’s really not a surprise that we haven’t changed humanity in one or two generations. Our brains are just not built in a way that can (yet) provide assistance with the quite large amount of work required to perform multi-tasking.

We can handle multiple tasks, no doubt at all, but we’ve just got to make sure, for our own well-being and overall ability to complete the task, that we don’t fall into the attractive, but deceptive, trap that we are some sort of parallel supercomputer.


Publish and be damned, be silent and be ignored.

I’m working on a longer piece on how student interaction on electronic discussion forums suffers from the same problems of tone as any on-line forum. Once people decide that how they wish to communicate is the de facto standard for all discussion, then non-conformity is somehow weakness and indicative of bad faith or poor argument. But tone is a difficult thing to discuss because the perceived tone of a piece is in the hands of the reader and the writer.

A friend and colleague recently asked me for some advice about blogging and I think I’ve now done enough of it that I can offer some reasonable advice. I think the most important thing that I said at the time was that it was important to get stuff out there. You can write into a blog and keep it private but then no-one reads it. You can tweak away at it until it’s perfect but, much like a PhD thesis, perfect is the enemy of done. Instead of setting a lower bound on your word count, set an upper bound at which point you say “Ok, done, publish” to get your work out there. If your words are informed, authentic and as honest as you can make them then you’ll probably get some interesting and useful feedback.

But…

But there’s that tone argument again. The first thing you have to accept is that making any public statement has always attracted the attention of people, it’s the point really, and that the nature of the Internet means that you don’t need to walk into a park and stand at Speakers’ Corner to find hecklers. The hecklers will find you. So if you publish, you risk damning. If you’re silent, you have no voice. If you’re feeling nervous about publishing in the first place, how do you deal with this?

Let me first expose my thinking process. This is not an easy week for me as I think about what I do next, having deliberately stepped back to think and plan for the next decade or so. At the same time, I’m sick (our whole household is sick at the moment), very tired and have come off some travel. And I have hit a coincidental barrage of on-line criticism, some of which is useful and developing critique that I welcome and some of which is people just being… people. So this is very dear to my heart right now – why should I keep writing stuff if the outcome risks being unpleasant? I have other ways to make change.

Well, you should publish but you just need to accept that people will react to you publishing – sometimes well, sometimes badly. That’s why you publish, after all, isn’t it?

Let’s establish the ground truth – there is no statement you can make on the Internet that is immune to criticism but not all criticism is valid or useful. Let’s go through what can happen, although is only a subset.

  1. “I like sprouts”

    Facebook is the land of simple statements and many people talk about things that they like. “I like sprouts” critics find statements like this in order to express their incredulity that anyone could possibly enjoy Brussels Sprouts and “ugh, they’re disgusting”. The opposite is of course the people who show up on the “I hate sprouts” discussions to say “WHY DON’T YOU LOVE SPROUTS”? (For the record, I love Brussels sprouts.)

    A statement of personal preference for something as banal as food is not actually a question but it’s amazing how challenging such a statement can be. If you mention animals of any kind, there’s always the risk of animal production/consumption coming up because no opinion on the Internet is seen outside of the intersection of the perception of reader and writer. A statement about fluffy bunnies can lead to arguments about the cosmetics industry. Goodness help you if you try something that is actually controversial. Wherever you write it, if someone has an opinion that contradicts yours, discussion of both good and questionable worth can ensue.

    (Like the fact that Jon Pertwee is the best Doctor.)

    It’s worth noting that there are now people who are itching to go to the comments to discuss either Brussels Sprouts or Tom Baker/David Tennant or “Tom Baker/David Tennant”. This is why our species is doomed and I am the herald of the machine God. 01010010010010010101001101000101

  2. “I support/am opposed to racism/sexism/religious discrimination”

    It doesn’t matter which way around you make these statements, if a reader perceives it as a challenge (due to its visibility or because they’ve stumbled across it), then you will get critical, and potentially offensive, comment. I am on the “opposed to” side, as regular readers will know, but have been astounded by the number of times I’ve had people argue things about this. Nothing is ever settled on the Internet because sound evidence often doesn’t propagate as well as anecdote and drama.

    Our readership bubbles are often wider than we think. If you’re publishing on WP then pretty much anyone can read it. If you’re publishing on Facebook then you may get Friends and their Friends and the Friends of people you link… and so on. There are many fringe Friends on Facebook that will leap into the fray here because they are heavily invested in maintaining what they see as the status quo.

    In short, there is never a ‘safe’ answer when you come down on either side of a controversial argument but neutrality conveys very little. (There’s also the fact that there is no excluded middle for some issues – you can’t be slightly in favour of universal equality.)

    We also sail from “that’s not the real issue, THIS is the real issue” with great ease in this area of argument. You do not know the people who read your stuff until you have posted something that has hit all of the buttons on their agenda elevators. (And, yes, we all have them. Mine has many buttons.)

  3. Here is my amazingly pithy argument in support of something important.

    And here is the comment that:
    Takes something out of context.
    Misinterprets the thrust.
    Trivialises the issue.
    Makes a pedantic correction.
    Makes an unnecessary (and/or unpleasant) joke.
    Clearly indicates that the critic stopped reading after two lines.
    Picks a fight (possibly because of a lingering sprouts issue).

    When you publish with comments on, and I strongly suggest that you do, you are asking people to engage with you but you are not asking them to bully you, harass you or hijack your thread. Misinterpretation, and the correction thereof, can be a powerful tool to drive understanding. Bad jokes offer an opportunity to talk about the jokes and why they’re still being made. But a lot of what is here is tone policing, trying to make you regret posting. If you posted something that’s plain wrong, hurtful or your thrust was off (see later) then correction is good but, most of the time, this is tone policing and you will often know this better as bullying. Comments to improve understanding are good, comments to make people feel bad for being so stupid/cruel/whatever are bullying, even if the target is an execrable human being. And, yes, very easy trap to fall into, especially when buoyed up by self-righteousness. I’ve certainly done it, although I deeply regret the times that I did it, and I try to keep an eye out for it now.

    People love making jokes, especially on FB, and it can be hard for them to realise that this knee-jerk can be quite hurtful to some posters. I’m a gruff middle-aged man so my filter for this is good (and I just mentally tune people out or block them if that’s their major contribution) but I’ve been regularly stunned by people who think that posting something that is not supportive but jokey in response to someone sharing a thought or vulnerability is the best thing to do. If it derails the comments then, hooray, the commenter has undermined the entire point of making the post.

    Many sites have now automatically blocked or warped comments that rush in to be the “First” to post because it’s dumb. And now, even more tragically, at least one person is fighting the urge to prove my point by writing “First” underneath here as a joke. Because that’s the most important thing to take away from this.

  4. Here is a slight silly article using humour to make a point or using analogy to illustrate an argument.

    And here are the comments about this article failing because of some explicit extension of the analogy that is obviously not what was intended or here is the comment that interprets the humour as trivialising the issue at hand or, worse, indicating that the writer has secret ulterior motives.

    Writers communicate. If dry facts, by themselves, aligned one after the other in books educated people then humanity would have taken the great leap forward after the first set of clay tablets dried. Instead, we need frameworks for communication and mechanisms to facilitate understanding. Some things are probably beyond humorous intervention. I tried recently to write a comedic piece on current affairs and realised I couldn’t satirise a known racist without repeating at least some racial slurs – so I chose not to. But a piece like this, where I want to talk about some serious things without being too didactic? I think humour is fine.

    The problem is whether people think that you’re laughing at someone, especially them. Everyone personalises what they read – I imagine half of the people reading this think I’m talking directly to them, when I’m not. I’m condensing a billion rain drops to show you what can break a dam.

    Analogies are always tricky but they’re not supposed to be 1-1 matches for reality. Like all models, they are incomplete and fail outside of the points of matching. Combining humour and analogy is a really good way to lose some readers so you’ll get a lot of comments on this.

  5. Here is the piece where I got it totally and utterly wrong.

    You are going to get it wrong sometime. You’ll post while angry or not have thought of something or use a bad source or just have a bad day and you will post something that you will ultimately regret. This is the point at which it’s hardest to wade through the comments because, in between the tone policers, the literalists, the sproutists, the pedants, the racists, TIMECUBE, and spammers, you’re going to have read comments from people where they delicately but effectively tell you that you’ve made a mistake.

    But that is why we publish. Because we want people to engage with our writing and thoughtful criticism tells us that people are thinking about what we write.

The curse of the Internet is that people tend only to invest real energy in comment when they’re upset. Facebook have captured this with the Like button, where ‘yay’ is a click and “OH MY GOD, YOU FILTHY SOMETHINGIST” requires typing. Similarly, once you start writing and publishing, have a look at those people who are also creating and contributing, and those people who only pop up to make comments along the lines I’ve outlined. There are many powerful and effective critics in the world (and I like to discuss things as much as the next person) but the reach and power of the Internet means that there are also a lot of people who derive pleasure from sailing in to make comment when they have no intention of stating their own views or purpose in any way that exposes them.

Some pieces are written in a way that no discussion can be entered into safely, without leaving commentators any room to actually have a discussion around it. That’s always your choice but if you do it, why not turn the comments off? There’s no problem with having a clearly stated manifesto that succinctly captures your beliefs – people who disagree can write their own – but it’s best to clearly advertise that something is beyond casual “comment-based” discussion to avoid the confusion that you might be open for it.

I’ve left the comments open, let’s see what happens!


I have a new book out: A Guide to Teaching Puzzle-based learning. #puzzlebasedlearning #education

Time for some pretty shameless self-promotion. Feel free to stop reading if that will bother you.

My colleagues, Ed Meyer from BWU, Raja Sooriamurthi from CMU and Zbyszek Michalewicz (emeritus from my own institution) and I have just released a new book, called “A Guide to Teaching Puzzle-based learning.” What a labour of love this has been and, better yet, we are still still talking to each other. In fact, we’re planning some follow-up events next year to do some workshops around the book so it’ll be nice to work with the team again.

(How to get it? This is the link to Springer, paperback and e-Book. This is the link to Amazon, paperback only I believe.)

Here’s a slightly sleep-deprived and jet-lagged picture of me holding the book as part of my “wow, it got published” euphoria!

See how happy I am?

See how happy I am? And also so out of it.

The book is a resource for the teacher, although it’s written for teachers from primary to tertiary and it should be quite approachable for the home school environment as well. We spent a lot of time making it approachable, sharing tips for students and teachers alike, and trying to get all of our knowledge about how to teach well with puzzles down into the one volume. I think we pretty much succeeded. I’ve field-tested the material here at Universities, schools and businesses, with very good results across the board. We build on a good basis and we love sound practical advice. This is, very much, a book for the teaching coalface.

It’s great to finally have it all done and printed. The Springer team were really helpful and we’ve had a lot of patience from our commissioning editors as we discussed, argued and discussed again some of the best ways to put things into the written form. I can’t quite believe that we managed to get 350 pages down and done, even with all of the time that we had.

If you or your institution has a connection to SpringerLink then you can read it online as part of your subscription. Otherwise, if you’re keen, feel free to check out the preview on the home page and then you may find that there are a variety of prices available on the Web. I know how tight budgets are at the moment so, if you do feel like buying, please buy it at the best price for you. I’ve already had friends and colleagues ask what benefits me the most and the simple answer is “if people read it and find it useful”.

To end this disgraceful sales pitch, we’re actually quite happy to run workshops and the like, although we are currently split over two countries (sometimes three or even four), so some notice is always welcome.

That’s it, no more self-promotion to this extent until the next book!

 


You want thinkers. Let us produce them.

I was at a conference recently where the room (about 1000 people from across the business and educational world) were asked what they would like to say to everyone in the room, if they had a few minutes. I thought about this a lot because, at the time, I had half an idea but it wasn’t in a form that would work on that day. A few weeks later, in a group of 100 or so, I was asked a similar question and I managed to come up with something coherent. What follows here is a more extended version of what I said, with relevant context.

If I could say anything to the parents and  future employers of my students, it would be to STOP LOOKING AT GRADES as some meaningful predictor of the future ability of the student. While measures of true competency are useful, the current fine-grained but mostly arbitrary measurements of students, with rabid competitiveness and the artificial divisions between grade bands, do not fulfil this purpose. When an employer demands a GPA of X, there is no guaranteed true measure of depth of understanding, quality of learning or anything real that you can use, except for conformity and an ability to colour inside the lines. Yes, there will be exceptional people with a GPA of X, but there will also be people whose true abilities languished as they focused their energies on achieving that false grail. The best person for your job may be the person who got slightly (or much) lower marks because they were out doing additional tasks that made them the best person.

Please. I waste a lot of my time giving marks when I could be giving far more useful feedback, in an environment where that feedback could be accepted and actual positive change could take place. Instead, if I hand back a 74 with comments, I’ll get arguments about the extra mark to get to 75 rather than discussions of the comments – but don’t blame the student for that attitude. We have created a world in which that kind of behaviour is both encouraged and sensible. It’s because people keep demanding As and Cs to somehow grade and separate people that we still use them. I couldn’t switch my degree over to “Competent/Not Yet Competent” tomorrow because, being frank, we’re not MIT or Stanford and people would assume that all of my students had just scraped by – because that’s how we’re all trained.

If you’re an employer then I realise that it’s very demanding but please, where you can, look at the person wherever you can and ask your industrial bodies that feed back to education to focus on ensuring that we develop competent, thinking individuals who can practice in your profession, without forcing them to become grade-haggling bean counters who would cut a group member’s throat for an A.

If you’re a parent, then I would like to ask you to think about joining that group of parents who don’t ask what happened to that extra 1% when a student brings home a 74 or 84. I’m not going to tell you how to raise your children, it’s none of my business, but I can tell you, from my professional and personal perspective, that it probably won’t achieve what you want. Is your student enjoying the course, getting decent marks and showing a passion and understanding? That’s pretty good and, hopefully, if the educators, the parents and the employers all get it right, then that student can become a happy and fulfilled human being.

Do we want thinkers? Then we have to develop the learning environments in which we have the freedom and capability to let them think. But this means that this nonsense that there is any real difference between a mark of 84 and a mark of 85 has to stop and we need to think about how we develop and recognise true measures of competence and suitability that go beyond a GPA, a percentage or a single letter grade.

You cannot contain the whole of a person in a single number. You shouldn’t write the future of a student on such a flimsy structure.


Three Stories: #3 Taking Time for Cats

There are a number of draft posts sitting on this blog. Posts, which for one reason or another, I’ve either never finished, because the inspiration ran out, or I’ve never published, because I decided not to share them. Most of them were written when I was trying to make sense of being too busy, while at the same time I was taking on more work and feeling bad about not being able to commit properly to everything. I probably won’t ever share many of these posts but I still want to talk about some of the themes.

So, let me tell you a story about  cats.

One of the things about cats is that they can be mercurial, creatures of fancy and rapid mood changes. You can spend all day trying to get a cat to sit on your lap and, once you’ve given up and sat back down, 5 minutes later you find a cat on your lap. That’s just the way of cats.

When I was very busy last year, and the year before, I started to see feedback comments from my students that said things like “Nick is great but I feel interrupting him” or I’d try and squeeze them into the 5 minutes I had between other things. Now, students are not cats, but they do have times when they feel they need to come and see you and, sometimes, when that time passes, the opportunity is lost. This isn’t just students, of course, this is people. That’s just the way of people, too. No matter how much you want them to be well organised, predictable and well behaved, sometimes they’re just big, bipedal, mostly hairless cats.

One day, I decided that the best way to make my change my frantic behaviour was to set a small goal, to make me take the time I needed for the surprising opportunities that occurred in a day.

I decided that every time I was walking around the house, even if I was walking out to go to work and thought I was in a hurry, if one of the cats came up to me, I would pay attention to it: scratch it, maybe pick it up, talk to it, and basically interact with the cat.

Over time, of course, what this meant was that I saw more of my cats and I spent more time with them (cats are mercurial but predictable about some things). The funny thing was that the 5 minutes or so I spent doing this made no measurable difference to my day. And making more time for students at work started to have the same effect. Students were happier to drop in to see if I could spend some time with them and were better about making appointments for longer things.

Now, if someone comes to my office and I’m not actually about to rush out, I can spend that small amount of time with them, possibly longer. When I thought I was too busy to see people, I was. When I thought I had time to spend with people, I could.

Yes, this means that I have to be a little more efficient and know when I need to set aside time and do things in a different way, but the rewards are enormous.

I only realised the true benefit of this recently. I flew home from a work trip to Melbourne to discover that my wife and one of our cats, Quincy, were at the Animal Emergency Hospital, because Quincy couldn’t use his back legs. There was a lot of uncertainty about what was wrong and what could be done and, at one point, he stopped eating entirely and it was… not good there for a while.

The one thing that made it even vaguely less awful in that difficult time was that I had absolutely no regrets about the time that we’d spent together over the past 6 months. Every time Quincy had come up to say ‘hello’, I’d stopped to take some time with him. We’d lounged on the couch. He’d napped with me on lazy Sunday afternoons. We had a good bond and, even when the vets were doing things to him, he trusted us and that counted for a lot.

Quincy is now almost 100% and is even more of a softie than before, because we all got even closer while we were looking after him. By spending (probably at most) another five minutes a day, I was able to be happier about some of the more important things in my life and still get my “real” work done.

Fortunately, none of my students are very sick at the moment, but I am pretty confident that I talk to them when they need to (most of the time, there’s still room for improvement) and that they will let me know if things are going badly – with any luck at a point when I can help.

Your time is rarely your own but at least some of it is. Spending it wisely is sometimes not the same thing as spending it carefully. You never actually know when you won’t get the chance again to spend it on something that you value.


A Break in the Silence: Time to Tell a Story

It has been a while since I last posted here but that is a natural outcome of focusing my efforts elsewhere – at some stage I had to work out what I had time to do and do it. I always tell my students to cut down to what they need to do and, once I realised that the time I was spending on the blog was having one of the most significant impacts on my ability to juggle everything else, I had to eat my own dogfood and cut back on the blog.

Of course, I didn’t do it correctly because instead of cutting back, I completely cut it out. Not quite what I intended but here’s another really useful piece of information: if you decide to change something then clearly work out how you are going to change things to achieve your goal. Which means, ahem, working out what your goals are first.

I’ve done a lot of interesting stuff over the last 6 months, and there are more to come, which means that I do have things to write about but I shall try and write about one a week as a minimum, rather than one per day. This is a pace that I hope to keep up and one that will mean that more of you will read more of what I write, rather than dreading the daily kiloword delivery.

I’ll briefly reflect here on some interesting work and seminars I’ve been looking at on business storytelling – taking a personal story, something authentic, and using it to emphasise a change in business behaviour or to emphasise a characteristic. I recently attended one of the (now defunct) One Thousand and One’s short seminars on engaging people with storytelling. (I’m reading their book “Hooked” at the moment. It’s quite interesting and refers to other interesting concepts as well.) I realise that such ideas, along with many of my notions of design paired with content, will have a number of readers peering at the screen and preparing a retort along the lines of “Storytelling? STORYTELLING??? Whatever happened to facts?”

Why storytelling? Because bald facts sometimes just don’t work. Without context, without a way to integrate information into existing knowledge and, more importantly, without some sort of established informational relationship, many people will ignore facts unless we do more work than just present them.

How many examples do you want: Climate Change, Vaccination, 9/11. All of these have heavily weighted bodies of scientific evidence that states what the answer should be, and yet there is powerful and persistent opposition based, largely, on myth and storytelling.

Education has moved beyond the rationing out of approved knowledge from the knowledge rich to those who have less. The tyrannical informational asymmetry of the single text book, doled out in dribs and drabs through recitation and slow scrawling at the front of the classroom, looks faintly ludicrous when anyone can download most of the resources immediately. And yet, as always, owning the book doesn’t necessarily teach you anything and it is the educator’s role as contextualiser, framer, deliverer, sounding board and value enhancer that survives the death of the drip-feed and the opening of the flood gates of knowledge. To think that storytelling is the delivery of fairytales, and that is all it can be, is to sell such a useful technique short.

To use storytelling educationally, however, we need to be focused on being more than just entertaining or engaging. Borrowing heavily from “Hooked”, we need to have a purpose in telling the story, it needs to be supported by data and it needs to be authentic. In my case, I have often shared stories of my time in working with  computer networks, in short bursts, to emphasise why certain parts of computer networking are interesting or essential (purpose), I provide enough information to show this is generally the case (data) and because I’m talking about my own experiences, they ring true (authenticity).

If facts alone could sway humanity, we would have adopted Dewey’s ideas in the 1930s, instead of rediscovering the same truths decade after decade. If only the unembellished truth mattered, then our legal system would look very, very different. Our students are surrounded by talented storytellers and, where appropriate, I think those ranks should include us.

Now, I have to keep to the commitment I made 8 months ago, that I would never turn down the chance to have one of my cats on my lap when they wanted to jump up, and I wish you a very happy new year if I don’t post beforehand.