HERSDA Keynote: “Cultivating Connections Throughout the Academe: Learning to Teach by Learning to Learn” Dr Kathy Takayama.

A very interesting keynote today and I took a lot of notes. Anyone who has read my SIGCSE blogs knows that I’m prone to being verbose so I hope that this is useful if wordy. (Any mistakes are, of course, mine.)

Dr Takayama started talk with an art image: “Venn diagrams (under the spotlight)” by Amalia Pica. She stressed how the Venn Diagram was simple, versatile, intearction, connection, commonality, and it also transcends boundaries – the overlap of two colours produces a new colour. We can also consider this as an absence of difference (sharing) or new knowledge (creation). However the background of the artwork was based in the artist’s experience of the suppression of group theory and Venn diagrams in Argentina – as both of these were seen to encourage subversive forms of group activity and critical theory. Dr Takayama then followed this thread into ideas of inclusion and exclusion. What are the group dynamics from our structures? How do we group students and acaemdics into exclusive and inclusive domains? What does this mean for our future?
How does this limit learning?
She then talked about our future professoriate – those students who will go on to join us in the professorial ranks. She broke this into three aspects: Disciplinary Identity, Dispositions for engagement and Integrative communities. Our disciplinary identity reflects our acculturation to disciplinary practices and habits of mind, where our dispositions identify who we find ourselves in the learning – rather than focusing on what to learn.
“In the face of today’s hyper-accelerated ultra-competitive global society, the preservation of opportunities for self-development and autonmous reflection is a value we underestimate at our peril.” (Richard Wolin)
When we discuss the emergence of disciplinary identity, we are talking about expert thinking – the scholarly habits of a discipline that allow someone to identify themselves as a member of the discipline. What do we do that allows us to say “I am a microbiologist” or “I am an engineer”?
The development of expertise is through iterative authentic experiences, truly appropriate activities carried out inside the disciple, where we have discipline-centred practices, including signature pedagogies (Schulman). The signature pedagogies of a discipline have four features. They must  be pervasive, routine, habitual and deeply engaging
Dr Takayama then discussed, at some length, a study in placing students into unfamiliar territory, where they were required to take scholarly habits from another discipline. In this case, Dr Takayama (a microbiologist) exchanged scholarly habits with students of David Reichart – Historian. Academics confirm to standard practices of their disciplines and students acculturate quickly. Takayama and Reichart sought to take pedagogies from other areas to take students into new thinking processes.
Students from the History course were required to use a science poster basis (research poster) to present their work, instead of a traditional report. The word “Poster” was reacted to badly – students thought that was a cheapening of their effort for a year’s work. Students had to think outside of the norms and discovered new aspects of communication, voice and interpretation in the unusual territory. This also added a challenge component and allowed a multi-dimensional exploration of area.
The microbiology students had to document their research in a completely blank book and were allowed to create a narrative in that blank book. This was at odds with usual structure for Science: accurate, reproducible, adhere to convention, no narrative, no first person, dates, signatures. While accuracy and reproducibly were still enforced, students were encouraged to explore much more widely in their blank book.
Student work started to resemble commonplace books (loci communes) – a compiled work with annotations and narrative from the compiler. The new student books contain personalisation, reflection, narrative, collage, moments of exhilaration and discovery – but they maintained fidelity and scientific accuracy.
This then led to the core idea from the work: (An) engagement with the unfamiliar as a means for further development of expertise.
Students’ understandings are deeply tied to existing and established practices – to the point that students feared that outside conventions would render their work invalid. Working in unfamiliar territory allows the students to refine their understanding of their discipline and push the boundaries, as well as their own understanding. Lecturers had to take risks as well, to get this realisation.
In our traditional dispositions for engagement, we have had a tendency to create a learning culture that is less interested in the unfamiliar and we have implicitly driven a focus on understanding a discipline vs developing an understanding of oneself. The nature of learning as situated in institutional cultures is something that we can see from the inside but the student perspective is vital as we want to know what the students think that we look like. From the students’ perspectives, they see learning in terms of specialisation, globalisation, technology and collaboration. This is a critical forum through which students made sense of their own place in relation to the  discipline.
Students identified two over-arching goals:
  • Routine Expertise: The Habits of mind and skills associated with efficiency and performance in familiar  domains, and
  • Adaptive Expertise (after Bransford): applying knowledge effectively to novel situations or unique problems
Students discover themselves in the material – finding connection and allowing deep eqnuiry into their own nature. (Students’ awareness of themselves in the course or the curriculum (Barbazat, Amherst))
Looking from our perspective, based on what our students want and how they succeed, Barnett (U London) identified dispositions for learning as Venturing Forward
  • A will to learn
  • A will to encounter the unfamiliar
  • A will to engage
  • A preparedness to listen
  • A willingness to be changed
  • A determination to keep going.
 Dr Takayama then went on to talk about developing a strong learning and teaching community through courses such as Brown’s Certification program, which has any benefits in enhancing the perception of value and practices in learning and teaching, as well as overall enhancement of the new post-graduates. One of the core points identified was that many of the PhD students who are produced will go on to teach in liberal arts colleges, institutions with an undergraduate teaching focus and two-year colleges. If we don’t teach them how to teach then they will be woefully underprepared for the future that lies before them – just being good at research doesn’t translate into skill at teaching, hence it must be fostered and well-organised certification programs are a good way to do this.
I hope to comment more on the cert program shortly, but  a very interesting talk with lots of ideas for me to take home and to think about.

HERDSA 2012 – Conference Blogging

Hello from Hobart, Tasmania! I’ve switched over from my usual automated 4am delivery to ‘semi-live’ blogging of the talks and events that are taking place at the Higher Education Research and Development Society of Australasia (inc.) – or HERDSA. Last night was the conference reception and welcome, including a couple of talks about University of Tasmania and the (rather bleak) future of the Tasmanian Devil.

This morning we’re all gathered for the official opening, before the keynote on “Cultivating Connections through the Academe”. The theme of this year’s conference is “Connections” so everything is being framed along these lines.

Oh, that was interesting, His Excellency, the Honourable Peter Underwood AC, Governor of Tasmania, just arrived so we all had to stand up, somewhat awkwardly, while we waited for the Governor’s party to arrive. (If you want to see something amusing, ask a large group of academics to all do something at the same time that they may not have expected to do and then make them wait. It is, in the words of the fable of the Scorpion and the Frog, not in our nature.) Once the Governor had arrived, and Advance Australia Fair had played, we had a welcome to the country by a member of the local indigenous community. I have noticed that there is a great deal of meaning and respect attached to the traditional greetings and acknowledgements of traditional ownership here – I have a basic requirement that if you are going to make a statement regarding the indigenous community and their relationship with the land, that it must at least be genuine and preferably heart-felt. So far, all of these statements have been sincere and I also noticed that the Governor bowed his head to the person who was going to give the welcome to the country. That was, I thought, a nice example of how mutual respect doesn’t have to be arduous or obsequious.

The Governor’s speech did touch on some key points regarding the value of education, in its role of passing on knowledge and forming communities to engage people in learning, identifying educational scholarship and teaching community as a possible strategic priority. Perceptions are important – but if perception doesn’t coincide with reality then we have an problem. The Governor highlighted the problem that we have where people see a perception as being implicitly separated from the reality – drawing on his experience with a school board. In order to identify ourselves as good performers, it is not enough to perform well, but we must be seen and perceived to perform well – where perception and reality coincide, perception drives how we are treated and, therefore, takes the priority.

The Governor mused on his time on the Supreme Court and thought about who much power he actually had, if the prisoner rejected his sentence, if the bailiffs refused to carry away the prisoner, if the police refused to arrest the bailiffs for contempt. He summarised his thoughts on this (and I paraphrase here) as:

The only power given to me was that given by public perception and public support.

He then identified us, the tertiary educator, as being in the same situation. It is our reach into the community and the community’s connection to us that gives us our ability to work, our authority to educate and our role within the greater community. As I mentioned earlier, this entire conference is about connections, hence community and, from the Governor’s perspective, our ability to carry out our job.

The University of Tasmania has a very special position within the life of Tasmania, as this is a small state and UTAS is the only University, to the extent that many tasks that would be assigned to government in Tasmania are performed by groups within UTAS, with community support. Because of this, they have to engage heavily with the  people around them and form strong and well-perceived connections.

Even people who haven’t been to UTAS are, apparently, aware of the role of UTAS and what use it can be to them. That’s a very powerful connection – both mental and community. The Governor then again stressed the importance of identifying all of the components of a strategy for a successful University, among them a prioritisation of Learning and Teaching, alongside Research and the usual key aspects.

An interesting talk from a unique part of Australia. We have a reception at Government House tonight where I will not have any opportunity at all to continue the discussion. 🙂


When the Stakes are High, the Tests Had Better Be Up to It.

(This is on the stronger opinion side but, in the case of standardised testing as it is currently practised, this will be a polarising issue. Please feel free to read the next article and not this one.)

If you make a mistake, please erase everything from the worksheet, and then leave the room, as you have just wasted 12 years of education.

A friend on FB (thanks, Julie!) linked me to an article in the Washington Post that some of you may have seen. The article is called “The Complete List of Problems with High-Stakes Standardised Tests” by Marion Brady, in the words of the article. a “teacher, administrator, curriculum designer and author”. (That’s attribution, not scare quotes.)

Brady provides a (rather long but highly interesting) list of problems with the now very widespread standardised testing regime that is an integral part of student assessment in some countries. Here. Brady focuses on the US but there is little doubt that the same problems would exist in other areas. From my readings and discussions with US teachers, he is discussing issues that are well-known problems in the area but they are slightly intimidating when presented as a block.

So many problems are covered here, from an incorrect focus on simplistic repetition of knowledge because it’s easier to assess, to the way that it encourages extrinsic motivations (bribery or punishment in the simplest form), to the focus on test providers as the stewards and guides of knowledge rather than the teachers. There are some key problems, and phrases, that I found most disturbing, and I quote some of them here:

[Teachers oppose the tests because they]

“unfairly advantage those who can afford test prep; hide problems created by margin-of-error computations in scoring; penalize test-takers who think in non-standard ways”

“wrongly assume that what the young will need to know in the future is already known; emphasize minimum achievement to the neglect of maximum performance; create unreasonable pressures to cheat.”

“are open to massive scoring errors with life-changing consequences”

“because they provide minimal to no useful feedback”

This is completely at odds with what we would consider to be reasonable education practice in any other area. If I had comments from students that identified that I was practising 10% of this, I would be having a most interesting discussion with my Head of School concerning what I was doing – and a carpeting would be completely fair! This isn’t how we should teach and we know it.

I spoke yesterday about an assault on critical thinking as being an assault on our civilisation, short-sightedly stabbing away at helping people to think as if it will really achieve what (those trying to undermine critical thinking) actually wanted. I don’t think that anyone can actually permanently stop information spreading, when that information can be observed in the natural world, but short-sightedness, malign manipulation of the truth and ignorance can certainly prevent individuals from gaining access to information – especially if we are peddling the lie that “everything which needs to be discovered is already known.”

We can, we have and we probably (I hope) always will work around these obstacles in information, these dark ages as I referred to them yesterday, but at what cost of the great minds who cannot be applied to important problems because they were born to poor families, in the ‘wrong’ state, in a district with no budget for schools, or had to compete against a system that never encouraged them to actually think?

The child who would have developed free safe power, starship drives, applicable zero-inflation stable economic models, or the “cure for cancer” may be sitting at the back of a poorly maintained, un-airconditioned, classroom somewhere, doodling away, and slowly drifting from us. When he or she encounters the standardised test, unprepared, untrained, and tries to answer it to the extent of his or her prodigious intellect, what will happen? Are you sufficiently happy with the system that you think that this child will receive a fair hearing?

We know that students learn from us, in every way. If we teach something in one way but we reward them for doing something else in a test, is it any surprise that they learn for the test and come to distrust what we talk about outside of these tests? I loathe the question “will this be in the exam” as much as the next teacher but, of course, if that is how we have prioritised learning and rewarded the student, then they would be foolish not to ask this question. If the standardised test is the one that decides your future, then, without doubt, this is the one that you must set as your goal, whether student, teacher, district or state!

Of course, it is the future of the child that is most threatened by all of this, as well as the future of the teaching profession. Poor results on a standardised test for a student may mean significantly reduced opportunity, and reduced opportunity, unless your redemptive mechanisms are first class, means limited pathways into the future. The most insidious thread through all of this is the idea that a standardised test can be easily manipulated through a strategy of learning what the answer should be, to a test question, rather than what it is, within the body of knowledge. We now combine the disadvantaged student having their future restricted, competing against the privileged student who has been heavily channeled into a mode that allows them to artificially excel, with no guarantee that they have the requisite aptitude to enjoy or take advantage of the increased opportunities. This means that both groups are equally in trouble, as far as realising their ambitions, because one cannot even see the opportunity while the other may have no real means for transforming opportunity into achievement.

The desire to control the world, to change the perception of inconvenient facts, to avoid hard questions, to never be challenged – all of these desires appear to be on the rise. This is the desire to make the world bend to our will, the real world’s actual composition and nature apparently not mattering much. It always helps me to remember that Cnut stood in the waves and commanded them not to come in order to prove that he could not control the waves – many people think that Cnut was defeated in his arrogance, when he was attempting to demonstrate his mortality and humility, in the face of his courtiers telling him that he had power above that of mortal men.

How unsurprising that so many people misrepresent this.


Actually, Now You’re On My Turf

This Diagram Officially Not Recommended By The Texas GOP 2012

I don’t normally dabble in politics on this blog, quite deliberately, because I don’t want people to stop reading things that might be of use because of partisan issues. However, with the release of the 2012 Texas Republican platform, and its section on Education (page 12), I don’t feel that I’m dabbling in politics to address this – because with the following statement, the Texas GOP has very firmly put their feet into my area, and I feel that a response is required.

Knowledge-Based Education – We oppose the teaching of Higher Order Thinking Skills (HOTS) (values clarification), critical thinking skills and similar programs that are simply a relabeling of Outcome-Based Education (OBE) (mastery learning) which focus on behavior modification and have the purpose of challenging the student’s fixed beliefs and undermining parental authority.

Now, I have tried to go the Texas GOP website to see if there have been any developments on this but, for some reason, I can’t seem to be able to get there at the moment. (This is often the Internet’s way of saying “You have become interesting to a great many people. All at once.”, where congestion is caused by fascination.)

I am hoping that this turns out to be some kind of Internet hoax, or the actions of one person, rather than the genuine statement of a major political party for a large US state. As an educator, as a University lecturer, as a scientist, as a thinker, as a human being I am terrified that critical thinking skills, the foundation of our civilisation, are being singled out as being something undesirable – because it will challenge the students’ fixed beliefs.

We have had long periods where beliefs could not be challenged, where critical thinking was either suppressed or ignored, and we generally refer to them historically as dark ages. What really confuses me is that, somehow, critical thinking is going to immediately lead to the collapse of parental authority – as if critical thinking is guaranteed to be obstructive or contrary thinking. Critical thinking is the consideration of claims to decide if they are always true, sometimes true, partly true, or false. There is no guarantee that parental values need to be isolated as claims that are always false and, in many ways, it is a sign of concern of the veracity of one’s beliefs if you assume that any critical assessment is going to lead to an immediate rejection!

The critical thinking that we teach, and consider vital, is a respectful criticism of ideas, rather than people. One of the strengths of a good academic is that they can be critical of an idea, without needing to belittle the thinker (the person behind the idea). I’ve talked about this at length with movement from dualism to relativism and then commitment, under the Perry developmental classifications.

To identify that we should keep children as authority dependent drones, never allowing them to question anything? That is to keep them as children for all of their lives. But this would also lead us to a far darker future than just permanent childhood. Our civilisation is based on thinking, on reaching further, on questioning, on asking “What if?” and then finding answers. What is covered in the section on Knowledge Based Education is a threat to all education at the higher level and, ultimately, something that every educator has to worry about.

This is not a political issue – this is, and always will be, an educational issue. A societal issue. A civilisation issue.

Again, please let this be a joke or a hoax. If this is what a large group of 21st Century Americans can believe is the right way to proceed, then we have a great deal of work to do in informing people of why critical thinking is desirable, rather than some terrible threat to their own authority. But this feels as if it is based in fear, and fear is always very hard to deal with.


Dewey Defeats Truman – again!

The US Presidential race in 1948 was apparently decided when the Chicago Tribune decided to publish their now infamous headline “Dewey Defeats Truman” (Wikipedia link). As it happened, Truman had defeated Dewey in an upset victory. The rather embarrassing mistake was a combination of an early press deadline, early polls and depending upon someone who had been reliable in their predictions previously. What was worse was that the early editions had predicted a significant reversed result, with a sweeping victory for Dewey. Even as other results came in indicating that this wasn’t so, the paper stuck to the headline, while watering down the story.

Ultimately, roughly 150,000 papers were printed that were, effectively, utter and total nonsense.

Because he’s a President, I doubt that Truman actually used the phrase “neener, neener”. (Associated Press, photo by Byron Rollins, via Wikipedia)

This is a famous story in media reporting and, in many ways, it gives us a pretty simple lesson: Don’t Run The Story Until You Have the Facts. Which brings me to the reporting on the US Supreme Court regarding the constitutionality of the controversial health care bill.

Students have to understand how knowledge is constructed, if they are to assist in their own development, and the construction of what is accepted to be fact is strongly influenced by the media, both traditional and new. We’ve moved to a highly dynamic form of media that can have a direct influence on events as they unfold. Forty years ago, you’d read about an earthquake that killed hundreds. Today, dynamic reporting of earthquakes on social media save lives because people can actually get out of the way or get help to people faster.

I’m a great fan of watching new media reporting, because the way that it is reported is so fluid and dynamic. An earthquake happens somewhere and the twitter reporting of it shows up as a corresponding twitter quake. People then react and spread the news, editing starts to happen and, boom, you have an emergent news feed built by hundreds of thousands of people. However, traditional media, which has a higher level of information access and paid staff to support, does not necessarily work the same way. Trad media builds stories on facts, produces them, has time to edit, commits them to press or air and has a well-paid set of information providers and presenters to make it all happen. (Yes, I know there are degrees in here and there are ‘Indy’ trad media groups, but I think you get my point.)

It was very interesting, therefore, to see a number of trad news sources get the decision on the health care bill completely and utterly wrong. When the court’s decision was being read out, an event that I watched through many eyes as I was monitoring feed and reaction to feed, CNN threw up a headline, before the decision had been announced saying that the bill had been defeated.

And FOX news reported the same thing.

Only one problem. It wasn’t true.

As this fact became apparent, first of all, the main stories changed, then the feeds published from the main stories changed and then, because nobody had printed a paper yet, some of the more egregious errors disappeared from web sites and feeds – never to be seen again.

Oh wait, the Internet is Forever, so those ‘disappeared’ feeds had already been copied, pictured and cached.

Now, because of the way that the presenting Justice was actually speaking, you could be forgiven for thinking that he was about to say that the bill had been defeated. Except for the fact that there were no actual print deadlines in play here – what tripped up CNN and FOX appears to have been their desire to report a certain type of story first. In the case of FOX, had the bill been defeated, it’s not hard to imagine them actually ringing up President Obama to say “neener, neener”. (FOX news is not the President, so is not held to the same standards of decorum.)

The final comment on this story, and which should tell you volumes about traditional news gathering mechanisms in the 21st century, is that there was an error in a twitter/blog feed reporting on the decision which made an erroneous claim about the tax liability of citizens who wished to opt out of the program. So, just to be clear, we’re talking about a non-fact-checked social media side feed and there’s a mistake in it. Which then a very large number of traditional news sources presented as fact, because it appears that a large amount of their expensive resource gathering and fact checking amounts to “Get Brad and Janet to check out what’s happening on Twitter”. They they all had to fix and edit (AGAIN) once they discovered that they had effectively reported an error made by someone sitting in the room, typing onto a social media feed, as if it had gone through any kind of informational hygiene process.

Here are my final thoughts. As an experiment, for about a week, read Fark, Metafilter and The Register. Then see how many days it is before the same stories show up on your television, radio and print news. See how much change the stories have gone through, if any. Then look for stories that go the other way around. You may find it interesting when you work out which sources you trust as authorities, especially those that appear more trustworthy because they are traditional.

(Note: Apologies for the delay in posting. As part of my new work routine, I rearranged some time and I realised that posting 6 hours late wouldn’t hurt anyone.)


The Invisible War – How Do You Find What You Don’t Know You’re Missing?

Photo: jasonEscapist, CC licence, click for details.

[T]here are known knowns; there are things we know that we know.
There are known unknowns; that is to say there are things that, we now know we don’t know.
But there are also unknown unknowns – there are things we do not know, we don’t know.

Donald Rumsfeld, when United States Secretary of Defence

I realise that this quote has been mocked before but I have always found it be both clear and interesting, mainly because accepting that there are things that you don’t know that you don’t know is important. Because of the way our world works now, where most information is heavily filtered in one form or another, it is becoming more a world of unknowns unknowns (things that are so filtered that you didn’t even know that you could have known about them) then a world of known unknowns (things that you have yet to look into but know exist).

I have a student who is undertaking a project exploring ways of exposing the revision history of Wikipedia in a way that makes it immediately obvious if you’re reading something that is generally agreed upon or in massive dispute. The History and Discussion tabs in Wikipedia are, for most people, equivalent to unknown unknowns – not only do they not even realise what they are there for, they don’t think to look. This illustrates one of the most insidious forms of filter, one where the information is presented in a way that appears static and reliable, relying upon the mechanism that you use to give that impression.

How, for example, can a person inside the Chinese web search zone find pictures of Tank Man at Tiananmen Square, if all legitimate searches that might turn up anything to do with it have been altered? If no picture of Tiananmen shows protests or tanks, how do you even know to search for Tank Man? Even if you find a picture of a man standing there, in front of tanks, how do you then discover the meaning of the picture?

I was reminded of the impact of filtering while I was reading Metafilter the other day. One of the Front Page Posts (FPPs) dealt with the call to boycott a Fantasy writer/game article contributor who had advocated the use of rape in fantasy literature as an awesome way to make the story better (in a variety of ways). I started reading the article, because I assumed that I would take issue with this Fantasy author but wanted to read the whole story, and left the page up to see what sort of comments unfolded. Because this could take time, I ignored the page for about 30 minutes.

Then, when I reloaded later, the post had been deleted. Now, because of the way that Metafilter works, a deleted FPP still exists and can be located in the database, but it is no longer linked to the front page and can no longer be modified. So, suddenly, I had an island of effectively hidden and frozen information. Having read the contents, the comments so far, and the write-up, I was still quite interested to follow the story but the unfolding and contribution of other people in the comments thread, which is the greatest strength of Metafilter, was no longer going to happen.

Now there are many of these deleted FPPs in Metafilter, easily accessible if you search for them by number, but they are closed to comment. They are fragments of conversations, hanging in space, incomplete, cast in amber. You can see them but you can’t see the final comments that would have closed the debate, the petering out as the arguments faded, the additional links that would have been added to this shard of the data corpus by the 12,000 active account holders of Metafilter.

Now, of course, whenever you look at Metafilter, you’ll know that for every few stories that you see on the front page, there’s probably at least one deleted one. Whenever you look at Wikipedia’s illusion of a clean white page where everything looks like it’s just been printed, you may realise that this could hide hundreds or millions of updates and corrections behind the scenes.

How does this change your perception of the information that is contained in there?

While it is easy to point to traditional publishing, especially for text and reference books, and point out the elitist cabals and intellectual thuggery that permeated some of these avenues, we must accept that the printed book never changed once it had been printed. To change a printed book, you must excise, burn, overprint, paint, physically retrieve and then re-insert. There is no remote update. There is no way that an invisible war can be waged against the contents of your copy of uncorrected Biggles or that someone thousands of kilometres away can stop you from opening the pages of your history text that describe the Tiananmen Square protests.

We have always had filter bubbles but, at the same time, we had history and the ability to compare fixed and concrete entities with each other. Torn out pages left holes, holes gave us questions, unknowns were discovered. I try very hard to read across and out of my filter bubble, and I strongly encourage my students to do the same, but at the same time I have to remind both myself and them that we are doing what we can within an implicit filter bubble of known knowns and known unknowns.

By definition, even though I’m aware of the possible existence of things that have already been so well hidden from me that I will never find them in my life time, I have no idea where to look to find these unknown unknowns. Maybe that’s why I’m buying more books and magazines at the moment, reading so very widely across the written and the electronic, and trying to commit as much as possible to here?

Do we need to know what we don’t know? How will we achieve this? Is this just another twinge as we move towards a different way of managing information?

What will this post say tomorrow?


You’re Welcome On My Lawn But Leaf Blowers Are Not

I was looking at a piece of software the other day and, despite it being a well-used and large-userbase piece of code, I was musing that I had never found it be particularly fit for purpose. (No, I won’t tell you what it is – I’m allergic to defamation suits.) However, my real objections to it, in simple terms, sound a bit trivial to my own ears and I’ve never really had the words or metaphors to describe it to other people.

Until today.

My wife and I were walking in to work today and saw, in the distance, a haze of yellow dust, rising up in front of three men who were walking towards us, line abreast, as a street sweeping unit slowly accompanied them along the road. Each of the men had a leaf blower that they were swinging around, kicking up all of the Plain Tree pollen/dust (which is highly irritating) and pushing it towards us in a cloud. They did stop when they saw us coming but, given how much dust was in the air, it’s 8 hours later and I’m still getting grit out of my eyes.

Weirdly enough, this image comes from a gaming site, discussing mecha formations. The Internet constantly amazes me.

Now, I have no problem with streets being kept clean and free of debris and I have a lot of respect for the sweepers, cleaners and garbage removal people who stop us from dying in a MegaCholera outbreak from living in cities – but I really don’t like leaf blowers. On reflection, there are a number of things that I don’t like for similar reasons so let me refer back to the piece of software I was complaining about and call it a leaf blower.

Why? Well, primarily, it’s because leaf blowers are a noisy and inefficient way to not actually solve the problem. Leaf blowers move the problem to someone else. Leaf blowers are the socially acceptable face of picking up a bag of garbage and throwing it on your neighbour’s front porch. Today was a great example – all of the dust and street debris was being blown out of the city towards the Park lands where, presumably, this would become someone else’s problem. The fact that a public thoroughfare was a pollen-ridden nightmare for 30 minutes or so was also, apparently, collateral damage.

Now, of course, there are people who use leaf blowers to push leaves into big piles that they then pick up, but there are leaf vacuums and brooms and things like that which will do a more effective job with either less noise or more efficiently. (And a lot of people just blow it off their property as if it will magically disappear.) The catch is, of course, better solutions generally require more effort.

The problem with a broom is that pushing a broom is a laborious and tiring task, and it’s quite reasonable for large-scale tasks like this that we have mechanical alternatives. For brief tidy up and small spaces, however, the broom is king. The problem with the leaf vacuum is that it has to be emptied and they are, because of their size and nature, often more expensive than the leaf blower. You probably couldn’t afford to have as many of these on your cleanup crew’s equipment roster. So brooms are cheap but hard manual labour compared to expensive leaf vacuums which fulfil the social contract but require regular emptying.

Enter the leaf blower – low effort, relatively low cost, no need to empty the bag, just blow it off the property. It is, however, an easy way to not actually solve the problem.

And this, funnily enough, describes the software that I didn’t like (and many other things in a similar vein). Cost-wise it’s a sensible decision, compared to building it yourself and in terms of maintenance. It’s pretty easy to use. There’s no need to worry about being sensible or parsimonious with resources. You just do stuff in it with a small amount of time and you’re done.

The only problem is that what you are encouraged to produce by default, the affordance of the software, is not actually the solution to the problem the the software theoretically solves. It is an approximation to the answer but, in effect, you’ve handed the real problem to someone else – in my case, the student, because it’s software of an educational nature. This then feeds load straight back to you, your teaching assistants and support staff. Any effort you’ve expended is wasted and you didn’t even solve the problem.

I’ve talked before about trying to assess what knowledge workers are doing, rather than concentrating on the number of hours that they are spending at their desk, and the ‘desk hours’ metric is yet another example of leaf blowing. Cheap and easy metric, neither effective nor useful, and realistically any sensible interpretation requires you to go back and work out what people are actually doing during those hours – problem not solved, just shunted along, with a bit of wasted effort and a false sense of achievement.

Solving problems is sometimes difficult and it regularly requires careful thought and effort. There may be a cost involved. If we try to come up with something that looks like a solution, but all it does is blow the leaves around, then we probably haven’t actually solved anything.


Student Reflections – The End of Semester Process Report

I’ve mentioned before that I have two process awareness reports in one of my first-year courses. One comes just after the monster “Library” prac, and one is right at the end of the course. These encourage the students to reflect on their assignment work and think about their software development process. I’ve just finished marking the final one and, as last year, it’s a predominantly positive and rewarding experience.

When faced with 2-4 pages of text to produce, most of my students sit down and write several, fairly densely packed pages telling me about the things that they’ve discovered along the way: lessons learned, pit traps avoided and (interestingly) the holes that they did fall into. It’s rare that I get cynical replies and for this course, from over 100 responses, I think that I had about 5 disappointing ones.

The disappointing ones included ones that posted about how I had to give them marks for something that was rubbish (uh, no I didn’t, read the assignment spec and the forum carefully), ones that were scrawled together in about a minute and said nothing, and the ones that were the outpourings of someone who wasn’t really happy with where they were, rather than something I could easily fix. Let’s move on from these.

I want to talk about the ones who had crafted beautiful diagrams where they proudly displayed their software process. The ones who shared great ideas about how to help students in the next offering. The ones who shared the links that they found useful with me, in case other students would like them. The ones who were quietly proud of mastering their areas of difficulty and welcomed the opportunity to tell someone about it. The one who used this quote from Confucius:

“A man without distant care must have near sorrow”

(人无远虑 必有近忧)

To explain why you had to look into the future when you did software design – don’t leave your assignments to the last minute, he was saying, look ahead! (I am, obviously, going to use that for teaching next semester!)

The Confucian Symbol. Something else to put in my lecture slides for Semester 2, 2012.

Overall, I find these reports to be a resolutely uplifting experience. The vast majority of my students have learnt what I wanted them to learn and have improved their professional skills but, as well, a large number of them have realised that the assignments, together with the lectures, develop their knowledge. Here is one of my favourite student quotes about the assignments themselves, which tells me that we’re starting to get the design right:

The real payoff was towards the end of the assignment. Often it would be possible to “just type code” and earn at least half the marks fairly easily. However there was always a more complex final-­part to the assignment, one that I could not complete unless I approached it in a systematic, well thought out way. The assignments made it easy to see that a program of any real complexity would be nearly impossible to build without a well-­defined design.

But students were also thinking about how they were going to take more general lessons out of this. Here’s another quote I like:

Three improvements that I am aiming to take on board for future subjects are: putting together a study timetable early on in the game; taking the time to read and understand the problem I’ve been given; and put enough time aside to produce a concise design which includes testing strategies.

The exam for this course has just been held and we’re assembling the final marks for inspection on Friday, which will tell us how this new offering has gone. But, at this stage, I have an incredibly valuable resource of student feedback to draw on when I have to do any minor adjustments to make this course better for the next offering.

From a load perspective, yes, having two essays in an otherwise computationally based course does put load on the lecturer/marker but I am very happy to pay that price. It’s such a good way to find out what my students are thinking and, from a personal perspective, be a little more confident that my co-teaching staff and I are making a positive change in these students’ lives. Better still, by sharing comments from cohort to cohort, we provide an authenticity to the advice that I would be hard pressed to achieve.

I think that this course, the first one I’ve really designed from the ground up and I’m aware of how rare that opportunity is, is actually turning into something good. And that, unsurprisingly, makes me very happy.


Time Banking and Plagiarism: Does “Soul Destroying” Have An Ethical Interpretation?

Yesterday, I wrote a post on the 40 hour week, to give an industrial basis for the notion of time banking, and I talked about the impact of overwork. One of the things I said was:

The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. (Emphasis mine.)

Back-breaking is me being rather overly emphatic regarding the impact of work, although in manual industries workplace accidents caused by fatigue and overwork can and do break backs – and worse – on a regular basis.

Is it Monday morning already?

But soul-destroying? Am I just saying that someone will perform their tasks as an automaton or zombie, or am I saying something more about the benefit of full cognitive function – the soul as an amalgam of empathy, conscience, consideration and social factors? Well, the answer is that, when I wrote it, I was talking about mindlessness and the removal of the ability to take joy in work, which is on the zombie scale, but as I’ve reflected on the readings more, I am now convinced that there is an ethical dimension to fatigue-related cognitive impairment that is important to talk about. Basically, the more tired you get, the more likely you are to function on the task itself and this can have some serious professional and ethical considerations. I’ll provide a basis for this throughout the rest of this post.

The paper I was discussing, on why Crunch Mode doesn’t work, listed many examples from industry and one very interesting paper from the military. The paper, which had a broken link in the Crunch mode paper, may be found here and is called “Sleep, Sleep Deprivation, and Human Performance in Continuous Operations” by Colonel Gregory Belenky. Now, for those who don’t know, in 1997 I was a commissioned Captain in the Royal Australian Armoured Corps (Reserve), on detachment to the Training Group to set up and pretty much implement a new form of Officer Training for Army Reserve officers in South Australia. Officer training is a very arduous process and places candidates, the few who make it in, under a lot of stress and does so quite deliberately. We have to have some idea that, if terrible things happen and we have to deploy a human being to a war zone, they have at least some chance of being able to function. I had been briefed on most of the issues discussed in Colonel Belenky’s paper but it was only recently that I read through the whole thing.

And, to me today as an educator (I resigned my commission years ago), there are still some very important lessons, guidelines and warnings for all of us involved in the education sector. So stay with me while I discuss some of Belenky’s terminology and background. The first term I want to introduce is droning: the loss of cognitive ability through lack of useful sleep. As Belenky puts in, in the context of US Army Ranger training:

…the candidates can put one foot in front of another and respond if challenged, but have difficulty grasping their situation or acting on their own initiative.

What was most interesting, and may surprise people who have never served with the military, is that the higher the rank, the less sleep people got – and the higher level the formation, the less sleep people got. A Brigadier in charge of a Brigade is going to, on average, get less sleep than the more junior officers in the Brigade and a lot less sleep than a private soldier in a squad. As an officer, my soldiers were fed before me, rested before me and a large part of my day-to-day concern was making sure that they were kept functioning. This keeps on going up the chain and, as you go further up, things get more complex. Sadly, the people shouldering the most complex cognitive functions with the most impact on the overall battlefield are also the people getting the least fuel for their continued cognitive endeavours. They are the most likely to be droning: going about their work in an uninspired way and not really understanding their situation. So here is more evidence from yet another place: lack of sleep and fatigue lead to bad outcomes.

One of the key issues Belenky talks about is the loss of situational awareness caused by the accumulated sleep debt, fatigue and overwork suffered by military personnel. He gives an example of an Artillery Fire Direction Centre – this is where requests for fire support (big guns firing large shells at locations some distance away) come to and the human plotters take your requests, transform them into instructions that can be given to the gunners and then firing starts. Let me give you a (to me) chilling extract from the report, which the Crunch Mode paper also quoted:

Throughout the 36 hours, their ability to accurately derive range, bearing, elevation, and charge was unimpaired. However, after circa 24 hours they stopped keeping up their situation map and stopped computing their pre-planned targets immediately upon receipt. They lost situational awareness; they lost their grasp of their place in the operation. They no longer knew where they were relative to friendly and enemy units. They no longer knew what they were firing at. Early in the simulation, when we called for simulated fire on a hospital, etc., the team would check the situation map, appreciate the nature of the target, and refuse the request. Later on in the simulation, without a current situation map, they would fire without hesitation regardless of the nature of the target. (All emphasis mine.)

Here, perhaps, is the first inkling of what I realised I meant by soul destroying. Yes, these soldiers are overworked to the point of droning and are now shuffling towards zombiedom. But, worse, they have no real idea of their place in the world and, perhaps most frighteningly, despite knowing that accidents happen when fire missions are requested and having direct experience of rejecting what would have resulted in accidental hospital strikes, these soldiers have moved to a point of function where the only thing that matters is doing the work and calling the task done. This is an ethical aspect because, from their previous actions, it is quite obvious that there was both a professional and ethical dimension to their job as the custodians of this incredibly destructive weaponry – deprive them of enough sleep and they calculate and fire, no longer having the cognitive ability (or perhaps the will) to be ethical in their delivery. (I realise a number of you will have choked on your coffee slightly at the discussion of military ethics but, in the majority of cases, modern military units have a strong ethical code, even to the point of providing a means for soldiers to refuse to obey illegal orders. Most failures of this system in the military can be traced to failures in a unit’s ethical climate or to undetected instability in the soldiers: much as in the rest of the world.)

The message, once again, is clear. Overwork, fatigue and sleeplessness reduce the ability to perform as you should. Belenky even notes that the ability to benefit from training quite clearly deteriorates as the fatigue levels increase. Work someone hard enough, or let them work themselves hard enough, and not only aren’t they productive, they can’t learn to do anything else.

The notion of situational awareness is important because it’s a measure of your sense of place, in an organisational sense, in a geographical sense, in a relative sense to the people around you and also in a social sense. Get tired enough and you might swear in front of your grandma because your social situational awareness is off. But it’s not just fatigue over time that can do this: overloading someone with enough complex tasks can stress cognitive ability to the point where similar losses of situational awareness can occur.

Helmet fire is a vivid description of what happens when you have too many tasks to do, under highly stressful situations, and you lose your situational awareness. If you are a military pilot flying on instruments alone, especially with low or zero visibility, then you have to follow a set of procedures, while regularly checking the instruments, in order to keep the plane flying correctly. If the number of tasks that you have to carry out gets too high, and you are facing the stress of effectively flying the plane visually blind, then your cognitive load limits will be exceeded and you are now experiencing helmet fire. You are now very unlikely to be making any competent contributions at all at this stage but, worse, you may lose your sense of what you were doing, where you are, what your intentions are, which other aircraft are around you: in other words, you lose situational awareness. At this point, you are now at a greatly increased risk of catastrophic accident.

To summarise, if someone gets tired, stressed or overworked enough, whether acutely or over time, their performance goes downhill, they lose their sense of place and they can’t learn. But what does this have to do with our students?

A while ago I posted thoughts on a triage system for plagiarists – allocating our resources to those students we have the most chance of bringing back to legitimate activity. I identified the three groups as: sloppy (unintentional) plagiarism, deliberate (but desperate and opportunistic) plagiarism and systematic cheating. I think that, from the framework above, we can now see exactly where the majority of my ‘opportunistic’ plagiarists are coming from: sleep-deprived, fatigued and (by their own hands or not) over-worked students losing their sense of place within the course and becoming focused only on the outcome. Here, the sense of place is not just geographical, it is their role in the social and formal contracts that they have entered into with lecturers, other students and their institution. Their place in the agreements for ethical behaviour in terms of doing the work yourself and submitting only that.

If professional soldiers who have received very large amounts of training can forget where there own forces are, sometimes to the tragic extent that they fire upon and destroy them, or become so cognitively impaired that they carry out the mission, and only the mission, with little of their usual professionalism or ethical concern, then it is easy to see how a student can become so task focussed that start to think about only ending the task, by any means, to reduce the cognitive load and to allow themselves to get the sleep that their body desperately needs.

As always, this does not excuse their actions if they resort to plagiarism and cheating – it explains them. It also provides yet more incentive for us to try and find ways to reach our students and help them form systems for planning and time management that brings them closer to the 40 hour ideal, that reduces the all-nighters and the caffeine binges, and that allows them to maintain full cognitive function as ethical, knowledgable and professional skill practitioners.

If we want our students to learn, it appears that (for at least some of them) we first have to help them to marshall their resources more wisely and keep their awareness of exactly where they are, what they are doing and, in a very meaningful sense, who they are.


Time Banking: Aiming for the 40 hour week.

I was reading an article on metafilter on the perception of future leisure from earlier last century and one of the commenters linked to a great article on “Why Crunch Mode Doesn’t Work: Six Lessons” via the International Game Designers Association. This article was partially in response to the quality of life discussions that ensued after ea_spouse outed the lifestyle (LiveJournal link) caused by her spouse’s ludicrous hours working for Electronic Arts, a game company. One of the key quotes from ea_spouse was this:

Now, it seems, is the “real” crunch, the one that the producers of this title so wisely prepared their team for by running them into the ground ahead of time. The current mandatory hours are 9am to 10pm — seven days a week — with the occasional Saturday evening off for good behavior (at 6:30pm). This averages out to an eighty-five hour work week. Complaints that these once more extended hours combined with the team’s existing fatigue would result in a greater number of mistakes made and an even greater amount of wasted energy were ignored.

The badge is fastened with two pins that go straight into your chest.

This is an incredible workload and, as Evan Robinson notes in the “Crunch Mode” article, this is not only incredible but it’s downright stupid because every serious investigation into the effect of working more than 40 hours a week, for extended periods, and for reducing sleep and accumulating sleep deficit has come to the same conclusion: hours worked after a certain point are not just worthless, they reduce worth from hours already worked.

Robinsons cites studies and practices coming from industrialists as Henry Ford, who reduced shift length to a 40-hour work week in 1926, attracting huge criticism, because 12 years of research had shown that the shorter work week meant more output, not less. These studies have been going on since the 18th century and well into the 60’s at least and they all show the same thing: working eight hours a day, five days a week gives you more productivity because you get fewer mistakes, you get less fatigue accumulation and you have workers that are producing during their optimal production times (first 4-6 hours of work) without sliding into their negatively productive zones.

As Robinson notes, the games industry doesn’t seem to have got the memo. The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. The fact that you can get fired for having the audacity to try and work otherwise also helps a great deal in motivating people to adopt the strategy.

Why spend so many hours in the office? Remember when I said that it’s sometimes hard for people to see what I’m doing because, when I’m thinking or planning, I can look like I’m sitting in the office doing nothing? Imagine what it looks like if, two weeks before a big deadline, someone walks into the office at 5:30pm and everyone’s gone home. What does this look like? Because of our conditioning, which I’ll talk about shortly, it looks like we’ve all decided to put our lives before the work – it looks like less than total commitment.

As a manager, if you can tell everyone above you that you have people at their desks 80+ hours a week and will have for the next three months, then you’re saying that “this work is important and we can’t do any more.” The fact that people were probably only useful for the first 6 hours of every day, and even then only for the first couple of months, doesn’t matter because it’s hard to see what someone is doing if all you focus on is the output. Those 80+ hour weeks are probably only now necessary because everyone is so tired, so overworked and so cognitively impaired, that they are taking 4 times as long to achieve anything.

Yes, that’s right. All the evidence says that more than 2 months of overtime and you would have been better off staying at 40 hours/week in terms of measurable output and quality of productivity.

Robinson lists six lessons, which I’ll summarise here because I want to talk about it terms of students and why forward planning for assignments is good practice for better smoothing of time management in the future. Here are the six lessons:

  1. Productivity varies over the course of the workday, with greatest productivity in the first 4-6 hours. After enough hours, you become unproductive and, eventually, destructive in terms of your output.
  2. Productivity is hard to quantify for knowledge workers.
  3. Five day weeks of eight house days maximise long-term output in every industry that has been studied in the past century.
  4. At 60 hours per week, the loss of productivity caused by working longer hours overwhelms the extra hours worked within a couple of months.
  5. Continuous work reduces cognitive function 25% for every 24 hours. Multiple consecutive overnighters have a severe cumulative effect.
  6. Error rates climb with hours worked and especially with loss of sleep.

My students have approximately 40 hours of assigned work a week, consisting of contact time and assignments, but many of them never really think about that. Most plan in other things around their ‘free time’ (they may need to work, they may play in a band, they may be looking after families or they may have an active social life) and they fit the assignment work and other study into the gaps that are left. Immediately, they will be over the 40 hour marker for work. If they have a part-time job, the three months of one of my semesters will, if not managed correctly, give them a lumpy time schedule alternating between some work and far too much work.

Many of my students don’t know how they are spending their time. They switch on the computer, look at the assignment, Skype, browse, try something, compile, walk away, grab a bite, web surf, try something else – wow, three hours of programming! This assignment is really hard! That’s not all of them but it’s enough of them that we spend time on process awareness: working out what you do so you know how to improve it.

Many of my students see sports drinks, energy drinks and caffeine as a licence to not sleep. It doesn’t work long term as most of us know, for exactly the reasons that long term overwork and sleeplessness don’t work. Stimulants can keep you awake but you will still be carrying most if not all of your cognitive impairment.

Finally, and most importantly, enough of my students don’t realise that everything I’ve said up until now means that they are trying to sit my course with half a brain after about the halfway point, if not sooner if they didn’t rest much between semesters.

I’ve talked about the theoretical basis for time banking and the pedagogical basis for time banking: this is the industrial basis for time banking. One day I hope that at least some of my students will be running parts of their industries and that we have taught them enough about sensible time management and work/life balance that, as people in control of a company, they look at real measures of productivity, they look at all of the masses of data supporting sensible ongoing work rates and that they champion and adopt these practices.

As Robinson says towards the end of the article:

Managers decide to crunch because they want to be able to tell their bosses “I did everything I could.” They crunch because they value the butts in the chairs more than the brains creating games. They crunch because they haven’t really thought about the job being done or the people doing it. They crunch because they have learned only the importance of appearing to do their best to instead of really of doing their best. And they crunch because, back when they were programmers or artists or testers or assistant producers or associate producers, that was the way they were taught to get things done. (Emphasis mine.)

If my students can see all of their requirements ahead of time, know what is expected, have been given enough process awareness, and have the will and the skill to undertake the activities, then we can potentially teach them a better way to get things done if we focus on time management in a self-regulated framework, rather than imposed deadlines in a rigid authority-based framework. Of course, I still have a lot of work to to demonstrate that this will work but, from industrial experience, we have yet another very good reason to try.