HERDSA 2012: Connecting with VET
Posted: July 6, 2012 Filed under: Education | Tags: blogging, community, education, educational problem, educational research, ethics, Generation Why, herdsa, higher education, learning, resources, student perspective, teaching, teaching approaches, thinking Leave a commentAs part of the final session I attended in the general conference, I attended a talk by A/Prof Anne Langworthy and Dr Susan Johns who co-presented a talk on “Why is it important for Higher Education to connect with the VET sector?” As a point of clarification, VET stands for Vocational Education and Training, as I’ve previously mentioned, and is generally concerned with trade qualifications, with some students going on to diplomas and advanced diplomas that may be recognised as some or all of a first-year University course equivalent. If a student starts in a trade and then goes to a certain point, we can then easily accept them into an articulation program.
The Tasmanian context, a small state that is still relatively highly ruralised, provides a challenging backdrop to this work, as less than 18% of school leavers who could go on to University actually do go on to University. Combine this with the highest lower socio-economic status (SES) population by percentage in Australia, and many Tasmanians can be considered to be disadvantaged by both access to and participation in University level studies. Family influence plays a strong part here – many families have no-one in their immediate view who has been to University at all, although VET is slightly more visible.
Among the sobering statistics presented, where that out of a 12 year schooling system, where Year 12 is the most usual pre-requisite to University entry, as a state, the percentage of people 15 years or older who had year 10 schooling or less was 54%. Over half the adult population had not completed secondary schooling, the usual stepping stone to higher education.
The core drivers for this research were the following:
- VET pathways are more visible and accessible to low SES/rural students because the entry requirements aren’t necessarily as high and someone in their family might be able to extol the benefits.
- There are very low levels of people articulating from VET to Higher Ed – so, few people are going on to the diploma.
- There is an overall decline in VET and HE participation.
- Even where skills shortages are identified, students aren’t studying in the areas of regional need.
- UTAS is partnering with the Tasmanian VET provider Tasmanian Polytechnic and Skills Institute.
An interesting component of this background data is that, while completion rates are dropping for the VET skills, individual module completion rates are still higher than the courses in which the modules sit. In other words, people are likely to finish a module that is of use to them, or an employer, but don’t necessarily see the need of completing the whole course. However, across the board, the real problem is that VET, so often a pathway to higher ed for people who didn’t quite finish skill, is dropping in percentage terms as a pathway to Uni. There has been a steady decline in VET to HE articulation in Tasmania across the last 6 years.
The researchers opted for an evidence based approach to examine those students who had succeeded in articulating from VET to HE, investigating their perceptions and then mapping existing pathways to discover what could be learned. The profile of VET and HE students from the SES/rural areas in Tas are pretty similar although the VET students who did articulate into Uni were less likely to seek pathways that rewarded them with specific credits and were more likely to seek general admission. Given that these targeted articulations, with credit transfer, are supposed to reflect student desire and reward it, or encourage participation in a specific discipline, it appears that these pathways aren’t working as well as they could.
So what are the motivators for those students who do go from VET to Uni? Career and vocational aspirations, increased confidence from doing VET, building on their practical VET basis, the quality and enthusiasm of their VET teachers, the need to bridge over an existing educational hurdle and satisfaction with their progress to date. While participation in VET generally increased a student’s positive attitude to study, the decision to (or not to) articulate often came down to money, time, perceived lack of industry connection and even more transitional assistance.
It’s quite obvious that our students, and industry, can become fixated with the notion of industrial utility – job readiness – and this may be to the detriment to Universities if we are perceived as ivory towers. More Higher Ed participation is excellent for breaking the poverty cycle, developing social inclusion and the VET to Higher Ed nexus offers great benefits in terms of good student outcomes, such as progression and retention, but it’s obvious that people coming from a practice-based area, especially in terms of competency training, are going to need bridging and some understanding to adapt to the University teaching model. Or, of course, our model has to change. (Don’t think I was going to sneak that one past you.)
The authors concluded that bridging was essential. articulation and credit transfer arrangement should be reviewed and that better articulation agreements should be offered in areas of national and regional priority. The cost of going to University, which may appear very small to students who are happy to defer payment, can be an impediment to lower SES participants because, on average, people in these groups can be highly debt averse. The relocation costs and support costs of moving away from a rural community to an urban centre for education is also significant. It’s easy sometimes to forget about how much it costs to live in a city, especially when you deprive someone of their traditional support models.
Of course, that connection to industry is one where students can feel closer when they undertake VET and Universities can earn some disrespect, fairly or not, for being seen to be ivory towers, too far away from industry. If you have no-one in your family who has been to Uni, there’s no advocate for the utility of ‘wasting’ three years of your life not earning money in order to get a better job or future. However, this is yet another driver for good industry partnerships and meaningful relationships between industry, VET and Higher Education.
It’s great to see so much work being down in both understanding and then acting to fix some of the more persistent problems with those people who may never even see a University, unless we’re dynamic and thoughtful in our outreach programs. On a personal note, while Tasmania has the lowest VET to HE conversion, I noticed that South Australia (my home state) has the second lowest and a similar decline. Time, I think, to take some of the lessons learned and apply them in our back yard!
HERDSA 2012: What is the New Academy?
Posted: July 4, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, collaboration, education, educational research, ethics, herdsa, higher education, reflection, resources, teaching approaches, thinking, vygotsky, workload Leave a commentI attended some (more) interesting talks today on building research capacity, how we build the connection between education and research (the dreaded research-teaching nexus) and how we identify ourselves as academics. If I were going to summarise all three of these talks, it would be as:
How are we defining the Academy of the 21st Century?
There is no doubt that research is a crucial component of what we do – you can’t even be registered as a University in Australia unless you pursue research – but it often seems to be the favoured child in any discussion of importance for promotion and allocation of serious resources. Now I realise that a lot of work is going into fixing this but research has, for many years, counted for more.
So it’s interesting that, as Winthrop Professor Shelda Debowski, UWA, observed after returning from her Churchill Fellowship, we don’t really bother to do as much training as we should for research. Research success doesn’t automatically flow from finishing a PhD, any more than a PhD is an indication of readiness or aptitude to teach – yet many early researchers don’t get a great deal of development assistance. This leads, in some cases, to what Debowski refers to as middlescence: a great PhD but after 5+ years it all dies.
Succesful research requires many capabilities and ongoing learning and, while our universities try to support this, we’re not often sure what the best way is to support this. Staff are seeking guidance – research leaders are keen to help. How can we connect them usefully and efficiently? For me, I rephrase the question as:
How are we defining the Research Academy of the 21st Century?
Research is a simple world with a complex set of concepts behind it. Are we looking at the basis of inputs, outputs, strategy and impact? Are we looking at industrial interaction with collaboration, engagement and support? Are we being productive and effective, innovative and creative? There is, for many people’s careers, not much room for failure.
The PhD used to be all that was needed, in theory, because we had the time to make some mistakes, to find our feet, and to iterate towards a better model. Not any more.
My take on this, to go on from what I was saying in the last post, is that we can define the New Research Academy in terms of its environment. Like any species, the New Research Academic must adapt to the environment that they are in or they will perish. Climate change is a threat to the world, similarly Academy Change is a threat to the old inhabitants. The New Academy is fast, hungry, competitive, resource starved, commoditised, industry linked and, above all, heavily dependent on the perception of our efforts. The speed of change makes a difference here because if you were raised in the gentler environment of the Old Academy, but have been around for 20 years, then you have probably achieved enough success to survive. If I may take another biological example, you have accumulated enough resources that you can survive the lean years or the harsher years. The New Academy has frosts and only so many places available for the tribe. You build your resources quickly or it’s over.
Unless, of course, you can find a group to support you. Returning to Debowski’s material, she points out why development of researchers is so critical:
- Start with PhD – used to be the only thing that you needed to do.
- Now you have to understand how it fits into strategic research areas and areas of strength (broader sphere of understanding)
- Need to hook in with a research community (this is your resource sharing group)
- ECRs need to have to develop: communication skills, team and collaborative skills, project management, track record/profile, time, priority, career management, and grant seeking behaviour
- Research managers and leaders need to take a professional stance to support this: induction, culture setting, human resource management practices, strategic management, financial management, relationship building, mentoring and sponsorship, project management, risk management, media/promotion.
But, looking at that final list, do some of those look like the behaviours of a professional research academic? I’ll come back to this.
Debowski finished by emphasising the role of mentors and, in the Old/New Academy framework, this makes even more sense. A new PhD student has only a limited amount of time before poor performance effectively removes them from the appointment and job pool – they don’t have time to waste taking false paths. A mid-career researcher needs to work out which path to take and then has to optimise for it – do I continue teaching, do I focus on research, should I take that Associate Deans position? This is where a mentor is vital because the New Academy has a cold wind blowing through it. Huddled together, we’ll see Summer again – but, of course, you have to huddle with the right people.
This brings me to the next talk, on How Universities Connect Education and Research, presented by Professor Lawrence Cram. This was a very interesting talk, dealing with complexity theory to explain the small-scale chaotic relationships in trying to explain which actions get people promoted these days. This is a very mechanistic approach to life in the New Academy. Which X do I need to maximise to achieve Y? Cram, however, very nicely identifies that X is in fact a set of things, Y is a different set of things, and the connections between them operate at different levels at different times.
Cram identified the outputs of Universities as experience goods, where the product is hard to observe in advance, in terms of characteristics such as quality or price, but you’re quickly aware of how good they are once consumed. This generally requires you to sell your product on reputation but once this reputation is established, your pricing model (market position) tends to stay fairly stable. (Amusingly, dropping the price of experience goods, because we’re unsure of how the goods are created, may result in uncertainty because people will make up reasons for the price drop that generally include drop in quality, rather than efficiency of delivery or something positive.)
This makes mapping inputs to outputs difficult and explains why such measurable outputs as number of students, pass rates and research publications are far more likely to form the basis of any funding. Cram is looking across a very large area with a very large number of questions: does research success generate a corresponding success ‘buzz’ in the student body? Does research discovery parallel or assist the student with their own voyage of discovery through their courses?
Ultimately, directives from senior management drive a functional and idealistic approach that produces graduates and intellectual property, but most universities are struggling to unify this with directives and government funding, compared to what students want. Linking this back to the roles that we are expecting research managers to take, we start to see a managerial focus that is starting to dominate our professional academic staff. I rephrase this, and segue to the next talk, as:
How are we defining the Professional Academic in the 21st Century?
The final talk used identity theory to examine the different work ideologies that academics espouse. Wayne O’Donohue presented his and Richard Winter’s paper on “Understanding academic identity conflicts in the public university: Importance of work ideologies” and it was both an interesting presentation, as well as being a full paper that I hope to finish reading this evening.
Fundamentally, managerial and professional ideological beliefs differ on how academic work should be organised. As I have mention throughout this post, we are seeing more and more evidence of creeping requirement to become managers. Managerialism, according to Winter and O’Donohue, has moved us into market-driven entities that regard students as commodities. Consumers need to be swayed by branding and pandering to preferences – we risk basing the reputation of our experience good upon a good marketing campaign rather than a solid academic reputation.
The conceptual framework for this work is that the two identities are, effectively, at odds with each other. Academics who are forced to be managerial find themselves at odds with their idea of what it means to be an academic – they are not being who they want to be and are at odds with what their University wants them to be. If we are to be good managerial entities then we focus on competition and consumer preferences for allocating resources. If we are to be good academics, then we focus on economic and social welfare of all members, stressing normative goals and beliefs. It is hard to think of two more opposing points within this sphere and it is no wonder that the people surveyed by Winter and O’Donohue had to be censored to remove obscene language that reflected their frustration at their own perception of their role.
We know that the market is not all that good at managing public good items. We know the benefits of the educational system in breaking the poverty cycle, reducing crime and violence, improving families, but the market would have to change its short-term benefit model in order to factor this in. We are looking at the substantial differences of short term economic focus versus long term social welfare focus.
Ultimately, the dissonance generated by people doing things that they were asked to do, but didn’t want to do, causes dissatisfaction and cynicism. Dispirited academics leave. Leaving, of course, those who are willing to adapt to the more managerial focus to then rise through the ranks, take positions of power and then impose more managerial focuses.
So what is the New Academy? Is it really a world of bottoms on seats, feudalist in its enforced fealty to existing barons to see you through the lean years, unconnected to funding models and overly metricated in strange ways?
If you want my honest answer, I would say “Not yet.”
Yes, we are heavily measured, but we still have the freedom to challenge and correct those measurements. A great deal of work is being done to produce instruments that give us useful and applicable information, as well as ‘handy’ numbers.
Yes, it helps to be in a research group, but informal communities of practice, faculty and university initiatives, external funding sources such as OLT, ALTA and the ARC do not require you to sign your swords over to a baron or a King.
Yes, we are measured as to our student intakes but we are still, in many important ways, academically free. We can still maintain quality and be true to our academic heritage.
You don’t have to take me word for it. Read everything that I (and katrinafalkner) have been blogging about. You can see all of the work being done, that we have seen at this conference, to draw us all together, to make us remember that we are strong as group, to provide useful metrics, to collaborate, to mentor out of the desire to help rather than the desire to control and the work being down to find and advertise our identity and the way that we can achieve our goals.
Yes, the idea of the New Academy is intimidating, and I write as one who was lucky enough to ride the wave of the new expectations, but in the same way that we bring our students together to learn and explore the benefits of collaboration and social interaction, I am convinced that the best rebirthing of the Academy will occur as we continue to share our work, and meet to discuss it, and go back home and be active and build upon everything that we’ve discussed.
And, being honest, sometimes it just takes sticking to our point, when we’re right, and not doing something that we know is wrong. I know that these are times when people are scared for their jobs, and I’m certainly not immune to that either, but the question comes down to “how much will you put with?” Let me finish with two final questions, which are also, I’m afraid, a call-to-arms:
What have you done today to define the Academy of the 21st Century in a way that matches your ideals and intentions?
What will you do tomorrow?
HERDSA 2012: Integrating concepts across common engineering first year courses
Posted: July 4, 2012 Filed under: Education | Tags: blogging, curriculum, design, education, educational problem, educational research, herdsa, higher education, reflection, resources, student perspective, teaching, teaching approaches Leave a commentI attended a talk by Dr Andrew Guzzomi on “Interdisciplinary threshold concepts in engineering”, where he talked about University of Western Australia’s reconfiguration of their first year common engineering program in the face of their new 3+2 course roll-out across the University. Most Unis have a common engineering first year that is the basis for all disciplines. This is usually a collection of individual units each focusing on one discipline, developed and taught by academics from that discipline. For example, civil engineers teach statics, mechs teach dynamics, but there is no guaranteed connection or conceptual linkage between the two areas. This is despite the fact that statics is effectively dynamics with some key variables set to zero. (Engineers, you may now all howl in dismay!)
This work looked at what the threshold concepts were for engineering. These threshold concepts are transformative, in that if you understand them it will change the way that you think about the discipline, but they are also troublesome, they need work to teach and effort to learn. But, in theory, if we identify the right threshold concepts then we:
- Focus teaching, learning and assessment activities
- Renew otherwise crowded curricula
This is a big issue as we balance the requirements of our students, our discipline, our professional bodies and industry – we have to make sure that whatever teach is appropriate and the most useful (in all of the objective spaces) thing that we can be teaching.
Dr Guzzomi then discussed the ALTC (Australian Learning and Teaching Council) project that supported the basic investigation to conduct an inventory of what all groups considered to be the core threshold concepts. UWA was the case study, with an aim to reducing a guide for other educators, and to add back to threshold concept theory. This is one of the main contributions of the large-scale Australia-wide educational research support bodies: they can give enough money and influence to a project to allow change to occur.
(I picked up from the talk that, effectively, it helped to have a Chair of Engineering Education on board to get an initiative like this through. Even then, there was still resistance from some quarters. This isn’t surprising. If we all agreed with each other, I’d be shocked.)
The threshold concept identification required a very large set of workshops and consultative activities, across students and staff both within and without the discipline, starting with a diversification phase as concepts were added, and then moving to an integration phase that rationalised these concepts down into the set that really expressed the key threshold concepts of engineering for first year.
The implementation in Syllabus terms required the implementors to:
- Focus teaching and learning on TCs
- Address troublesome features
- provide opportunities to experience variation (motion unit taught using variation theory, when students work at indiv tables, doing different problems at different tables but pool similar answers for comparison to show the difference in approach and answer)
- System identification: where you work out which system he problem fits into to allow you simplify analysis
- Modelling and abstraction: where quantitative analysis is facilitated through translation to mathematical language and students use judgement to break system into salient components for modelling
- Dimensional reasoning: Identifying the variables needed to describe a complex system – making sure that equations balance.
- Rather than a traditional and relatively unlinked common foundation, teaching integrating concepts is showing promise
- Threshold concepts provided the lens and developed approach to integrated disciplines
- Teaching through variation supports student diversity in solutions
- This approach reveals connections across engineering disciplines beyond those in which they later chose to specialise
HERDSA 2012: Session 1 notes – Student Wellbeing
Posted: July 4, 2012 Filed under: Education | Tags: blogging, education, educational problem, herdsa, higher education, in the student's head, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, time banking Leave a commentI won’t be giving detailed comments on all sessions – firstly, I can’t attend everything and, secondly, I don’t want you all to die of word poisoning – but I’ve been to a number of talks and thought I’d discuss those here that really made me think. (My apologies for the delay. I seem to be coming down with a cold/flu and it’s slowing me down.)
In Session 1, I went to a talk entitled “Integrating teaching, learning, support and wellbeing in Universities”, presented by Dr Helen Stallman from University of Queensland. The core of this talk was that, if we want to support our students academically, we have to support them in every other way as well. The more distressed students are, the less well they do academically. If we want good outcomes, we have to able to support students’ wellbeing and mental health. We already provide counselling and support skill workshops but very few students will go and access these resources, until they actually need them.
This is a problem. Tell a student at the start of the course, when they are fine, where they can find help and they won’t remember it when they actually may need to know where that resource is. We have a low participation in many of the counselling and support skill workshop activities – it is not on the student’s agenda to go to one of these courses, it is on their agenda is to get a good mark. Pressured for time, competing demands, anything ‘optional’ is not a priority.
The student needs to identify that they have a problem, then they have to be able to find the solution! Many University webpages not actually useful in this regard, although they contain a lot of marketing information on the front page.
What if we have an at-risk profile that we can use to identify students? It’s not 100% accurate. Students who are ‘at risk’ may not have problems but students who don’t have the profile may still have problems! We don’t necessarily know what’s going on with our students. Where we have 100s of students, how can we know all of them? (This is one of the big drivers for my work in submission management and elastic time – identifying students who are at risk as soon as they may be at risk.)
So let me reiterate the problem with the timing of information: we tend to mention support services once, at the start. People don’t access resources unless they’re relevant and useful at the particular time. Talking to people when they don’t have a problem – they’ll forget it.
So what are the characteristics of interventions that promote student success:
- Inclusive of all students (and you can find it)
- Encourages self-management skills (Don’t smother them! Our goal is not dependency, it’s self-regulation)
- Promotes academic achievement (highest potential for each of our students)
- Promotes wellbeing (not just professional capabilities but personal capabilities and competencies)
- Minimally sufficient (students/academics/unis are not doing more work than they need to, and only providing the level of input that is required to achieve this goal.)
- Sustainable (easy for students and academics)
Dr Stallman then talked about two tools – the Learning Thermometer and The Desk. Student reflection and system interface gives us the Learning Thermometer, then automated and personalised student feedback is added, put in by academic. Support and intervention, web-based, as a loop around student feedback. Student privacy data is maintained and student gets to choose intervention that is appropriate. Effectively, the Learning Thermometer tells the student which services are available, as and when they are needed, based on their results, their feedback and the lecturer’s input.
This is designed to promote self-management skills and makes the student think “What can I do? What are the things that I can do?” Gives students of knowledge of which resources they can access. (And this resource is called “The Desk”) Who are the people who can help me?
What is being asked is: What are the issues that get in the way of achieving academic success?
About “The Desk”: it contains quizzes related to all part of the desk that gives students personalised feedback to give them module suggestions as appropriate. Have a summary sheet of what you’ve done so you can always remember it. Tools section to give you short tips on how to fix things. Coffee House social media centre to share information and pictures (recipes and anything really).
To allow teachers to work out what is going on, an addition to the Learning Thermometer can give the teacher feedback based on reflection and the interface. Early feedback to academics allows us to improve learning outcomes. THese improvements in teaching practices. (Student satisfaction correlates poorly with final mark, this is more than satisfaction.)
The final items in the talk focussed on:
- A universal model of prevention
- All students can be resilient
- Resources need to be timely relevant and useful
- Multiple access points
- Integrated within the learning environment
What are the implications?
- Focus on prevention
- Close the loop between learning, teaching, wellbeing and support
- More resilient students
- Better student graduate outcomes.
Overall a very interesting talk, which a lot of things to think about. How can I position my support resources so that students know where to go as and when they need them? Is ‘resiliency’ an implicit or explicit goal inside my outcomes and syllabus structure? Do the mechanisms that I provide for assessment work within this framework?
With my Time Banking hat on, I am always thinking about how I can be fair but flexible, consistent but compassionate, and maintain quality while maintaining humanity. This talk is yet more information to consider as I look at alternative ways to work with students for their own benefit, while improving their performance at the same time.
Contact details and information on tools discussed:
h.stallman@uq.edu.au
http://www.thelearningthermometer.org.au
http://www.thedesk.org.au
thedesk@uq.edu.au
When the Stakes are High, the Tests Had Better Be Up to It.
Posted: July 2, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, curriculum, design, education, educational problem, ethics, feedback, Generation Why, higher education, identity, plagiarism, reflection, resources, student perspective, teaching, teaching approaches, testing, thinking, universal principles of design, work/life balance Leave a comment(This is on the stronger opinion side but, in the case of standardised testing as it is currently practised, this will be a polarising issue. Please feel free to read the next article and not this one.)

If you make a mistake, please erase everything from the worksheet, and then leave the room, as you have just wasted 12 years of education.
A friend on FB (thanks, Julie!) linked me to an article in the Washington Post that some of you may have seen. The article is called “The Complete List of Problems with High-Stakes Standardised Tests” by Marion Brady, in the words of the article. a “teacher, administrator, curriculum designer and author”. (That’s attribution, not scare quotes.)
Brady provides a (rather long but highly interesting) list of problems with the now very widespread standardised testing regime that is an integral part of student assessment in some countries. Here. Brady focuses on the US but there is little doubt that the same problems would exist in other areas. From my readings and discussions with US teachers, he is discussing issues that are well-known problems in the area but they are slightly intimidating when presented as a block.
So many problems are covered here, from an incorrect focus on simplistic repetition of knowledge because it’s easier to assess, to the way that it encourages extrinsic motivations (bribery or punishment in the simplest form), to the focus on test providers as the stewards and guides of knowledge rather than the teachers. There are some key problems, and phrases, that I found most disturbing, and I quote some of them here:
[Teachers oppose the tests because they]
“unfairly advantage those who can afford test prep; hide problems created by margin-of-error computations in scoring; penalize test-takers who think in non-standard ways”
“wrongly assume that what the young will need to know in the future is already known; emphasize minimum achievement to the neglect of maximum performance; create unreasonable pressures to cheat.”
“are open to massive scoring errors with life-changing consequences”
“because they provide minimal to no useful feedback”
This is completely at odds with what we would consider to be reasonable education practice in any other area. If I had comments from students that identified that I was practising 10% of this, I would be having a most interesting discussion with my Head of School concerning what I was doing – and a carpeting would be completely fair! This isn’t how we should teach and we know it.
I spoke yesterday about an assault on critical thinking as being an assault on our civilisation, short-sightedly stabbing away at helping people to think as if it will really achieve what (those trying to undermine critical thinking) actually wanted. I don’t think that anyone can actually permanently stop information spreading, when that information can be observed in the natural world, but short-sightedness, malign manipulation of the truth and ignorance can certainly prevent individuals from gaining access to information – especially if we are peddling the lie that “everything which needs to be discovered is already known.”
We can, we have and we probably (I hope) always will work around these obstacles in information, these dark ages as I referred to them yesterday, but at what cost of the great minds who cannot be applied to important problems because they were born to poor families, in the ‘wrong’ state, in a district with no budget for schools, or had to compete against a system that never encouraged them to actually think?
The child who would have developed free safe power, starship drives, applicable zero-inflation stable economic models, or the “cure for cancer” may be sitting at the back of a poorly maintained, un-airconditioned, classroom somewhere, doodling away, and slowly drifting from us. When he or she encounters the standardised test, unprepared, untrained, and tries to answer it to the extent of his or her prodigious intellect, what will happen? Are you sufficiently happy with the system that you think that this child will receive a fair hearing?
We know that students learn from us, in every way. If we teach something in one way but we reward them for doing something else in a test, is it any surprise that they learn for the test and come to distrust what we talk about outside of these tests? I loathe the question “will this be in the exam” as much as the next teacher but, of course, if that is how we have prioritised learning and rewarded the student, then they would be foolish not to ask this question. If the standardised test is the one that decides your future, then, without doubt, this is the one that you must set as your goal, whether student, teacher, district or state!
Of course, it is the future of the child that is most threatened by all of this, as well as the future of the teaching profession. Poor results on a standardised test for a student may mean significantly reduced opportunity, and reduced opportunity, unless your redemptive mechanisms are first class, means limited pathways into the future. The most insidious thread through all of this is the idea that a standardised test can be easily manipulated through a strategy of learning what the answer should be, to a test question, rather than what it is, within the body of knowledge. We now combine the disadvantaged student having their future restricted, competing against the privileged student who has been heavily channeled into a mode that allows them to artificially excel, with no guarantee that they have the requisite aptitude to enjoy or take advantage of the increased opportunities. This means that both groups are equally in trouble, as far as realising their ambitions, because one cannot even see the opportunity while the other may have no real means for transforming opportunity into achievement.
The desire to control the world, to change the perception of inconvenient facts, to avoid hard questions, to never be challenged – all of these desires appear to be on the rise. This is the desire to make the world bend to our will, the real world’s actual composition and nature apparently not mattering much. It always helps me to remember that Cnut stood in the waves and commanded them not to come in order to prove that he could not control the waves – many people think that Cnut was defeated in his arrogance, when he was attempting to demonstrate his mortality and humility, in the face of his courtiers telling him that he had power above that of mortal men.
How unsurprising that so many people misrepresent this.
You’re Welcome On My Lawn But Leaf Blowers Are Not
Posted: June 28, 2012 Filed under: Education | Tags: advocacy, design, education, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching approaches, thinking, tools Leave a commentI was looking at a piece of software the other day and, despite it being a well-used and large-userbase piece of code, I was musing that I had never found it be particularly fit for purpose. (No, I won’t tell you what it is – I’m allergic to defamation suits.) However, my real objections to it, in simple terms, sound a bit trivial to my own ears and I’ve never really had the words or metaphors to describe it to other people.
Until today.
My wife and I were walking in to work today and saw, in the distance, a haze of yellow dust, rising up in front of three men who were walking towards us, line abreast, as a street sweeping unit slowly accompanied them along the road. Each of the men had a leaf blower that they were swinging around, kicking up all of the Plain Tree pollen/dust (which is highly irritating) and pushing it towards us in a cloud. They did stop when they saw us coming but, given how much dust was in the air, it’s 8 hours later and I’m still getting grit out of my eyes.

Weirdly enough, this image comes from a gaming site, discussing mecha formations. The Internet constantly amazes me.
Now, I have no problem with streets being kept clean and free of debris and I have a lot of respect for the sweepers, cleaners and garbage removal people who stop us from dying in a MegaCholera outbreak from living in cities – but I really don’t like leaf blowers. On reflection, there are a number of things that I don’t like for similar reasons so let me refer back to the piece of software I was complaining about and call it a leaf blower.
Why? Well, primarily, it’s because leaf blowers are a noisy and inefficient way to not actually solve the problem. Leaf blowers move the problem to someone else. Leaf blowers are the socially acceptable face of picking up a bag of garbage and throwing it on your neighbour’s front porch. Today was a great example – all of the dust and street debris was being blown out of the city towards the Park lands where, presumably, this would become someone else’s problem. The fact that a public thoroughfare was a pollen-ridden nightmare for 30 minutes or so was also, apparently, collateral damage.
Now, of course, there are people who use leaf blowers to push leaves into big piles that they then pick up, but there are leaf vacuums and brooms and things like that which will do a more effective job with either less noise or more efficiently. (And a lot of people just blow it off their property as if it will magically disappear.) The catch is, of course, better solutions generally require more effort.
The problem with a broom is that pushing a broom is a laborious and tiring task, and it’s quite reasonable for large-scale tasks like this that we have mechanical alternatives. For brief tidy up and small spaces, however, the broom is king. The problem with the leaf vacuum is that it has to be emptied and they are, because of their size and nature, often more expensive than the leaf blower. You probably couldn’t afford to have as many of these on your cleanup crew’s equipment roster. So brooms are cheap but hard manual labour compared to expensive leaf vacuums which fulfil the social contract but require regular emptying.
Enter the leaf blower – low effort, relatively low cost, no need to empty the bag, just blow it off the property. It is, however, an easy way to not actually solve the problem.
And this, funnily enough, describes the software that I didn’t like (and many other things in a similar vein). Cost-wise it’s a sensible decision, compared to building it yourself and in terms of maintenance. It’s pretty easy to use. There’s no need to worry about being sensible or parsimonious with resources. You just do stuff in it with a small amount of time and you’re done.
The only problem is that what you are encouraged to produce by default, the affordance of the software, is not actually the solution to the problem the the software theoretically solves. It is an approximation to the answer but, in effect, you’ve handed the real problem to someone else – in my case, the student, because it’s software of an educational nature. This then feeds load straight back to you, your teaching assistants and support staff. Any effort you’ve expended is wasted and you didn’t even solve the problem.
I’ve talked before about trying to assess what knowledge workers are doing, rather than concentrating on the number of hours that they are spending at their desk, and the ‘desk hours’ metric is yet another example of leaf blowing. Cheap and easy metric, neither effective nor useful, and realistically any sensible interpretation requires you to go back and work out what people are actually doing during those hours – problem not solved, just shunted along, with a bit of wasted effort and a false sense of achievement.
Solving problems is sometimes difficult and it regularly requires careful thought and effort. There may be a cost involved. If we try to come up with something that looks like a solution, but all it does is blow the leaves around, then we probably haven’t actually solved anything.
Student Reflections – The End of Semester Process Report
Posted: June 27, 2012 Filed under: Education | Tags: authenticity, education, feedback, Generation Why, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools Leave a commentI’ve mentioned before that I have two process awareness reports in one of my first-year courses. One comes just after the monster “Library” prac, and one is right at the end of the course. These encourage the students to reflect on their assignment work and think about their software development process. I’ve just finished marking the final one and, as last year, it’s a predominantly positive and rewarding experience.
When faced with 2-4 pages of text to produce, most of my students sit down and write several, fairly densely packed pages telling me about the things that they’ve discovered along the way: lessons learned, pit traps avoided and (interestingly) the holes that they did fall into. It’s rare that I get cynical replies and for this course, from over 100 responses, I think that I had about 5 disappointing ones.
The disappointing ones included ones that posted about how I had to give them marks for something that was rubbish (uh, no I didn’t, read the assignment spec and the forum carefully), ones that were scrawled together in about a minute and said nothing, and the ones that were the outpourings of someone who wasn’t really happy with where they were, rather than something I could easily fix. Let’s move on from these.
I want to talk about the ones who had crafted beautiful diagrams where they proudly displayed their software process. The ones who shared great ideas about how to help students in the next offering. The ones who shared the links that they found useful with me, in case other students would like them. The ones who were quietly proud of mastering their areas of difficulty and welcomed the opportunity to tell someone about it. The one who used this quote from Confucius:
“A man without distant care must have near sorrow”
(人无远虑 必有近忧)
To explain why you had to look into the future when you did software design – don’t leave your assignments to the last minute, he was saying, look ahead! (I am, obviously, going to use that for teaching next semester!)
Overall, I find these reports to be a resolutely uplifting experience. The vast majority of my students have learnt what I wanted them to learn and have improved their professional skills but, as well, a large number of them have realised that the assignments, together with the lectures, develop their knowledge. Here is one of my favourite student quotes about the assignments themselves, which tells me that we’re starting to get the design right:
The real payoff was towards the end of the assignment. Often it would be possible to “just type code” and earn at least half the marks fairly easily. However there was always a more complex final-part to the assignment, one that I could not complete unless I approached it in a systematic, well thought out way. The assignments made it easy to see that a program of any real complexity would be nearly impossible to build without a well-defined design.
But students were also thinking about how they were going to take more general lessons out of this. Here’s another quote I like:
Three improvements that I am aiming to take on board for future subjects are: putting together a study timetable early on in the game; taking the time to read and understand the problem I’ve been given; and put enough time aside to produce a concise design which includes testing strategies.
The exam for this course has just been held and we’re assembling the final marks for inspection on Friday, which will tell us how this new offering has gone. But, at this stage, I have an incredibly valuable resource of student feedback to draw on when I have to do any minor adjustments to make this course better for the next offering.
From a load perspective, yes, having two essays in an otherwise computationally based course does put load on the lecturer/marker but I am very happy to pay that price. It’s such a good way to find out what my students are thinking and, from a personal perspective, be a little more confident that my co-teaching staff and I are making a positive change in these students’ lives. Better still, by sharing comments from cohort to cohort, we provide an authenticity to the advice that I would be hard pressed to achieve.
I think that this course, the first one I’ve really designed from the ground up and I’m aware of how rare that opportunity is, is actually turning into something good. And that, unsurprisingly, makes me very happy.
Who Knew That the Slippery Slope Was Real?
Posted: June 26, 2012 Filed under: Education, Opinion | Tags: advocacy, blogging, curriculum, design, education, educational problem, feedback, higher education, in the student's head, motivation, plagiarism, resources, student perspective, thinking, time banking, tools, universal principles of design 1 CommentTake a look at this picture.
One thing you might have noticed, if you’ve looked carefully, is that this man appears to have had some reconstructive surgery on the right side of his face and there is a colour difference, which is slightly accentuated by the lack of beard stubble. What if I were to tell you that this man was offered the chance to have fake stubble tattooed onto that section and, when he declined because he felt strange about it, received a higher level of pressure and, in his words, guilt trip than for any other procedure during the extensive time he spent in hospital receiving skin grafts and burn treatments. Why was the doctor pressuring him?
Because he had already performed the tattooing remediation on two people and needed a third for the paper. In Dan’s words, again, the doctor was a fantastic physician, thoughtful, and he cared but he had a conflict of interest that meant that he moved to a different mode of behaviour. For me, I had to look a couple of times because the asymmetry that the doctor referred to is not that apparent at first glance. Yet the doctor felt compelled, by interests that were now Dan’s, to make Dan self-conscious about the perceived problem.
A friend on Facebook (thanks, Bill!) posted a link to an excellent article in Wired, entitled “Why We Lie, Cheat, Go to Prison and Eat Chocolate Cake” by Dan Ariely, the man pictured above. Dan is a professor of behavioural economics and psychology at Duke and his new book explores the reasons that we lie to each other. I was interested in this because I’m always looking for explanations of student behaviour and I want to understand their motivations. I know that my students will rationalise and do some strange things but, if I’m forewarned, maybe I can construct activities and courses in a way that heads this off at the pass.
There were several points of interest to me. The first was the question whether a cost/benefit analysis of dishonesty – do something bad, go to prison – actually has the effect that we intend. As Ariely points out, if you talk to the people who got caught, the long-term outcome of their actions was never something that they thought about. He also discusses the notion of someone taking small steps, a little each time, that move them from law abiding, for want of a better word, to dishonest. Rather than set out to do bad things in one giant leap, people tend to take small steps, rationalising each one, and after each step opening up a range of darker and darker options.
Welcome to the slippery slope – beloved argument of rubicose conservative politicians since time immemorial. Except that, in this case, it appears that the slop is piecewise composed on tiny little steps. Yes, each step requires a decision, so there isn’t the momentum that we commonly associate with the slope, but each step, in some sense, takes you to larger and larger steps away from the honest place from which you started.
Ariely discusses an experiment where he gave two groups designer sunglasses and told one group that they had the real thing, and the other that they had fakes, and then asked them to complete a test and then gave them a chance to cheat. The people who had been randomly assigned into the ‘fake sunglasses’ group cheated more than the others. Now there are many possible reasons for this. One of them is the idea that if you know that are signalling your status deceptively to the world, which is Ariely’s argument, you are in a mindset where you have taken a step towards dishonesty. Cheating a little more is an easier step. I can see many interpretations of this, because of the nature of the cheating which is in reporting how many questions you completed on the test, where self-esteem issues caused by being in the ‘fake’ group may lead to you over-promoting yourself in the reporting of your success on the quiz – but it’s still cheating. Ultimately, whatever is motivating people to take that step, the step appears to be easier if you are already inside the dishonest space, even to a degree.
[Note: Previous paragraph was edited slightly after initial publication due to terrible auto-correcting slipping by me. Thanks, Gary!]
Where does something like copying software or illicitly downloading music come into this? Does this constant reminder of your small, well-rationalised, step into low-level lawlessness have any impact on the other decisions that you make? It’s an interesting question because, according to the outline in Ariely’s sunglasses experiment, we would expect it to be more of a problem if the products became part of your projected image. We know that having developed a systematic technological solution for downloading is the first hurdle in terms of achieving downloads but is it also the first hurdle in making steadily less legitimate decisions? I actually have no idea but would be very interested to see some research in this area. I feel it’s too glib to assume a relationship, because it is so ‘slippery slope’ argument, but Ariely’s work now makes me wonder. Is it possible that, after downloading enough music or software, you could actually rationalise the theft of a car? Especially if you were only ‘borrowing’ it? (Personally, I doubt it because I think that there are several steps in between.) I don’t have a stake in this fight – I have a personal code for behaviour in this sphere that I can live with but I see some benefits in asking and trying to answer these questions from something other that personal experience.
Returning to the article, of particular interest to me was the discussion of an honour code, such as Princeton’s, where students sign a pledge. Ariely sees it as benefit as a reminder to people that is active for some time but, ultimately, would have little value over several years because, as we’ve already discussed, people rationalise in small increments over the short term rather than constructing long-term models where the pledge would make a difference. Sign a pledge in 2012 and it may just not have any impact on you by the middle of 2012, let alone at the end of 2015 when you’re trying to graduate. Potentially, at almost any cost.
In terms of ongoing reminders, and a signature on a piece of work saying (in effect) “I didn’t cheat”, Ariely asks what happens if you have to sign the honour clause after you’ve finished a test – well, if you’ve finished then any cheating has already occurred so the honour clause is useless then. If you remind people at the start of every assignment, every test, and get them to pledge at the beginning then this should have an impact – a halo effect to an extent, or a reminder of expectation that will make it harder for you to rationalise your dishonesty.
In our school we have an electronic submission system that require students to use to submit their assignments. It has boiler plate ‘anti-plagiarism’ text and you must accept the conditions to submit. However, this is your final act before submission and you have already finished the code, which falls immediately into the trap mentioned in the previous paragraph. Dan Ariely’s answers have made me think about how we can change this to make it more of an upfront reminder, rather than an ‘after the fact – oh it may be too late now’ auto-accept at the end of the activity. And, yes, reminder structures and behaviour modifiers in time banking are also being reviewed and added in the light of these new ideas.
The Wired Q&A is very interesting and covers a lot of ground but, realistically, I think I have to go and buy Dan Ariely’s book(s), prepare myself for some harsh reflection and thought, and plan for a long weekend of reading.
Time Banking and Plagiarism: Does “Soul Destroying” Have An Ethical Interpretation?
Posted: June 25, 2012 Filed under: Education | Tags: advocacy, blogging, design, education, educational problem, feedback, higher education, in the student's head, learning, plagiarism, resources, student perspective, teaching, teaching approaches, time banking, tools, work/life balance, workload 4 CommentsYesterday, I wrote a post on the 40 hour week, to give an industrial basis for the notion of time banking, and I talked about the impact of overwork. One of the things I said was:
The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. (Emphasis mine.)
Back-breaking is me being rather overly emphatic regarding the impact of work, although in manual industries workplace accidents caused by fatigue and overwork can and do break backs – and worse – on a regular basis.
But soul-destroying? Am I just saying that someone will perform their tasks as an automaton or zombie, or am I saying something more about the benefit of full cognitive function – the soul as an amalgam of empathy, conscience, consideration and social factors? Well, the answer is that, when I wrote it, I was talking about mindlessness and the removal of the ability to take joy in work, which is on the zombie scale, but as I’ve reflected on the readings more, I am now convinced that there is an ethical dimension to fatigue-related cognitive impairment that is important to talk about. Basically, the more tired you get, the more likely you are to function on the task itself and this can have some serious professional and ethical considerations. I’ll provide a basis for this throughout the rest of this post.
The paper I was discussing, on why Crunch Mode doesn’t work, listed many examples from industry and one very interesting paper from the military. The paper, which had a broken link in the Crunch mode paper, may be found here and is called “Sleep, Sleep Deprivation, and Human Performance in Continuous Operations” by Colonel Gregory Belenky. Now, for those who don’t know, in 1997 I was a commissioned Captain in the Royal Australian Armoured Corps (Reserve), on detachment to the Training Group to set up and pretty much implement a new form of Officer Training for Army Reserve officers in South Australia. Officer training is a very arduous process and places candidates, the few who make it in, under a lot of stress and does so quite deliberately. We have to have some idea that, if terrible things happen and we have to deploy a human being to a war zone, they have at least some chance of being able to function. I had been briefed on most of the issues discussed in Colonel Belenky’s paper but it was only recently that I read through the whole thing.
And, to me today as an educator (I resigned my commission years ago), there are still some very important lessons, guidelines and warnings for all of us involved in the education sector. So stay with me while I discuss some of Belenky’s terminology and background. The first term I want to introduce is droning: the loss of cognitive ability through lack of useful sleep. As Belenky puts in, in the context of US Army Ranger training:
…the candidates can put one foot in front of another and respond if challenged, but have difficulty grasping their situation or acting on their own initiative.
What was most interesting, and may surprise people who have never served with the military, is that the higher the rank, the less sleep people got – and the higher level the formation, the less sleep people got. A Brigadier in charge of a Brigade is going to, on average, get less sleep than the more junior officers in the Brigade and a lot less sleep than a private soldier in a squad. As an officer, my soldiers were fed before me, rested before me and a large part of my day-to-day concern was making sure that they were kept functioning. This keeps on going up the chain and, as you go further up, things get more complex. Sadly, the people shouldering the most complex cognitive functions with the most impact on the overall battlefield are also the people getting the least fuel for their continued cognitive endeavours. They are the most likely to be droning: going about their work in an uninspired way and not really understanding their situation. So here is more evidence from yet another place: lack of sleep and fatigue lead to bad outcomes.
One of the key issues Belenky talks about is the loss of situational awareness caused by the accumulated sleep debt, fatigue and overwork suffered by military personnel. He gives an example of an Artillery Fire Direction Centre – this is where requests for fire support (big guns firing large shells at locations some distance away) come to and the human plotters take your requests, transform them into instructions that can be given to the gunners and then firing starts. Let me give you a (to me) chilling extract from the report, which the Crunch Mode paper also quoted:
Throughout the 36 hours, their ability to accurately derive range, bearing, elevation, and charge was unimpaired. However, after circa 24 hours they stopped keeping up their situation map and stopped computing their pre-planned targets immediately upon receipt. They lost situational awareness; they lost their grasp of their place in the operation. They no longer knew where they were relative to friendly and enemy units. They no longer knew what they were firing at. Early in the simulation, when we called for simulated fire on a hospital, etc., the team would check the situation map, appreciate the nature of the target, and refuse the request. Later on in the simulation, without a current situation map, they would fire without hesitation regardless of the nature of the target. (All emphasis mine.)
Here, perhaps, is the first inkling of what I realised I meant by soul destroying. Yes, these soldiers are overworked to the point of droning and are now shuffling towards zombiedom. But, worse, they have no real idea of their place in the world and, perhaps most frighteningly, despite knowing that accidents happen when fire missions are requested and having direct experience of rejecting what would have resulted in accidental hospital strikes, these soldiers have moved to a point of function where the only thing that matters is doing the work and calling the task done. This is an ethical aspect because, from their previous actions, it is quite obvious that there was both a professional and ethical dimension to their job as the custodians of this incredibly destructive weaponry – deprive them of enough sleep and they calculate and fire, no longer having the cognitive ability (or perhaps the will) to be ethical in their delivery. (I realise a number of you will have choked on your coffee slightly at the discussion of military ethics but, in the majority of cases, modern military units have a strong ethical code, even to the point of providing a means for soldiers to refuse to obey illegal orders. Most failures of this system in the military can be traced to failures in a unit’s ethical climate or to undetected instability in the soldiers: much as in the rest of the world.)
The message, once again, is clear. Overwork, fatigue and sleeplessness reduce the ability to perform as you should. Belenky even notes that the ability to benefit from training quite clearly deteriorates as the fatigue levels increase. Work someone hard enough, or let them work themselves hard enough, and not only aren’t they productive, they can’t learn to do anything else.
The notion of situational awareness is important because it’s a measure of your sense of place, in an organisational sense, in a geographical sense, in a relative sense to the people around you and also in a social sense. Get tired enough and you might swear in front of your grandma because your social situational awareness is off. But it’s not just fatigue over time that can do this: overloading someone with enough complex tasks can stress cognitive ability to the point where similar losses of situational awareness can occur.
Helmet fire is a vivid description of what happens when you have too many tasks to do, under highly stressful situations, and you lose your situational awareness. If you are a military pilot flying on instruments alone, especially with low or zero visibility, then you have to follow a set of procedures, while regularly checking the instruments, in order to keep the plane flying correctly. If the number of tasks that you have to carry out gets too high, and you are facing the stress of effectively flying the plane visually blind, then your cognitive load limits will be exceeded and you are now experiencing helmet fire. You are now very unlikely to be making any competent contributions at all at this stage but, worse, you may lose your sense of what you were doing, where you are, what your intentions are, which other aircraft are around you: in other words, you lose situational awareness. At this point, you are now at a greatly increased risk of catastrophic accident.
To summarise, if someone gets tired, stressed or overworked enough, whether acutely or over time, their performance goes downhill, they lose their sense of place and they can’t learn. But what does this have to do with our students?
A while ago I posted thoughts on a triage system for plagiarists – allocating our resources to those students we have the most chance of bringing back to legitimate activity. I identified the three groups as: sloppy (unintentional) plagiarism, deliberate (but desperate and opportunistic) plagiarism and systematic cheating. I think that, from the framework above, we can now see exactly where the majority of my ‘opportunistic’ plagiarists are coming from: sleep-deprived, fatigued and (by their own hands or not) over-worked students losing their sense of place within the course and becoming focused only on the outcome. Here, the sense of place is not just geographical, it is their role in the social and formal contracts that they have entered into with lecturers, other students and their institution. Their place in the agreements for ethical behaviour in terms of doing the work yourself and submitting only that.
If professional soldiers who have received very large amounts of training can forget where there own forces are, sometimes to the tragic extent that they fire upon and destroy them, or become so cognitively impaired that they carry out the mission, and only the mission, with little of their usual professionalism or ethical concern, then it is easy to see how a student can become so task focussed that start to think about only ending the task, by any means, to reduce the cognitive load and to allow themselves to get the sleep that their body desperately needs.
As always, this does not excuse their actions if they resort to plagiarism and cheating – it explains them. It also provides yet more incentive for us to try and find ways to reach our students and help them form systems for planning and time management that brings them closer to the 40 hour ideal, that reduces the all-nighters and the caffeine binges, and that allows them to maintain full cognitive function as ethical, knowledgable and professional skill practitioners.
If we want our students to learn, it appears that (for at least some of them) we first have to help them to marshall their resources more wisely and keep their awareness of exactly where they are, what they are doing and, in a very meaningful sense, who they are.
Time Banking: Aiming for the 40 hour week.
Posted: June 24, 2012 Filed under: Education | Tags: education, educational problem, higher education, in the student's head, learning, measurement, MIKE, principles of design, resources, student perspective, teaching, teaching approaches, time banking, tools, universal principles of design, work/life balance 5 CommentsI was reading an article on metafilter on the perception of future leisure from earlier last century and one of the commenters linked to a great article on “Why Crunch Mode Doesn’t Work: Six Lessons” via the International Game Designers Association. This article was partially in response to the quality of life discussions that ensued after ea_spouse outed the lifestyle (LiveJournal link) caused by her spouse’s ludicrous hours working for Electronic Arts, a game company. One of the key quotes from ea_spouse was this:
Now, it seems, is the “real” crunch, the one that the producers of this title so wisely prepared their team for by running them into the ground ahead of time. The current mandatory hours are 9am to 10pm — seven days a week — with the occasional Saturday evening off for good behavior (at 6:30pm). This averages out to an eighty-five hour work week. Complaints that these once more extended hours combined with the team’s existing fatigue would result in a greater number of mistakes made and an even greater amount of wasted energy were ignored.
This is an incredible workload and, as Evan Robinson notes in the “Crunch Mode” article, this is not only incredible but it’s downright stupid because every serious investigation into the effect of working more than 40 hours a week, for extended periods, and for reducing sleep and accumulating sleep deficit has come to the same conclusion: hours worked after a certain point are not just worthless, they reduce worth from hours already worked.
Robinsons cites studies and practices coming from industrialists as Henry Ford, who reduced shift length to a 40-hour work week in 1926, attracting huge criticism, because 12 years of research had shown that the shorter work week meant more output, not less. These studies have been going on since the 18th century and well into the 60’s at least and they all show the same thing: working eight hours a day, five days a week gives you more productivity because you get fewer mistakes, you get less fatigue accumulation and you have workers that are producing during their optimal production times (first 4-6 hours of work) without sliding into their negatively productive zones.
As Robinson notes, the games industry doesn’t seem to have got the memo. The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. The fact that you can get fired for having the audacity to try and work otherwise also helps a great deal in motivating people to adopt the strategy.
Why spend so many hours in the office? Remember when I said that it’s sometimes hard for people to see what I’m doing because, when I’m thinking or planning, I can look like I’m sitting in the office doing nothing? Imagine what it looks like if, two weeks before a big deadline, someone walks into the office at 5:30pm and everyone’s gone home. What does this look like? Because of our conditioning, which I’ll talk about shortly, it looks like we’ve all decided to put our lives before the work – it looks like less than total commitment.
As a manager, if you can tell everyone above you that you have people at their desks 80+ hours a week and will have for the next three months, then you’re saying that “this work is important and we can’t do any more.” The fact that people were probably only useful for the first 6 hours of every day, and even then only for the first couple of months, doesn’t matter because it’s hard to see what someone is doing if all you focus on is the output. Those 80+ hour weeks are probably only now necessary because everyone is so tired, so overworked and so cognitively impaired, that they are taking 4 times as long to achieve anything.
Yes, that’s right. All the evidence says that more than 2 months of overtime and you would have been better off staying at 40 hours/week in terms of measurable output and quality of productivity.
Robinson lists six lessons, which I’ll summarise here because I want to talk about it terms of students and why forward planning for assignments is good practice for better smoothing of time management in the future. Here are the six lessons:
- Productivity varies over the course of the workday, with greatest productivity in the first 4-6 hours. After enough hours, you become unproductive and, eventually, destructive in terms of your output.
- Productivity is hard to quantify for knowledge workers.
- Five day weeks of eight house days maximise long-term output in every industry that has been studied in the past century.
- At 60 hours per week, the loss of productivity caused by working longer hours overwhelms the extra hours worked within a couple of months.
- Continuous work reduces cognitive function 25% for every 24 hours. Multiple consecutive overnighters have a severe cumulative effect.
- Error rates climb with hours worked and especially with loss of sleep.
My students have approximately 40 hours of assigned work a week, consisting of contact time and assignments, but many of them never really think about that. Most plan in other things around their ‘free time’ (they may need to work, they may play in a band, they may be looking after families or they may have an active social life) and they fit the assignment work and other study into the gaps that are left. Immediately, they will be over the 40 hour marker for work. If they have a part-time job, the three months of one of my semesters will, if not managed correctly, give them a lumpy time schedule alternating between some work and far too much work.
Many of my students don’t know how they are spending their time. They switch on the computer, look at the assignment, Skype, browse, try something, compile, walk away, grab a bite, web surf, try something else – wow, three hours of programming! This assignment is really hard! That’s not all of them but it’s enough of them that we spend time on process awareness: working out what you do so you know how to improve it.
Many of my students see sports drinks, energy drinks and caffeine as a licence to not sleep. It doesn’t work long term as most of us know, for exactly the reasons that long term overwork and sleeplessness don’t work. Stimulants can keep you awake but you will still be carrying most if not all of your cognitive impairment.
Finally, and most importantly, enough of my students don’t realise that everything I’ve said up until now means that they are trying to sit my course with half a brain after about the halfway point, if not sooner if they didn’t rest much between semesters.
I’ve talked about the theoretical basis for time banking and the pedagogical basis for time banking: this is the industrial basis for time banking. One day I hope that at least some of my students will be running parts of their industries and that we have taught them enough about sensible time management and work/life balance that, as people in control of a company, they look at real measures of productivity, they look at all of the masses of data supporting sensible ongoing work rates and that they champion and adopt these practices.
As Robinson says towards the end of the article:
Managers decide to crunch because they want to be able to tell their bosses “I did everything I could.” They crunch because they value the butts in the chairs more than the brains creating games. They crunch because they haven’t really thought about the job being done or the people doing it. They crunch because they have learned only the importance of appearing to do their best to instead of really of doing their best. And they crunch because, back when they were programmers or artists or testers or assistant producers or associate producers, that was the way they were taught to get things done. (Emphasis mine.)
If my students can see all of their requirements ahead of time, know what is expected, have been given enough process awareness, and have the will and the skill to undertake the activities, then we can potentially teach them a better way to get things done if we focus on time management in a self-regulated framework, rather than imposed deadlines in a rigid authority-based framework. Of course, I still have a lot of work to to demonstrate that this will work but, from industrial experience, we have yet another very good reason to try.




