Musing on Industrial Time

Now Print, Black, Linocut, (C) Nick Falkner, 2013

I caught up with a good friend recently and we were discussing the nature of time. She had stepped back from her job and was now spending a lot of her time with her new-born son. I have gone to working three days a week, hence have also stepped back from the five-day grind.  It was interesting to talk about how this change to our routines had changed the way that we thought of and used time. She used a term that I wanted to discuss here, which was industrial timeto describe the clock-watching time of the full-time worker. This is part of the larger area of time discipline, how our society reacts to and uses time, and is really quite interesting. Both of us had stopped worrying about the flow of time in measurable hours on certain days and we just did things until we ran out of day. This is a very different activity from the usual “do X now, do Y in 15 minutes time” that often consumes us. In my case, it took me about three months of considered thought and re-training to break the time discipline habits of thirty years. In her case, she has a small child to help her to refocus her time sense on the now.

Modern time-sense is so pervasive that we often don’t think about some of the underpinnings of our society. It is easy to understand why we have years and, although they don’t line up properly, months given that these can be matched to astronomical phenomena that have an effect on our world (seasons and tides, length of day and moonlight, to list a few). Days are simple because that’s one light/dark cycle. But why there are 52 weeks in a year? Why are there 7 days in a week? Why did the 5-day week emerge as a contiguous block of 5 days? What is so special about working 9am to 5pm?

A lot of modern time descends from the struggle of radicals and unionists to protect workers from the excesses of labour, to stop people being worked to death, and the notion of the 8 hour day is an understandable division of a 24 hour day into three even chunks for work, rest and leisure. (Goodness, I sound like I’m trying to sell you chocolate!)

If we start to look, it turns out that the 7 day week is there because it’s there, based on religion and tradition. Interestingly enough, there have been experiments with other week lengths but it appears hard to shift people who are used to a certain routine and, tellingly, making people wait longer for days off appears to be detrimental to adoption.

If we look at seasons and agriculture, then there is a time to sow, to grow, to harvest and to clear, much as there is a time for livestock to breed and to be raised for purpose. If we look to the changing time of sunrise and sunset, there is a time at which natural light is available and when it is not. But, from a time discipline perspective, these time systems are not enough to be able to build a large-scale, industrial and synchronised society upon – we must replace a distributed, loose and collective notion of what time is with one that is centralised, authoritarian and singular. While religious ceremonies linked to seasonal and astronomical events did provide time-keeping on a large scale prior to the industrial revolution, the requirement for precise time, of an accuracy to hours and minutes, was not possible and, generally, not required beyond those cues given from nature such as dawn, noon, dusk and so on.

After the industrial revolution, industries and work was further developed that was heavily separated from a natural linkage – there are no seasons for a coal mine or a steam engine – and the development of the clock and reinforcement of the calendar of work allowed both the measurement of working hours (for payment) and the determination of deadlines, given that natural forces did not have to be considered to the same degree. Steam engines are completed, they have no need to ripen.

With the notion of fixed and named hours, we can very easily determine if someone is late when we have enough tools for measuring the flow of time. But this is, very much, the notion of the time that we use in order to determine when a task must be completed, rather than taking an approach that accepts that the task will be completed at some point within a more general span of time.

We still have confusion where our understanding of “real measures” such as days, interact with time discipline. Is midnight on the 3rd of April the second after the last moment of April the 2nd or the second before the first moment of April the 4th? Is midnight 12:00pm or 12:00am? (There are well-defined answers to this but the nature of the intersection is such that definitions have to be made.)

But let’s look at teaching for a moment. One of the great criticisms of educational assessment is that we confuse timeliness, and in this case we specifically mean an adherence to meeting time discipline deadlines, with achievement. Completing the work a crucial hour after it is due can lead to that work potentially not being marked at all, or being rejected. But we do usually have over-riding reasons for doing this but, sadly, these reasons are as artificial as the deadlines we impose. Why is an Engineering Degree a four-year degree? If we changed it to six would we get better engineers? If we switched to competency based training, modular learning and life-long learning, would we get more people who were qualified or experienced with engineering? Would we get less? What would happen if we switched to a 3/1/2/1 working week? Would things be better or worse? It’s hard to evaluate because the week, and the contiguous working week, are so much a part of our world that I imagine that today is the first day that some of you have thought about it.

Back to education and, right now, we count time for our students because we have to work out bills and close off accounts at end of financial year, which means we have to meet marking and award deadlines, then we have to project our budget, which is yearly, and fit that into accredited degree structures, which have year guidelines…

But I cannot give you a sound, scientific justification for any of what I just wrote. We do all of that because we are caught up in industrial time first and we convince ourselves that building things into that makes sense. Students do have ebb and flow. Students are happier on certain days than others. Transition issues on entry to University are another indicator that students develop and mature at different rates – why are we still applying industrial time from top to bottom when everything we see here says that it’s going to cause issues?

Oh, yes, the “real world” uses it. Except that regular studies of industrial practice show that 40 hour weeks, regular days off, working from home and so on are more productive than the burn-out, everything-late, rush that we consider to be the signs of drive. (If Henry Ford thinks that making people work more than 40 hours a week is bad for business, he’s worth listening to.) And that’s before we factor in the development of machines that will replace vast numbers of human jobs in the next 20 years.

I have a different approach. Why aren’t we looking at students more like we regard our grape vines? We plan, we nurture, we develop, we test, we slowly build them to the point where they can produce great things and then we sustain them for a fruitful and long life. When you plant grape vines, you expect a first reasonable crop level in three years, and commercial levels at five. Tellingly, the investment pattern for grapes is that it takes you 10 years to break even and then you start making money back. I can’t tell you how some of my students will turn out until 15-25 years down the track and it’s insanity to think you can base retrospective funding on that timeframe.

You can’t make your grapes better by telling them to be fruitful in two years. Some vines take longer than others. You can’t even tell them when to fruit (although can trick them a little). Yet, somehow, we’ve managed to work around this to produce a local wine industry worth around $5 billion dollars. We can work with variation and seasonal issues.

One of the reasons I’m so keen on MOOCs is that these can fit in with the routines of people who can’t dedicate themselves to full-time study at the moment. By placing well-presented, pedagogically-sound materials on-line, we break through the tyranny of the 9-5, 5 day work week and let people study when they are ready to, where they are ready to, for as long as they’re ready to. Like to watch lectures at 1am, hanging upside down? Go for it – as long as you’re learning and not just running the video in the background while you do crunches, of course!

Once you start to question why we have so many days in a week, you quickly start to wonder why we get so caught up on something so artificial. The simple answer is that, much like money, we have it because we have it. Perhaps it’s time to look at our educational system to see if we can do something that would be better suited to developing really good knowledge in our students, instead of making them adept at sliding work under our noses a second before it’s due. We are developing systems and technologies that can allow us to step outside of these structures and this is, I believe, going to be better for everyone in the process.

Conformity isn’t knowledge, and conformity to time just because we’ve always done that is something we should really stop and have a look at.


That’s not the smell of success, your brain is on fire.

Would you mind putting out the hippocampus when you have a chance?

Would you mind putting out the hippocampus when you have a chance?

I’ve written before about the issues of prolonged human workload leading to ethical problems and the fact that working more than 40 hours a week on a regular basis is downright unproductive because you get less efficient and error-prone. This is not some 1968 French student revolutionary musing on what benefits the soul of a true human, this is industrial research by Henry Ford and the U.S. Army, neither of whom cold be classified as Foucault-worshipping Situationist yurt-dwelling flower children, that shows that there are limits to how long you can work in a sustained weekly pattern and get useful things done, while maintaining your awareness of the world around you.

The myth won’t die, sadly, because physical presence and hours attending work are very easy to measure, while productive outputs and their origins in a useful process on a personal or group basis are much harder to measure. A cynic might note that the people who are around when there is credit to take may end up being the people who (reluctantly, of course) take the credit. But we know that it’s rubbish. And the people who’ve confirmed this are both philosophers and the commercial sector. One day, perhaps.

But anyone who has studied cognitive load issues, the way that the human thinking processes perform as they work and are stressed, will be aware that we have a finite amount of working memory. We can really only track so many things at one time and when we exceed that, we get issues like the helmet fire that I refer to in the first linked piece, where you can’t perform any task efficiently and you lose track of where you are.

So what about multi-tasking?

Ready for this?

We don’t.

There’s a ton of research on this but I’m going to link you to a recent article by Daniel Levitin in the Guardian Q&A. The article covers the fact that what we are really doing is switching quickly from one task to another, dumping one set of information from working memory and loading in another, which of course means that working on two things at once is less efficient than doing two things one after the other.

But it’s more poisonous than that. The sensation of multi-tasking is actually quite rewarding as we get a regular burst of the “oooh, shiny” rewards our brain gives us for finding something new and we enter a heightened state of task readiness (fight or flight) that also can make us feel, for want of a better word, more alive. But we’re burning up the brain’s fuel at a fearsome rate to be less efficient so we’re going to tire more quickly.

Get the idea? Multi-tasking is horribly inefficient task switching that feels good but makes us tired faster and does things less well. But when we achieve tiny tasks in this death spiral of activity, like replying to an e-mail, we get a burst of reward hormones. So if your multi-tasking includes something like checking e-mails when they come in, you’re going to get more and more distracted by that, to the detriment of every other task. But you’re going to keep doing them because multi-tasking.

I regularly get told, by parents, that their children are able to multi-task really well. They can do X, watch TV, do Y and it’s amazing. Well, your children are my students and everything I’ve seen confirms what the research tells me – no, they can’t but they can give a convincing impression when asked. When you dig into what gets produced, it’s a different story. If someone sits down and does the work as a single task, it will take them a shorter time and they will do a better job than if they juggle five things. The five things will take more than five times as long (up to 10, which really blows out time estimation) and will not be done as well, nor will the students learn about the work in the right way. (You can actually sabotage long term storage by multi-tasking in the wrong way.) The most successful study groups around the Uni are small, focused groups that stay on one task until it’s done and then move on. The ones with music and no focus will be sitting there for hours after the others are gone. Fun? Yes. Efficient? No. And most of my students need to be at least reasonably efficient to get everything done. Have some fun but try to get all the work done too – it’s educational, I hear. 🙂

It’s really not a surprise that we haven’t changed humanity in one or two generations. Our brains are just not built in a way that can (yet) provide assistance with the quite large amount of work required to perform multi-tasking.

We can handle multiple tasks, no doubt at all, but we’ve just got to make sure, for our own well-being and overall ability to complete the task, that we don’t fall into the attractive, but deceptive, trap that we are some sort of parallel supercomputer.


5 Things I would Like My Students to Be Able to Perceive

Our students will go out into the world and will be exposed to many things but, if we have done our job well, then they will not just be pushed around by the pressure of the events that they witness, but they will be able to hold their ground and perceive what is really going on, to place their own stamp on the world.

Balance is one of the most useful outcomes of valid perception.

Balance is one of the most useful outcomes of valid perception.

I don’t tell my students how to think, although I know that it’s a commonly held belief that everyone at a Uni tries to shape the political and developmental thought of their students, I just try to get them to think. This is probably going to have the side effect of making them thoughtful, potentially even critical of things that don’t make sense, and I realise that this is something that not everybody wants from junior citizens. But that’s my job.

Here is a list of five things that I think I’d like a thoughtful person to be able to perceive. It’s not the definitive five or the perfect five but these are the ones that I have today.

  1. It would be nice if people were able to reliably tell the difference between 1/3 and 1/4 and understand that 1/3 is larger than 1/4. Being able to work out the odds of things (how likely they are) require you to be able to look at two things that are smaller than one and get them in the right order so you can say “this is more likely than that”. Working on percentages can make it easier but this requires people to do division, rather than just counting things and showing the fraction.But I’d like my students to be able to perceive how this can be a fundamental misunderstanding that means that some people can genuinely look at comparative probabilities and not be able to work out that this simple mathematical comparison is valid. And I’d like them to be able to think about how to communicate this to help people understand.
  2. A perceptive person would be able to spot when something isn’t free. There are many people who go into casinos and have a lot of fun gambling, eating very cheap or unlimited food, staying in cheap hotels and think about what a great deal it is. However, every game you play in a casino is designed so that casinos do not make a loss – but rather than just saying “of course” we need to realise that casinos make enough money to offer “unlimited buffet shrimp” and “cheap luxury rooms” and “free luxury for whales” because they are making so much money. Nothing in a casino is free. It is paid for by the people who lose money there.This is not, of course, to say that you shouldn’t go and gamble if you’re an adult and you want to, but it’s to be able to see and clearly understand that everything around you is being paid for, if not in a way that is transparently direct. There are enough people who suffer from the gambler’s fallacy to put this item on the list.
  3. A perceptive person would have a sense of proportion. They would not start issuing death threats in an argument over operating systems (or ever, preferably) and they would not consign discussions of human rights to amusing after-dinner conversation, as if this was something to be played with.
  4. A perceptive person would understand the need to temper the message to suit the environment, while still maintaining their own ethical code regarding truth and speaking up. But you don’t need to tell a 3-year old that their painting is awful any more than you need to humiliate a colleague in public for not knowing something that you know. If anything, it makes the time when you do deliver the message bluntly much more powerful.
  5. Finally, a perceptive person would be able to at least try to look at life through someone else’s eyes and understand that perception shapes our reality. How we appear to other people is far more likely to dictate their reaction than who we really are. If you can’t change the way you look at the world then you risk getting caught up on your own presumptions and you can make a real fool of yourself by saying things that everyone else knows aren’t true.

There’s so much more and I’m sure everyone has their own list but it’s, as always, something to think about.


5 Things: Necessary Assumptions of Truth

I’m (still) in the middle of writing a large summary of my thoughts on education and how can we develop a better way to provide education to as many students as possible. Unsurprisingly, this is a large undertaking and I’m expecting that the final document will be interesting and fairly controversial. I suspect that one of the major problems will stem from things that I believe that we have to assume are true. Now this is always challenging, especially where evidence is lacking, but the reason that I present for some of these things to be held as true is that, if we hold them as false, then we make them false as a self-fulfilling prophecy. This may not be purely because of our theoretical framework but it may be because of what we do in implementation when we implicitly declare that something no longer needs to be worried about.

I am looking to build a better Machine for Education but such a thing is always built on the assumption that better is something that you can achieve.

"Machine". Mono print on lino with wooden tools. (C) Nick Falkner, 2014

“Machine”. Mono print on lino with wooden tools. (C) Nick Falkner, 2014

The reason for making these assumptions of truth is very simple. When I speak of a “Machine for Education”, I am not moving towards some cyberpunk dystopian future, I am recognising that we are already all embedded inside a framework that turns human energy into educational activity, it’s just that the current machine places stress upon its human components, rather than taking the strain in its mechanical/procedural/technological elements. An aeroplane is a machine for flying and it works because it does not require constant human physical effort simply to keep it in the air. We have replaced the flapping wings of early designs with engines, hydraulics, computers and metal. The reason an aeroplane is a good machine is because the stress is taken on the machine itself, which can take it, with sensible constructions of human elements around it that make it a manageable occupation. (When we place airline workers under undue stress, we see the effect on the machine through reduced efficiency in maintenance and decision making, so this isn’t a perfect system.) Similarly, the development of the driverless car is a recognition of two key facts: firstly, that most cars spend most of their time not being driven and, secondly, that the activity of driving for many people is a chore that is neither enjoyable nor efficiently productive. The car is a good machine where most of the wear happens in the machine but we can make it better as a transport device by further removing the human being as a weak point, as a stress accumulator and as a part of the machine that gets worn down but is not easy to repair or rebuild. We also make the machine more efficient by potentially reducing the number required, given the known usage patterns. (Ultimately, the driverless car is the ultimate micro-light urban transit system.)

So what are these assumptions of truth?

  1. That our educational system can always be improved and, hence, is ready for improvement now.

    It has always surprised me when some people look at dull and lifeless chalk-and-talk, based on notes from 20 years ago, and see no need for improvement, instead suggesting punitive measures to force students to sit and pretend to listen. We have more evidence from research as to what works than we have ever had before and, in conjunction with centuries of careful thought, have a great opportunity to make change.

  2. That everyone on the planet can benefit from an improved educational system.

    Yes, this means that you have to assume that, one day, we could reach everyone on the planet. We cannot assume that a certain group can be ignored and then move on. This, of course, doesn’t mean that it all has to happen tomorrow but it does mean that any planning for extending our systems must have the potential to reach everyone in the country of origin and, by extension, when we have every country, we have the world.

  3. That an educational system can develop students in terms of depth of knowledge and skills but also in terms of their scholarship, breadth of knowledge, and range of skills.

    We currently focus heavily on training for quite narrowly specified professions in the general case and we do this to the detriment of developing the student as a scholar, as a designer, as a thinker, as a philosopher, as an artist and as a citizen. This will vary from person to person but a rich educational grounding is the foundation for better things in later life, more flexibility in work and the potential for more creativity and autonomy in leisure. Ultimately, we want our graduates to be as free to create as they are to consume, rather than consigning them to work in tight constraint.

  4. That we can construct environments where all students can legitimately demonstrate that they have achieved the goals of the course.

    This is a very challenging one so I’ve worded it carefully. I have a problem with curve grading, as everyone probably knows, and it really bothers me that someone can fail because someone else passed. I also think that most of our constraints are highly artificial and they are in place because this is what we did before. If we start from the assumption that we can construct a system where everyone can legitimately pass then we change the nature of the system we build.

  5. That all outcomes in an educational system can be the combination of personal actions and systemic actions, thus all outcomes must be perceived and solutions developed through both lenses.

    So students are handing in their work late? This assumption requires us to look across all of their activity to work out why this is happening. This behaviour may have been set in place earlier on in their educational career so this is a combination of the student activity triggers of value, motivation and instrumentality and a feedback system that is part of an earlier component of the educational system. This does not absolve the student of questionable practices or ‘anti-educational’ behaviour but it requires us to not immediately assume that they are a ‘bad student’ as an easy out.

Ultimately, these are just some of the things I’m looking out and I’m sure that there will be discussion in the comments but I have set these to stop the shortcut thinking that does not lead to a solution because it pushes the problem to a space where it does not have to be solved. If we start from the assumption of no bad students then we have to collect actual evidence to the contrary that survives analysis and peer review to locate where the help needs to be given. And this is very much my focus – support and help to bring people back to a positive educational experience. It’s too easy to assume things are false when it makes the job easier – as well absent a very human response for an over-worked sector. I think it’s time to plant some flags of assumed truths to change the way we talk and think about these things.


Ending the Milling Mindset

This is the second in a set of posts that are critical of current approaches to education. In this post, I’m going to extend the idea of rejecting an industrial revolutionary model of student production and match our new model for manufacturing, additive processes, to a new way to produce students. (I note that this is already happening in a number of places, so I’m not claiming some sort of amazing vision here, but I wanted to share the idea more widely.)

Traditional statistics is often taught with an example where you try to estimate how well a manufacturing machine is performing by measuring its outputs. You determine the mean and variation of the output and then use some solid calculations to then determine if the machine is going to produce a sufficient number of accurately produced widgets to keep your employers at WidgetCo happy. This is an important measure for things such as getting the weight right across a number of bags of rice or correctly producing bottles that hold the correct volume of wine. (Consumers get cranky if some bags are relatively empty or they have lost a glass of wine due to fill variations.)

If we are measuring this ‘fill’ variation, then we are going to expect deviation from the mean in two directions: too empty and too full. Very few customers are going to complain about too much but the size of the variation can rarely be constrained in just one direction, so we need to limit how widely that fill needle swings. Obviously, it is better to be slightly too full (on average) than too empty (on average) although if we are too generous then the producer loses money. Oh, money, how you make us think in such scrubby, little ways.

When it comes to producing items, rather than filling, we often use a machine milling approach, where a block of something is etched away through mechanical or chemical processes until we are left with what we want. Here, our tolerance for variation will be set based on the accuracy of our mill to reproduce the template.

In both the fill and the mill cases, imagine a production line that travels on a single pass through loading, activity (fill/mill) and then measurement to determine how well this unit conforms to the desired level. What happens to those items that don’t meet requirements? Well, if we catch them early enough then, if it’s cost effective, we can empty the filled items back into a central store and pass them through again – but this is wasteful in terms of cost and energy, not to mention that contents may not be able to be removed and then put back in again. In the milling case, the most likely deviance is that we’ve got the milling process wrong and taken away things in the wrong place or to the wrong extent. Realistically, while some cases of recycling the rejects can occur, a lot of rejected product is thrown away.

If we run our students as if they are on a production line along these lines then, totally unsurprisingly, we start to set up a nice little reject pile of our own. The students have a single pass through a set of assignments, often without the ability to go and retake a particular learning activity. If they fail sufficient of these tests, then they don’t meet our requirements and they are rejected from that course. Now some students will over perform against our expectations and, one small positive, they will then be recognised as students of distinction and not rejected. However, if we consider our student failure rate to reflect our production wastage, then failure rates of 20% or higher start to look a little… inefficient. These failure rates are only economically manageable (let us switch off our ethical brains for a moment) if we have enough students or they are considered sufficiently cheap that we can produce at 80% and still make money. (While some production lines would be crippled by a 10% failure rate, for something like electric drive trains for cars, there are some small and cheap items where there is a high failure rate but the costing model allows the business to stay economical.) Let us be honest – every University in the world is now concerned with their retention and progression rates, which is the official way of saying that we want students to stay in our degrees and pass our courses. Maybe the single pass industrial line model is not the best one.

Why carve back to try to reveal people, when we could build people up instead?

Why carve back to try to reveal people, when we could build people up instead?

Enter the additive model, via the world of 3D printing. 3D printing works by laying down the material from scratch and producing something where there is no wastage of material. Each item is produced as a single item, from the ground up. In this case, problems can still occur. The initial track of plastic/metal/material may not adhere to the plate and this means that the item doesn’t have a solid base. However, we can observe this and stop printing as soon as we realise this is occurring. Then we try again, perhaps using a slightly different approach to get the base to stick. In student terms, this is poor transition from the school environment, because nothing is sticking to the established base! Perhaps the most important idea, especially as we develop 3D printing techniques that don’t require us to deposit in sequential layers but instead allows us to create points in space, is that we can identify those areas where a student is incomplete and then build up that area.

In an additive model, we identify a deficiency in order to correct rather than to reject. The growing area of learning analytics gives us the ability to more closely monitor where a student has a deficiency of knowledge or practice. However, such identification is useless unless we then act to address it. Here, a small failure has become something that we use to make things better, rather than a small indicator of the inescapable fate of failure later on. We can still identify those students who are excelling but, now, instead of just patting them on the back, we can build them up in additional interesting ways, should they wish to engage. We can stop them getting bored by altering the challenge as, if we can target knowledge deficiency and address that, then we must be able to identify extension areas as well – using the same analytics and response techniques.

Additive manufacturing is going to change the way the world works because we no longer need to carve out what we want, we can build what we want, on demand, and stop when it’s done, rather than lamenting a big pile of wood shavings that never amounted to a table leg. A constructive educational focus rejects high failure rates as being indicative of missed opportunities to address knowledge deficiencies and focuses on a deep knowledge of the student to help the student to build themselves up. This does not make a course simpler or drop the quality, it merely reduces unnecessary (and uneconomical) wastage. There is as much room for excellence in an additive educational framework – if anything, you should get more out of your high achievers.

We stand at a very interesting point in history. It is time to revisit what we are doing and think about what we can learn from the other changes going on in the world, especially if it is going to lead to better educational results.


Thoughts on the colonising effect of education.

This is going to be longer than usual but these thoughts have been running around in my mind for a while and, rather than break them up, I thought I’d put them all together here. My apologies for the long read but, to help you, here’s the executive summary. Firstly, we’re not going to get anywhere until all of us truly accept that University students are not some sort of different species but that they are actually junior versions of ourselves – not inferior, just less advanced. Secondly, education is heavily colonising but what we often tend to pass on to our students are mechanisms for conformity rather than the important aspects of knowledge, creativity and confidence.

Let me start with some background and look at the primary and secondary schooling system. There is what we often refer to as traditional education: classroom full of students sitting in rows, writing down the words spoken by the person at the front. Assignments test your ability to learn and repeat the words and apply this is well-defined ways to a set of problems. Then we have progressive education that, depending upon your socio-political alignment and philosophical bent, is either a way of engaging students and teachers in the process for better outcomes, more critical thought and a higher degree of creativity; or it is cats and dogs lying down together, panic in the streets, a descent into radicalism and anarchy. (There is, of course, a middle ground, where the cats and dogs sleep in different spots, in rows, but engage in discussions of Foucault.) Dewey wrote on the tension between these two apparatus (seriously, is there anything he didn’t write on?) but, as we know, he was highly opposed to the lining up on students in ranks, like some sort of prison, so let’s examine why.

Simply put, the traditional model is an excellent way to prepare students for factory work but it’s not a great way to prepare them for a job that requires independence or creativity. You sit at your desk, the teacher reads out the instructions, you copy down the instructions, you are assigned piece work to do, you follow the instructions, your work is assessed to determine if it is acceptable, if not, you may have to redo it or it is just rejected. If enough of your work is deemed acceptable, then you are now a successful widget and may take your place in the community. Of course, it will help if your job is very similar to this. However, if your deviation from the norm is towards the unacceptable side then you may not be able to graduate until you conform.

Now, you might be able to argue this on accuracy, were it not for the constraining behavioural overtones in all of this. It’s not about doing the work, it’s about doing the work, quietly, while sitting for long stretches, without complaint and then handing back work that you had no part in defining for someone else to tell you what is acceptable. A pure model of this form cripples independence because there is no scope for independent creation as it must, by definition, deviate and thus be unacceptable.

Progressive models change this. They break up the structure of the classroom, change the way that work is assigned and, in many cases, change the power relationship between student and teacher. The teacher is still authoritative in terms of information but can potentially handle some (controlled for societal reasons) deviation and creativity from their student groups.

The great sad truth of University is that we have a lot more ability to be progressive because we don’t have to worry about too many severe behavioural issues as there is enough traditional education going on below these levels (or too few management resources for children in need) that it is highly unlikely that students with severe behavioural issues will graduate from high school, let alone make it to University with the requisite grades.

But let’s return to the term ‘colonising’, because it is a loaded term. We colonise when we send a group of settlers to a new place and attempt to assert control over it, often implicit in this is the notion that the place we have colonised is now for our own use. Ultimately, those being colonised can fight or they can assimilate. The most likely outcome if the original inhabitants fight is they they are destroyed, if those colonising are technologically superior or greatly outnumber them. Far more likely, and as seen all around the world, is the requirement for the original inhabitants to be assimilated to the now dominant colonist culture. Under assimilation, original cultures shrink to accommodate new rules, requirements, and taboos from the colonists.

In the case of education, students come to a University in order to obtain the benefits of the University culture so they are seeking to be colonised by the rules and values of the University. But it’s very important to realise that any positive colonisation value (and this is a very rare case, it’s worth noting) comes with a large number of negatives. If students come from a non-Western pedagogical tradition, then many requirements at Universities in Australia, the UK and America will be at odds with the way that they have learned previously, whether it’s power distances, collectivism/individualism issues or even in the way that work is going to be assigned and assessed. If students come from a highly traditional educational background, then they will struggle if we break up the desks and expect them to be independent and creative. Their previous experiences define their educational culture and we would expect the same tensions between colonist and coloniser as we would see in any encounter in the past.

I recently purchased a game called “Dog Eat Dog“, which is a game designed to allow you to explore the difficult power dynamics of the colonist/colonised relationship in the Pacific. Liam Burke, the author, is a second-generation half-Filipino who grew up in Hawaii and he developed the game while thinking about his experiences growing up and drawing on other resources from the local Filipino community.

The game is very simple. You have a number of players. One will play the colonist forces (all of them). Each other player will play a native. How do you select the colonist? Well, it’s a simple question: Which player at the table is the richest?

As you can tell, the game starts in uncomfortable territory and, from that point on, it can be very challenging as the the native players will try to run small scenarios that the colonist will continually interrupt, redirect and adjudicate to see how well the natives are playing by the colonist’s rules. And the first rule is:

The (Native people) are inferior to the (Occupation people).

After every scenario, more rules are added and the native population can either conform (for which they are rewarded) or deviate (for which they are punished). It actually lies inside the colonist’s ability to kill all the natives in the first turn, should they wish to do so, because this happened often enough that Burke left it in the rules. At the end of the game, the colonists may be rebuffed but, in order to do that, the natives have become adept at following the rules and this is, of course, at the expense of their own culture.

This is a difficult game to explain in the short form but the PDF is only $10 and I think it’s an important read for just about anyone. It’s a short rule book, with a quick history of Pacific settlement and exemplars, produced from a successful Kickstarter.

Let’s move this into the educational sphere. It would be delightful if I couldn’t say this but, let’s be honest, our entire system is often built upon the premise that:

The students are inferior to the teachers.

Let’s play this out in a traditional model. Every time the students get together in order to do anything, we are there to assess how well they are following the rules. If they behave, they get grades (progress towards graduation). If they don’t conform, then they don’t progress and, because everyone has finite resources, eventually they will drop out, possibly doing something disastrous in the process. (In the original game, the native population can run amok if they are punished too much, which has far too many unpleasant historical precedents.) Every time that we have an encounter with the students, they have to come up with a rule to work out how they can’t make the same mistake again. This new rule is one that they’re judged against.

When I realised how close a parallel this, a very cold shiver went down my spine. But I also realised how much I’d been doing to break out of this system, by treating students as equals with mutual respect, by listening and trying to be more flexible, by interpreting a more rigid pedagogical structure through filters that met everyone’s requirements. But unless I change the system, I am merely one of the “good” overseers on a penal plantation. When the students leave my care, if I know they are being treated badly, I am still culpable.

As I started with, valuing knowledge, accuracy,  being productive (in an academic sense), being curious and being creative are all things that we should be passing on from our culture but these are very hard things to pass on with a punishment/reward modality as they are all cognitive in aspect. What is far easier to do is to pass on culture such as sitting silently, being bound by late penalties, conformity to the rules and the worst excesses of the Banking model of education (after Freire) where students are empty receiving objects that we, as teachers, fill up. There is no agency in such a model, nor room for creativity. The jug does not choose the liquid that fills it.

It is easy to see examples all around us of the level of disrespect levelled at colonised peoples, from the mindless (and well-repudiated) nonsense spouted in Australian newspapers about Aboriginal people to the racist stereotyping that persists despite the overwhelming evidence of equality between races and genders. It is also as easy to see how badly students can be treated by some staff. When we write off a group of students because they are ‘bad students’ then we have made them part of a group that we don’t respect – and this empowers us to not have to treat them as well as we treat ourselves.

We have to start from the basic premise that our students are at University because they want to be like us, but like the admirable parts of us, not the conformist, factory model, industrial revolution prison aspects. They are junior lawyers, young engineers, apprentice architects when they come to us – they do not have to prove their humanity in order to be treated with respect. However, this does have to be mutual and it’s important to reflect upon the role that we have as a mentor, someone who has greater knowledge in an area and can share it with a more junior associate to bring them up to the same level one day.

If we regard students as being worthy of respect, as being potential peers, then we are more likely to treat them with a respect that engenders a reciprocal relationship. Treat your students like idiots and we all know how that goes.

The colonial mindset is poisonous because of the inherent superiority and because of the value of conformity to imposed rules above the potential to be gained from incorporating new and useful aspects of other cultures. There are many positive aspects of University culture but they can happily coexist with other educational traditions and cultures – the New Zealand higher educational system is making great steps in this direction to be able to respect both Maori tradition and the desire of young people to work in a westernised society without compromising their traditions.

We have to start from the premise that all people are equal, because to do otherwise is to make people unequal. We then must regard our students as ourselves, just younger, less experienced and only slightly less occasionally confused than we were at that age. We must carefully examine how we expose students to our important cultural aspects and decide what is and what is not important. However, if all we turn out at the end of a 3-4 year degree is someone who can perform a better model of piece work and is too heavily intimidated into conformity that they cannot do anything else – then we have failed our students and ourselves.

The game I mentioned, “Dog Eat Dog”, starts with a quote by a R. Zamora Linmark from his poem “They Like You Because You Eat Dog”. Linmark is a Filipino American poet, novelist, and playwright, who was educated in Honolulu. His challenging poem talks about the ways that a second-class citizenry are racially classified with positive and negative aspects (the exoticism is balanced against a ‘brutish’ sexuality, for example) but finishes with something that is even more challenging. Even when a native population fully assimilates, it is never enough for the coloniser, because they are still not quite them.

“They like you because you’re a copycat, want to be just like them. They like you because—give it a few more years—you’ll be just like them.
And when that time comes, will they like you more?”

R. Zamora Linmark, “They Like You Because You Eat Dog”, from “Rolling the R’s”

I had a discussion once with a remote colleague who said that he was worried the graduates of his own institution weren’t his first choice to supervise for PhDs as they weren’t good enough. I wonder whose fault he thought that was?


Data: Harder to Anonymise Yourself Than You Might Think

There’s a lot of discussion around a government’s use of metadata at the moment, where instead of looking at the details of your personal data, government surveillance is limited to looking at the data associated with your personal data. In the world of phone calls, instead of taping the actual call, they can see the number you dialled, the call time and its duration, for example. CBS have done a fairly high-level (weekend-suitable) coverage of a Stanford study that quickly revealed a lot more about participants than they would have thought possible from just phone numbers and call times.

But how much can you tell about a person or an organisation without knowing the details? I’d like to show you a brief, but interesting, example. I write fiction and I’ve recently signed up to “The Submission Grinder“, which allows you to track your own submissions and, by crowdsourcing everyone’s success and failures, to also track how certain markets are performing in terms of acceptance, rejection and overall timeliness.

Now, I have access to no-one else’s data but my own (which is all of 5 data points) but I’ll show you how assembling these anonymous data results together allows me to have a fairly good stab at determining organisational structure and, in one case, a serious organisational transformation.

Let’s start by looking at a fairly quick turnover semi-pro magazine, Black Static. It’s a short fiction market with horror theming. Here’s their crowd-sourced submission graph for response times, where rejections are red and acceptances are green. (Sorry, Damien.)

Black Static - Response Time Graph

Black Static – Response Time Graph

Black Static has a web submission system and, as you can see, most rejections happen in the first 2-3 weeks. There is then a period where further work goes on. (It’s very important to note that this is a sample generated by those people who are using Submission Grinder, which is a subset of all people submitting to Black Static.) What this looks like, given that it is unlikely that anyone could read a lot 4,000-7,000 manuscripts in detail at a time, is that the editor is skimming the electronic slush pile to determine if it’s worth going to other readers. After this initial 2 week culling, what we are seeing is the result of further reading  so we’d probably guess that the readers’ reviews are being handled as they come in, with some indication that this is one roughly weekly – maybe as a weekend job? It’s hard to say because there’s not much data beyond 21 days so we’re guessing.

Let’s look at Black Static’s sister SF magazine, Interzone, now semi-pro but still very highly regarded.

Interzone - Response Times Graph

Interzone – Response Time Graph

Lots more data here! Again, there appears to be a fairly fast initial cut-off mechanism from skimming the web submission slush pile. (And I can back this up with actual data as Interzone rejected one of my stories in 24 hours.) Then there appears to be a two week period where some thinking or reading takes place and then there’s a second round of culling, which may be an editorial meeting or a fast reader assignment. Finally we see two more fortnightly culls as the readers bring back their reviews. I think there’s enough data here to indicate that Interzone’s editorial group consider materials most often every fortnight. Also the acceptances generated by positive reviews appear to be the same quantity as those from the editors – although there’s so little data here we’re really grabbing at tempting looking straws.

Now let’s look at two pro markets, starting with the Magazine of Fantasy & Science Fiction.

Fantasy & Science Fiction - Response Time Graph

Fantasy & Science Fiction – Response Time Graph

This doesn’t have the same initial culling process that the other two had, although it appears that there is a period of 7-14 days when a lot of work has been reviewed and then rejected – we don’t see as much work rejected again until the 35 day mark, when it looks like all reader reviews are back. Notably, there is a large gap between the initial bunch of acceptances (editor says ‘yes’) and then acceptances supported by reviewers. I’m speculating now but I wonder if what we’re seeing between that first and second group of acceptances are reviewers who write back in and say “Don’t bother” quickly, rather than assembling personalised feedback for something that could be salvaged. Either way, the message here is simple. If you survive the first four weeks in F&SF system, then you are much less likely to be rejected and, with any luck, this may translate (worse case) into personal suggestions for improvement.

F&SF has a postal submission system, which makes it far more likely that the underlying work is going to batched in some way, as responses have to go out via mail and doing this in a more organised fashion makes sense. This may explain why this is such a high level of response overall for the first 35 days, as you can’t easily click a button to send a response electronically and there’re a finite number of envelopes any one person wants to prepare on any given day. (I have no idea how right I am but this is what I’m limited to by only observing the metadata.)

Tor.com has a very interesting graph, which I’ll show below.

Tor.com - Response Time Graph

Tor.com – Response Time Graph

Tor.com pays very well and has an on-line submission system via e-mail. As a result, it is positively besieged with responses and their editorial team recently shut down new submissions for two months while they cleared backlog. What interested me in this data was the fact that the 150 day spike was roughly twice as high as the 90 and 120. Hmm – 90, 120, 150 as dominant spikes. Does that sound like a monthly editors’ meeting to anyone else? By looking at the recency graph (which shows activity relative to today) we can see that there has been an amazing flurry of activity at Tor.com in the past month. Tor.com has a five person editorial team (from their website) with reading and support from two people (plus occasional others).  It’s hard for five people to reach consensus without discussion so that monthly cycle looks about right. But it will take time for 7 people to read all of that workload, which explains the relative silence until 3 months have elapsed.

What about that spike at 150? It could be the end of the initial decisions and the start of “worth another look” pile so let’s see if their web page sheds any light on it. Aha!

Have you read my story? We reply to everything we’ve finished evaluating, so if you haven’t heard from us, the answer is “probably not.” At this point the vast majority of stories greater than four months old are in our second-look pile, and we respond to almost everything within seven months.

I also wonder if we are seeing previous data where it was taking longer to get decisions made – whether we are seeing two different time management strategies of Tor.com at the same time, being the 90+120 version as well as the 150 version. Looking at the website again.

Response times have improved quite a bit with the expansion of our first reader team (emphasis mine), and we now respond to the vast majority of stories within three months. But all of the stories they like must then be read by the senior editorial staff, who are all full-time editors with a lot on our plates.

So, yes, the size of Tor.com’s slush pile and the number of editors that must agree basically mean that people are putting time aside to make these decisions, now aiming at 90 days, with a bit of spillover. It looks like we are seeing two regimes at once.

All of this information is completely anonymous in terms of the stories, the authors and any actual submission or acceptance patterns that could relate data together. But, by looking at this metadata on the actual submissions, we can now start to get an understanding of the internal operations of an organisation, which in some cases we can then verify with publicly held information.

Now think about all the people you’ve phoned, the length of time that you called them and what could be inferred about your personal organisation from those facts alone. Have a good night’s sleep!