Tukhta: the tyranny of inflated performance figures.

I’m sketching out a book on the early Soviet Union and artistic movements (don’t ask) so I’ve been rereading every Russian author I can get my hands on. I read a lot of these works when I was (probably too) young, starting from the very easy and shallow slopes of “Ivan Denisovich” and then plunging down into “Gulag Archipelago”. One of the things that comes out starkly from Solzhenitsyn’s account of the forced labour camps of “Gulag Archipelago” is the way that unrealistic expectations from an overbearing superior organisation can easily lead to an artificial conformity to productivity requirements, which leads to people cheating to achieve their overly ambitious quotas. In Solzhenitsyn’s words, the many thieves in the camp (he is less than complementary about non-political prisoners) coined the word tufta, which he rendered into better Russian as tukhta, the practice of making up your quotas through devious means and fabricating outputs. This could be as simple as writing down a figure that didn’t reflect your actual labour or picking up a pile of timber that had already been counted, moving it somewhere else, and counting it again.

The biggest problem with achieving a unreasonable goal, especially one which is defined by ideology rather than reality, is that it is easy for those who can to raise the expectation because, if you can achieve that goal, then no doubt you can achieve this one. This led to such excesses as the Stakhanovite movement, where patently impossible levels of human endeavour were achieved as evidence of commitment to Stalinist ideology and being a good member of the state. The darker side to all this, and this will be a word very familiar to those used to Soviet history, is that anyone who doesn’t attain such lofty goals or doesn’t sign up to be a noble Stakhanovite is labelled as a wrecker. Wreckers were a very common obstacle in the early development of the new Soviet state, pointing out things like “you can’t build that without concrete” or “water flows downhill”.  It should be noted that the original directives of the movement were quite noble, as represented in this extract from a conference in 1935:

The Stakhanovite movement means organizing labor in a new fashion, rationalizing technologic processes, correct division of labor, liberating qualified workers from secondary spadework, improving work place, providing rapid growth for labor productivity and securing significant increase of workers’ salaries.

Pretty good, right? Now consider that the namer of this movement was “Aleksei Stakhanov, who had mined 102 tons of coal in less than 6 hours (14 times his quota)”. This astounding feat of human endeavour was broken a year later, when Nikita Izotov mined 607 tons of coal in a single shift! It’s worth noting that fully-mechanised and highly industrialised contemporary Australian coal mines can produce round about 3,800 tonnes every 6 hours. What a paltry achievement when all you need is six Nikita Izotovs. So this seemingly well-focused initiative, structured as a benefit to state and worker, is disingenuous for the state and dangerous for the worker.

""Stakhanovite model soviet worker guarantees the continuing peace!"" You'll note the anti-intelligensia and racist targets of the worker - ideologically these were all wreckers.

“”Stakhanovite model soviet worker guarantees the continuing peace!””
You’ll note the anti-intelligensia and racist imagery on the poster as well – ideologically these were all wreckers.

Imagine that you are a worker trying to keep yourself and your family alive in the middle of famine after famine – of course you want to meet the requirements as well as you can, potentially even exceeding them so that you don’t get sent to a camp, locked up, or demoted and diminished in your role. While some people might be practising tukhta out of laziness, you are practising it because it is the way that things are. You need to nod in agreement with ridiculous requirements and then write up your results in a way that exceeds them, if you want to survive. Your reward? Even more ridiculous requirements, not determined in capacity and available inputs but in required output. Tukhta is your curse and your only means of survival. Unsurprisingly, the Stakhanovite movement was denounced as part of Stalinism later on in the emerging and mutating Soviet Union.

Now imagine that you are a student. You have been given a pile of reading to do, a large collection of assignments across a variety of subjects that are not really linked to each other, and you are told that you need to do all of this to succeed. Are you going to deeply apply yourself to everything, to form your own conceptual framework and illuminate it through careful study? Well, perhaps you would, except that you have quotas to achieve and deadlines to meet and, around you, other students are doing better, pressing further and are being actively rewarded and encouraged for it. Will you be at least tempted to move things around to achieve your quota? Will you prioritise some labour over another, which could be more useful in the long-term? Will you hide your questions in the hope of being able to be seen to not be a bad student?

Now imagine that you are a young academic, perhaps one with a young family, and you are going to enter the job market. You know that your publications, research funding and overall contributions will be compared to other stand-outs in the field, to overall averages and to defined requirements for the institution. Will you sit and mull contemplatively over an important point of science or will you crank out yet another journal at a prestigious, but not overly useful, target venue, working into the night and across the weekend? Will you look at the exalted “Research Stars” who have very high publication and citation rates and who attract salary loadings up to a level that could pay for 2-3 times the number of positions they hold? Will you be compared to these people and found wanting? Will you write papers with anyone prestigious? Will you do what you need to do to move from promising to reliable to a leader in the field regardless of whether it’s actually something you should be doing? (Do you secretly wonder whether you can even get there from where you started and lie awake at night thinking about it?)

Measurements that pit us against almost impossible standards and stars so high that we probably cannot reach them grind down the souls of the majority of the population and lead them into the dark pathways of tukhta. It is easy to say “Don’t cheat” or “Don’t work all weekend” when you are on top of the pile. As the workers in the Gulag and many Soviet Citizens found out, doing that just lets the people setting the quotas to keep setting them as they wish, with no concern for the people who are grist to the mill.

Tukhta should not be part of an educational system and we should be very wary of the creeping mensuration of the academy. You don’t have to look far to see highly celebrated academics and researchers who were detected in their cheating and were punished hard. Yet a part of me knows that the averages are set as much by the tukhtaviks that we have not yet detected and, given how comparative was have made our systems, that is monstrously unfair.

Assessing how well someone is performing needs to move beyond systems that are so pitifully easy to game and so terribly awful to their victims when they are so gamed.


The Fragile Student Relationship (working from #Unstuck #by Julie Felner @felner)

I was referred some time ago to a great site called “Unstuck”, which has some accompanying iPad software, that helps you to think about how to move past those stuck moments in your life and career to get things going. They recently posted an interesting item on “How to work like a human” and I thought that a lot of what they talked about had direct relevance to how we treat students and how we work with them to achieve things. The article is by Julie Felner and I strongly suggest that you read it, but here are my thoughts on her headings, as they apply to education and students.

Ultimately, if we all work together like human beings, we’re going to get on better than if we treat our students as answer machines and they treat us as certification machines. Here’s what optimising for one thing, mechanistically, can get you:

This robot is the business at climbing hills. Dancing like a fool, not so much. It's not human.

This robot is the business at climbing hills. Dancing like a fool, not so much. It’s not human.

But if we’re going to be human, we need to be connected. Here are some signs that you’re not really connected to your students.

  1. Anything that’s not work you treat with a one word response. A student comes to see you and you don’t have time to talk about anything but assignment X or project Y. I realise time is scarce but, if we’re trying to build people, we have to talk to people, like people.
  2. You’re impatient when they take time to learn or adjust. Oh yeah, we’ve all done this. How can they not pick it up immediately? What’s wrong with them? Don’t they know I’m busy?
  3. Sleep and food are for the weak – and don’t get sick. There are no human-centred reasons for not getting something done. I’m scheduling all of these activities back-to-back for two months. If you want it, you’ll work for it.
  4. We never ask how the students are doing. By which I mean, asking genuinely and eking out a genuine response, if some prodding is required. Not intrusively but out of genuine interest. How are they doing with this course?
  5. We shut them down. Here’s the criticism. No, I don’t care about the response. No, that’s it. We’re done. End of discussion. There are times when we do have to drawn an end to a discussion but there’s a big difference between closing off something that’s going nowhere and delivering everything as if no discussion is possible.

Here is my take on Julie’s suggestions for how we can be more human at work, which works for the Higher Ed community just as well.

  1. Treat every relationship as one that matters. The squeaky wheels and the high achievers get a lot of our time but all of our students are actually entitled to have the same level of relationship with us. Is it easy to get that balance? No. Is it a worthwhile goal? Yes.
  2. Generously and regularly express your gratitude. When students do something well, we should let them know- as soon as possible. I regularly thank my students for good attendance, handing things in on time, making good contributions and doing the prep work. Yes, they should be doing it but let’s not get into how many things that should be done aren’t done. I believe in this strongly and it’s one of the easiest things to start doing straight away.
  3. Don’t be too rigid about your interactions. We all have time issues but maybe you can see students and talk to them when you pass them in the corridor, if both of you have time. If someone’s been trying to see you, can you grab them from a work area or make a few minutes before or after a lecture? Can you talk with them over lunch if you’re both really pressed for time? It’s one thing to have consulting hours but it’s another to make yourself totally unavailable outside of that time. When students are seeking help, it’s when they need help the most. Always convenient? No. Always impossible to manage? No. Probably useful? Yes.
  4. Don’t pretend to be perfect. Firstly, students generally know when you’re lying to them and especially when you’re fudging your answers. Don’t know the answer? Let them know, look it up and respond when you do. Don’t know much about the course itself? Well, finding out before you start teaching is a really good idea because otherwise you’re going to be saying “I don’t know a lot” and there’s a big, big gap between showing your humanity and obviously not caring about your teaching. Fix problems when they arise and don’t try to make it appear that it wasn’t a problem. Be as honest as you can about that in your particular circumstances (some teaching environments have more disciplinary implications than others and I do get that).
  5. Make fewer assumptions about your students and ask more questions. The demographics of our student body have shifted. More of my students are in part-time or full-time work. More are older. More are married. Not all of them have gone through a particular elective path. Not every previous course contains the same materials it did 10 years ago. Every time a colleague starts a sentence with “I would have thought” or “Surely”, they are (almost always) projecting their assumptions on to the student body, rather than asking “Have you”, “Did you” or “Do you know”?

Julie made the final point that sometimes we can’t get things done to the deadline. In her words:

You sometimes have to sacrifice a deadline in order to preserve something far more important — a relationship, a person’s well-being, the quality of the work

I completely agree because deadlines are a tool but, particularly in academia, the deadline is actually rarely as important as people. If our goal is to provide a good learning environment, working our students to zombie status because “that’s what happened to us” is bordering on a cycle of abuse, rather than a commitment to quality of education.

We all want to be human with our students because that’s how we’re most likely to get them to engage with us as a human too! I liked this article and I hope you enjoyed my take on it. Thank you, Julie Felner!


5 Things: Computers

In the interests of blogging more usefully, I’m trying some “5 point posts” in areas where I have some reasonable knowledge. Hope they’re useful!

  1. Computers neither like you nor hate you.

    If I had a dollar for every time I saw someone go through some sort of ritual like pleading with, patting or hitting a computer, I’d be a very rich man. We often talk about computers as if they understand what we’re talking about (a fallacy that can trip up novice programmers, thanks for the reminder, Mark!) and this assumes that there’s some kind of mind in there. I know that you all know that it’s not actually true but we have to stop acting like it’s true as well.

    If you have important documents on your computer – then back them up, somewhere. If you are writing large documents, save them every 5 minutes or so. And check regularly to make sure that they’re actually being saved. The amount of preparation you put into making sure that the computer doing something ‘bad’ won’t actually affect you will directly reduce the amount of stress that you feel when it does go wrong. The computer is neither your friend nor your enemy and it will do what the programs tell it to do – not what you want it to do or what any reasonable person would do. This is pretty much true across every computer and operating systems. The computer can’t tell your vital photos from an old recipe copy you don’t need and it most certainly has no idea that you have a deadline – you’re just more likely to make mistakes because you’re under pressure.

    One of the best things you can ever install to stop your computer “behaving badly” is anti-virus software that you keep updated. Yes, it costs money (sorry) but how much is your time worth? If you can say “Yes, I lost the last 12 months work and it will take me a week to get my computer working again and I don’t care” then you can skip Anti-Virus. Everyone else – please install supported Anti-Virus software (look on line for customer reviews and recommendations, I make none here.) Having your computer hacked isn’t some jolly pirate image that pops up and goes “ho ho ho”. Modern attacks can wait, encrypt your backups and then charge you money to get at your own data – deleting it if you don’t pay. Computers don’t hate you but there are a lot of haters out there. One of the biggest threats is becoming part of a BotNet, a collection of computers that are being used to conduct unauthorised or criminal activities, without the knowledge of their owners. Not that worried? BotNets can be used to host all sorts of things, including child pornography chat servers and files. But don’t be worried! Install good anti-virus software instead and keep it up to date!

    Now, very, very few people are “bad with computers” but a lot of people have had unfortunate first encounters (and that is far more likely to have to do with the computer than with what you are doing) and have retreated to what is, essentially, a position of superstition. This wouldn’t be a problem, except that…

  2. Computers are everywhere.

    And this is why not being comfortable with computers is going to be more of a problem. I have now learned to program (in simplistic form but still) everything from cars to video recorders, including my vacuum cleaner, because all of them have little computers inside them. It will, sadly, get harder and harder to stay away from them. I’m not advocating some Butlerian evolution of the machine but it’s just happening anyway. Do they work exactly as we wish? No, but I’ll get back to that later, because they are close enough most of the time.

    So you probably already have one at home, in some form, which brings me to… 

  3. Computers need to be replaced and upgraded.

    This is a bit of a pain, particularly for those who don’t like (or prefer not to) change or have no cash (or anything else that says “I don’t want to upgrade”). The computing hardware will eventually break down and the more active the life of the computer, the more likely it is for something to go wrong. Laptops tend to die before desktops because of vibration, dust and heat, and tablets and phones are easy to drop. That’s why the point I made about backing up is really important anyway and triply important for anything vaguely mobile.

    Companies regularly declare an end-of-life period for their software and hardware and you have to keep an eye out for this as, after this time, you will no longer get updates for the software and the hardware. An unsupported computer is a crash waiting to happen and a security hole that you could drive a truck through. So not only do you need to keep using something supported, you have to keep patching it (downloading updates from the company and installing them) to keep it safe. In 2008, an unpatched Windows XP box would be hacked in, on average, 4 minutes of connection time. XP itself was released in 2001 and it was officially declared end-of-life on April 8, 2014. That’s over 12 years, compared to the usual product cycle of 10 years. But now, unless something big happens or you happen to be running ATMs, you will not get any more support for Microsoft on this operating system. Which means that, soon enough, your machine will take but minutes to infect and become part of someone else’s network of compromised machines – if it hasn’t happened already.

    Hardware does change and removing old machines can be painful when you have a trusted companion that is still working. However, these sorts of changes (like Apple’s removal of support for the PowerPC chip) are advertised well in advance (it took 7 years for Apple to stop supporting the PowerPC) and there is at least one silver lining on the creep in hardware and system specifications. If you buy 12 months behind the release of new technology, you should still get 5-ish good years out of your machines and avoid paying full price – plus you can buy refurbished models from early adopters with more money than sense. However, be careful and don’t buy something from a discontinued line because it is cheap – it will end-of-life much sooner than the low-end new line hardware.

    Yes, forced obsolescence sucks but we actually don’t have to buy the new shiny every time (not that many of us can afford to) and knowledge of the refresh/end-of-life cycle will help you to make a good decision. Those of you who are supporting older family members, I know it sucks but you’re going to have broach the issue of operating system changeovers before they become part of an distributed denial-of-service attack on some government department or have all of their e-mails encrypted for a $500 decrypt fee.

  4. It doesn’t really matter which computer you use, if it works for you.

    I’ve used pretty much everything in the way of computers and I use what works for me, when I need to. Right now, I’m using a lot of Apple gear because I’m not doing as much gaming and it all does what I need. If I were working more in different areas, I might be doing a lot more in Linux. I’ve worked with Windows before and I’ll probably work with it again. In 10 years time, who knows?

    I have no strong opinions as to what is best and I’m certainly not going to lecture someone on their choice. If they’re obviously unhappy, then we might chat, but don’t let anyone tell you that you’re right or wrong just because you have this system or that. (Unless it’s horribly out of date or not backed up, in which case, please look into updating/upgrading/fixing!)

  5. Computers are here to stay and the computing profession has some work to do

    And that’s the truth of it. We have a long way to go in making computers work better with people, that’s for sure. It would be great if we could be more ambiguous and hand wavy with a machine and get it to do what we want but there’s a lot of things to get working before that happens. However, hand on my heart, it is so much easier to use computers now than it was 10 years ago, let along 20 or 30. I genuinely think that we are going to see better and better ways to work with them as time goes on so, please, hang in there if you’re having trouble. That next upgrade might be just what you were looking for, even if it seems like a pain at the time.


Knowing the Tricks Helps You To Deal With Assumptions

I teach a variety of courses, including one called Puzzle-Based Learning, where we try to teach think and problem-solving techniques through the use of simple puzzles that don’t depend on too much external information. These domain-free problems have most of the characteristics of more complicated problems but you don’t have to be an expert in the specific area of knowledge to attempt them. The other thing that we’ve noticed over time is that a good puzzle is fun to solve, fun to teach and gets passed on to other people – a form of infectious knowledge.

Some of the most challenging areas to try and teach into are those that deal with probability and statistics, as I’ve touched on before in this post. As always, when an area is harder to understand, it actually requires us to teach better but I do draw the line at trying to coerce students into believing me through the power of my mind alone. But there are some very handy ways to show students that their assumptions about the nature of probability (and randomness) so that they are receptive to the idea that their models could need improvement (allowing us to work in that uncertainty) and can also start to understand probability correctly.

We are ferociously good pattern matchers and this means that we have some quite interesting biases in the way that we think about the world, especially when we try to think about random numbers, or random selections of things.

So, please humour me for a moment. I have flipped a coin five times and recorded the outcome here. But I have also made up three other sequences. Look at the four sequences for a moment and pick which one is most likely to be the one I generated at random – don’t think too much, use your gut:

  1. Tails Tails Tails Heads Tails
  2. Tails Heads Tails Heads Heads
  3. Heads Heads Tails Heads Tails
  4. Heads Heads Heads Heads Heads

Have you done it?

I’m just going to put a bit more working in here to make sure that you’ve written down your number…

I’ve run this with students and I’ve asked them to produce a sequence by flipping coins then produce a false sequence by making subtle changes to the generated one (turns heads into tails but change a couple along the way). They then write the two together on a board and people have to vote on which one is which. As it turns out, the chances of someone picking the right sequence is about 50/50, but I engineered that by starting from a generated sequence.

This is a fascinating article that looks at the overall behaviour of people. If you ask people to write down a five coin sequence that is random, 78% of them will start with heads. So, chances are, you’ve picked 3 or 4 as you’re starting sequence. When it comes to random sequences, most of us equate random with well-shuffled, and, on the large scale, 30 times as many people would prefer option 3 to option 4. (This is where someone leaps into the comments to say “A-ha” but, it’s ok, we’re talking about overall behavioural trends. Your individual experience and approach may not be the dominant behaviour.)

From a teaching point of view, this is a great way to break up the concepts of random sequences and some inherent notion that such sequences must be disordered. There are 32 different ways of flipping 5 coins in a strict sequence like this and all of them are equally likely. It’s only when we start talking about the likelihood of getting all heads versus not getting all heads that the aggregated event of “at least one head” starts to be more likely.

How can we use this? One way is getting students to write down their sequences and then asking them to stand up, then sit down when your ‘call’ (from a script) goes the other way. If almost everyone is still standing at heads then you’ve illustrated that you know something about how their “randomisers” work. A lot of people (if your class is big enough) should still be standing when the final coin is revealed and this we can address. Why do so many people think about it this way? Are we confusing random with chaotic?

The Law of Small Numbers (Tversky and Kahneman), also mentioned in the post, which is basically that people generalise too much from small samples and they expect small samples to act like big ones. In your head, if the grand pattern over time could be resorted into “heads, tails, heads, tails,…” then small sequences must match that or they just don’t look right. This is an example of the logical fallacy called a “hasty generalisation” but with a mathematical flavour. We are strongly biassed towards the the validity of our experiences, so when we generate a random sequence (or pick a lucky door or win the first time at poker machines) then we generalise from this small sample and can become quite resistant to other discussions of possible outcomes.

If you have really big classes (367 or more) then you can start a discussion on random numbers by asking people what the chances are that any two people in the room share a birthday. Given that there are only 366 possible birthdays, the Pigeonhole principle states that two people must share a birthday as, in a class of 367, there are only 366 birthdays to go around so one must be repeated! (Note for future readers: don’t try this in a class of clones.) There are lots of other, interesting thinking examples in the link to Wikipedia that helps you to frame randomness in a way that your students might be able to understand it better.

10 pigeons into 9 boxes? Someone has a roommate.

10 pigeons into 9 boxes? Someone has a roommate.

I’ve used a lot of techniques before, including the infamous card shouting, but the new approach from the podcast is a nice and novel angle to add some interest to a class where randomness can show up.


MOOCs and the on-line Masters Degree

There’s been a lot of interest in Georgia Tech’s new on-line masters degree in Computer Science, offered jointly with Udacity and AT&T. The first offering ran with 375 students, and there are 500 in the pipeline, but readmissions opened again two days ago so this number has probably gone up. PBS published an article recently, written up on the ACM blog.

I think we’re all watching this with interest as, while it’s neither Massive at this scale or Open (fee-paying and admission checked), if this works reasonably, let alone well, then we have something new to offer at the tertiary scale but without many of the problems that we’ve traditionally seen with existing MOOCs (retention, engagement, completion and accreditation.)

Right now, there are some early observations: the students are older (11 years older on average) and most are working. In this way, we’re much closer to the standard MOOC demographic for success: existing degree, older and practised in work. We would expect this course to do relatively well, much as our own experiences with on-line learning at the 100s scale worked well for that demographic. This is, unlike ours, more tightly bound into Georgia’s learning framework and their progress pathways, so we are very keen to see how their success will translate to other areas.

We are still learning about where MOOC (and its children SPOC and the Georgia Tech program) will end up in the overall scheme of education. With this program, we stand a very chance of working out exactly what it means to us in the traditional higher educational sector.

An inappropriate picture of a bricks-and-mortar campus for an article on on-line learning.

An inappropriate picture of a bricks-and-mortar campus for an article on on-line learning.


CodeSpells! A Kickstarter to make a difference. @sesperu @codespells #codespells

I first met Sarah Esper a few years ago when she was demonstrating the earlier work in her PhD project with Stephen Foster on CodeSpells, a game-based project to start kids coding. In a pretty enjoyable fantasy game environment, you’d code up spells to make things happen and, along the way, learn a lot about coding. Their team has grown and things have come a long way since then for CodeSpells, and they’re trying to take it from its research roots into something that can be used to teach coding on a much larger scale. They now have a Kickstarter out, which I’m backing (full disclosure), to get the funds they need to take things to that next level.

Teaching kids to code is hard. Teaching adults to code can be harder. There’s a big divide these days between the role of user and creator in the computing world and, while we have growing literary in use, we still have a long way to go to get more and more people creating. The future will be programmed and it is, honestly, a new form of literacy that our children will benefit from.

If you’re one of my readers who likes the idea of new approaches to education, check this out. If you’re an old-timey Multi-User Dungeon/Shared Hallucination person like me, this is the creative stuff we used to be able to do on-line, but for everyone and with cool graphics in a multi-player setting. If you have kids, and you like the idea of them participating fully in the digital future, please check this out.

To borrow heavily from their page, 60% of jobs in science, technology,engineering and maths are computing jobs but AP Computer Science is only taught at 5% of schools. We have a giant shortfall of software people coming up and this will be an ugly crash when it comes because all of the nice things we have become used to in the computing side will slow down and, in some cases, pretty much stop. Invest in the future!

I have no connection to the project apart from being a huge supporter of Sarah’s drive and vision and someone who would really like to see this project succeed. Please go and check it out!

The Earth Magic Sphere image, from the Kickstarter page.

The Earth Magic Sphere image, from the Kickstarter page.


Funding Education: Trust me, you want to. #stem #education #csed

Some very serious changes to the Higher Education system of Australia are going to be discussed starting from October 28th – deregulating the University fee structure, which will most likely lead to increasing fees and interest rates, leading to much greater student debt. (Yes, there are some positives in there but it’s hard to get away from massive increase of student debt.) While some university representative organisations are in favour of this, with amendments and protections for some students, I am yet to be convinced that deregulating the Universities is going to do much while we labour under the idea that students will move around based on selected specialisations,  the amount of “life lessons” they will accumulate or their perception of value for money. We have no idea what price sensitivity is going to do to the Australian market. We do know what happened in the UK when they deregulated fees:

‘Professor Byrne agreed, but said fee deregulation would have to be “carefully thought through so as to avoid what happened in the UK when they did it there – initially, when the fees were uncapped, all the universities just charged the maximum amount. It’s been corrected now, but that was a complete waste of time because all it did was transfer university costing from the public to the private sphere.”’

But, don’t worry, Professor Byrne doesn’t think this will lead to a two-tier system, split between wealthy universities and less-well-off regionals:

“I’d call it an appropriately differentiated system, with any number of levels within it.”

We have four classes! That must be better than have/have not. That’s… wait…

The core of this argument is that, somehow, it is not the role of Universities to provide the same thing as every other university, which is a slashing of services more usually (coyly) referred to as “playing to your strengths”. What this really is, however, is geographical and social entrapment. You weren’t born in a city, you don’t want to be saddled with huge debt or your school wasn’t great so you didn’t get the marks to go to a “full” University? Well, you can go to a regional University, which is playing to its strengths, to offer you a range of courses that have been market-determined to be suitable.  But it will be price competitive! This is great, because after 2-3 generations of this, the people near the regional University will not have the degree access to make the money to work anywhere other than their region or to go to a different University. And, of course, we have never seen a monopolised, deregulated market charging excessive fees when their consumer suffers from a lack of mobility…

There are some quite valid questions as to why we need to duplicate teaching capabilities in the same state, until we look at the Australian student, who tends to go to University near where they live, rather than moving into residential accommodation on campus, and, when you live in a city that spans 70km from North to South as Adelaide does, it suddenly becomes more evident why there might be repeated schools in the Universities that span this geographical divide. When you live in Sydney, where the commute can be diabolical and the city is highly divided by socioeconomic grouping, it becomes even more important. Duplication in Australian Universities is not redundancy, it’s equality.

The other minor thing to remember is that the word University comes from the Latin word for whole. The entire thing about a University is that it is most definitely not a vocational training college, focussed on one or two things. It is defined by, and gains strength from, its diversity and the nature of study and research that comes together in a place that isn’t quite like any other. We are at a point in history when the world is changing so quickly that predicting the jobs of the next 20 years is much harder, especially if we solve some key problems in robotics. Entire jobs, and types of job, will disappear almost overnight – if we have optimised our Universities to play to their strengths rather than keeping their ability to be agile and forward-looking, we will pay for it tomorrow. And we will pay dearly for it.

Education can be a challenging thing for some people to justify funding because you measure the money going in and you can’t easily measure the money that comes back to you. But we get so much back from an educated populace. Safety on the road: education. Safety in the skies: education. Art, literature, music, film: a lot of education. The Internet, your phone, your computer: education, Universities, progressive research funding and CSIRO.

Did you like a book recently? That was edited by someone who most likely had a degree that many wouldn’t consider worth funding. Just because it’s not obvious what people do with their degrees, and just because some jobs demand degrees when they don’t need them, it doesn’t mean that we need to cut down on the number of degrees or treat people who do degrees with a less directly vocational pathway as if they are parasites (bad) or mad (worse). Do we need to change some things about our society in terms of perceptions of worth and value? Yes – absolutely, yes. But let’s not blame education for how it gets mutated and used. And, please, just because we don’t understand someone’s job, let us never fall into the trap of thinking it’s easy or trivial.

The people who developed the first plane had never flown. The people who developed WiFi had never used a laptop. The people who developed the iPhone had never used one before. But they were educated and able to solve challenges using a combination of technical and non-technical knowledge. Steve Jobs may never have finished college (although he attributed the Mac’s type handling to time he spent in courses there) but he employed thousands of people who did – as did Bill Gates. As do all of the mining companies if they actually want to find ore bodies and attack them properly.

Education will define what Australia is for the rest of this century and for every century afterwards. To argue that we have to cut funding and force more debt on to students is to deny education to more Australians and, ultimately, to very much head towards a permanently divided Australia.

You might think, well, I’m ok, why should I worry? Ignoring any altruistic issues, what do you think an undereducated, effectively underclass, labour force is going to do when all of their jobs disappear? If there are still any History departments left, then you might want to look into the Luddites and the French Revolution. You can choose to do this for higher purposes, or you can do it for yourself, because education will help us all to adjust to an uncertain future and, whether you think so or not, we probably need the Universities running at full speed as cradles of research and ideas, working with industry to be as creative as possible to solve the problems that you will only read about in tomorrow’s paper.

Funding Education: Trust me, you want to.

Rage Against the Machine

Rage Against the Machine


I have a new book out: A Guide to Teaching Puzzle-based learning. #puzzlebasedlearning #education

Time for some pretty shameless self-promotion. Feel free to stop reading if that will bother you.

My colleagues, Ed Meyer from BWU, Raja Sooriamurthi from CMU and Zbyszek Michalewicz (emeritus from my own institution) and I have just released a new book, called “A Guide to Teaching Puzzle-based learning.” What a labour of love this has been and, better yet, we are still still talking to each other. In fact, we’re planning some follow-up events next year to do some workshops around the book so it’ll be nice to work with the team again.

(How to get it? This is the link to Springer, paperback and e-Book. This is the link to Amazon, paperback only I believe.)

Here’s a slightly sleep-deprived and jet-lagged picture of me holding the book as part of my “wow, it got published” euphoria!

See how happy I am?

See how happy I am? And also so out of it.

The book is a resource for the teacher, although it’s written for teachers from primary to tertiary and it should be quite approachable for the home school environment as well. We spent a lot of time making it approachable, sharing tips for students and teachers alike, and trying to get all of our knowledge about how to teach well with puzzles down into the one volume. I think we pretty much succeeded. I’ve field-tested the material here at Universities, schools and businesses, with very good results across the board. We build on a good basis and we love sound practical advice. This is, very much, a book for the teaching coalface.

It’s great to finally have it all done and printed. The Springer team were really helpful and we’ve had a lot of patience from our commissioning editors as we discussed, argued and discussed again some of the best ways to put things into the written form. I can’t quite believe that we managed to get 350 pages down and done, even with all of the time that we had.

If you or your institution has a connection to SpringerLink then you can read it online as part of your subscription. Otherwise, if you’re keen, feel free to check out the preview on the home page and then you may find that there are a variety of prices available on the Web. I know how tight budgets are at the moment so, if you do feel like buying, please buy it at the best price for you. I’ve already had friends and colleagues ask what benefits me the most and the simple answer is “if people read it and find it useful”.

To end this disgraceful sales pitch, we’re actually quite happy to run workshops and the like, although we are currently split over two countries (sometimes three or even four), so some notice is always welcome.

That’s it, no more self-promotion to this extent until the next book!

 


Talking Ethics with the Terminator: Using Existing Student Experience to Drive Discussion

One of the big focuses at our University is the Small-Group Discovery Experience, an initiative from our overall strategy document, the Beacon of Enlightenment. You can read all of the details here, but the essence is that a small group of students and an experienced research academic meet regularly to start the students down the path of research, picking up skills in an active learning environment. In our school, I’ve run it twice as part of the professional ethics program. This second time around, I think it’s worth sharing what we did, as it seems to be working well.

Why ethics? Well, this is first year and it’s not all that easy to do research into Computing if you don’t have much foundation, but professional skills are part of our degree program so we looked at an exploration of ethics to build a foundation. We cover ethics in more detail in second and third year but it’s basically a quick “and this is ethics” lecture in first year that doesn’t give our students much room to explore the detail and, like many of the more intellectual topics we deal with, ethical understanding comes from contemplation and discussion – unless we just want to try to jam a badly fitting moral compass on to everyone and be done.

Ethical issues present the best way to talk about the area as an introduction as much of the formal terminology can be quite intimidating for students who regard themselves as CS majors or Engineers first, and may not even contemplate their role as moral philosophers. But real-world situations where ethical practice is more illuminating are often quite depressing and, from experience, sessions in medical ethics, and similar, rapidly close down discussion because it can be very upsetting. We took a different approach.

The essence of any good narrative is the tension that is generated from the conflict it contains and, in stories that revolve around artificial intelligence, robots and computers, this tension often comes from what are fundamentally ethical issues: the machine kills, the computer goes mad, the AI takes over the world. We decided to ask the students to find two works of fiction, from movies, TV shows, books and games, to look into the ethical situations contained in anything involving computers, AI and robots. Then we provided them with a short suggested list of 20 books and 20 movies to start from and let them go. Further guidance asked them to look into the active ethical agents in the story – who was doing what and what were the ethical issues?

I saw the students after they had submitted their two short paragraphs on this and I was absolutely blown out of the water by their informed, passionate and, above all, thoughtful answers to the questions. Debate kept breaking out on subtle points. The potted summary of ethics that I had given them (follow the rules, aim for good outcomes or be a good person – sorry, ethicists) provided enough detail for the students to identify issues in rule-based approaches, utilitarianism and virtue ethics, but I could then introduce terms to label what they had already done, as they were thinking about them.

I had 13 sessions with a total of 250 students and it was the most enjoyable teaching experience I’ve had all year. As follow-up, I asked the students to enter all of their thoughts on their entities of choice by rating their autonomy (freedom to act), responsibility (how much we could hold them to account) and perceived humanity, using a couple of examples to motivate a ranking system of 0-5. A toddler is completely free to act (5) and completely human (5) but can’t really be held responsible for much (0-1 depending on the toddler). An aircraft autopilot has no humanity or responsibility but it is completely autonomous when actually flying the plane – although it will disengage when things get too hard. A soldier obeying orders has an autonomy around 5. Keanu Reeves in the Matrix has a humanity of 4. At best.

They’ve now filled the database up with their thoughts and next week we’re going to discuss all of their 0-5 ratings as small groups, then place them on a giant timeline of achievements in literature, technology, AI and also listing major events such as wars, to see if we can explain why authors presented the work that they did. When did we start to regard machines as potentially human and what did the world seem like them to people who were there?

This was a lot of fun and, while it’s taken a little bit of setting up, this framework works well because students have seen quite a lot, the trick is just getting to think about with our ethical lens. Highly recommended.

What do you think, Arnold? (Image from moviequotes.me)

What do you think, Arnold? (Image from moviequotes.me)


CSEDU Wrap-up (#csedu14 #AdelEd)

Well, it’s the day after CSEDU and the remaining attendees are all checking out and leaving. All that remains now is lunch (which is not a minor thing in Spain) and heading to the airport. In this increasingly on-line age, the question is often asked “Why do you still go to conferences?”, meaning “Why do you still transport yourself to conferences rather than participating on-line?” It’s a pretty simple reason and it comes down to how well we can be somewhere using telepresence or electronic representations of ourselves in other places. Over the time of this conference, I’ve listened to a number of talks and spoken to a number of people, as you can see from my blog and (if you could see my wallet) the number of business cards I’ve collected. However, some of the most fruitful discussions took place over simple human rituals such as coffee, lunch, drinks and dinner. Some might think that a travelling academic’s life is some non-stop whirl of dining and fun but what is actually happening is a pretty constant round of discussion, academic argument and networking. When we are on the road, we are generally doing a fair portion of our job back home and are going to talks and, in between all of this, we are taking advantage of being surrounded by like-minded people to run into each other and build up our knowledge networks, in the hope of being able to do more and to be able to talk with people who understand what we’re doing. Right now, telepresence can let me view lectures and even participate to an extent, but it cannot give me those accidental meetings with people where we can chat for 5 minutes and work out if we should be trying to work together. Let’s face it, if we could efficiently send all of the signals that we need to know if another human is someone we want to work with or associate with, we’d have solved this problem for computer dating and, as I understand it, people are still meeting for dinners and lunch to see if what was represented on line had any basis in reality. (I don’t know about modern computer dating – I’ve been married for over 15 years – so please correct me if I’m wrong.)

Of course, for dating, most people choose to associate with someone who is already in their geographical locale but academics don’t have that luxury because we don’t tend to have incredible concentrations of similar universities and research groups in one place (although some concentrations do exist) and a conference provides us with a valuable opportunity to walk out our raw ideas into company and see what happens. There is also a lot to be said for the “defusing” nature of a face-to-face meeting, when e-mail can be so abrupt and video conferencing can provide quite jagged and harsh interactions, made more difficult by network issues and timezone problems. That is another good reason for conferences: everyone is away and everyone is in the same timezone. The worst conference to attend is one that is in your home town, because you will probably not take time off work, you’ll duck into the conference when you have a chance – and this reduces the chances of all of the good things we’ve talked about. It’s because you’re separated from your routine that you can have dinner with academic strangers or hang around after coffee to spend the time to talk about academic ideas. Being in the same timezone also makes it a lot easier as multi-continent video conferences often select times based on what is least awful for everyone, so Americans are up too early, Australians are up too late, and the Europeans are missing their lunches. (Again, don’t mess with lunch.)

It’s funny that the longer I stay an academic, the harder I work at conferences but it’s such a good type of hard work. It’s productive, it’s exciting, it’s engaging and it allows us to all make more progress together. I’ve met some great people here and run into some friends, both of which make me very happy. It’s almost time to jump back on a plane and head home (where I turn around in less than 14 hours to go and run another conference) but I feel that we’ve done some good things here and that will lead to better things in the future.

A place for meeting people and taking the time for academic thought.

A place for meeting people and taking the time for academic thought.

It’s been a blast, CSEDU, let’s do it again. Buenos dias!