The Value of Investment… in People

I apologise in advance because this post is less about learning and teaching as practice and more about protecting those people who provide the learning and teaching. It’s probably more political than usual, although I’m trying to be neutral here there’s some inevitability, so please feel free to skip it.

As many of you will know, a number of Australian Universities have had to slash their budgets in the wake of the global financial crisis, often due to a large amount of their operating budget coming from investments, rather than Commonwealth funds for student places or localised research and consulting incomes. As part of this, the rounds of staff retrenchments, targeted redundancies and the general theme of ‘reducing sail’ seems to show up on the news sites with increasing regularly.

Now, you might not be a huge fan of the American airline Southwest, but their continued profitability and growth after the horror of 9/11 was attributed to a number of key strategies that the airline took during that difficult time. Most other major airlines cut their routes and their staff to reduce their expenditure and, as we all know, staff are expensive. Southwest carried out no layoffs, instead looking at the situation as a possibility for expansion. If everyone else was reducing presence, well, soon enough, people would want to fly again. (I note that this was only really possible because Southwest had committed to keeping their debt low and their cash on hand high, which meant that they didn’t have to service dead debt or carry out a fire sale to strip out the debt or make their interest payments.) From what I’ve read, Southwest still pays some of the highest salaries, is still profitable, has surprisingly good ongoing relationships between management and labour, and, despite some hiccups, is proceeding pretty well.

Pilots take years to train. Crew take years to reach recognised higher levels of competency. A Captain requires somewhere between 10-20 years of experience, thousands of hours of flying, skill tests, commitment to physical fitness. Good cabin crew are also hard to find and take time to train, to give your airline the consistency and excellence of experience that keep people coming back. Who do you fire when the money starts getting squeezed? Senior people (and lose their expertise) or your junior people (and artificially age your workforce)? Worse still, when people start getting fired, what behaviour will you get from the rest? Solidarity, where you drive a wedge between management and worker because the workers unite for good or ill, or treachery, where workers turn on each other to scrabble up the side of the ship to get out of the water? A climate of fear takes focus away from your core business.

Academics take years to train. Senior educators, researchers and administrators take years to reach recognised higher levels of competency. A Professor takes 10-20 years of experience, thousands of hours of reading and writing, millions of dollars in grants, PhDs and other skill test, commitment to … ok, well there the analogy stops but a lot of us run or work out because we’re desk bound. Good administrative support people and professional staff members are hard to find and take time to train, to give our academy the consistency and excellence of experience that keep people coming back.

Do I really need to go on? At a time when the rest of the world is reeling from the GFC, Australia has had some interaction with it but, from most accounts, nothing like the impact elsewhere. When the world recovers, we want to be able to take as many students as possible, into well-established, well-staffed and actively growing programs. If we don’t do that, then somebody else will. There are new Universities going up all over the regions that we have traditionally seen as our student recruiting grounds. (Some Unis here are already changing their acceptance policies to address this but, with reduced staff, you have to wonder how extra intake is going to be balanced.)

This is the moment where we could go in many different directions – and only some of them are good.

We have an amazing opportunity to take what I believe to be a generally excellent educational basis at the tertiary level and make ourselves more available to the domestic and international markets. Shrinking a school, or an area, to half of its staff is not a 1 or even 5-year decision, it’s a 10-20 year decision. Could we be more efficient? Yes, I think we could. Do we need to keep quality levels high? Yes, but then let’s have that discussion and tell people what we want to achieve. Is there a finite amount of increase a given academic can take? Yes. There’s only so far we can squeeze before we risk compromising long-term sustainability, quality and excellence. It’s pretty obvious that there’s a lot of house cleaning going on at the moment, for a range of reasons, and I can’t help thinking that a lot of that can be handled through good management, rather than broad brush activities like this. But I’m a junior woodchuck, so my view may be heavily compromised and rose-tinted.

I’m always scared that these staff reduction exercises take out core aspects of our elders, remove the unlucky and encourage those who are capable in many fields to go elsewhere – at least as much as they remove people that we may actually want to “get rid of”. Worse, the climate of fear, of losing your job or having to shoulder the load as being one of the ‘lucky’ ones left behind, takes focus away from our core business – excellence in learning, teaching and research.

I look at Southwest and, yes, it’s a bit of a stretch, but I wonder what would happen if we committed to riding this out and seeing what opportunities opened up – with no need for division, enforced solidarity or encouraged treachery.


Teaching CS in the 21st Century: CS as a fundamental skill.

Today’s Guardian has a feature in their Computer Science and IT section that includes a lot of very interesting pieces, ranging from what’s scaring girls away from coding, to why we need to be able to program and, John Naughton’s proposal for rebooting the computing curriculum – as an open letter to the Education Minister for the UK. Feel free not to read the rest of my piece if you’re pressed for time – the links on the first page will keep you busy for quite a while.

For those who are still reading, here’s a picture of ubiquitous access to computers in the developing world – giving people the possibility of doing anything with their lives. (Image is from this World Food Programme page, the food aid branch of the UN, showing the Nepalese deployment of the XO Laptop, with a programme focused on bringing young people into education, combined with a cooking oil-based incentive scheme if daughters attend at least 80% of the time.)

Children using a cheap high accessibility laptop.

What I took away from reading the Guardian feature is the overwhelming message that we should teach programming and computer awareness for the same reason that we teach maths and science to all students, regardless of where they’ll end up – because that’s the world in which they live. To quote Naughton’s article:

We teach elementary physics to every child, not primarily to train physicists but because each of them lives in a world governed by physical systems. In the same way, every child should learn some computer science from an early age because they live in a world in which computation is ubiquitous. (Item 3, A Manifesto for Teaching CS in the 21st Century.)

I’ve read too many articles about various government programs that try to raise standards but do so in a way that concentrates effort on some areas in a way that starves all of the other areas, or sidelines them at the least. If we don’t see Information Communication and Technology (ICT) skills as vital, then we won’t assign priority to them. They’ll get shunted out of the way for other topics, like Maths, Language skills and Science. ICT is not more important than these but, in the world that our students will have to occupy, ICT needs a seat at the table. As many other, and better, commentators have noted, the transformation of the workforce continues apace and programming and computer use is now a vital skill in many jobs.

We need the focus in schools, because then we can hire the teachers, which drives the job market, which causes the teacher training, which improves the quality, which improves the number of competent graduates, and ultimately leads to knowledgable and fully-participating members of our civilised democracies where those little boxes on desks aren’t a mystery or intimidating. I can’t take more people into my Uni-level courses than are being produced by schools – and, sadly, not everyone who has the skill or training at school goes on to use it. I can’t wave a wand and turn the “less than 20%” of women who start my degree into 50% by the end. (Well, yes, I can, but I can’t do it fairly or ethically.) I can do the best I can with the people I get but I’d really love to get a lot more people with the skills!

We all know this is a challenge because we have so many acronyms that might mean ‘Computer training’ – are we teaching ICT, IS, IT, CS, CSE? To step back from the acronyms, and their deliberate placement for emphasis, are we teaching computational or algorithmic thinking (problem solving and solution design), are we teaching computer usage at a fundamental level, are we teaching people how to use certain packages, certain techniques – where does programming fit into all of this?

All of us are need at least a subset of these skills now, in the 21st century. On a daily basis, I download more software updates and modifications and program more items around my house, than I ever did in the years before 1995.

As always, time and resource budgets are tight and, because of this, this is not a problem we can solve at one college, one school or even one state. This is why governments have to make this a national priority if initiatives like this will succeed. This doesn’t have to mean standardised testing or fixed curricula – it means incentive to provide quality education in certain areas, with supportive high-level goals and curriculum consideration, as well as allocated money for training and community building. Of course, there are many existing initiatives like the UK revamp of the high school curriculum and available on-line resources but, here in Australia, we still don’t seem to have strong linkage between a senior school course and University entry and it must make it hard to direct students into a certain path if there is no benefit for them. There are some excellent starting points, however, such as the Australian Government’s Digital Education Revolution, so there is certainly some hope for the future, but we need long-term vision and bipartisan support for these initiatives if they’re going to continue and make real change over time.

 


Post #100: Why I Haven’t Left My University

In light of all of the posts from people telling us why they have left their jobs (Goldman Sachs, Google and the Empire, with the meme still rising), I wanted to spend my 100th post telling you why I’m not leaving my job.

  1. I’m not disillusioned. A lot of the “Why I Left” (WIL) posts talk about the authors discovering that their job wasn’t what it seemed, or that it had changed and the culture was gone, or that terrible things had happened and either evil Ring Lords had taken over their world or, in some cases, Evil Hobbits had killed the Benevolent Dictator. (Perspective is important.) Yes, University culture is changing but, firstly, not all change is bad and, secondly, a lot of positive change is taking place. Is this the job I thought it was when I started? Well, no, but that’s because I didn’t really understand what the job was. Education, knowledge, learning, teaching, research, integrity, persistence, excellence. Sometimes the framework it comes in can be irritating (matrix management I’m looking at you) but the core is solid and, because of that, the house stands. I’m now spending effort to get into positions where I can help that change occur in a good way and with a good goal.
  2. I don’t work for shareholders. Or, if I do, I work for 22 million of them.This is a big one. Most Universities in Australia are public Universities – government money, i.e. taxes, go to the universities to pay about half of their bills. Everyone who pays tax invests in the Universities that educates them and their children. Because we live in Australia, even if you can’t pay tax at the moment, then while it is not as equitable and accessible as it used to be (we could fix that, you know) it is still possible for people to go to college. Yes, it would be nice if it were free again but that certainly wouldn’t happen under a profit-driven shareholder vested model. I work for the people and, because of that, I have to be ready to educate anyone, anywhere, anytime. I don’t get to fail off a group of people because I’ve decided that they’re not smart enough for me – I need to look at what I need them to do and what they can do and get them from one place to the other. Maybe they need more help to get to that stage? That’s my job to work out as well, at my level. Some of them won’t make it, sure, but I never want it to be due to anything that I didn’t do.
  3. My job is fantastic.On a given day I can be discussing new developments in technology, encouraging a group of students to code, writing applications for my own research or getting time to stare at a wall and think about how to make the world a better place. Better yet, I have AMAZING ROBES OF POWER in which to do this in times of high celebration. Yes, every so often someone says “Those who can do, those who can’t teach” but I have been and I have done, and I continue to do, and now I also teach (I’ve posted in the past about authenticity). The most useful thing about that phrase is that, when it’s said seriously, you’ve just been saved a lot of effort in character assessment. 🙂
  4. I am a small part of a large community doing the most important job of allFrom kindergarten to PhD, the preparation and training of the next generation is one of the most important things that will ever get done. Since we developed writing, we’ve been able to scale our expert numbers up to match the number of trainees with increasing ability – first we had to copy by hand, then print and now we have electronic distribution. But we still need educators to complete the process of developing knowledge and enabling people to be able to receive and develop knowledge. But what we do is important because, without it, society goes away. Knowledge erodes. Things fall down. The machine stops.
  5. Every so often, someone says thank you. Every so often, one of my students comes back, covered in the dust of the real world and thanks me for what I’ve done. Yes, they often say things like “Wow, that thing you told me – did you know it was right?” but I know what they mean. All that sitting in lecture theatres and working on assignments – it had a purpose. That purpose was the right one. Thank you.

And that’s five good reasons why I’m still here.


Nice Suit! Why My Improved Taste In Clothes Helps Me Teach.

A picture of Barney from How I Met Your Mother

Graduation day can be one of the really big days for my students, as is the first day that they go off for job interviews, or placement interviews – the first day that they have some skills, a matching qualification and have put on the clothes and trappings of business. As Barney would say, “Suit up!”

I’m not intending to start a discussion here on the utility of the suit (because for anyone who has to do tech support, there is none), the assumption that the suit is practical wear in all climates (because in Australia in high summer it most certainly is not) but I do want to talk about the comfort of the suit.

Now, one of the weirdest things about suits is the number of people who wear uncomfortable or even dangerously constricting business attire. It would be hard to imagine a more consistently uncomfortable group of people than a large group of graduating students, sitting in a packed, hot hall, waiting to graduate, necks chafing if they’re wearing ties, sweating because of the layers, possibly risking ankle damage or a fall if they’re in unfamiliar heels and, overall, being ultimately miserable while waiting for the moment when we give them the big piece of paper and say “Go off and be legen…”

Wait for it.

“…dary”.

These days, I have very simple requirements of my clothes. Everything I wear has to be as comfortable as my long-distance running gear. When you run over 20 miles/ 32 km, you don’t have the ability to carry too much spare clothing. What you wear has to be comfortable, suitable and, above all, not chafe regardless of sun, wind and rain. This is clothing to achieve things in – and all of my clothing should do this!

People told me that suits meant business. But suits only mean business because business people wear suits. This kind of dogma is subtly and explicitly divisive – explicitly because if you can’t afford a suit, then you’re on the back foot; subtly because if you can’t afford a good suit, you’re sending a message of either impecunity or ignorance. Now, yes, for special presentations, funerals and where everyone else will be wearing a suit, I will still suit up. But, whenever possible, I wear a nice shirt and trousers – or good jeans. Or shorts, in summer. This is far more practical for what I do and allows me to still walk the 3 miles/5 km from home to work and get my thinking time in. There’s neat, there’s well-dressed and then there’s some of the nightmares passed off as business attire. There is a wealth of secret knowledge, affluence barriers, expectations and, above all, hidden pitfalls in this whole business attire thing that really makes me wonder whether we’re focusing on the right things. I can’t tell my students not to wear business clothing, because the reality is that some people just won’t hire them, but I should be able to help them to develop a mental framework where they can analyse what is being asked of them and then work out if they are happy to pay the price to achieve a goal.

I don’t pretend to be wise but I can now appreciate that I have done enough things, and failed at a sufficient number, that I’ve learned right and wrong ways to approach problems and find solutions. My students need me to share this with them because, although some of the lessons won’t sink in until they do it themselves, any proto-wisdom that I can pass on may save them time. If I tell them what dogma looks like, get them focused on the right things, then I help them to identify some of the things that they will hit once they leave us. I don’t feel more or less of a teacher if I wear shorts or a suit, but, in so many ways, the way that I expose my students to knowledge, discuss it with them and reinforce it will determine how their brain is dressed when they step out into the world. It will also strongly affect how will they improve upon what we’ve taught them and how they accumulate more information into the future. Basically, if I get across to my students the idea that we are giving them a foundation, which will be solid, and show them how to build – then sometime down the line, they’re on the way to something special and rewarding.

And being confident, skilled and competent at what you do, that’s probably the best thing that you can ever wear.


The Binary World of Steve Jobs

I’ve commented before on Steve Jobs but, having just finished Walter Isaacson’s fascinating biography, I’ve had some other thoughts that I wanted to talk about here.

I stand by my previous post, regardless of the success of Apple or Steve Jobs’ achievements, I still wouldn’t let him near my classes but there are still many things that they can learn from his ideas, his example, his life and, of course, his death. It’s just important to separate some of the innate Steveness from the ideas. His desire for the right solution, his attention to design, his drive for perfection are all things that I can use in my teaching. The amount of time spent trying to make every piece of something functional and beautiful – I couldn’t find better exemplars of the design principles I’ve been talking about and you can find them in most homes and in most people’s hands.

But one thing that was thrown into sharp relief for me throughout the biography was the strictly dichotomous nature of his world view. A dichotomy is the splitting of something into two, non-overlapping parts. An often heard dichotomy is “if you’re not with us, you’re against us.” (This is usually a false dichotomy, implying that there are only two choices when there are probably more. If you’re curious, the “Saw” movie franchise exercises the false dichotomy for most of its running – pretending that the protagonists only have two options and that the choice that they make inside that morally and physically restrictive space is somehow a reflection of their ethics.)

Steve Jobs’ world was full of dichotomies. Things were either excellent or they were terrible. Sometimes this switched, very rapidly, depending on the day or who was being spoken to. People were heroes or… well, let’s say villains because I’m trying to keep this clean. There is no doubt that this contributed to the pursuit of excellence in many ways, but my reading of the biography rather obliquely suggests that it was the sheer brilliance and excellence of the people around in Apple that made this happen, to some extent despite this stark view.

A diagram of the hate/like dichotomy

This is pretty much what Isaacson reports as Steve Jobs’ world view and, while it’s quite clear and clean in many regards, it’s simplicity is undermined by the fact that the things in either set could cross that yellow line in unpredictable ways. Now, once again, yes, Apple are hugely successful and there is no doubt that this binary approach had a lot to do with a great deal of its success – but this is not a view that naturally generates discussion. Once again, this is an important part of my job: I need to get students talking.

It would be trivial for me to walk out, ask a question, mock people who give me a weak or incorrect answer, write ‘idiot’ on their assignments and never give them strong guidance as to how to fix it other than “It’s not right”, but it’s not what I’m getting paid for. I will happily talk to my students about purity of vision, strong design principles, try to give them feedback that they recognise as feedback to reinforce this (trickier than it looks) but, at the end of the day, me lecturing at people doesn’t get as much information across as me getting them involved in a broader discussion of issues and principles. It’s very easy to say “this sucks”. It’s much harder to say why this sucks and in discussing why we naturally start to head towards how we can fix it, because we can see the reasons that it’s terrible.

Now, I’m going to move away from Steve’s heroes/villains, great/terrible dichotomies to some of those I see from students while I teach. I have to be able to handle a far less dichotomous view of the world and I have to draw the students away from this as well. Hardware and OS dichotomies abound: PCs don’t suck, Macs don’t rule. Macs aren’t for grandmas and noobs, PCs aren’t the only true programming platform. There’s the regrettable and seemingly entrenched gender dichotomy in STEM – men and women are far more individually distinctive than any mindless and echolalic gender stereotypes that try to give a falsely dichotomous split. (And, of course, this doesn’t even begin to address the discussion on the number of gender identities being greater than two!)

I don’t have a fundamental problem with people being able to identify things that they like or don’t like, I just need to exercise this as a matter of degree in my teaching and I have to pass on to my students that even if they want to draw a line in the sand to separate their world, having only two categories imposes a very hard structure on a much more complicated world. I also need to be able to explain why a categorisation has been made or all I’m going to pass on is dogma – something indisputable that has to be specifically learned in order to be known, versus something that is a matter for discussion. I teach Computer Science – a discipline based heavily on mathematics, usually implemented in artificially-created, short-term universes with arbitrary physical rules inside the system. I’m not sure that I have enough hard ground to stand on to be dogmatic!

At the end of all this, there’s no doubt I would have found Steve Jobs charismatic, fascinating and terrifying, probably in equal parts, and I suspect that he would have had little time for my somewhat wooly, generous and contemplative approach. I certainly could never have achieved what he achieved and I don’t seek to criticise him for what he did because, frankly, I don’t really know enough about him and who am I to judge? But I can look at this example and think about it, in order to work out how I can improve the way that my students think, work and interact with other people. And, bottom line, I don’t think false dichotomies are the way to go forward.


I Am Thinking, HE/SHE Is Procrastinating, THEY Are Daydreaming

This is a follow-up thought to my recent post on laziness. I spend a lot of time thinking and, sometimes, it would be easy to look at me and think “Wow, he’s not doing anything.” Sometimes, in my office, I stare at a wall, doodle, pace the corridor, sketch on the whiteboard or, if I’m really stuck, go for a walk down by the river. All of this helps me to clear and organise my thoughts. I use tools to manage what I have to do and to get it done in time but the cognitive work of thinking things through sometimes takes time. The less I sleep, the longer it takes. That’s why, while I’m jet lagged, I will do mostly catch-up and organisational work rather than thinking. Right now I can barely do a crossword, which is an excellent indicator that my brain is fried for anything much more complex than blogging. Given that I last slept in a bed over 30 hours ago, this isn’t surprising.

Now it’s easy to accept that I stumble around, somewhat absent-mindedly, because I’m an academic and you can all understand that my job requires me to do a lot of thinking…

But so many jobs require a lot of thinking to be done well – or , at least, the component tasks that go to make up modern jobs.

It’s a shame then that it’s activity that most people focus on rather than quality. If I were to sit in my office and type furiously but randomly, answer mails curtly, and never leave for coffee or cake, have to schedule meetings three weeks in advance – what a powerhouse I would appear! Except, of course, that I wouldn’t really appear to be that to people who knew what I was supposed to do. I don’t do the kind of job where I can move from task to task without, in most cases, detailed research including a search for new material, construction, creation, design, analysis, building, testing and executing. As always, this doesn’t make my job better or worse than anyone else’s, but I don’t carry out the same action repeatedly, an action that can be reduced in cognitive load with familiarity, I tend to do something at least slightly different each time. Boiler plate repetition is more likely to indicate that I am not doing my job correctly, given the roles that I hold.

So, if there are no points in a week where I sit there with books or papers or doodles or sketches of ideas and I think about them – then I’m really running the risk of not doing my job. I need to produce work of high quality and, because there’s a lot of new content creation, there’s creation/editing/testing… load throughout. Some of which, to an external viewer, looks like sitting around throwing paper into the bin while I hunt for solutions.

I think about this a lot for my students. I expect them, in a lecture, to not sit and think so much that they don’t communicate. I will try and bring them back from mental flights of fancy rather than let them fly off because I’ve only got an hour or two with them and need to try to get certain concepts across. And then what? Sometime in 4th year, or PhD, I expect them to flip a switch and realise that the apparent inactivity of quiet, contemplative thought is one of the most productive activities? That a day where you write eight pages, and on review only salvage half of one page, could be the most important and useful day in your PhD?

This is why I tend not to give out marks for ‘just anything’ – two pages of nonsense gets zero, there are no marks for effort because I am rewarding the wrong activity, especially where we haven’t achieved quality. Similarly, I don’t give out marks for attendance but for the collaboration – if you are after an activity, getting the students to do something, I think it’s always best to reward them for doing the activity, not just attending the framing session! But this, of course, comes hand-in hand with the requirement to give them enough timely feedback that they can improve their mark – by improving the quality of what they produce.

Electronic learning systems could be really handy here. Self-paced learning, with controlled remote assessment mechanisms, allows this thinking time and the ability to sit, privately, and mull over the problems. Without anyone harassing them.

Years ago, when I was still in the Army Reserve, we were on exercise for a couple of weeks and my soldiers were getting pretty tired because we’d been running 4 hour shifts to staff the radios. You sat on the radios for 4 hours, you were off for 4. Every so often you might get 6 hours off but it was unlikely. This meant that my soldiers were often sleeping in the middle of the day, desperately trying to make up lost sleep as well as periodically showering, shaving and eating. 4 hours goes really quickly when you’re not on duty. People in our base area who WEREN’T doing these shifts thought that my soldiers were lazy and, on at least two occasions, tried to wake them up to use them on work parties – digging holes, carrying things, doing soldier stuff. My soldiers needed their sleep and I was their commander so I told the other people, politely, to leave them alone. My operators had a job to do and maintained the quality of their work by following a very prescribed activity pattern – but the people around them could only see inactivity because of their perspective.

Maybe it’s time to look at my students again, look at what I’m asking them to do and make sure that what I’m asking and that the environment I’m giving them is the right one. I don’t think we’re doing too badly, because of previous reviews, but it’s probably never too soon to check things out again.


Soft Power follow-up

The magazine “Monocle” has covered soft power in previous issues and, amusingly enough, about 24 hours after I put the previous post into the queue, they ran another article featuring a hard/soft comparison that was very similar to mine – I hadn’t seen it and, obviously, they neither saw nor cared about mine but the coincidence amused me. However, other discussions of soft power in the media include what will happen to the Cato Institute, which has had a significant cultural influence (whether for good or ill I leave to the reader) and now appears to be moving towards a less diverse controlling board. I’m not advocating for Cato (most certainly not) but this is a salient reminder that soft power is used by many different people to attempt to carry out non-military or confrontational change for whatever they consider to be the correct way to live or carry out activity x.

Putting Education into this sphere of “things that you should really think about” seems even more appropriate in this context. But, and it’s a big but (and I cannot lie), it is as easy to place material into the public eye that attacks teaching as it is to defend it. Regrettably, enough people are influenced by the first argument that they say which even vaguely aligns with their beliefs – it becomes a fact and attempts to argue against it just reinforce the fact. What this means to me is that positive, constructive examples should be seen everywhere.

Which comes back to us. I’m still a bit jet lagged but it’s right on top of my to do list. “Be educationally excellent – frequently.” 🙂


Impact and Legacy. A Memoriam For a Man I Never Met.

Paul Haines is dead. I never met him. Part of his legacy, however, is that you can read what I’m writing now.

Paul was a writer, and a very good one, who I got to know, to an extent, through LiveJournal. Regrettably, it was after he was diagnosed with the cancer that went on to kill him, on March 5th, 2012, but his account of his striving to survive and his continuing desire to write and be a father and a family man had a great effect upon me. Sadly, it didn’t remove my love of subordinate clauses but my own fiction is now a far more Australian fiction – a more authentic expression of myself. I told him that it was an embarrassment that it took a New Zealander to show me how to be Australian. I’m happy in that I was able to tell him this while he was still alive and awake, before he slipped deeper down and went. I’m sad in that we agreed to share a beer one day, me hoping that it would come to pass and him knowing that it was a ghost’s promise. I’m sad that he leaves behind a wife and young daughter. And I’m angry at cancer but, then, I’m always angry at cancer.

I have always considered my legacy to be my students and my friends. I have no children of my own and cats don’t last forever. The extent to which I now feel the loss of a man I never met reminds me, not that I should need it, that honest writing, regular writing, naked writing is a legacy of its own. Part of Paul’s legacy is here, on this page as you read it, as well as in his books and on-line writing.

If you’re reading this, then you know I write – but do you? What do you want to say to the world? They say that “The first Velvet Underground album only sold 10,000 copies, but everyone who bought it formed a band.” I wonder how many people we can get to write? Paul’s struggle, his account of his life, his works and his untimely death, foreshadowed as it was for so long, touched me and led me to write. To finally start putting things out there. Not because I have anything that amazing to say but because I have anything to say at all.

You don’t know when and you don’t know where. Do you have something to write? Do you have anything to say? Share it with us all, please. We may hate it, it may scare us – or it might inspire another person. More impact. A legacy.

Tomorrow I will fly home. As I approach New Zealand, Paul’s birthplace, and as I leave it and head towards his adopted home, I’ll raise a glass for that beer we never had and toast his memory. And when I land, I look forward to reading your writings.

RIP: Paul Richard Haines 8 June 1970 – 5 March 2012


SIGCSE, Keynote #2, Hal Abelson, “The midwife doesn’t get to keep the baby.”

Well, another fantastic keynote and, for the record, that’s not the real title. The title of the talk was From Computational Thinking to Computational Values. For those who don’t know who Hal Abelson is, he’s a Professor of EE/CS at MIT who has made staggering contributions to pedagogy and the teaching of Computer Science over the years. He’s been involved with the first implementations of Logo, changed the way we think about using computer languages, has been a cornerstone of the Free Software Movement (including the Foundation), led the charge of the OpenCourseWare (OCW) at MIT, published many things that other people would have been scared to publish and, basically, has spent a long time trying to make the world a better place.

It went without saying that, today, we were in for some inspiration and, no doubt, some sort of call to arms. We weren’t disappointed. What follows is as accurate a record as I could make, typing furiously. I took a vast quantity of notes over what was a really interesting talk and I’ll try to get the main points down here. Any mistakes are mine and I have tried to represent the talk without editorialising, although I have adjusted some of the phrasing slightly in places, so the words are, pretty much, Professor Abelsons’s.

Professor Abelson started from a basic introduction of Computational Thinking (CT) but quickly moved on to how he thought that we’d not quite captured it properly in modern practice: it’s how we look in this digital world and see it as a source of empowerment for everybody, as a life changing view. Not just CT, but computational values.

What do we mean? We’re not only talking about cool ideas but that these ideas should be empowering and people should be able to exercise great things and have an impact on the world.

He then went on to talk about Google’s Ngram viewer, which allows you to search all of the books that Google has scanned in and find patterns. You can use this to see how certain terms, ideas and names come and go over time. What’s interesting here is that (1) ascent to and descent from fame appears to be getting faster and (2) you can visualise all of this and get an idea of the half-life of fame (which was nearly the title of this post).

Abelson describes this as a generative platformone which can be used for things that were not thought of it when it was built, one we can build upon ourselves and change over time. Generating new things for an unseen future. (Paper reference here was Nature, with a covering article from another magazine entitled “Researchers Aim to chart intellectual trends in Arxiv”)

Then the talk took a turn. Professor Abelson took us back, 8 years ago, when Duke’s “Give everyone an iPod” project had every student (eventually) with a free iPod and encouraged them to record, share and mix-up what they were working with.

Enter the Intellectual Property Lawyer. Do the students have permission to share the lecturer-created creative elements of the lectures?

Professor Abelson’s point is that we are booming more concerned with locking up our content into proprietary Content Management Systems (CMS) and this risks turning the academy into a marketplace for packaged ideas and content, rather than a place of open enquiry and academic freedom. This was the main theme of the talk and we’ve got a lot of ground left to cover here! This talk was for those who loved computational values, rather than property creation.

We visited the early, ham-fisted attempts to grant limited licences for simple activities like recording lectures and the immediately farcical notion that I could take notes of a lecture and be in breach of copyright if I then discussed it with a classmate who didn’t attend. Ngrams shows what happens when you have a system where you can do what you like with the data – what if the person holding that data for you, which you created, starts telling you what to do? Where does this leave our Universities?

Are we producing education or property? Professor Abelson sees this as a battle for the soul of the Universities. We should be generative.

We can take computational actions, actions that we will take to reinforce the sense that we have that people ought to be able to relish the power that they get from our computational thinking and computational ideas. This includes providing open courseware (like MIT’s OCW and Stanford’s AI) and open access to research, especially (but not only) when funded by the public purse.

As a teaser, at this point, Abelson introduced MITx, an online intensive learning system that opens up on MONDAY. No other real details – put it in your calendar to check out on Monday! MIT want their material and their content engines to be open source and generative – that word again! Put it into your own context or framework and do great things!

The companion visions to all of this are this:

  1. Great learning institutions provide universal access to course content. (OpenCourseWare)
  2. Great research institutions provide universal access to their collective intellectual resources.(DSpace)

What are the two reasons that we should all support these open initiatives? Why should we fill in the moat and open the drawbridge?

  1. Without initiatives to maintain them, we risk marginalising our academic values and stressing our university communities.
  2. To keep a seat at the table in decisions about the disposition of knowledge in the information age.

Abelson introduced an interesting report, “Who Owns Academic Work? Battling for Control of Intellectual Property”, which discusses the conflation of property and academic rights.

Basically, scientific literature has become property. We, academia, produce it and then give away our rights to journal publishers, who give us limited rights in exchange on a personal level and then hold onto it forever. Neither our institution nor the public has any right to this material anymore. We looked at some examples of rights. Sign up to certain publishers and, from that point on, you can use only up to 250 words of any of the transferred publications in a new work. The number of publishers is shrinking and the cost of subscription is rising.

Professor Abelson asked how it is that, in this sphere alone, the midwife gets to keep the baby? We all have to publish if we act individually, as promotions and tenure depend upon publication in prominent journals – but that there was hope (and here he referred to the Mathematical boycott of the Elsevier publishing group). HR 3699 (the Research Works Act) could have challenged any federal law that mandated open access on federally funded research. Lobbied for by the journal publishing group, it lost support, firstly from Elsevier, and then from the two members of Congress who proposed it

Even those institutions that have instituted an open access policy are finding it hard – some publishers have made specific amendments to the clause that allows pre-print drafts to be display locally to say “except where someone has an institutionally mandated open access policy”.

BUT. HR3699 has gone away for now. Abelson’s message is that there is hope!

We have allowed a lot of walled gardens to spring up. Places where data is curated and applications made available, but only under the permission of the gardener. Despite our libraries paying up to hundreds of thousands of dollars for access to the on-line journal stores, we are severely limited in what we can do with them. Your library cannot search it, index it, scrape it, or many other things. You can, of course, buy a service that provides some of these possibilities from the publisher. A walled garden is not a generative environment.

Jonathan Zittrain, 2008, listed two important generative technologies: the internet and the PC, because you didn’t need anyone’s permission to link or to run software. In Technology Review, now, Zittrain thinks that the PC is dead because of the number of walled gardens that have sprung up.

In Professor Abelson’s words:

Network Effects
lead to
Monopoly Positions
lead to 
Concentration of Channels
lead to
Decline of Generativity.
 What about tomorrow? Will our students have the same tinkering possibilities that we had? Will any of our old open software still run?  Will mobile computing be tinkerable? Open source allows for small tinkering steps, and reduces our reliance on monolithic, approved, releases.
The talk then concluded with some more of Professor Abelson’s words, which I reproduce here because they are far better than mine.
We have the spark of inspiration about how one should relate to their information environment and the belief that that kind of inspiration, power and generativity should be available to everybody.

These beliefs are powerful and have powerful enemies. Draw on your own inspiration and power to make sure that what inspired us is going to be available to our students.

Another month, another milestone!

That’s another month of blogging down. At some stage, I plan to measure what my output has been and try to come up with some indication of how I can improve my content. I’ll probably try to make things tighter, add some picture, but have separate longer essays occasionally.

Only 10 more months of 1 post / day to keep to my original goal!

Thanks for reading – if you’re new, you can start at Jan 1 and work forward, if you’re a long-time reader, thanks for sticking around.

I wanted to put a picture of success or winning here but, frankly, there are only so many pictures of grumpy babies and Charlie Sheen that anybody needs. So enjoy the rapturous and simplistic text. I’ll see you tomorrow.