The Limits of Expressiveness: If Compilers Are Smart, Why Are We Doing the Work?

I am currently on holiday, which is “Nick shorthand” for catching up on my reading, painting and cat time. Recently, my interests in my own discipline have widened and I am precariously close to that terrible state that academics sometimes reach when they suddenly start uttering words like “interdisciplinary” or “big tent approach”. Quite often, around this time, the professoriate will look at each other, nod, and send for the nice people with the butterfly nets. Before they arrive and cart me away, I thought I’d share some of the reading and thinking I’ve been doing lately.

My reading is a little eclectic, right now. Next to Hooky’s account of the band “Joy Division” sits Dennis Wheatley’s “They Used Dark Forces” and next to that are four other books, which are a little more academic. “Reading Machines: Towards an Algorithmic Criticism” by Stephen Ramsay; “Debates in the Digital Humanities” edited by Matthew Gold; “10 PRINT CHR$(205.5+RND(1)); : GOTO 10” by Montfort et al; and “‘Pataphysics: A Useless Guide” by Andrew Hugill. All of these are fascinating books and, right now, I am thinking through all of these in order to place a new glass over some of my assumptions from within my own discipline.

“10 PRINT CHR$…” is an account of a simple line of code from the Commodore 64 Basic language, which draws diagonal mazes on the screen. In exploring this, the authors explore fundamental aspects of computing and, in particular, creative computing and how programs exist in culture. Everything in the line says something about programming back when the C-64 was popular, from the use of line numbers (required because you had to establish an execution order without necessarily being able to arrange elements in one document) to the use of the $ after CHR, which tells both the programmer and the machine that what results from this operation is a string, rather than a number. In many ways, this is a book about my own journey through Computer Science, growing up with BASIC programming and accepting its conventions as the norm, only to have new and strange conventions pop out at me once I started using other programming languages.

Rather than discuss the other books in detail, although I recommend all of them, I wanted to talk about specific aspects of expressiveness and comprehension, as if there is one thing I am thinking after all of this reading, it is “why aren’t we doing this better”? The line “10 PRINT CHR$…” is effectively incomprehensible to the casual reader, yet if I wrote something like this:

do this forever
pick one of “/” or “\” and display it on the screen

then anyone who spoke English (which used to be a larger number than those who could read programming languages but, honestly, today I’m not sure about that) could understand what was going to happen but, not only could they understand, they could create something themselves without having to work out how to make it happen. You can see language like this in languages such as Scratch, which is intended to teach programming by providing an easier bridge between standard language and programming using pre-constructed blocks and far more approachable terms. Why is it so important to create? One of the debates raging in Digital Humanities at the moment, at least according to my reading, is “who is in” and “who is out” – what does it take to make one a digital humanist? While this used to involve “being a programmer”, it is now considered reasonable to “create something”. For anyone who is notionally a programmer, the two are indivisible. Programs are how we create things and programming languages are the form that we use to communicate with the machines, to solve the problems that we need solved.

When we first started writing programs, we instructed the machines in simple arithmetic sequences that matched the bit patterns required to ensure that certain memory locations were processed in a certain way. We then provided human-readable shorthand, assembly language, where mnemonics replaced numbers, to make it easier for humans to write code without error. “20” became “JSR” in 6502 assembly code, for example, yet “JSR” is as impenetrably occulted as “20” unless you learn a language that is not actually a language but a compressed form of acronym. Roll on some more years and we have added pseudo-English over the top: GOSUB in Basic and the use of parentheses to indicate function calls in other languages.

However, all I actually wanted to do was to make the same thing happen again, maybe with some minor changes to what it was working on. Think of a sub-routine (method, procedure or function, if we’re being relaxed in our terminology) and you may as well think of a washing machine. It takes in something and combines it with a determined process, a machine setting, powders and liquids to give you the result you wanted, in this case taking in dirty clothes and giving back clean ones. The execution of a sub-routine is identical to this but can you see the predictable familiarity of the washing machine in JSR FE FF?

If you are familiar with ‘Pataphysics, or even “Ubu Roi” the most well-known of Jarry’s work, you may be aware of the pataphysician’s fascination with the spiral – le Grand Gidouille. The spiral, once drawn, defines not only itself but another spiral in the negative space that it contains. The spiral is also a natural way to think about programming because a very well-used programming language construct, the for loop, often either counts up to a value or counts down. It is not uncommon for this kind of counting loop to allow us to advance from one character to the next in a text of some sort. When we define a loop as a spiral, we clearly state what it is and what it is not – it is not retreading old ground, although it may always spiral out towards infinity.

However, for maximum confusion, the for loop may iterate a fixed number of times but never use the changing value that is driving it – it is no longer a spiral in terms of its effect on its contents. We can even write a for loop that goes around in a circle indefinitely, executing the code within it until it is interrupted. Yet, we use the same keyword for all of these.

In English, the word “get” is incredibly overused. There are very few situations when another verb couldn’t add more meaning, even in terms of shade, to the situation. Using “get” forces us, quite frequently, to do more hard work to achieve comprehension. Using the same words for many different types of loop pushes load back on to us.

What happens is that when we write our loop, we are required to do the thinking as to how we want this loop to work – although Scratch provides a forever, very few other languages provide anything like that. To loop endlessly in C, we would use while (true) or for (;;), but to tell the difference between a loop that is functioning as a spiral, and one that is merely counting, we have to read the body of the loop to see what is going on. If you aren’t a programmer, does for(;;) give you any inkling at all as to what is going on? Some might think “Aha, but programming is for programmers” and I would respond with “Aha, yes, but becoming a programmer requires a great deal of learning and why don’t we make it simpler?” To which the obvious riposte is “But we have special languages which will do all that!” and I then strike back with “Well, if that is such a good feature, why isn’t it in all languages, given how good modern language compilers are?” (A compiler is a program that turns programming languages into something that computers can execute – English words to byte patterns effectively.)

In thinking about language origins, and what we are capable of with modern compilers, we have to accept that a lot of the heavy lifting in programming is already being done by modern, optimising, compilers. Years ago, the compiler would just turn your instructions into a form that machines could execute – with no improvement. These days, put something daft in (like a loop that does nothing for a million iterations), and the compiler will quietly edit it out. The compiler will worry about optimising your storage of information and, sometimes, even help you to reduce wasted use of memory (no, Java, I’m most definitely not looking at you.)

So why is it that C++ doesn’t have a forever, a do 10 times, or a spiral to 10 equivalent in there? The answer is complex but is, most likely, a combination of standards issues (changing a language standard is relatively difficult and requires a lot of effort), the fact that other languages do already do things like this, the burden of increasing compiler complexity to handle synonyms like this (although this need not be too arduous) and, most likely, the fact that I doubt that many people would see a need for it.

In reading all of these books, and I’ll write more on this shortly, I am becoming increasingly aware that I tolerate a great deal of limitation in my ability to solve problems using programming languages. I put up with having my expressiveness reduced, with taking care of some unnecessary heavy lifting in making things clear to the compiler, and I occasionally even allow the programming language to dictate how I write the words on the page itself – not just syntax and semantics (which are at least understandably, socially and technically) but the use of blank lines, white space and end of lines.

How are we expected to be truly creative if conformity and constraint are the underpinnings of programming? Tomorrow, I shall write on the use of constraint as a means of encouraging creativity and why I feel that what we see in programming is actually limitation, rather than a useful constraint.


Doo de doo dooooo, doo de doo doo dooooo.

"What did you do in the 80s, Daddy?""I don't want to talk about it."

“What did you do in the 80s, Daddy?”
“I don’t want to talk about it.”

Some of you will recognise the title of this post as the opening ‘music’ of the Europe song, “The Final Countdown”. I wasn’t sure what to call this post because it was the final component of a year long cycle that begin with some sketchy diagrams and a sketchier plan and has seen several different types of development over time. It is not, however, the final post on this blog as I intend to keep blogging but, from this post forwards, I will no longer require myself to provide at least one new post for every day.

This is, perhaps, just as well, because I am already looking over 2013 and realising that my ‘free project’ space is now completely occupied until July. Despite my intentions to travel less, I am in the US twice before the middle of March and have several domestic trips planned as well. And this is a reminder of everything that I’ve been trying to come to terms with in writing this blog and talking about my students, myself, and our community: I can talk about things and deal with them rationally in my head, but that doesn’t mean that I always act on them.

In retrospect, it has been a successful year and I have been able to produce more positive change in 2012 then probably in the sum of my working contributions up until that point. However, I am not in as good a shape as I was at the start of the year, for a variety of reasons, so when I say that my ‘free project’ space is full, I mean that I have fewer additional things to do but I am deliberately allocating less of my personal time to do them. In 2013, family and friends come first, then my projects, then my required work. Why? Because I will always find a way to do the work that I’m supposed to do, but if I start with that I can use all of my time to do that, whereas if I invert it, I have to be more efficient and I’m pretty confident that I can still get it done. After all, next year I’ll have at least an extra hour or two a day from not blogging.

Let’s not forget that this blogging project has consumed somewhere in the region of 350-400 hours of my time over the year, and that’s probably an underestimate. 400 hours is ten working weeks or just under 17 days of contiguous hours. Was my blog any better for being daily? Probably not. Could I be far more flexible and agile with my time if I removed the daily posting requirement? Of course – and so, away it goes. (So it goes, Mr Vonnegut.) The value to me of this activity has been immense – it has changed the way that I think about things and I have a far greater basis of knowledge from which I can discuss important aspects of learning and teaching. I have also discovered how little I know about some things but at least I know that they exist now! The value to other people is more debatable but given that I know that at least some people have found use in it, then it’s non-zero and I can live with that. Recalling Kurt Vonnegut again, and his book “Timequake”, I always saw this blog as a place where people could think “Oh, me too!” as I stumble my way through complicated ideas and try to comprehend the developed notions of clever people.

“Many people need desperately to receive this message: ‘I feel and think much as you do, care about many of the things you care about, although most people do not care about them. You are not alone.'” (Vonnegut, Timequake, 1997)

I never really thought much about the quality of this blog, but I was always concerned about the qualities of it. I wanted it to be inclusive, reliable, honest, humble, knowledgable, useful and welcoming. Looking back, I achieved some of that some of the time and, at other times, well, I’m a human. Some days I was angrier than others but I like to think it was about important things. Sexism makes me angry. Racism makes me angry. The corruption of science for political ends makes me angry. Deliberate ignorance makes me angry. Inequity and elitism make me angry. I hope, however, the anger was a fuel for something better, burning to lift something up that carried a message that wasn’t just pure anger. If, at any stage, all I did was combine oxygen and kerosene on the launch pad and burn the rocket, then I apologise, because I always wanted to be more useful than that.

This is not the end of the blog, but it’s the end of one cycle. It’s like a long day at the beach. You leap out of bed as the sun is coming up, grab some fruit and run down to the water, still warm from the late summer currents and the hot wind that blows across it, diving in to swim out and look back at the sand as it lights up. Maybe you grab your fishing rod and spend an hour or two watching the float bob along the surface, more concerned with talking to your friend or drinking a beer than actually catching a fish, because it’s just such a nice day to be with people. Lunch is sandy sandwiches, eaten between laughs in the gusty breeze that lifts up the beach and tries to jam a big handful of grains into every bite, so you juggle it and the tomato slides out, landing on your lap. That’s ok, because all you have to do is to dive back into the water and you’re clean again. The afternoon is beach cricket, squinting even through sunglasses as some enthusiastic adult hits the ball for a massive 6 that requires everyone to search for it for about 15 minutes, then it’s some cold water and ice creams. Heading back that night, and it’s a long day in an Australian summer, you’re exhausted, you’re spent. You couldn’t swim another stroke, eat another chip or run for another ball if you tried. You’ll eat something for dinner and everyone will mumble about staying up but the day is over and, in an hour or so, everyone will be asleep. You might try and stay up because there’s so much to do but the new day starts tomorrow. Or, worst case, next summer. It’s not the end of the beach. It’s just the end of one day.

Firstly, of course, I want to thank my wife who has helped me to find the time I needed to actually do this and who has provided a very patient ear when I am moaning about that most first world of problems: what is my blog theme for today. The blog has been a part of our lives every day for 1-2 hours for an entire year and that requires everyone in the household to put in the effort – so, my most sincere gratitude to the amazing Dr K. There’s way I could have done any of this without you.

For everyone who is not my wife, thank you for reading and being part of what has been a fascinating journey. Thank you for all of your comments, your patience, your kindness and your willingness to listen. I hope that you have a very happy and prosperous New Year. Remember what Vonnegut said; that people need to know, sometimes, that they are not alone.

I’ll see you tomorrow.

And this is the real me! Yes, it was me ALL ALONG! Happy New Year!

And this is the real me! Yes, it was me ALL ALONG!
Happy New Year!


Thanks for the exam – now I can’t help you.

I have just finished marking a pile of examinations from a course that I co-taught recently. I haven’t finalised the marks but, overall, I’m not unhappy with the majority of the results. Interestingly, and not overly surprisingly, one of the best answered sections of the exam was based on a challenging essay question I set as an assignment. The question spans many aspects of the course and requires the student to think about their answer and link the knowledge – which most did very well. As I said, not a surprise but a good reinforcement that you don’t have to drill students in what to say in the exam, but covering the requisite knowledge and practising the right skills is often helpful.

However, I don’t much like marking exams and it doesn’t come down to the time involved, the generally dull nature of the task or the repetitive strain injury from wielding a red pen in anger, it comes down to the fact that, most of the time, I am marking the student’s work at a time when I can no longer help him or her. Like most exams at my Uni, this was the terminal examination for the course, worth a substantial amount of the final marks, and was taken some weeks after teaching finished. So what this means is that any areas I identify for a given student cannot now be corrected, unless the student chooses to read my notes in the exam paper or come to see me. (Given that this campus is international, that’s trickier but not impossible thanks to the Wonders of Skypenology.) It took me a long time to work out exactly why I didn’t like marking, but when I did, the answer was obvious.

I was frustrated that I couldn’t actually do my job at one of the most important points: when lack of comprehension is clearly identified. If I ask someone a question in the classroom, on-line or wherever, and they give me an answer that’s not quite right, or right off base, then we can talk about it and I can correct the misunderstanding. My job, after all, is not actually passing or failing students – it’s about knowledge, the conveyance, construction and quality management thereof. My frustration during exam marking increases with every incomplete or incorrect answer I read, which illustrates that there is a section of the course that someone didn’t get. I get up in the morning with the clear intention of being helpful towards students and, when it really matters, all I can do is mark up bits of paper in red ink.

Quickly, Jones! Construct a valid knowledge framework! You're in a group environment! Vygotsky, man, Vygotsky!

Quickly, Jones! Construct a valid knowledge framework! You’re in a group environment! Vygotsky, man, Vygotsky!

A student who, despite my sweeping, and seeping, liquid red ink of doom, manages to get a 50 Passing grade will not do the course again – yet this mark pretty clearly indicates that roughly half of the comprehension or participation required was not carried out to the required standard. Miraculously, it doesn’t matter which half of the course the student ‘gets’, they are still deemed to have attained the knowledge. (An interesting point to ponder, especially when you consider that my colleagues in Medicine define a Pass at a much higher level and in far more complicated ways than a numerical 50%, to my eternal peace of mind when I visit a doctor!) Yet their exam will still probably have caused me at least some gnashing of teeth because of points missed, pointless misstatement of the question text, obscure song lyrics, apologies for lack of preparation and the occasional actual fact that has peregrinated from the place where it could have attained marks to a place where it will be left out in the desert to die, bereft of the life-giving context that would save it from such an awful fate.

Should we move the exams earlier and then use this to guide the focus areas for assessment in order to determine the most improvement and develop knowledge in the areas in most need? Should we abandon exams entirely and move to a continuous-assessment competency based system, where there are skills and knowledge that must be demonstrated correctly and are practised until this is achieved? We are suffering, as so many people have observed before, from overloading the requirement to grade and classify our students into neatly discretised performance boxes onto a system that ultimately seeks to identify whether these students have achieved the knowledge levels necessary to be deemed to have achieved the course objectives. Should we separate competency and performance completely? I have sketchy ideas as to how this might work but none that survive under the blow-torches of GPA requirements and resource constraints.

Obviously, continuous assessment (practicals, reports, quizzes and so on) throughout the semester provide a very valuable way to identify problems but this requires good, and thorough, course design and an awareness that this is your intent. Are we premature in treating the exam as a closing-off line on the course? Do we work on that the same way that we do any assignment? You get feedback, a mark and then more work to follow-up? If we threw resourcing to the wind, could we have a 1-2 week intensive pre-semester program that specifically addressed those issues that students failed to grasp on their first pass? Congratulations, you got 80%, but that means that there’s 20% of the course that we need to clarify? (Those who got 100% I’ll pay to come back and tutor, because I like to keep cohorts together and I doubt I’ll need to do that very often.)

There are no easy answers here and shooting down these situations is very much in the fish/barrel plane, I realise, but it is a very deeply felt form of frustration that I am seeing the most work that any student is likely to put in but I cannot now fix the problems that I see. All I can do is mark it in red ink with an annotation that the vast majority will never see (unless they receive the grade of 44, 49, 64, 74 or 84, which are all threshold-1 markers for us).

Ah well, I hope to have more time in 2013 so maybe I can mull on this some more and come up with something that is better but still workable.


Thinking about teaching spaces: if you’re a lecturer, shouldn’t you be lecturing?

I was reading a comment on a philosophical post the other day and someone wrote this rather snarky line:

He’s is a philosopher in the same way that (celebrity historian) is a historian – he’s somehow got the job description and uses it to repeat the prejudices of his paymasters, flattering them into thinking that what they believe isn’t, somehow, ludicrous. (Grangousier, Metafilter article 123174)

Rather harsh words in many respects and it’s my alteration of the (celebrity historian)’s name, not his, as I feel that his comments are mildy unfair. However, the point is interesting, as a reflection upon the importance of job title in our society, especially when it comes to the weighted authority of your words. From January the 1st, I will be a senior lecturer at an Australian University and that is perceived differently where I am. If I am in the US, I reinterpret this title into their system, namely as a tenured Associate Professor, because that’s the equivalent of what I am – the term ‘lecturer’ doesn’t clearly translate without causing problems, not even dealing with the fact that more lecturers in Australia have PhDs, where many lecturers in the US do not. But this post isn’t about how people necessarily see our job descriptions, it’s very much about how we use them.

In many respects, the title ‘lecturer’ is rather confusing because it appears, like builder, nurse or pilot, to contain the verb of one’s practice. One of the big changes in education has been the steady acceptance of constructivism, where the learners have an active role in the construction of knowledge and we are facilitating learning, in many ways, to a greater extent than we are teaching. This does not mean that teachers shouldn’t teach, because this is far more generic than the binding of lecturers to lecturing, but it does challenge the mental image that pops up when we think about teaching.

If I asked you to visualise a classroom situation, what would you think of? What facilities are there? Where are the students? Where is the teacher? What resources are around the room, on the desks, on the walls? How big is it?

Take a minute to do just this and make some brief notes as to what was in there. Then come back here.

It’s okay, I’ll still be here!

Read the rest of this entry »


False Dichotomy: If I don’t understand it, then either I am worthless or it is!

I’ve been reading an interesting post on Metafilter about the “Minima Moralia: Reflections from the Damaged Life“, by Theodor Adorno. While the book itself is very interesting, two of the comments on the article caught my eye. An earlier commenter had mentioned that they neither understood nor appreciated this kind of thing, and made the usual throwaway remark about postmodernism being “a scam to funnel money from the productive classes to the parasitical academy” (dydecker). Further down, another commenter, Frowner, gently took this statement to task, starting by noting that Adorno would have been appalled by being labelled a post-modernist, and then discussing why dydecker might have felt the need to attack things in this way. It’s very much worth reading Frowner’s comments on this post, but I shall distil the first one here:

  1. Just because a text is difficult to obscure does not mean that it is postmodern. Also post-modernist is not actually an insult and this may be a politically motivated stance to attacks  group of people who are also likely to identify as status quo critical or (gasp) Marxist.

  2. Not all texts need to be accessible to all audiences, not is something worthless, fake or elitist if it requires pre-readings or some effort to get into. Advanced physics texts can be very difficult to comprehend for the layperson. This does not make Quantum Field Theory wrong or a leftist conspiracy.
  3. You don’t need to read books that you don’t want to read.
  4. You don’t need to be angry at difficult books for being difficult. To exactly quote Frowner,

    Difficult books only threaten us if we decide to feel guilty and ashamed for not reading them.

    If you’re actually studying an area, and read the books that the work relies upon, difficult books can become much clearer, illustrating that it was perhaps not the book that was causing the difficulty.

  5. Sometimes you won’t like something and this has nothing to do with its quality or worth – you just don’t like it.
  6. Don’t picture a perfect reader in your head who understands everything and hold yourself to that standard. If you’re reading a hard book then keep plugging away and accept your humanity.

Frowner then goes on to beautifully summarise all of this in a later comment, where he notes that we seem to learn to be angry at, or uncomfortable with, difficult texts, because we are under pressure to be capable of understanding everything of worth. This is an argument of legitimacy: if the work is legitimate and I don’t understand it, then I am stupid, however if I can argue that the work is illegitimate, then this is a terrible con job, I am not stupid for not understanding this and we should attack this work! Frowner wonders about how we are prepared for the world and believes that we are encouraged to see ourselves as inadequate if we do not understand everything for ourselves, hence the forced separation of work into legitimate and illegitimate, with am immediate, and often vicious, attack on those things we define as illegitimate in order to protect our image of ourselves.

I spend a reasonable amount of time in art galleries and I wish I had a dollar for everyone who stood in front of a piece of modern art (anything from the neo-impressionists on, basically) and felt the need to loudly state that they “didn’t get it” or that they could “have painted it themselves.” (I like Rothko, Mondrian and Klee, among others, so I am often in that part of the gallery.) It is quite strange when you come to think about it – why on earth are people actually vocalising this? Looking more closely, it is (less surprisingly) people in groups of two or more who seem to do this: I don’t understand this so, before you ask me about, I will declare it to be without worth. I didn’t get it, therefore this art has failed me. We go back to Frowner’s list and look at point 2: Not all art (in this case) is for everyone and that’s ok. I can admire Grant Wood’s skill and his painting “American Gothic” but the painting doesn’t appeal as much to me as does the work of Schiele, for example. That’s ok, that doesn’t make Schiele better than Wood in some Universal Absolute Fantasy League of Painters (although the Schiele/Klimt tag team wrestling duo, with their infamous Golden Coat Move, would be fun to watch) – it’s a matter of preference. I regularly look at things that I don’t quite understand but I don’t regard it as a challenge or an indication that it or I are at fault, although I do see things that I understand completely and can quite happily identify reasons that I don’t like it!

Klee's "The Goldfish". Some will see this as art, others will say "my kids could do that". Unless you are Hans Wilhelm Klee, no, probably not.

Klee’s “The Goldfish”. Some will see this as art, others will say “my kids could do that”. Unless you are Hans Wilhelm Klee, no, probably not.

I am, however, very lucky, because I have a job and lifestyle where my ability to think about things is a core component: falsely dichotomous thinking is not actually what I’m paid to do. However, I do have influence over students and I need to be very careful in how I present information to them. In my last course, I deliberately referred to Wikipedia among other documents because it is designed to be understood and is usually shaped by many hands until it reaches an acceptable standard of readability. I could have pointed my students at ethics texts but these texts often require more preparation and a different course structure, which may have put students off actually reading and understanding them. If my students go into ethics, or whatever other area they deem interesting, then point 4 becomes valid and their interest, and contextual framing, can turn what would have been a difficult book into a useful book.

I agree with this (effectively) anonymous poster and his or her summary of an ongoing issue: we make it hard for people to admit that they are learning, that they haven’t quite worked something out yet, because we make “not getting something immediately” a sign of slowness (informally) and often with negative outcomes (in assessment or course and career progression). We do not have to be experts at everything, nor should we pretend to be. We risk not actually learning some important and beautiful things because we feel obliged to reject it before it rejects us – and some things, of great worth that will be long appreciated, take longer to ‘get’ then just the minute or two that we feel we can allocate.


Adelaide Computing Education Conventicle 2012: “It’s all about the people”

acec 2012 was designed to be a cross-University event (that’s the whole point of the conventicles, they bring together people from a region) and we had a paper from the University of South Australia:  ‘”It’s all about the people”; building cultural competence in IT graduates’ by Andrew Duff, Kathy Darzanos and Mark Osborne. Andrew and Kathy came along to present and the paper was very well received, because it dealt with an important need and a solid solution to address that need, which was inclusive, insightful and respectful.

For those who are not Australians, it is very important to remember that the original inhabitants of Australia have not fared very well since white settlement and that the apology for what happened under many white governments, up until very recently, was only given in the past decade. There is still a distance between the communities and the overall process of bringing our communities together is referred to as reconciliation. Our University has a reconciliation statement and certain goals in terms of representation in our staff and student bodies that reflect percentages in the community, to reduce the underrepresentation of indigenous Australians and to offer them the same opportunities. There are many challenges facing Australia, and the health and social issues in our indigenous communities are often exacerbated by years of poverty and a range of other issues, but some of the communities have a highly vested interest in some large-scale technical, ICT and engineering solutions, areas where indigenous Australians are generally not students. Professor Lester Irabinna Rigney, the Dean of Aboriginal Education, identified the problem succinctly at a recent meeting: when your people live on land that is 0.7m above sea level, a 0.9m sea-level rise starts to become of concern and he would really like students from his community to be involved in building the sea walls that address this, while we look for other solutions!

Andrea, Kathy and Mark’s aim was to share out the commitment to reconciliation across the student body, making this a whole of community participation rather than a heavy burden for a few, under the guiding statement that they wanted to be doing things with the indigenous community, rather than doing things to them. There’s always a risk of premature claiming of expertise, where instead of working with a group to find out what they want, you walk in and tell them what they need. For a whole range of very good and often heartbreaking reasons, the Australian indigenous communities are exceedingly wary when people start ordering them about. This was the first thing I liked about this approach: let’s not make the same mistakes again. The authors were looking for a way to embed cultural awareness and the process of reconciliation into the curriculum as part of an IT program, sharing it so that other people could do it and making it practical.

Their key tenets were:

  1. It’s all about the diverse people. They developed a program to introduce students to culture, to give them more than one world view of the dominant culture and to introduce knowledge of the original Australians. It’s an important note that many Australians have no idea how to use certain terms or cultural items from indigenous culture, which of course hampers communication and interaction.

    For the students, they were required to put together an IT proposal, working with the indigenous community, that they would implement in the later years of their degree. Thus, it became part of the backbone of their entire program.

  2. Doing with [people], not to [people]. As discussed, there are many good reasons for this. Reduce the urge to be the expert and, instead, look at existing statements of right and how to work with other peplum, such as the UN rights of indigenous people and the UniSA graduate attributes. This all comes together in the ICUP – Indigenous Content in Undergraduate Program

How do we deal with information management in another culture? I’ve discussed before the (to many) quite alien idea that knowledge can reside with one person and, until that person chooses or needs to hand on that knowledge, that is the person that you need. Now, instead of demanding knowledge and conformity to some documentary standard, you have to work with people. Talking rather than imposing, getting the client’s genuine understanding of the project and their need – how does the client feel about this?

Not only were students working with indigenous people in developing their IT projects, they were learning how to work with other peoples, not just other people, and were required to come up with technologically appropriate solutions that met the client need. Not everyone has infinite power and 4G LTE to run their systems, nor can everyone stump up the cash to buy an iPhone or download apps. Much as programming in embedded systems shakes students out of the ‘infinite memory, disk and power’ illusion, working with other communities in Australia shakes them out of the single worldview and from the, often disrespectful, way that we deal with each other. The core here is thinking about different communities and the fact that different people have different requirements. Sometimes you have to wait to speak to the right person, rather than the available person.

The online forum has four questions that students have to find a solution to, where the forum is overseen by an indigenous tutor. The four questions are:

  1. What does culture mean to you?
  2. Post a cultural artefact that describes your culture?
  3. I came here to study Computer Science – not Aboriginal Australians?
  4. What are some of the differences between Aboriginal and non-Aboriginal Australians?

The first two are amazing questions – what is your answer to question number 2? The second pair of questions are more challenging and illustrate the bold and head-on approach of this participative approach to reconciliation. Reconciliation between all of the Australian communities requires everyone to be involved and, being honest, questions 3 and 4 are going to open up some wounds, drag some silly thinking out into the open but, most importantly, allow us to talk through issues of concern and confusion.

I suspect that many people can’t really answer question 4 without referring back to mid-50s archetypal depictions of Australian Aborigines standing on one leg, looking out over cliffs, and there’s an excellent ACMI (Australian Centre for the Moving Image) exhibit in Melbourne that discusses this cultural misappropriation and stereotyping. One of the things that resonated with me is that asking these questions forces people to think about these things, rather than repeating old mind grooves and received nonsense overheard in pubs, seen on TV and heard in racist jokes.

I was delighted that this paper was able to be presented, not least because the goal of the team is to share this approach in the hope of achieving even greater strides in the reconciliation process. I hope to be able to bring some of it to my Uni over the next couple of years.

 


The Emperor’s New Clothes Redux: The Sokal Hoax

Making way in a new area of scholarship can be challenging for many reasons, no matter how welcoming the community. One of the reasons for this is that there are points in our life where we are allowed to make larger mistakes, or be ignorant, but it is rarer for adults, especially those who are already employed within a job, to be allowed the latitude to say “I have no idea”. As I discussed yesterday, the fable of the Emperor’s New Clothes explains this dilemma well, because children have more licence to be honest to the point of tactlessness where an adult is always weighing up the implications of admitting that they cannot quite see what everyone else is talking about.

Some of you will be familiar with the Sokal Hoax, where Professor Alan Sokal, from physics at NYU, submitted an article to a journal of postmodern cultural studies. The work, “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity”, was accepted by the journal Social Text, which was (at the time) not practising academic peer review. Sokal did not intend for this article to be taken seriously or even expect it to published, although he did produce an article that he described it (in a follow-up article) as:

“a pastiche of Left-wing cant, fawning references, grandiose quotations, and outright nonsense . . . structured around the silliest quotations … he could find about mathematics and physics”

The entire affair is worth reading and you can find the Wikipedia summary here and a good critique about some of Sokal’s less intended consequences here (transcript of a New York Review of Books article). Regrettably, what is less clear is whether Sokal actually achieved very much, in real terms, by carrying out this action. Yes, Social Text moved to an academic peer review system and that’s generally better for all concerned. For those who don’t know, peer review is the process by which submitted articles go to a number of other people in the field and they review the work to see if it is fit to publish. This reduces the load on the editors and allows for more, and more specific, areas of expertise to be involved. It is not, however, faultless as a poor combination of peers can still lead to substandard, or plain wrong, work getting through, especially if reviewers farm the work out to their grad students or review under time constraints. It is, therefore, not all that surprising that Sokal’s deliberately targeted paper, which identified how to get a paper published by these editors in this journal, succeeded, and less surprising when you hear the editors’ account that they thought the paper needed revisions (removing much of the handwaving and contradictory footnotes) and were concerned about the article but, as the journal at the time was one of opinion, they published it anyway.

Such generosity on the part of the editors does not forgive the publication of some of the deliberate misuse of terminology and physics that Sokal uses to highlight the lack of rigour in the journal and the editorial review process. However, one of the problems I have with this is that, as a Computer Scientist speaking to Educational researchers, people often take what I say as a true account of my field, especially given that they do not have the expertise in my discipline to know (or care) about things like computability or algorithmic performance. If I were to submit a scholarly paper to a journal of education, am I doing anyone any favours by deliberately misrepresenting the aspects of my field, given that I am identified by discipline and school on submission?

Yes, people should use terms correctly and there is a great deal of misuse of science for uninformed or nefarious purposes, with some of the writings coming from post-modernist inspired writers being completely wrong. However, when one is not a physicist, one depends upon the knowledge gained from other people as to what physics is. There is a part of me that thinks that Sokal wasted an opportunity to actually fix a number of misunderstandings – for example, making a clear distinction between linear in strict mathematical and physical terms and linear, in post-Derridan terms, where the meaning is (quite deliberately) less well-defined and often pejorative. Words change. Terms change. Knowledge can still exist and continue to connect terms if we make the effort to bridge, rather than to mock or deride.

The post-modernists, especially Derrida, have attracted a great deal of negative interest, often for what appear to be semi-religious objects to their approach, although I would be the first to say that Derrida’s obsession with repurposing words, redefining concepts when it suits him, and providing grammatical constructions that further, rather than reduce, ambiguity do make him a valid target for at least a raised eyebrow on many occasions. I do not have a strong opinion as to whether the Emperor, in this case, is clothed or not, but I must be honest and say that I do not believe that the outputs and constants of science are a purely cultural construction, although I do agree that the mechanism of the scientific academy is very much a cultural artefact and if anything deserves to be reduced to its components for inspection, it is an institution that almost systematically seems to avoid recognising the contribution of women and non-western people except where unavoidable. I mention Derrida here, mostly because Derrida was the first point of media attack when Sokal’s hoax was revealed. This speaks volumes for the bravery of Sokal’s attack – when the media will leap up and put a face on a stick to wave it about because “philosophy X is all mumbo-jumbo and here is the head witch doctor” you really have to wonder what a non-peer reviewed opinion piece in a journal dedicated to same is actually achieving. Derrida thought that the major problem with the piece was that it would make a later, serious, attempt to discuss such issues impossible to achieve.

Of course, although Sokal’s Hoax is a triumph of exposing the publication of works based on their source. authority and obscurity, this is most certainly not restricted to post-modernist journals of opinion. A friend of mine called me in once to read through a paper that used such unusual terminology, for him, that he was unsure as to whether it was good or bad. Fortunately, it was in my discipline and, because I know and can use the word ontology without dying, I was able to identify it as a low-level rehash of some basic work in the field. It was sound work, using the correct terminology, but it certainly wasn’t at the level of the conference it had been sent to – to my friend, however, it was as meaningless as anything that Sokal mocked from Derrida. I am well aware that some of my areas, including knowledge management and educational research, are seen by others to be exactly the same as the post-modernist repurposing of scientific terminology that Sokal attacks.

The point is not who is lying to whom, or whether there is anything behind some of the more obscure utterings of the Post-Modernists, but it is whether deliberately winding people up with a hoax would achieve more than a genuine attempt to reach out to and correct a community, using your expertise and developing a voice in the other discipline to provide a sound translation. Epistemology, theory of knowledge, is important and I’m really not sure that hoaxing and mockery really achieves all that much, especially as, like any extrinsic punishment approach, it tells you not to do something but not how not to do it.


The Emperor’s New (Insert Noun Here)

I’ve always enjoyed the story of the Emperor’s New Clothes, because it has a number of different readings. We can speak of the tactless honesty of the innocent, the child who sees the emperor as he is, or of the willingness to uphold the status quo when it is imposed from a sufficiently high point, in the people who pretend that the emperor is clothed. We can also look at the villains of the piece, who weave a suit that is invisible to those who are stupid, incompetent or unfit to hold a position. This is, of course, genius because it forces the viewer of the suit into that most difficult of decisions: do I speak up (and force someone to explicitly work out if I have sufficient worth to counter the prevailing interpretation) or do I stay silent (to not be seen to be a fool).

There are some quite entertaining logical issues to wrestle with, starting from some fairly reasonable assumptions. Imagine that you are Courtier X, arriving in the room after Courtier 1, and you observe the Emperor. Now you know, full well, that the Emperor has always been up to this point clothed, and in the finest clothes of the land, and he is not know for his propensity for streaking. Walking into the room, you would expect the Emperor to be clothed. Let us assume that, out of a sense of survival and fellow-feeling, Courtier X-1, the one who arrived before you, hisses “He’s wearing a suit that is only invisible to idiots.” Surviving in the Royal Court would have prepared X for a life of rapid adjustment to changes of circumstance brought about by pique and the accidental collision of coronial concerns, so this information would immediately have shot through his mind and, whether you believed that the suit was there or not, behaving otherwise has some quite obvious downsides. Firstly, the Emperor obviously believes that he is wearing this suit. Secondly, there are X-1 other courtiers in the room who have now gone along with it. Thirdly, you have a family to feed and it’s not as if you could go off to another court.

The child’s voice is unaffected by such concerns. The child sees, he thinks, he speaks. Children are very frank when they deal with difficult matters such as the apparent ugliness or facial eructations of an aged relative, the apparent size or adiposity of strangers, or the details with which bodily functions are announced. (I can, however, see a Romulan reading of the tale where the child is sent into battle for his outspokenness and fails to achieve victory – but cultures always vary in these matters.) However, everyone is now embarrassed – doubly so because not only is the Emperor nude, but everyone around him has lied to him. With any luck the Imperial Executioner was in on the lie as well, so that he can run off ashamed before he has to behead everyone else.

Speaking truth to power is a difficult matter and we often seem to confuse it with “saying any old thing because it’s our opinion” and the two are really not the same at all. I have previously referred to the “just saying'” mentality, where offensive or bigoted commentary is presented because it is truthful, when it is quite obvious that it is designed to be hurtful and the words are hiding behind a pretence of honesty. Telling the Emperor that he is naked is the duty of the Emperor’s staff, because it allows us to deal with the real villains of the piece, rather than the difficult (and more likely) outcome that a small child went to bed that night with no supper. Telling the Emperor that he is fat really doesn’t serve any purpose unless you are genuinely concerned for his health and attempting to reduce his adiposity.

The Emperor’s New Clothes is often used to refer to other situations of social hypocrisy or the collective agreement on something that is not true and, as such, it is so heavily used in some areas that its coinage is seriously debased. One reading that I find fascinating is that we can regard the suit as the “words we may use to cloak our fears” (Naomi Wood, KSU) but these words do not protect us from the reality of the situation. The child is free of adult corruption, certainly, but this is also a colder and harsher world, a situation at odds with our normal thoughts on childhood.

I strongly believe that one of the key problems some of my colleagues have with educational research, and its associated vocabulary, is that some of them are convinced that we are somehow playing the Emperor’s New Clothes with them. After all, we are asking them to look at the old fabric, find it wanting, and then we are talking of a new one, describing it in terms that may not be used that often in the standard discipline. Worse, every so often I bet we make it look like any sensible person would be able to understand that this was a better approach – and this is quite damning of whoever says it, whether they are talking to students or staff. Speaking truth to power is as important peer-to-peer as it is student-to-teacher or peasant-to-king but we must distinguish between being rude and dismissive and genuinely seeking answers. I may not always succeed but I do try to use evidence, published work and, of course, the far more influential work of the real leaders in this field! I am nowhere near attaining expertise here but at least I now know where to look and where to start the discussions. I do not yet have a suit of knowledge, but I have a pair of shorts that I can wear in the company of the besuited so that we can have some discussions without me exposing myself too badly! 🙂

The antithesis of the New Clothes phenomenon also occurs frequently: people are looking at a fully-clothed person and pretending that they cannot see the clothes. Obviously, neither approach is sensible when pushed to the extreme. Sometimes we just have to use our eyes and our brains and tell people what we see. And that can be one of the hardest things to do – as well as the most valuable.


Pressganging Story into Service: The Dickens, you say?

a-christmas-carol-still

“Marley was dead” and so begins Charles Dickens’ “A Christmas Carol” which has been reprinted and remade so many times it is near impossible to avoid the cultural impact of this work in English-speaking areas. For those who have avoided it, for whatever reason, it is a simple story. An unpleasant miser, Ebenezer Scrooge, believes Christmas to be nothing but humbug, a waste of time, a period for the stupid to amuse themselves, and a way for those who work insufficiently hard to deprive him (Scrooge) of his hard-won money. Scrooge’s transformation within the book is the core of the story, initiated by the visit of his (long dead) business partner, Marley, who warns him that only a bleak and unpleasant afterlife awaits him after death. Marley tells Scrooge that three ghosts will visit him and to change while he still can.

The first ghost, Christmas Past, shows Scrooge a younger version of himself, when he was innocent and those obstacles he faced that put him onto his current (unpleasant, unloving and unloved) trajectory. The second ghost, Christmas Present, shows him the London he is in now. The joy of family and reuniting with old friends. The ghost takes Scrooge to visit the house of Bob Cratchit, Scrooge’s underpaid and overworked clerk, who lives with a large family and a seriously ill child, Tiny Tim, for whom no medical treatment is forthcoming because Scrooge pays Cratchit so little. Finally, Christmas Yet To Come arrives, and takes Scrooge on a dark journey to the death of Tiny Tim still as a young boy, Scrooge’s own death and the human vultures who pick over his belongings, and his untended grave in a dark corner of a forgotten cemetery.

Scrooge, reminded of his humanity, surrounded by humans and warned of the outcomes to others and himself of his perilous course, awakens on Christmas morning a changed man. His entire demeanour is permanently changed, not just for Christmas Day, but because he now seeks to be not just a better man, but the best man.

I have several film version of this that I like: the Patrick Stewart is good and the Bill Murray comedic-version “Scrooged” is slightly more delightful because Scrooge (Cross, in this version) is redeemed well before his course is as set. (And I like a happy ending.)

Yesterday I spoke about finding stories and myths that I could use and, even stripped of any religious overtones associated with the word Christmas, there’s still a lot to think about in the framing of A Christmas Carol. Dickens had suffered deep and lasting humiliation as a child and the engines of the Industrial Revolution had, by this time, ground up many older traditions and families along the way. Dickens appeal to the charity of those who can afford it is a core part of the work, as well as drawing back to pre-Cromwellian Christmas traditions that had been stamped out under the dour washed-out grey heel of the puritans. But, back to the framing.

The story starts with the description of Scrooge as someone who is happy with their lot, but shouldn’t be. His negative interpretation of the world is as much at odds with reality as his positive perception of the many flaws of his partner, Marley. Marley’s visit forces Scrooge to listen to the one person who could start him on his journey – because no-one else would have the authenticity to speak to him.

The journey begins with advice from a mentor who wishes you to avoid making their mistakes.

The three ghosts appear to force Scrooge to identify how he has changed, how flawed his perceptions are and that his actions, or inactions, will most likely have consequences that extend beyond his lifetime. 

In order to understand why (or if we need to change), we need to understand:

  • How we have already changed to this point
  • What our environment really looks like
  • Why change might be necessary

And none of this is any surprise for anyone who has read one, two or many self-help or realisation books – except that Dickens’ story is full of emotion and a reason for changing. In all of its forms, I have found the thread of the Cratchits to be one of the most moving. Scrooge’s loss and decline one could almost (well, I can’t but some could) write off as the unfortunate actions of a man who attained what he thought he wanted: wealth, and thus a derived happiness. Scrooge is obviously not happy but there are far too many who would ponder ‘why’ when he was so rich! (For every aphorism regarding “money not buying happiness”, there are many examples apparently to the contrary and Dave Gilmore’s famous riposte “… but it will let you park your yacht right next to it.”)

TinyTim, for me, is the core of this myth because Tim is ill, through no fault of his own but because of the time, the body and the family that he was born into.  It’s not Tim’s fault but that simple fact is not enough to save him from dying – he needs other people to realise that he deserves better just because of what he is (a child) rather than who he is (a child of poor parents). Scrooge is not an evil man, although he is most certainly not a good man at the start, and the death of the child is never what he intended, because it would never have occurred to him the Cratchit would have that much of a life outside of the office. Scrooge’s indifference to the world, to the city of London, to Cratchit and to his own humanity is part of the initial transformation that he undertook, to become the Scrooge that we saw. That is the essence of Scrooge – he can change because he changed before. When Scrooge changes, he finally starts down the path to happiness, which appears to hold him in this enlightened and positively changed state for the rest of his long (and happy) life.

I enjoy the story and it’s something I always revisit leading up to Christmas because it is very easy to start getting all ‘bah, humbug’ in the face of commercialism, over expectation and the sheer hype of the holiday season. However, looking at it as a story about change, I’m forced to think about who could come to me and say “Don’t be like me”. How have I changed from where I was 20, 10 or even 5 years ago? What am I ignoring around me that I could be appreciating more?

Where will this path take me?

What would you expect to see, if the mentor and the three ghosts came to see you?


John Henry Died

Every culture has its myths and legends, especially surrounding those incredible individuals who stand out or tower over the rest of the society. The Ancient Greeks and Romans had their gods, demigods, heroes and, many times, cautionary tales of the mortals who got caught in the middle. Australia has the stories of pre- and post-federation mateship, often anti-authoritarian or highlighting the role of the larrikin. We have a lot of bushrangers (with suspiciously good hearts or reacting against terrible police oppression), Simpson and his donkey (a first world war hero who transported men to an aid station using his donkey, ultimately dying on the battlefield) and a Prime Minister who goes on YouTube to announce that she’s now convinced that the Mayans were right and we’re all doomed – tongue firmly in cheek. Is this the totality of the real Australia? No, but the stylised notion of ‘mateship’, the gentle knock and the “come off the grass, you officious … person” attitude are as much a part of how many Australians see themselves as shrimp on a barbie is to many US folk looking at us. In any Australian war story, you are probably more likely to hear about the terrible hangover the Gunner Suggs had and how he dragged his friend a kilometre over rough stones to keep him safe, than you are to hear about how many people he killed. (I note that this mateship is often strongly delineated over gender and racial lines, but it’s still a big part of the Australian story.)

The stores that we tell and those that we pass on as part of our culture strongly shape our culture. Look at Greek mythology and you see stern warnings against hubris – don’t rate yourself too highly or the gods will cut you down. Set yourself up too high in Australian culture and you’re going to get knocked down as well: a ‘tall poppies’ syndrome that is part cultural cringe, inherited from colonial attitudes to the Antipodes, part hubris and part cultural confusion as Anglo, Euro, Asian, African and… well, everyone, come to terms with a country that took the original inhabitants, the Australian Aboriginal and Torres Strait Islanders, quite a while to adapt to. As someone who wasn’t born in Australia, like so many others who live here and now call themselves Australia, I’ve spent a long time looking at my adopted homeland’s stories to see how to fit. Along the way, because of travel, I’ve had the opportunity to look at other cultures as well: the UK, obviously as it’s drummed into you at school, and the US, because it interests me.

The stories of Horatio Alger, from the US, fascinate me, because of their repeated statement of the rags to riches story. While most of Alger’s protagonists never become amazingly wealthy, they rise, through their own merits, to take the opportunities presented to them and, because of this, a good man will always rise. This is, fundamentally, the American Dream – that any person can become President, effectively, through the skills that they have and through rolling up their sleeves. We see this Dream become ugly when any of the three principles no longer hold, in a framing I first read from Professor Harlon Dalton:

  1. The notion that we are judged solely on our merits:For this to be true, we must not have any bias, racist, gendered, religious, ageist or other. Given the recent ruling that an attractive person can be sacked, purely for being attractive and for providing an irresistible attraction for their boss, we have evidence that not only is this point not holding in many places, it’s not holding in ways that beggar belief.
  2. We will each have a fair opportunity to develop these merits:This assumes equal opportunity in terms of education, in terms of jobs, which promptly ignores things like school districts, differing property tax levels, teacher training approaches and (because of the way that teacher districts work) just living in a given state or country because your parents live there (and can’t move) can make the distance between a great education and a sub-standard child minding service. So this doesn’t hold either.
  3. Merit will out:Look around. Is the best, smartest, most talented person running your organisation or making up all of the key positions? Can you locate anyone in the “important people above me” who is holding that job for reasons other than true, relevant merit?

Australia’s myths are beneficial in some ways and destructive in others. For my students, the notion that we help each other, we question but we try to get things done is a positive interpretation of the mild anti-authoritarian mateship focus. The downside is drinking buddies going on a rampage and covering up for each other, fighting the police when the police are actually acting reasonably and public vandalism because of a desire to act up. The mateship myth hides a lot of racism, especially towards our indigenous community, and we can probably salvage a notion of community and collaboration from mateship, while losing some of the ugly and dumb things.

The tunnel went through.

The tunnel went through.

Horatio Alger myths would give hope, except for the bleak reality that many people face which is that it is three giant pieces of boloney that people get hit about the head with. If you’re not succeeding, then Horatio Alger reasoning lets us call you lazy or stupid or just not taking the opportunities. You’re not trying to pull yourself up by your bootstraps hard enough. Worse still, trying to meet up to this, sometimes impossible, guideline leads us into John Henryism. John Henry was a steel driver, who hammered and chiseled the rock through the mountains to build tunnels for the railroad. One day the boss brought in a steam driven hammer and John Henry bet that he could beat it, to show that he and his crew should not be replaced. After a mammoth battle between man and machine, John Henry won, only to die with the hammer in his hand.

Let me recap: John Henry died – and the boss still got a full day’s work that was equal to two steam-hammers. (One of my objections to “It’s a Wonderful Life” is that the rich man gets away with stealing the money – that’s not a fairy tale, it’s a nightmare!) John Henryism occurs when people work so hard to lift themselves up by their bootstraps that they nearly (or do) kill themselves. Men in their 50s with incredibly high blood pressure, ulcers and arthritis know what I’m talking about here. The mantra of the John Henryist is:

“When things don’t go the way that I want them to, that just makes me work even harder.”

There’s nothing intrinsically wrong with this when your goal is actually achievable and you apply this maxim in moderation. At its extreme, and for those people who have people standing on their boot caps, this is a recipe to achieve a great deal for whoever is benefiting from your labour.

And then dying.

As John Henry observes in the ballad (Springsteen version), “I’ll hammer my fool self to death”, and the ballad of John Henry is actually a cautionary tale to set your pace carefully because if you’re going to swing a hammer all day, every day, then you have to do it at a pace that won’t kill you. This is the natural constraint on Horatio Alger and balances all of the issues with merit and access to opportunity: don’t kill your “fool self” striving for something that you can’t achieve. It’s a shame, however, that the stories line up like this because there’s a lot of hopelessness sitting in that junction.

Dealing with students always makes me think very carefully about the stories I tell and the stories I live. Over the next few days, I hope to put together some thoughts on a 21st century myth form that inspires without demanding this level of sacrifice, and that encourages without forcing people into despair if existing obstacles block them – and it’s beyond their current control to shift. However, on that last point, what I’d really like to come up with is a story that encourages people to talk about obstacles and then work together to lift them out of the way. I do like a challenge, after all. 🙂