SIGCSE 2013: Special Session on Designing and Supporting Collaborative Learning Activities

Katrina and I delivered a special session on collaborative learning activities, focused on undergraduates because that’s our area of expertise. You can read the outline document here. We worked together on the underlying classroom activities and have both implemented these techniques but, in this session, Katrina did most of the presenting and I presented the collaborative assessment task examples, with some facilitation.

The trick here is, of course, to find examples that are both effective as teaching tools and are effective as examples. The approach I chose to take was to remind everyone in the room of what the most important aspects were to making this work with students and I did this by deliberately starting with a bad example. This can be a difficult road to walk because, when presenting a bad example, you need to convince everyone that your choice was deliberate and that you actually didn’t just stuff things up.

My approach was fairly simple. Break people into groups, based on where they were currently sitting, and then I immediately went into the question, which had been tailored for the crowd and for my purposes:

“I want you to talk about the 10 things that you’re going to do in the next 5 years to make progress in your career and improve your job performance.”

And why not? Everyone in the room was interested in education and, most likely, had a job at a time when it’s highly competitive and hard to find or retain work – so everyone has probably thought about this. It’s a fair question for this crowd.

Well, it would be, if it wasn’t so anxiety inducing. Katrina and I both observed a sea of frozen faces as we asked a question that put a large number of participants on the spot. And the reason I did this was to remind everyone that anxiety impairs genuine participation and willingness to engage. There were a large number of frozen grins with darting eyes, some nervous mumbles and a whole lot of purposeless noise, with the few people who were actually primed to answer that question starting to lead off.

I then stopped the discussion immediately. “What was wrong with that?” I asked the group.

Well, where do we start? Firstly, it’s an individual activity, not a collaborative activity – there’s no incentive or requirement for discussion, groupwork or anything like that. Secondly, while we might expect people to be able to answer this, it is a highly charged and personal areas, and you may not feel comfortable discussing your five year plan with people that you don’t know. Thirdly, some people know that they should be able to answer this (or at least some supervisors will expect that they can) but they have no real answer and their anxiety will not only limit their participation but it will probably stop them from listening at all while they sweat their turn. Finally, there is no point to this activity – why are we doing this? What are we producing? What is the end point?

My approach to collaborative activity is pretty simple and you can read any amount of Perry, Dickinson, Hamer et al (and now us as well) to look at relevant areas and Contributing Student Pedagogy, where students have a reason to collaborate and we manage their developmental maturity and their roles in the activity to get them really engaged. Everyone can have difficulties with authority and recognising whether someone is making enough contribution to a discussion to be worth their time – this is not limited to students. People, therefore, have to believe that the group they are in is of some benefit to them.

So we stepped back. I asked everyone to introduce themselves, where they came from and give a fact about their current home that people might not know. Simple task, everyone can do it and the purpose was to tell your group something interesting about your home – clear purpose, as well. This activity launched immediately and was going so well that, when I tried to move it on because the sound levels were dropping (generally a good sign that we’re reaching a transition), some groups asked if they could keep going as they weren’t quite finished. (Monitoring groups spread over a large space can be tricky but, where the activity is working, people will happily let you know when they need more time.) I was able to completely stop the first activity and nobody wanted me to continue. The second one, where people felt that they could participate and wanted to say something, needed to keep going.

Having now put some faces to names, we then moved to a simple exercise of sharing an interesting teaching approach that you’d tried recently or seen at the conference and it’s important to note the different comfort levels we can accommodate with this – we are sharing knowledge but we give participants the opportunity to share something of themselves or something that interest them, without the burden of ownership. Everyone had already discovered that everyone in the group had some areas of knowledge, albeit small, that taught them something new. We had started to build a group where participants valued each other’s contribution.

I carried out some roaming facilitation where I said very little, unless it was needed. I sat down with some groups, said ‘hi’ and then just sat back while they talked. I occasionally gave some nodded or attentive feedback to people who looked like they wanted to speak and this often cued them into the discussion. Facilitation doesn’t have to be intrusive and I’m a much bigger fan of inclusiveness, where everyone gets a turn but we do it through non-verbal encouragement (where that’s possible, different techniques are required in a mixed-ability group) to stay out of the main corridor of communication and reduce confrontation. However, by setting up the requirement that everyone share and by providing a task that everyone could participate in, my need to prod was greatly reduced and the groups mostly ran themselves, with the roles shifting around as different people made different points.

We covered a lot of the underlying theory in the talk itself, to discuss why people have difficulty accepting other views, to clarify why role management is a critical part of giving people a reason to get involved and something to do in the conversation. The notion that a valid discursive role is that of the supporter, to reinforce ideas from the proposer, allows someone to develop their confidence and critically assess the idea, without the burden of having to provide a complex criticism straight away.

At the end, I asked for a show of hands. Who had met someone knew? Everyone. Who had found out something they didn’t know about other places? Everyone. Who had learned about a new teaching technique that they hadn’t known before. Everyone.

My one regret is that we didn’t do this sooner because the conversation was obviously continuing for some groups and our session was, sadly, on the last day. I don’t pretend to be the best at this but I can assure you that any capability I have in this kind of activity comes from understanding the theory, putting it into practice, trying it, trying it again, and reflecting on what did and didn’t work.

I sometimes come out of a lecture or a collaborative activity and I’m really not happy. It didn’t gel or I didn’t quite get the group going as I wanted it to – but this is where you have to be gentle on yourself because, if you’re planning to succeed and reflecting on the problems, then steady improvement is completely possible and you can get more comfortable with passing your room control over to the groups, while you move to the facilitation role. The more you do it, the more you realise that training your students in role fluidity also assists them in understanding when you have to be in control of the room. I regularly pass control back and forward and it took me a long time to really feel that I wasn’t losing my grip. It’s a practice thing.

It was a lot of fun to give the session and we spent some time crafting the ‘bad example’, but let me summarise what the good activities should really look like. They must be collaborativeinclusiveachievable and obviously beneficial. Like all good guidelines there are times and places where you would change this set of characteristics, but you have to know your group well to know what challenges they can tolerate. If your students are more mature, then you push out into open-ended tasks which are far harder to make progress in – but this would be completely inappropriate for first years. Even in later years, being able to make some progress is more likely to keep the group going than a brick wall that stops you at step 1. But, let’s face it, your students need to know that working in that group is not only not to their detriment, but it’s beneficial. And the more you do this, the better their groupwork and collaboration will get – and that’s a big overall positive for the graduates of the future.

To everyone who attended the session, thank you for the generosity and enthusiasm of your participation and I’m catching up on my business cards in the next weeks. If I promised you an e-mail, it will be coming shortly.


Humans: We Appear To Be Stuck With Them

I’ve just presented a paper with the ‘lofty’ title of “Computer Science Education: The First Threshold Concept” and the fundamental question I ask is “Why are certain ideas in learning and teaching in Computer Science just not getting any traction?” I frame this in the language of Threshold Concepts, which allows us to talk about certain concepts as being far more threatening than others but far more useful when we accept them. It doesn’t really matter why we say that people aren’t accepting these things, the fact is that they aren’t. Is it because of authority issues, from Perry’s work, where people aren’t ready to accept more than one source of truth? Is it because of poor role management, which leads us to the work of Dickinson? Is it because many people struggle in the pre-operational stages of Neo-Piagetian theory and, even if they can realise some concrete goals, they can’t apply things to the abstract?

It doesn’t matter, really, because we all have colleagues who, on reading the above, would roll their eyes and reject the notion that this is even a valid language of discourse. Why, some will wonder, are we making it so hard when we talk about teaching – “I know how to teach, it’s just sometimes that the students aren’t working hard enough or smart enough”.  When I mentioned to a colleague that I was giving this paper, he said “Feeling sensitive, are you?” and what he meant was, possibly with a slightly malign edge, that I was taking all of this criticism personally.

Yes, well, probably I am, but let’s talk about why. It’s because it’s important that students are taught well. It’s because it’s important that students get the best opportunities. It’s important that my assumptions about the world, my presumptions of my own ability and that of my students, do not have a detrimental effect on the way that I do my job. I’m taking money to be a teacher, a researcher and an academic administrator – I should be providing real value for that money.

But I am not, by any stretch, the best ‘anything’ in the world. I am not the best teacher. I am not the best researcher. I am not the best speaker. If you are looking for an expert in this area, look elsewhere, because I am a tolerable channel for the works of much better scholars. And, yes, I’m sensitive about some of this because, like many people I speak to in this community, I’m getting tired of having good, solid, scientific work rejected because people feel threatened by it or are dismissive of it. I’m sick of rubbish statements like “we can’t tell people how to teach” because, well, yes, actually we can but it requires us to define what teaching quality is and what our learning environments should look like – what we are trying to do, what we actually do and what we should be doing. Lots of work has been done here, lots of work is yet to occur, and, let me be clear, I am not now, or ever, saying that the “Nick way” is the only way  or the desired way – I’m saying that the discussion is important and that we should be able to say what good teaching is and then we must require this.

In my talk, I mentioned the use of social capital – the investment into our social networks that leads to real and future benefits – and how we spend a lot of time on bonding but too little time on bridging. In other words, we don’t have great ways to reach out and we miss opportunities but, a lot of the time, once we bring someone into the educational community, we can build those relationships. Unfortunately, this is not always true and politics, the curse of academia, too often raises its ugly head and provides too many possible venues, or excludes people, or drives wedges between the community when we should be bonding. I was saddened to discover that politics was traipsing around my current activity, as I was hoping that this would be a launchpad for more and more collaborative work – now we are in the middle of a field of politics.

*sigh*

So much energy – so much lost opportunity unless we use that energy to connect, build and work together. It’s not as if we don’t have enough people saying “Why are you bothering with that? I don’t see the need therefore it’s not important.” But this is humans, after all. My paper opened with a quote from Terence in 163BC,

Homo sum, humani a me nihil alienum puto (I am a [human], nothing human is foreign to me)”

and I then proceeded to shoot this down because threshold concept theory says that one of our key problems is that so much is foreign to us that, unless we recognise this, we are in trouble. However, some things are horribly familiar to us and the unpleasantries of academic politics are one that is not foreign to anyone who has spent more than a couple of years post-PhD.

When I looked at the recent ACM/IEEE Curriculum, the obvious omission was any real attempt to provide a grounding for pedagogy in the document. Hundreds, if not thousands, of concepts were presented with hours attached to them as if this was a formal scientific statement of actual time required to achieve the task. I see this as a wasted bridging opportunity to share, with everyone who reads that document, the idea that certain ideas are trickier, however we frame that statement. If we say “You might have some trouble with this”, we give agency to teachers to think about how they prepare and we also give them a licence to struggle with it, without being worried that they are fundamentally flawed as teachers. If we say “Students may find this challenging”, then the teachers can understand that they do not have a class of bad or lazy students, they have a class of humans because some things are harder to learn than others.

My point from the talk was that, however we slice it, we are fighting an uphill battle and need to focus on bringing in more and more people, which means focusing on bridging rather than division and, where possible, bridging with the same vigour as we bond with our current friends and colleagues. As for politics, it will always be with us, so I suppose the question now is how much energy we give to that, when we could be giving it to to bridging in new people and consolidating our bridges with other people? Bridges are fundamentally hard to build, because it’s so easy for them to fall down, and that’s why the maintenance, the bonding energy, is so important.

I don’t have a solid answer to this but I hope that someone else has some good ideas and feels like sharing them.


SIGCSE 2013: The Revolution Will Be Televised, Perspectives on MOOC Education

Long time between posts, I realise, but I got really, really unwell in Colorado and am still recovering from it. I attended a lot of interesting sessions at SIGCSE 2013, and hopefully gave at least one of them, but the first I wanted to comment on was a panel with Mehram Sahami, Nick Parlante, Fred Martin and Mark Guzdial, entitled “The Revolution Will Be Televised, Perspectives on MOOC Education”. This is, obviously, a very open area for debate and the panelists provided a range of views and a lot of information.

Mehram started by reminding the audience that we’ve had on-line and correspondence courses for some time, with MIT’s OpenCourseWare (OCW) streaming video from the 1990s and Stanford Engineering Everywhere (SEE) starting in 2008. The SEE lectures were interesting because viewership follows a power law relationship: the final lecture has only 5-10% of the views of the first lecture. These video lectures were being used well beyond Stanford, augmenting AP courses in the US and providing entire lecture series in other countries. The videos also increased engagement and the requests that came in weren’t just about the course but were more general – having a face and a name on the screen gave people someone to interact with. From Mehram’s perspective, the challenges were: certification and credit, increasing the richness of automated evaluation, validated peer evaluation, and personalisation (or, as he put it, in reality mass customisation).

Nick Parlante spoke next, as an unashamed optimist for MOOC, who has the opinion that all the best world-changing inventions are cheap, like the printing press, arabic numerals and high quality digital music. These great ideas spread and change the world. However, he did state that he considered artisinal and MOOC education to be very different: artisinal education is bespoke, high quality and high cost, where MOOCs are interesting for the massive scale and, while they could never replace artisinal, they could provide education to those who could not get access to artisinal.

It was at this point that I started to twitch, because I have heard and seen this argument before – the notion that MOOC is better than nothing, if you can’t get artisinal. The subtext that I, fairly or not, hear at this point is the implicit statement that we will never be able to give high quality education to everybody. By having a MOOC, we no longer have to say “you will not be educated”, we can say “you will receive some form of education”. What I rarely hear at this point is a well-structured and quantified argument on exactly how much quality slippage we’re tolerating here – how educational is the alternative education?

Nick also raised the well-known problems of cheating (which is rampant in MOOCs already before large-scale fee paying has been introduced) and credentialling. His section of the talk was long on optimism and positivity but rather light on statistics, completion rates, and the kind of evidence that we’re all waiting to see. Nick was quite optimistic about our future employment prospects but I suspect he was speaking on behalf of those of us in “high-end” old-school schools.

I had a lot of issues with what Nick said but a fair bit of it stemmed from his examples: the printing press and digital music. The printing press is an amazing piece of technology for replicating a written text and, as replication and distribution goes, there’s no doubt that it changed the world – but does it guarantee quality? No. The top 10 books sold in 2012 were either Twilight-derived sadomasochism (Fifty Shades of Unncessary) or related to The Hunger Games. The most work the printing presses were doing in 2012 was not for Thoreau, Atwood, Byatt, Dickens, Borges or even Cormac McCarthy. No, the amazing distribution mechanism was turning out copy after copy of what could be, generously, called popular fiction. But even that’s not my point. Even if the printing presses turned out only “the great writers”, it would be no guarantee of an increase in the ability to write quality works in the reading populace, because reading and writing are different things. You don’t have to read much into constructivism to realise how much difference it makes when someone puts things together for themselves, actively, rather than passively sitting through a non-interactive presentation. Some of us can learn purely from books but, obviously, not all of us and, more importantly, most of us don’t find it trivial. So, not only does the printing press not guarantee that everything that gets printed is good, even where something good does get printed, it does not intrinsically demonstrate how you can take the goodness and then apply it to your own works. (Why else would there be books on how to write?)  If we could do that, reliability and spontaneously, then a library of great writers would be all you needed to replace every English writing course and editor in the world. A similar argument exists for the digital reproduction of music. Yes, it’s cheap and, yes, it’s easy. However, listening to music does not teach you to how write music or perform on a given instrument, unless you happen to be one of the few people who can pick up music and instrumentation with little guidance. There are so few of the latter that we call them prodigies – it’s not a stable model for even the majority of our gifted students, let alone the main body.

Fred Martin spoke next and reminded us all that weaker learners just don’t do well in the less-scaffolded MOOC environment. He had used MOOC in a flipped classroom, with small class sizes, supervision and lots of individual discussion. As part of this blended experience, it worked. Fred really wanted some honest figures on who was starting and completing MOOCs and was really keen that, if we were to do this, that we strive for the same quality, rather than accepting that MOOCs weren’t as good and it was ok to offer this second-tier solution to certain groups.

Mark Guzdial then rounded out the panel and stressed the role of MOOCs as part of a diverse set of resources, but if we were going to do that then we had to measure and report on how things had gone. MOOC results, right now, are interesting but fundamentally anecdotal and unverified. Therefore, it is too soon to jump into MOOC because we don’t yet know if it will work. Mark also noted that MOOCs are not supporting diversity yet and, from any number of sources, we know that many-to-one (the MOOC model) is just not as good as 1-to-1. We’re really not clear if and how MOOCs are working, given how many people who do complete are actually already degree holders and, even then, actual participation in on-line discussion is so low that these experienced learners aren’t even talking to each other very much.

It was an interesting discussion and conducted with a great deal of mutual respect and humour, but I couldn’t agree more with Fred and Mark – we haven’t measured things enough and, despite Nick’s optimism, there are too many unanswered questions to leap in, especially if we’re going to make hard-to-reverse changes to staffing and infrastructure. It takes 20 years to train a Professor and, if you have one that can teach, they can be expensive and hard to maintain (with tongue firmly lodged in cheek, here). Getting rid of one because we have a promising new technology that is untested may save us money in the short term but, if we haven’t validated the educational value or confirmed that we have set up the right level of quality, a few years now from now we might discover that we got rid of the wrong people at the wrong time. What happens then? I can turn off a MOOC with a few keystrokes but I can’t bring back all of my seasoned teachers in a timeframe less than years, if not decades.

I’m with Mark – the resource promise of MOOCs is enormous and they are part of our future. Are they actually full educational resources or courses yet? Will they be able to bring education to people that is a first-tier, high quality experience or are we trapped in the same old educational class divisions with a new name for an old separation? I think it’s too soon to tell but I’m watching all of the new studies with a great deal of interest. I, too, am an optimist but let’s call me a cautious one!


Grace.

A friend sent me a link to this excellent piece on the importance of grace, in terms of your own appreciation of yourself and in your role as a teacher. Thank you, A! Here is the link:

The Lesson of Grace in Teaching

“…to hear from my own professor, whom I really love and admire, at a time when I felt ashamed of my intelligence and thus unworthy of his friendship, that I wasn’t just a student in a seat, not just a letter grade or a number on my transcript, but a valuable person who he wants to know on a personal level, was perhaps the most incredible moment of my college career.”

 


Expressiveness and Ambiguity: Learning to Program Can Be Unnecessarily Hard

One of the most important things to be able to do in any profession is to think as a professional. This is certainly true of Computer Science, because we have to spend so much time thinking as a Computer Scientist would think about how the machine will interpret our instructions. For those who don’t program, a brief quiz. What is the value of the next statement?

What is 3/4?

No doubt, you answered something like 0.75 or maybe 75% or possibly even “three quarters”? (And some of you would have said “but this statement has no intrinsic value” and my heartiest congratulations to you. Now go off and contemplate the Universe while the rest of us toil along on the material plane.) And, not being programmers, you would give me the same answer if I wrote:

What is 3.0/4.0?

Depending on the programming language we use, you can actually get two completely different answers to this apparently simple question. 3/4 is often interpreted by the computer to mean “What is the result if I carry out integer division, where I will only tell you how many times the denominator will go into the numerator as a whole number, for 3 and 4?” The answer will not be the expected 0.75, it will be 0, because 4 does not go into 3 – it’s too big. So, again depending on programming language, it is completely possible to ask the computer “is 3/4 equivalent to 3.0/4.0?” and get the answer ‘No’.

This is something that we have to highlight to students when we are teaching programming, because very few people use integer division when they divide one thing by another – they automatically start using decimal points. Now, in this case, the different behaviour of the ‘/’ is actually exceedingly well-defined and is not all ambiguous to the computer or to the seasoned programmer. It is, however, nowhere near as clear to the novice or casual observer.

I am currently reading Stephen Ramsay’s excellent “Reading Machines: Towards an Algorithmic Criticism” and it is taking me a very long time to read an 80 page book. Why? Because, to avoid ambiguity and to be as expressive and precise as possible, he has used a number of words and concepts with which I am unfamiliar or that I have not seen before. I am currently reading his book with a web browser and a dictionary because I do not have a background in literary criticism but, once I have the building blocks, I can understand his argument. In other words, I am having to learn a new language in order to read a book for that new language community. However, rather than being irked that “/” changes meaning depending on the company it keeps, I am happy to learn the new terms and concepts in the space that Ramsay describes, because it is adding to my ability to express key concepts, without introducing ambiguous shadings of language over things that I already know. Ramsay is not, for example, telling me that “book” no longer means “book” when you place it inside parentheses. (It is worth noting that Ramsay discusses the use of constraint as a creative enhancer, a la Oulipo, early on in the book and this is a theme for another post.)

The usual insult at this point is to trot out the accusation of jargon, which is as often a statement that “I can’t be bothered learning this” than it is a genuine complaint about impenetrable prose. In this case, the offender in my opinion is the person who decided to provide an invisible overloading of the “/” operator to mean both “division” and “integer division”, as they have required us to be aware of a change in meaning that is not accompanied by a change in syntax. While this isn’t usually a problem, spoken and written languages are full of these things after all, in the computing world it forces the programmer to remember that “/” doesn’t always mean “/” and then to get it the right way around. (A number of languages solve this problem by providing a distinct operator – this, however, then adds to linguistic complexity and rather than learning two meanings, you have to learn two ‘words’. Ah, no free lunch.) We have no tone or colour in mainstream programming languages, for a whole range of good computer grammar reasons, but the absence of the rising tone or rising eyebrow is sorely felt when we encounter something that means two different things. The net result is that we tend to use the same constructs to do the same thing because we have severe limitations upon our expressivity. That’s why there are boilerplate programmers, who can stitch together a solution from things they have already seen, and people who have learned how to be as expressive as possible, despite most of these restrictions. Regrettably, expressive and innovative code can often be unreadable by other people because of the gymnastics required to reach these heights of expressiveness, which is often at odds with what the language designers assumed someone might do.

We have spent a great deal of effort making computers better at handling abstract representations, things that stand in for other (real) things. I can use a name instead of a number and the computer will keep track of it for me. It’s important to note that writing int i=0; is infinitely preferable to typing “0000000000000000000000000000000000000000000000000000000000000000” into the correct memory location and then keeping that (rather large number) address written on a scrap of paper. Abstraction is one of the fundamental tools of modern programming, yet we greatly limit expressiveness in sometimes artificial ways to reduce ambiguity when, really, the ambiguity does seem a little artificial.

One of the nastiest potential ambiguities that shows up a lot is “what do we mean by ‘equals'”. As above, we already know that many languages would not tell you that “3/4 equals 3.0/4.0” because both mathematical operations would be executed and 0 is not the same as 0.75. However, the equivalence operator is often used to ask so many different questions: “Do these two things contain the same thing?”, “Are these two things considered to be the same according to the programmer?” and “Are these two things actually the same thing and stored in the same place in memory?”

Generally, however, to all of these questions, we return a simple “True” or “False”, which in reality reflects neither the truth nor the falsity of the situation. What we are asking, respectively, is “Are the contents of these the same?” to which the answer is “Same” or “Different”. To the second, we are asking if the programmer considers them to be the same, in which case the answer is really “Yes” or “No” because they could actually be different, yet not so different that the programmer needs to make a big deal about it. Finally, when we are asking if two references to an object actually point to the same thing, we are asking if they are in the same location or not.

There are many languages that use truth values, some of them do it far better than others, but unless we are speaking and writing in logical terms, the apparent precision of the True/False dichotomy is inherently deceptive and, once again, it is only as precise as it has been programmed to be and then interpreted, based on the knowledge of programmer and reader. (The programming language Haskell has an intrinsic ability to say that things are “Undefined” and to then continue working on the problem, which is an obvious, and welcome, exception here, yet this is not a widespread approach.) It is an inherent limitation on our ability to express what is really happening in the system when we artificially constrain ourselves in order to (apparently) reduce ambiguity. It seems to me that we have reduced programmatic ambiguity, but we have not necessarily actually addressed the real or philosophical ambiguity inherent in many of these programs.

More holiday musings on the “Python way” and why this is actually an unreasonable demand, rather than a positive feature, shortly.


The Limits of Expressiveness: If Compilers Are Smart, Why Are We Doing the Work?

I am currently on holiday, which is “Nick shorthand” for catching up on my reading, painting and cat time. Recently, my interests in my own discipline have widened and I am precariously close to that terrible state that academics sometimes reach when they suddenly start uttering words like “interdisciplinary” or “big tent approach”. Quite often, around this time, the professoriate will look at each other, nod, and send for the nice people with the butterfly nets. Before they arrive and cart me away, I thought I’d share some of the reading and thinking I’ve been doing lately.

My reading is a little eclectic, right now. Next to Hooky’s account of the band “Joy Division” sits Dennis Wheatley’s “They Used Dark Forces” and next to that are four other books, which are a little more academic. “Reading Machines: Towards an Algorithmic Criticism” by Stephen Ramsay; “Debates in the Digital Humanities” edited by Matthew Gold; “10 PRINT CHR$(205.5+RND(1)); : GOTO 10” by Montfort et al; and “‘Pataphysics: A Useless Guide” by Andrew Hugill. All of these are fascinating books and, right now, I am thinking through all of these in order to place a new glass over some of my assumptions from within my own discipline.

“10 PRINT CHR$…” is an account of a simple line of code from the Commodore 64 Basic language, which draws diagonal mazes on the screen. In exploring this, the authors explore fundamental aspects of computing and, in particular, creative computing and how programs exist in culture. Everything in the line says something about programming back when the C-64 was popular, from the use of line numbers (required because you had to establish an execution order without necessarily being able to arrange elements in one document) to the use of the $ after CHR, which tells both the programmer and the machine that what results from this operation is a string, rather than a number. In many ways, this is a book about my own journey through Computer Science, growing up with BASIC programming and accepting its conventions as the norm, only to have new and strange conventions pop out at me once I started using other programming languages.

Rather than discuss the other books in detail, although I recommend all of them, I wanted to talk about specific aspects of expressiveness and comprehension, as if there is one thing I am thinking after all of this reading, it is “why aren’t we doing this better”? The line “10 PRINT CHR$…” is effectively incomprehensible to the casual reader, yet if I wrote something like this:

do this forever
pick one of “/” or “\” and display it on the screen

then anyone who spoke English (which used to be a larger number than those who could read programming languages but, honestly, today I’m not sure about that) could understand what was going to happen but, not only could they understand, they could create something themselves without having to work out how to make it happen. You can see language like this in languages such as Scratch, which is intended to teach programming by providing an easier bridge between standard language and programming using pre-constructed blocks and far more approachable terms. Why is it so important to create? One of the debates raging in Digital Humanities at the moment, at least according to my reading, is “who is in” and “who is out” – what does it take to make one a digital humanist? While this used to involve “being a programmer”, it is now considered reasonable to “create something”. For anyone who is notionally a programmer, the two are indivisible. Programs are how we create things and programming languages are the form that we use to communicate with the machines, to solve the problems that we need solved.

When we first started writing programs, we instructed the machines in simple arithmetic sequences that matched the bit patterns required to ensure that certain memory locations were processed in a certain way. We then provided human-readable shorthand, assembly language, where mnemonics replaced numbers, to make it easier for humans to write code without error. “20” became “JSR” in 6502 assembly code, for example, yet “JSR” is as impenetrably occulted as “20” unless you learn a language that is not actually a language but a compressed form of acronym. Roll on some more years and we have added pseudo-English over the top: GOSUB in Basic and the use of parentheses to indicate function calls in other languages.

However, all I actually wanted to do was to make the same thing happen again, maybe with some minor changes to what it was working on. Think of a sub-routine (method, procedure or function, if we’re being relaxed in our terminology) and you may as well think of a washing machine. It takes in something and combines it with a determined process, a machine setting, powders and liquids to give you the result you wanted, in this case taking in dirty clothes and giving back clean ones. The execution of a sub-routine is identical to this but can you see the predictable familiarity of the washing machine in JSR FE FF?

If you are familiar with ‘Pataphysics, or even “Ubu Roi” the most well-known of Jarry’s work, you may be aware of the pataphysician’s fascination with the spiral – le Grand Gidouille. The spiral, once drawn, defines not only itself but another spiral in the negative space that it contains. The spiral is also a natural way to think about programming because a very well-used programming language construct, the for loop, often either counts up to a value or counts down. It is not uncommon for this kind of counting loop to allow us to advance from one character to the next in a text of some sort. When we define a loop as a spiral, we clearly state what it is and what it is not – it is not retreading old ground, although it may always spiral out towards infinity.

However, for maximum confusion, the for loop may iterate a fixed number of times but never use the changing value that is driving it – it is no longer a spiral in terms of its effect on its contents. We can even write a for loop that goes around in a circle indefinitely, executing the code within it until it is interrupted. Yet, we use the same keyword for all of these.

In English, the word “get” is incredibly overused. There are very few situations when another verb couldn’t add more meaning, even in terms of shade, to the situation. Using “get” forces us, quite frequently, to do more hard work to achieve comprehension. Using the same words for many different types of loop pushes load back on to us.

What happens is that when we write our loop, we are required to do the thinking as to how we want this loop to work – although Scratch provides a forever, very few other languages provide anything like that. To loop endlessly in C, we would use while (true) or for (;;), but to tell the difference between a loop that is functioning as a spiral, and one that is merely counting, we have to read the body of the loop to see what is going on. If you aren’t a programmer, does for(;;) give you any inkling at all as to what is going on? Some might think “Aha, but programming is for programmers” and I would respond with “Aha, yes, but becoming a programmer requires a great deal of learning and why don’t we make it simpler?” To which the obvious riposte is “But we have special languages which will do all that!” and I then strike back with “Well, if that is such a good feature, why isn’t it in all languages, given how good modern language compilers are?” (A compiler is a program that turns programming languages into something that computers can execute – English words to byte patterns effectively.)

In thinking about language origins, and what we are capable of with modern compilers, we have to accept that a lot of the heavy lifting in programming is already being done by modern, optimising, compilers. Years ago, the compiler would just turn your instructions into a form that machines could execute – with no improvement. These days, put something daft in (like a loop that does nothing for a million iterations), and the compiler will quietly edit it out. The compiler will worry about optimising your storage of information and, sometimes, even help you to reduce wasted use of memory (no, Java, I’m most definitely not looking at you.)

So why is it that C++ doesn’t have a forever, a do 10 times, or a spiral to 10 equivalent in there? The answer is complex but is, most likely, a combination of standards issues (changing a language standard is relatively difficult and requires a lot of effort), the fact that other languages do already do things like this, the burden of increasing compiler complexity to handle synonyms like this (although this need not be too arduous) and, most likely, the fact that I doubt that many people would see a need for it.

In reading all of these books, and I’ll write more on this shortly, I am becoming increasingly aware that I tolerate a great deal of limitation in my ability to solve problems using programming languages. I put up with having my expressiveness reduced, with taking care of some unnecessary heavy lifting in making things clear to the compiler, and I occasionally even allow the programming language to dictate how I write the words on the page itself – not just syntax and semantics (which are at least understandably, socially and technically) but the use of blank lines, white space and end of lines.

How are we expected to be truly creative if conformity and constraint are the underpinnings of programming? Tomorrow, I shall write on the use of constraint as a means of encouraging creativity and why I feel that what we see in programming is actually limitation, rather than a useful constraint.


Doo de doo dooooo, doo de doo doo dooooo.

"What did you do in the 80s, Daddy?""I don't want to talk about it."

“What did you do in the 80s, Daddy?”
“I don’t want to talk about it.”

Some of you will recognise the title of this post as the opening ‘music’ of the Europe song, “The Final Countdown”. I wasn’t sure what to call this post because it was the final component of a year long cycle that begin with some sketchy diagrams and a sketchier plan and has seen several different types of development over time. It is not, however, the final post on this blog as I intend to keep blogging but, from this post forwards, I will no longer require myself to provide at least one new post for every day.

This is, perhaps, just as well, because I am already looking over 2013 and realising that my ‘free project’ space is now completely occupied until July. Despite my intentions to travel less, I am in the US twice before the middle of March and have several domestic trips planned as well. And this is a reminder of everything that I’ve been trying to come to terms with in writing this blog and talking about my students, myself, and our community: I can talk about things and deal with them rationally in my head, but that doesn’t mean that I always act on them.

In retrospect, it has been a successful year and I have been able to produce more positive change in 2012 then probably in the sum of my working contributions up until that point. However, I am not in as good a shape as I was at the start of the year, for a variety of reasons, so when I say that my ‘free project’ space is full, I mean that I have fewer additional things to do but I am deliberately allocating less of my personal time to do them. In 2013, family and friends come first, then my projects, then my required work. Why? Because I will always find a way to do the work that I’m supposed to do, but if I start with that I can use all of my time to do that, whereas if I invert it, I have to be more efficient and I’m pretty confident that I can still get it done. After all, next year I’ll have at least an extra hour or two a day from not blogging.

Let’s not forget that this blogging project has consumed somewhere in the region of 350-400 hours of my time over the year, and that’s probably an underestimate. 400 hours is ten working weeks or just under 17 days of contiguous hours. Was my blog any better for being daily? Probably not. Could I be far more flexible and agile with my time if I removed the daily posting requirement? Of course – and so, away it goes. (So it goes, Mr Vonnegut.) The value to me of this activity has been immense – it has changed the way that I think about things and I have a far greater basis of knowledge from which I can discuss important aspects of learning and teaching. I have also discovered how little I know about some things but at least I know that they exist now! The value to other people is more debatable but given that I know that at least some people have found use in it, then it’s non-zero and I can live with that. Recalling Kurt Vonnegut again, and his book “Timequake”, I always saw this blog as a place where people could think “Oh, me too!” as I stumble my way through complicated ideas and try to comprehend the developed notions of clever people.

“Many people need desperately to receive this message: ‘I feel and think much as you do, care about many of the things you care about, although most people do not care about them. You are not alone.'” (Vonnegut, Timequake, 1997)

I never really thought much about the quality of this blog, but I was always concerned about the qualities of it. I wanted it to be inclusive, reliable, honest, humble, knowledgable, useful and welcoming. Looking back, I achieved some of that some of the time and, at other times, well, I’m a human. Some days I was angrier than others but I like to think it was about important things. Sexism makes me angry. Racism makes me angry. The corruption of science for political ends makes me angry. Deliberate ignorance makes me angry. Inequity and elitism make me angry. I hope, however, the anger was a fuel for something better, burning to lift something up that carried a message that wasn’t just pure anger. If, at any stage, all I did was combine oxygen and kerosene on the launch pad and burn the rocket, then I apologise, because I always wanted to be more useful than that.

This is not the end of the blog, but it’s the end of one cycle. It’s like a long day at the beach. You leap out of bed as the sun is coming up, grab some fruit and run down to the water, still warm from the late summer currents and the hot wind that blows across it, diving in to swim out and look back at the sand as it lights up. Maybe you grab your fishing rod and spend an hour or two watching the float bob along the surface, more concerned with talking to your friend or drinking a beer than actually catching a fish, because it’s just such a nice day to be with people. Lunch is sandy sandwiches, eaten between laughs in the gusty breeze that lifts up the beach and tries to jam a big handful of grains into every bite, so you juggle it and the tomato slides out, landing on your lap. That’s ok, because all you have to do is to dive back into the water and you’re clean again. The afternoon is beach cricket, squinting even through sunglasses as some enthusiastic adult hits the ball for a massive 6 that requires everyone to search for it for about 15 minutes, then it’s some cold water and ice creams. Heading back that night, and it’s a long day in an Australian summer, you’re exhausted, you’re spent. You couldn’t swim another stroke, eat another chip or run for another ball if you tried. You’ll eat something for dinner and everyone will mumble about staying up but the day is over and, in an hour or so, everyone will be asleep. You might try and stay up because there’s so much to do but the new day starts tomorrow. Or, worst case, next summer. It’s not the end of the beach. It’s just the end of one day.

Firstly, of course, I want to thank my wife who has helped me to find the time I needed to actually do this and who has provided a very patient ear when I am moaning about that most first world of problems: what is my blog theme for today. The blog has been a part of our lives every day for 1-2 hours for an entire year and that requires everyone in the household to put in the effort – so, my most sincere gratitude to the amazing Dr K. There’s way I could have done any of this without you.

For everyone who is not my wife, thank you for reading and being part of what has been a fascinating journey. Thank you for all of your comments, your patience, your kindness and your willingness to listen. I hope that you have a very happy and prosperous New Year. Remember what Vonnegut said; that people need to know, sometimes, that they are not alone.

I’ll see you tomorrow.

And this is the real me! Yes, it was me ALL ALONG! Happy New Year!

And this is the real me! Yes, it was me ALL ALONG!
Happy New Year!


Thanks for the exam – now I can’t help you.

I have just finished marking a pile of examinations from a course that I co-taught recently. I haven’t finalised the marks but, overall, I’m not unhappy with the majority of the results. Interestingly, and not overly surprisingly, one of the best answered sections of the exam was based on a challenging essay question I set as an assignment. The question spans many aspects of the course and requires the student to think about their answer and link the knowledge – which most did very well. As I said, not a surprise but a good reinforcement that you don’t have to drill students in what to say in the exam, but covering the requisite knowledge and practising the right skills is often helpful.

However, I don’t much like marking exams and it doesn’t come down to the time involved, the generally dull nature of the task or the repetitive strain injury from wielding a red pen in anger, it comes down to the fact that, most of the time, I am marking the student’s work at a time when I can no longer help him or her. Like most exams at my Uni, this was the terminal examination for the course, worth a substantial amount of the final marks, and was taken some weeks after teaching finished. So what this means is that any areas I identify for a given student cannot now be corrected, unless the student chooses to read my notes in the exam paper or come to see me. (Given that this campus is international, that’s trickier but not impossible thanks to the Wonders of Skypenology.) It took me a long time to work out exactly why I didn’t like marking, but when I did, the answer was obvious.

I was frustrated that I couldn’t actually do my job at one of the most important points: when lack of comprehension is clearly identified. If I ask someone a question in the classroom, on-line or wherever, and they give me an answer that’s not quite right, or right off base, then we can talk about it and I can correct the misunderstanding. My job, after all, is not actually passing or failing students – it’s about knowledge, the conveyance, construction and quality management thereof. My frustration during exam marking increases with every incomplete or incorrect answer I read, which illustrates that there is a section of the course that someone didn’t get. I get up in the morning with the clear intention of being helpful towards students and, when it really matters, all I can do is mark up bits of paper in red ink.

Quickly, Jones! Construct a valid knowledge framework! You're in a group environment! Vygotsky, man, Vygotsky!

Quickly, Jones! Construct a valid knowledge framework! You’re in a group environment! Vygotsky, man, Vygotsky!

A student who, despite my sweeping, and seeping, liquid red ink of doom, manages to get a 50 Passing grade will not do the course again – yet this mark pretty clearly indicates that roughly half of the comprehension or participation required was not carried out to the required standard. Miraculously, it doesn’t matter which half of the course the student ‘gets’, they are still deemed to have attained the knowledge. (An interesting point to ponder, especially when you consider that my colleagues in Medicine define a Pass at a much higher level and in far more complicated ways than a numerical 50%, to my eternal peace of mind when I visit a doctor!) Yet their exam will still probably have caused me at least some gnashing of teeth because of points missed, pointless misstatement of the question text, obscure song lyrics, apologies for lack of preparation and the occasional actual fact that has peregrinated from the place where it could have attained marks to a place where it will be left out in the desert to die, bereft of the life-giving context that would save it from such an awful fate.

Should we move the exams earlier and then use this to guide the focus areas for assessment in order to determine the most improvement and develop knowledge in the areas in most need? Should we abandon exams entirely and move to a continuous-assessment competency based system, where there are skills and knowledge that must be demonstrated correctly and are practised until this is achieved? We are suffering, as so many people have observed before, from overloading the requirement to grade and classify our students into neatly discretised performance boxes onto a system that ultimately seeks to identify whether these students have achieved the knowledge levels necessary to be deemed to have achieved the course objectives. Should we separate competency and performance completely? I have sketchy ideas as to how this might work but none that survive under the blow-torches of GPA requirements and resource constraints.

Obviously, continuous assessment (practicals, reports, quizzes and so on) throughout the semester provide a very valuable way to identify problems but this requires good, and thorough, course design and an awareness that this is your intent. Are we premature in treating the exam as a closing-off line on the course? Do we work on that the same way that we do any assignment? You get feedback, a mark and then more work to follow-up? If we threw resourcing to the wind, could we have a 1-2 week intensive pre-semester program that specifically addressed those issues that students failed to grasp on their first pass? Congratulations, you got 80%, but that means that there’s 20% of the course that we need to clarify? (Those who got 100% I’ll pay to come back and tutor, because I like to keep cohorts together and I doubt I’ll need to do that very often.)

There are no easy answers here and shooting down these situations is very much in the fish/barrel plane, I realise, but it is a very deeply felt form of frustration that I am seeing the most work that any student is likely to put in but I cannot now fix the problems that I see. All I can do is mark it in red ink with an annotation that the vast majority will never see (unless they receive the grade of 44, 49, 64, 74 or 84, which are all threshold-1 markers for us).

Ah well, I hope to have more time in 2013 so maybe I can mull on this some more and come up with something that is better but still workable.


Thinking about teaching spaces: if you’re a lecturer, shouldn’t you be lecturing?

I was reading a comment on a philosophical post the other day and someone wrote this rather snarky line:

He’s is a philosopher in the same way that (celebrity historian) is a historian – he’s somehow got the job description and uses it to repeat the prejudices of his paymasters, flattering them into thinking that what they believe isn’t, somehow, ludicrous. (Grangousier, Metafilter article 123174)

Rather harsh words in many respects and it’s my alteration of the (celebrity historian)’s name, not his, as I feel that his comments are mildy unfair. However, the point is interesting, as a reflection upon the importance of job title in our society, especially when it comes to the weighted authority of your words. From January the 1st, I will be a senior lecturer at an Australian University and that is perceived differently where I am. If I am in the US, I reinterpret this title into their system, namely as a tenured Associate Professor, because that’s the equivalent of what I am – the term ‘lecturer’ doesn’t clearly translate without causing problems, not even dealing with the fact that more lecturers in Australia have PhDs, where many lecturers in the US do not. But this post isn’t about how people necessarily see our job descriptions, it’s very much about how we use them.

In many respects, the title ‘lecturer’ is rather confusing because it appears, like builder, nurse or pilot, to contain the verb of one’s practice. One of the big changes in education has been the steady acceptance of constructivism, where the learners have an active role in the construction of knowledge and we are facilitating learning, in many ways, to a greater extent than we are teaching. This does not mean that teachers shouldn’t teach, because this is far more generic than the binding of lecturers to lecturing, but it does challenge the mental image that pops up when we think about teaching.

If I asked you to visualise a classroom situation, what would you think of? What facilities are there? Where are the students? Where is the teacher? What resources are around the room, on the desks, on the walls? How big is it?

Take a minute to do just this and make some brief notes as to what was in there. Then come back here.

It’s okay, I’ll still be here!

Read the rest of this entry »


False Dichotomy: If I don’t understand it, then either I am worthless or it is!

I’ve been reading an interesting post on Metafilter about the “Minima Moralia: Reflections from the Damaged Life“, by Theodor Adorno. While the book itself is very interesting, two of the comments on the article caught my eye. An earlier commenter had mentioned that they neither understood nor appreciated this kind of thing, and made the usual throwaway remark about postmodernism being “a scam to funnel money from the productive classes to the parasitical academy” (dydecker). Further down, another commenter, Frowner, gently took this statement to task, starting by noting that Adorno would have been appalled by being labelled a post-modernist, and then discussing why dydecker might have felt the need to attack things in this way. It’s very much worth reading Frowner’s comments on this post, but I shall distil the first one here:

  1. Just because a text is difficult to obscure does not mean that it is postmodern. Also post-modernist is not actually an insult and this may be a politically motivated stance to attacks  group of people who are also likely to identify as status quo critical or (gasp) Marxist.

  2. Not all texts need to be accessible to all audiences, not is something worthless, fake or elitist if it requires pre-readings or some effort to get into. Advanced physics texts can be very difficult to comprehend for the layperson. This does not make Quantum Field Theory wrong or a leftist conspiracy.
  3. You don’t need to read books that you don’t want to read.
  4. You don’t need to be angry at difficult books for being difficult. To exactly quote Frowner,

    Difficult books only threaten us if we decide to feel guilty and ashamed for not reading them.

    If you’re actually studying an area, and read the books that the work relies upon, difficult books can become much clearer, illustrating that it was perhaps not the book that was causing the difficulty.

  5. Sometimes you won’t like something and this has nothing to do with its quality or worth – you just don’t like it.
  6. Don’t picture a perfect reader in your head who understands everything and hold yourself to that standard. If you’re reading a hard book then keep plugging away and accept your humanity.

Frowner then goes on to beautifully summarise all of this in a later comment, where he notes that we seem to learn to be angry at, or uncomfortable with, difficult texts, because we are under pressure to be capable of understanding everything of worth. This is an argument of legitimacy: if the work is legitimate and I don’t understand it, then I am stupid, however if I can argue that the work is illegitimate, then this is a terrible con job, I am not stupid for not understanding this and we should attack this work! Frowner wonders about how we are prepared for the world and believes that we are encouraged to see ourselves as inadequate if we do not understand everything for ourselves, hence the forced separation of work into legitimate and illegitimate, with am immediate, and often vicious, attack on those things we define as illegitimate in order to protect our image of ourselves.

I spend a reasonable amount of time in art galleries and I wish I had a dollar for everyone who stood in front of a piece of modern art (anything from the neo-impressionists on, basically) and felt the need to loudly state that they “didn’t get it” or that they could “have painted it themselves.” (I like Rothko, Mondrian and Klee, among others, so I am often in that part of the gallery.) It is quite strange when you come to think about it – why on earth are people actually vocalising this? Looking more closely, it is (less surprisingly) people in groups of two or more who seem to do this: I don’t understand this so, before you ask me about, I will declare it to be without worth. I didn’t get it, therefore this art has failed me. We go back to Frowner’s list and look at point 2: Not all art (in this case) is for everyone and that’s ok. I can admire Grant Wood’s skill and his painting “American Gothic” but the painting doesn’t appeal as much to me as does the work of Schiele, for example. That’s ok, that doesn’t make Schiele better than Wood in some Universal Absolute Fantasy League of Painters (although the Schiele/Klimt tag team wrestling duo, with their infamous Golden Coat Move, would be fun to watch) – it’s a matter of preference. I regularly look at things that I don’t quite understand but I don’t regard it as a challenge or an indication that it or I are at fault, although I do see things that I understand completely and can quite happily identify reasons that I don’t like it!

Klee's "The Goldfish". Some will see this as art, others will say "my kids could do that". Unless you are Hans Wilhelm Klee, no, probably not.

Klee’s “The Goldfish”. Some will see this as art, others will say “my kids could do that”. Unless you are Hans Wilhelm Klee, no, probably not.

I am, however, very lucky, because I have a job and lifestyle where my ability to think about things is a core component: falsely dichotomous thinking is not actually what I’m paid to do. However, I do have influence over students and I need to be very careful in how I present information to them. In my last course, I deliberately referred to Wikipedia among other documents because it is designed to be understood and is usually shaped by many hands until it reaches an acceptable standard of readability. I could have pointed my students at ethics texts but these texts often require more preparation and a different course structure, which may have put students off actually reading and understanding them. If my students go into ethics, or whatever other area they deem interesting, then point 4 becomes valid and their interest, and contextual framing, can turn what would have been a difficult book into a useful book.

I agree with this (effectively) anonymous poster and his or her summary of an ongoing issue: we make it hard for people to admit that they are learning, that they haven’t quite worked something out yet, because we make “not getting something immediately” a sign of slowness (informally) and often with negative outcomes (in assessment or course and career progression). We do not have to be experts at everything, nor should we pretend to be. We risk not actually learning some important and beautiful things because we feel obliged to reject it before it rejects us – and some things, of great worth that will be long appreciated, take longer to ‘get’ then just the minute or two that we feel we can allocate.