Data Visualisation: Strong Messages Educate Better

Know what this is?

Blitz

Rather pretty, isn’t it – but it has a definite direction, like someone has throw something from the right and it has hit the ground and scattered.

This image is from the Bomb Sight website, and shows all of the bombs that fell on London (and surrounds) from the 7th of October, 1940, to the 6th of June, 1941. The Bomb Sight team have been working from a variety of data sources to put together a reasonably reliable picture of the recorded bombs on London over that 242 day period. If you zoom in (and it starts zoomed in), you start to see how many sites took 2, 3, 4 or more bombs (10, 11, plus) over that time.

If I were to put together a number of bombs and a number of days and say “X bombs fell in London over Y days”, you could divide X by Y and say “Gosh.” Or I can show you a picture like the one above and tell you that each of those dots represents at least one bomb, possibly as many as 10 or so, and watch your jaw drop.

Seen this way, the Blitz becomes closer to those of us who were fortunate enough not to live through that terrible period. We realise any number of things, most of which is that close proximity to a force who wishes you ill is going to result in destruction and devastation of a level that we might not be able to get our heads around, unless we see it.

Seen this way, it’s a very strong message of what actually happened. It has more power. In a world of big numbers and enormous data, it’s important to remember how we can show things so that we tell their stories in the right way. Numbers can be ignored. Pictures tell better stories, as long as we are honest and truthful in the way that we use them.


The Invisible Fragility of our World of Knowledge

If I were to mention that I was currently researching Rongorongo, as background for a story in which the protagonists communicated in a range of reverse boustrophedonic texts, there are three likely outcomes.

  1. You would roll your eyes and close the browser, or,
  2. You would think “Aha, that’s what I was talking about last night at the Friends of Rapanui Quiz Night. How apt!”, or,
  3. You would go and look up Rongorongo and boustrophedon in Wikipedia.

What I am fairly sure that most of you will not do, is to go and look up the information in a book, go to a library or even ask another human. (Some of you will have used physical means such as books or libraries because you are being deliberate physical users. I am after the usage patterns that your adopt unconsciously, or as a matter of actual habit, then those that are employed because of a deliberate endeavour to use another source.) There is no doubt that we live in an amazing world of immediately available information and that it has changed the way that we use, store and retrieve information but this immediacy has come at a cost: we tend not to use or consult physical media as much. As a result, there is less of the physical to hand, most of the time. I have noticed a major change in the way that I use information and, while I tend to read and annotate material on printed paper (using a fountain paper, no less, so I am not judging anyone for their affectations), I search and edit in the digital form. Why? Each form has its own efficiencies.

A physical artefact that we can no longer read.

A physical artefact that we can no longer read.

The absence of the physical artefact is often not noticeable unless we are cut off from the Internet or from our stored versions of the material. Last week, my laptop decided that it would no longer boot and I realised, with mounting horror, that my only copies of certain works in progress were sitting on this ‘dead’ machine. Why weren’t they backed up? Because I was not connected to the Internet for a few hours and I had left my actual backup device at home, to reduce the risk of losing both laptop and backup in the same localised catastrophe.

The majority of the on-line information repositories are remarkable in their ease of use and sheer utility – as long as you can connect to them .We, however, have an illusion of availability and cohesion that is deceptive and it is the comfortable analogue of the printed page that lulls us into this. Wikipedia, for example, presents a single page full of text, just like a book does. It is only when you look at the History and the Discussion that it dawns on you that each character on the page could have been contributed by a difference source. While the printed page is the final statement of a set of arguments between the authors, the editors and their mutual perceptions of reality, it is static once printed. In Wikipedia, its strength and its weakness is that the argument never ends. Anything on a publicly editable page is inherently fragile and ephemeral. What is there today may not be there tomorrow and there is no guarantee that what appears sound now will be anything other than horrible and deliberately broken in a second.

The fragility doesn’t stop there, however, because we don’t actually have any part of Wikipedia inside our offices, unless you happen to be Jimmy Wales. (Hi!) Wikipedia.org, the domain name of Wikipedia, is registered in California, but the server I was connected to (for the queries I put above) was in Washington State, and there were some 17 active network devices involved in routing traffic from me (in Adelaide) to the server (in Washington) and then getting the information back. This doesn’t count the active electronic devices that I can’t see in this path and, believe me, there will be a lot of them. Now we build a lot of redundancy into the global network that we call the Internet (the network of networks of networks) but a major catastrophe on the West Coast will quickly force so much traffic onto those backup links that information flow will stop and, for some good technical reasons, it will then start to fall over.

So the underlying physical pathways that actually shunt the network information from point to point could fall over. At that point, if I had a book on the linguistics of Easter Island, I could read it by torchlight even if I had no local power. A severe power failure here or in enough places along the way, or at Wikipedia’s data centres? Suddenly, my ability to find out anything is blocked.

But let’s look at the information itself. People have been editing the Rongorongo page for over 10 years. The first version (that we can see, Wikipedia can invisibly delete revisions) is recorded for the 25th of November, 2002. Happy double digits, Rongorongo page! Since then there have been roughly 3000 edits. Are all of them the same quality? Hmm. Here are some comments:

14 April 2006, “reinstate link to disambiguate Rongorongo, wife of Turi, NZ”

18 May 2006, “If I want to be blocked, why do I improve these pages? REMEMBER LIUVIGILD! TRY BLOCKING ME!!! BWAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA” (sic)

18 May 2006, “Excuse my insolence. This is not vandalism, as it is all true. Why do you insist on reverting it? Please send a PERSONAL message of explanation. Sincerely, 64.107.172.130”

28 April 2007, “Inhabitants of Easter Island have many names for it.”

7 April 2011, “A picture of a banana leaf is not helpful here. I looked on this banana leaf for scribblings. I know what one looks like, and if someone doesn’t, they can read about it at banana.”

12 November, 2012, “What’s wrong, Kwamikagami? It is what it is, isn’t it? Just a straight up comparison of rongorongo and Indus Valley glyphs, nothing more. I’d love to know which ones are ‘not true”‘according to you”

There are periods when this page is changing every few minutes and sometimes the data is the same for days or even months. But most people don’t know this because they never think to look in the history or talk sections. Right now, it appears that someone is disputing the authority of Kwamikagami, a person who has carried out a lot of edits on this page. This is important because if you say to someone “Hey, look at this page” then 3000 edits over 10 years says that the chances of the page changing in a day is something like 80%. The burstiness would have an impact on this but the general idea is that the simple page on a dead text(?) is more likely to change on a daily basis than not.

Does this make Wikipedia any better or any worse than the printed page? I think it makes it different because we have to treat it as an evolving discussion that we have walked in on, because of its inherent fragility and ephemeral nature.

We live in amazing times, where I can use a small hand-held device to access almost everything that our species has created. And yet, when I go to look at how robust this knowledge source is and how vulnerable we are to losing our connection to that knowledge, I am reminded that we are going to have to work out how to do this properly. If we give up the fixed physical forms (books, CDs, DVDs), then so be it, but we must make sure that we deal with this fragility before we become too seduced by the immediacy.  We have to think about this for our students too. How do we provide them with artefacts that they can consult down the line, when they need to look something up? Books have no licensing agreements, never expire and do not have to be abandoned when a digital format changes. Yet, they have none of the advantages.

I mention this because I am really looking forward to seeing how people address and solve this challenge – how can we have the best of the immediate and convenient, while having the enduring presence and guarantee of future access? Rongorongo itself is a physical artefact for which we have lost the knowledge of reading, or if it is even a text at all. It’s a reminder that we have faced this problem before and we have not solved it sufficiently well. Perhaps this time.


AAEE 2012 – Yes, Another Conference

In between writing up the conventicle (which I’m not doing yet), the CI Conference (which I’m doing slowly) and sleep (infrequent), I’m attending the Australasian Association for Engineering Education 2012 conference. Today, I presented a paper on e-Enhancing existing courses and, through a co-author, another paper on authentic teaching tool creation experiences.

My first paper gave me a chance to look at the Google analytics and tracking data for the on-line material I created in 2009. Since then, there have been:

  • 11,118 page views
  • 2.99 pages viewed/visit
  • 1,721 unique visitors
  • 3,715 visits overall

The other thing that is interesting is that roughly 60% of the viewers return to view the podcasts again. The theme of my talk was “Is E-Enhancement Worth It” and I had the pleasure of pointing out that I felt that it was because, as I was presenting, I was simultaneously being streamed giving my thoughts of computer networks to students in Singapore and (strangely enough) Germany. As I said in the talk and in the following discussion, the podcasts are far from perfect and, to increase their longevity, I need to make them shorter and more aligned to a single concept.

Why?

Because while the way I present concepts may change, because of sequencing and scaffolding changes, the way that I present an individual concept is more likely to remain the same over time. My next step is to make up a series of conceptual podcasts that are maybe 3-5 minutes in duration. Then the challenge is how to assemble these – I have ideas but not enough time.

One of the ideas raised today is the idea that we are seeing the rise of the digital native, a new type of human acclimatised to a short gratification loop, multi-tasking, and a non-linear mode of learning. I must be honest and say that everything I’ve read on the multi-tasking aspect, at least, leads me to believe that this new generation don’t multi-task any better than anyone else did. If they do two things, then they do them more slowly and don’t achieve the same depth: there’s no shortage of research work on this and given the limits of working memory and cognition this makes a great deal of sense. Please note, I’m not saying that I don’t believe that Homo Multiplexor can’t emerge, it’s just that I have not yet any strong scientific evidence to back up the anecdotes. I’m perfectly willing to believe that default searching activities have changed (storing ways of searching rather than the information) because that is a logical way to reduce cognitive load but I am yet to see strong evidence that my students can do two things at once well and without any loss of time. Either working memory has completely changed, which we should be able to test, or we risk confusing the appearance of doing two things at once with actually doing two things at once.

This is one of those situations that, as one of my colleagues observed, leaves us in that difficult position of being told, with great certainty, about a given student (often someone’s child) who can achieve great things while simultaneously watching TV and playing WoW. Again, I do not rule out the possibility of a significant change in humanity (we’re good at it) but I have often seen that familiar tight smile and the noncommittal nod as someone doesn’t quite acknowledge that your child is somehow the spearhead of a new parallelised human genus.

It’s difficult sometimes to express ideas like this. Compare this to the numbers I cited above. Everyone who reads this will look at those numbers and, while they will think many things, they are unlikely to think “I don’t believe that”. Yet I know that there are people who have read this and immediately snorted (or the equivalent) because they frankly disbelieve me o the multi-tasking, with no more or less hard evidence than that supporting the numbers. I’m actually expecting some comments on this one because the notion of the increasing ability of young people to multitask is so entrenched. If there is a definitive set of work supporting this, then I welcome it. The only problem is that all I can find supports the original work on working memory and associated concepts – there are only so many things you can focus on and beyond that you might be able to function but not at much depth. (There are exceptions, of course, but the 0.1% of society do not define the rule.)

The numbers are pasted straight out of my Google analytics for the learning materials I put up – yet you have no more reason to believe them than if I said “83% of internet statistics are made up”, which is a made up statistic. (If is is true, it is accidentally true.) We see again one of the great challenges in education: numbers are convincing, evidence that contradicts anecdote is often seen as wrong, finding evidence in the first place can be hard.

One more day of conference tomorrow! I can only wonder what we’ll be exposed to.


David and Goliath: Who Needs The Strategy? (CI 2012 Masterclass)

I attended a second masterclass on the first day of Creative Innovations, this one entitled “Strategic Diagnosis and Action”, given by Richard Rumelt from UCLA. Richard gave a fascinating and well-polished presentation on how you can actually get a strategy that you can use, as oppose to something that you can print up, put on the wall, and completely ignore. Richard’s definition of strategy is pretty straightforward:

A strategy is a coherent mix of policy and action designed to surmount a high-stakes challenge.

As I believe I’ve noted before, having a strategy is not just useful in terms of knowing where you’re going, it also allows you to make a choice between two (apparently) equal choices. Richard’s question is “What are we going to do in order to meet a challenge?” and, in my application, this makes any choice a matter of “which of these choices will give me the greatest assistance in meeting the challenge?”

As Richard said in his talk, when David met Goliath, if you’re Goliath you don’t think you need a strategy. David, however, has a high-stakes challenge and better be as fast with his mind as he is with his feet. Goliath winning? That’s not a strategy story; a strategy story is about the discovery or creation of strength (where we’re surprised to see it).

David has a post-negotiation debrief with the head of Goliath industries.

David has a post-negotiation debrief with the head of Goliath industries.

Another point that was clearly emphasised is that if you take a challenge focus to your strategy, your strategy can be specific to that challenge and, as a result, clearer and more goal focused. The example of Apple was given. When Steve Jobs returned in 1997, he had to implement a new strategy but it wasn’t one of growth or market domination, it was a survival strategy. Cutting 15 desktops to 1? Survival. Cutting 5 of the 6 national retailers? Survival? Off-shoring everything possible  to reduce expenditure? Survival. This is a coherent strategy with clear and sharp action – this is a survival strategy. The strategy is all about addressing the most important challenge that you have. If it’s survival, fix that first.

The speaker had apparently spoken to Jobs in 1998 and Jobs had said that he was going to “wait for the next big thing”. Well, in 1998 that made sense. Rather than being a second-rate (or small share) PC producer, Apple’s approach was to find a new market where they could dominate. The survival strategy kept the company going for long enough that they could switch to a new strategy of dominating the new music and mobile markets. And, of course, by doing this, Jobs got to set the new rules for that area. There’s a reason that iPods, by default, only work with iTunes and that Apple has complete vertical control. That reason is predominantly because it allows Apple to totally control that market, to avoid having to go through the hard lessons of 1997 and 1998 again.

So, taking this into my Educational Sector setting, what is the strategy that Universities should be employing? Well, first of all, the global tertiary sector is not one business so we’re restricted to individual institution decision making, even where state and federal guidelines are in play. The survival strategy is, to me, effectively off the table. If global education is under an extinction threat then we are facing a catastrophe of such proportion that human survival is probably the requisite strategy. If the MOOC is so successful, and of the required quality, that it can replace the University then a survival strategy for the Unis is ethically questionable as we are spending more money to achieve the same result, assuming that when MOOCs are fully costed they end up being cheaper.

So, either way we slice it, a survival strategy for Universities doesn’t actually look like a valid one. But what does a strategy for a University look like anyway? Let’s step back and ask what Richard things a general strategy is and isn’t.

Firstly, strategy is not a set of goals and, according to Richard, you know it’s a bad strategy when it’s all performance goals and no diagnosis and analysis. “We will increase revenue by 20%.” Great. How?

You know it’s a bad strategy when it’s all fluff. “Our fundamental strategy is one of customer-centric intermediation” from a bank. Good, you’re a retail bank, now what? How do you apply a values statement meaningfully to 30,000 people? Richard sees this as a childish approach – a third grade recitation of “I will not chew gum in school” and not productive when contaminating a strategy.

If no-one has bothered to diagnose what the problem is – bad strategy. To act with intelligence and to get a good strategy, you need to define the nature of the problem. (One of the most refreshing things about the new strategy that is about to be released for my Uni is that I know that a great deal of problem definition underpins it – so I’m quite looking forward to reading it when it’s released shortly. I am quite hopeful that little or any of the critique here will apply.)

What about if you have 47 strategies, 178 Action items and Action Item #122 is “Develop a strategic plan”? It’s a dog’s dinner that everything has gone into that you could find in the fridge, with no discipline, diagnosis, analysis or thought.

So how do we make a good strategy? Diagnose the challenge. Provide guiding policy. Build a set of coherent actions into the strategy and don’t just provide goals as if they are self-solving problem elements.

In terms of Universities, and the whole higher education sector, this means that we not only have to work out what our challenges are, but we have to pick challenges that we can solve. (A previous Prime Minister of Australia famously declared that “by 1990, no Australian child will be living in poverty.” Given that the definition of poverty in Australia is relative to the affluence enjoyed by other sectors, rather than the ‘true’ international definition of subsistence, this is declaring war on an unwinnable challenge. The goal “Stop the drug trade” is equally fraught as it requires legal powers that do not, and may not, exist.)

What are the key challenges facing Universities? Well, if we take survival off the table for the institutions, we change the challenge focus to “what are the key challenges facing the post-school education market in Australia?” and that gives us an entirely new lens on the problem.

I have a lot of thinking to do but, as I said, I’m looking forward to what our new strategic plan will look like for the next 5 years, because I hope that it will help me to identify a subset of challenges that I can look at. Having done that, then I can ask “which are the ones I can help to solve?”


Killing Your Darlings: The Cost of Innovation (CI 2012)

I’m going to take a little more informal approach to some of the themes expressed at CI 2012, because I have a lot of things to do, and you have a lot of things to do, so we can’t sit here waiting for me write everything up and you most certainly don’t want to read 100,000 words about What Nick Did In Late Spring In Melbourne. So let’s go forward.

Innovation is the introduction of the new, whether product, service or idea, but we know what this really means – it means that we have to let go of something old. Letting go of something old is not going to be easy, and how difficult it is can be a very complicated and emotional calculus, so innovation, which can already be hard, is made harder because change can hurt.

If you’re a writer, you may have heard the term “Kill your darlings”, which is attributed to Faulkner (the other one) and is a recasting of the following quote from Sir Arthur Quiller-Couch:

“Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it – whole-heartedly – and delete it before sending your manuscripts to press. Murder your darlings”

On shallow reading, it appears that any attachment to something makes it eligible for extinction when what is really meant is that sentimentality is the enemy of objectivity. Innovative change is full of situations where your attachment to elements of your existing situation, or an entrenched commitment to the status quo (no, not the band), will compromise your ability to objectively assess whether you are making a correct decision.

There is a statement that every industry will go away at some stage – we’ve seen the rise and fall of so many that such a statement appears to have some credibility. But what about education? We have changed a great deal but will the industry of education every truly disappear? I honestly can’t say but I can talk about a simpler problem, which is what the “darlings” are in the traditional Higher Education system. And, sure enough, when we start talking about innovation and the threat of the new, we see these darlings protected in a way that doesn’t necessarily always seem objective. Now, we don’t have to kill any of them but change is inevitable and, if change is to come in, something has to go out. I have a starting list, which I’m planning to work on over time.

  1. Darling #1, The Lecture:

    We know that the traditional 1-to-many broadcast lecture is a successful way to occupy the time of everyone in the room but it is most certainly not the best way to get certain types of information across. There are many different aspects to this but conference talks and seminars are a world away from the traditional “today I will talk slowly about differential equations while I flash hundreds of slides past you at a speed that you can’t record and no you can’t have any notes or recording”.

    Yes, some lecturers are better than others but when information transfer and retention is important, the lecture is not the right delivery mechanism. Yet, it’s almost unassailable in its ubiquity. It’s a darling.

  2. Darling #2, The Exam:

    I was looking back at my Grand Challenges course, which had a 20% final examination of some of the core topics, and thought about what it had achieved. From my marking of the exam and review of how students prepared, my goal for the exam worked for most of the class. Most had reviewed all of the core material and organised it in a useful way to be able to summarise the core content of the course.

    But did it have to be assigned as a 1 hour exam in a giant examination hall? Did it anything to the course?

    You know, I’m not sure that it did. Next time, I might just assign an exercise to provide a portfolio of work from the course in an organised form and then have an assessment of that which is effectively a viva voce examination to assess that students had done enough work to produce a useful index and had sufficient familiarity to rapidly contextualise problems and knowledge. But, and this is important, far more conversationally.

    The examination can be made highly objective and has the advantage that you are really pretty sure that the student is doing the work – but we’re already seeing cheating technology that we will have more and more trouble dealing with. If the only supporting argument for the exam is that it’s harder to cheat, we need a better reason. If the argument is that it will force the student to learn the work, then we’ve got that around the wrong way. We need to bring motivation back into the rest of the course. Right now, the vast majority of learning happens 2-3 days before the exam and is forgotten by the following weekend.

    And yet, exams are everywhere. They’re entrenched institutional artefacts. Hello, darling.

  3. Darling #3, Me and my University:

    Oh no! Apostasy! But let’s be honest, the primary question around MOOC is whether we need the Universities that we’ve had for so many hundreds of years. If we’re questioning the University, then we’re starting to question the role and future of the teaching academic. Teacherless education was a theme that popped up occasionally at CI 2012 and, while I instinctively react to this in terms of ‘well, who builds these experiences’, we can still learn a lot by looking at what we actually need to make things work.

    I have a small office in a big and old University, with my academic robes hanging on the door for when I walk into the graduation ceremony in the giant old sandstone building once or so every year to farewell and congratulate my graduating students. How much of this is necessary recognition of achievement and how much is a darling?

    Let’s face it – we’re darlings ourselves.

Let me stress that I am not saying that everything must go, but innovation needs space and that means something else has to go. Rather than saying that everything is sacrosanct, we should really be looking at what can and should go, which will drive a search for the new and innovative. My hope would be that by looking at these things, we find the reasons why some of these could stay and belong in the future, rather than propping them up with sentimentality and an ultimately weak approach to necessary change and reinvigoration.

What are your darlings?


More on the Change Lab: Creative Innovations 2012 Day 1 (still)

(Yeah, I’m slowly adding content. I just came from a dinner that pretty much defies description so you’ll have to just give me love for actually taking the time to write this at midnight instead of going to bed. 🙂 )

I spoke before about the Change Lab but here are the key steps.

1. An innovative approach that is systemic, participative and creative
2. Collective effort to address a vital, complex challenge in a given systems
3 A committed alliance of political, economic and cultural leaders in the system
4 A rhythmic process of acting and reflecting
5 A structure container for building capacities for co-initiating co-sensing co-presencing co-creating and co-evolving
6 A safe space for practising how to exercise both power and love

Whoah – what? Power and love? This is a form of framing to show how two very different camps think about the work.

POWER: One camp says that the only thing that matters is individual interests, ambitions and capacity to act.
LOVE:  Other camp focused on what’s good for the whole, the best solution, that’s the only thing that matters

Here’s the quote from the Reverend Martin Luther King, Junior:

“Power without love is reckless and abusive. Love without power is sentimental and anaemic.”

So we must attend to both the power and the love as part of the whole.

Ok, this kind of things is easy to say but the guy who was saying this was Dr Adam Kahane – he’s gone to places to look at difficult situations and if he thinks this works then I’m willing to listen. Adam was in Bogota with politicians and militia in one room, including people who had made death threats against each other, and had guerrillas calling in on the phone to be part of the scenario generation.

“Do I have to agree to a ceasefire to take part in the scenario?” (Random guerrilla)

(The answer was no – no preconditions to the scenario because you just wanted people in the same place)

What we have to face is that some problems are so big that it will take more than our friends and family to solve them. We may have to work with strangers – or our enemies.

Think about that.

You’ve been fighting someone for so long that you don’t really remember all the details – but you know that you hate each other and that you have both done bad things to each other recently. Suddenly, something comes up and it’s huge. It’s a wicked problem, one that is complex and hard to deal with or even understand. You can’t solve it alone. Your enemy can’t solve it alone.

Can you solve the initial problem of getting these two people into the room just to even talk about things? Then, having done that, can you work out how to work together on the thing that threatens you both and, somehow, act in concert to deal with it?

What if it’s so big that it’s bigger than both of you? Now, not only do you have to work together on something, you have to find someone else who will work with your semi-dysfunctional mutual hatred society. Maybe the only person who can help you is the person that you both hate second to each other? Point 2 of the approach talks about a collective effort and 3 demands that the leader, the people with agency who can change things, are the people who should be at the table.

Does it have to be Kings, warlords or CEOs? It depends on how entrenched they are in the status quo. If all the CEO is going to say it “Hey, we’re great”, then send someone who is nearly as powerful but actually has their eyes open.

I spoke to Adam tonight at the dinner and our exchange went like this:

Nick: “Thanks for a great talk, Adam. Listening to you talk about Bogota gave me hope. I don’t have to deal with warlords and guerrillas, I just have to get some academics around a table.”

Adam: “You’re welcome, but I was trying to change academics for years and I just gave up. Remember Kissinger’s quote about academics? (Kissinger, who was apparently quoting Sayre on Issawi)”

Issawi (from the grave): “In any dispute the intensity of feeling is inversely proportional to the value of the issues at stake.”

Sayre (from the grave): “That is why academic politics are so bitter.”

Now I realise that Adam was being facetious, he’s a highly amusing man, but it is slightly scary to hear this from a man who was willing to tell people to stop complaining about sitting next to someone who had tried to kill them five times, because he was trying to stop the sixth attempt.

I like the Power and Love framing – I think I’m far too prone to that sentimental ‘love’ approach,without giving enough attention to the requirement of people to be people! I think I’ll have to buy Adam’s book tomorrow!


Creative Innovations 2012

So much to blog about from the conventicle but, surprise!, I’m not at home, I’m in a hotel preparing to attend the first day the Creative Innovations 2012 conference. I have a ‘wild card’ entry (sponsored ticket) courtesy  of the Vice President of Services and Resources of my University and I’m really looking forward to it.

This is not a free lunch. (Readers of fine literature will know that there ain’t no such thing as a free lunch.) I need to look at the activities of the next three days through a lens that could bring five concrete proposals back to the University. I must be honest – I had been expecting something like this because it’s too good an opportunity to go to waste. Take a group of people from the Uni and throw them into a giant melting pot of entrepreneurs and creative thinkers… well, you’d hope to get at least five ideas!

Our Uni is a big place, with many complex systems, so I’ll definitely have my thinking cap on for the next few days!

This entry is short because I suspect I’ll be live blogging quite extensively tomorrow.

And I have my conventicle notes to write up as well.

Expect a lot from me over the next few days!


First Adelaide Computing Education Conventicle

Well, my hosting duties are done and I’m relaxing at home, having hosted the first successful Adelaide Computing Education Conventicle! I’m absolutely exhausted and I have to jump on a plane very soon and so I crave your indulgence because today’s post is going to be a reposting of my welcoming speech to the Conventicle. My thanks to all of the guests, presenters and attendees – we started a new tradition well. I look forward to filling in the details over the next few days. Without any further ado, here is my speech:

“Welcome to the first Adelaide Computing Education Conventicle.

I would first like to acknowledge that we are meeting on the traditional country of the Kaurna people of the Adelaide plains, the original inhabitants of the land upon which the University of Adelaide was built, and who have shared with us a name for this building. Ingkarni Wardli means ‘place of learning’ or, my favourite, ‘the house of inquiry’ and is the first building in the University’s history to have a Kaurna name. I recognise and respect their cultural heritage, beliefs and relationship with the land, and I acknowledge that they are of continuing importance to the Kaurna people living today.

In the spirit of today’s events, I would like to share with you the history of the name of this building, to emphasise the importance of today’s meeting – a meeting of people who are dedicated to learning, to knowledge and to sharing what they know with other people. This building had a working name of “Innova21” but a new name was always sought and, after a great deal of discussion, the then-Dean, Professor Peter Dowd, decided to seek advice on a name from the Kaurna people.

It would have been very easy to look at what we, as outsiders, know of the Kaurna language and pick a name that seemed right – especially when the word for knowledge “Ingkarni” was so close to the word “Innova”. However, the Kaurna language is protected by its custodians, because of people with less than perfect understanding or, in some extreme cases, a desire to exploit by association, so we needed to seek approval before the naming. As it turns out, calling the building “Ingkarni” by itself would have been nonsensical and would have undone the intent of the namers, which was to recognise and respect the cultural traditions of the Kaurna, in their role as educators.

If you have ever had the good fortune to hear the Kaurna Elder, Uncle Lewis O’Brien, you will know that the Kaurna placed great value on education and were respected among the neighbouring communities as educators and conference leaders. When big decisions were being made, when important knowledge had to be shared, the Kaurna were generally to be consulted and would have an instrumental role in the process. What better name for a building that contains science and education than the name “House of Enquiry” from a people who were known for their knowledge and their importance in the sharing of wisdom?

Today, we gather to discuss our knowledge of education, to share our successes and to understand and to seek to address those areas where we are yet to succeeed. I would like to thank the Australian Council of Deans of ICT’s Learning and Teaching Academy, for funding both me and Simon under the Fellows program. I would like to thank the inimitable Simon for his encouragement to run this, and to thank our other interstate guest, Dr Raymond Lister, for being here today to share his research. I would also like to thank you all for agreeing to present, or to just show up and listen. It is far easier to ignore alternative approaches to learning and teaching than it is to sit in a room and prepare to discover that you might be able to do things differently, with greater effect. I welcome you all and I hope that this is the first of a long and fruitful cycle of Conventicles. It is now my pleasure to introduce Simon!”

 


Game Design and Boredom: Learning From What I Like

For those of you poor deluded souls who are long term readers (or long term “receivers of e-mail that you file under the ‘read while anaesthetised’ folder”) you will remember that I talked about producing a zombie game some time ago and was crawling around the house to work out how fast you could travel as a legless zombie. Some of you (well, one of you – thanks, Mark) has even sent me appropriately English pictures to put into my London-based game. Yet, as you can see, there is not yet a game.

What happened?

The first thing I wanted to do was to go through the design process and work out if I could produce a playable game that worked well. Along the way, however, I’ve discovered a lot of about games because I have been thinking in far more detail about games and about why I like to play the games that I enjoy. To quote my previous post:

I play a number of board games but, before you think “Oh no, not Monopoly!”, these are along the lines of the German-style board games, games that place some emphasis on strategy, don’t depend too heavily on luck, may have collaborative elements (or an entirely collaborative theme), tend not to be straight war games and manage to keep all the players in the game until the end.

What I failed to mention, you might notice, is that I expect these games to be fun. As it turns out, the first design for the game actually managed to meet all of the above requirements and, yet, was not fun in any way at all. I realised that I had fallen into a trap that I am often prone to, which is that I was trying to impose a narrative over a set of events that could actually occur in any order or any way.

Ever prepared for a class, with lots of materials for one specific area, and then the class takes a sudden shift in direction (it turns out that the class haven’t assimilated a certain foundation concept) and all of that careful work has to be put away for later? Sometimes it doesn’t matter how much you prepare – life happens and your carefully planned activities get derailed. Even if you don’t get any content surprises, it doesn’t take much to upset the applecart (a fire alarm goes off, for example) and one of the signs of the good educator is the ability to adapt to continue to bring the important points to the learner, no matter what happens. Walking in with a fixed narrative of how the semester is going to roll out is unlikely to meet the requirements of all of your students and if something goes wrong, you’re stuffed (to use the delightful Australian vernacular, which seems oddly appropriate around Thanksgiving).

In my head, while putting my game together, I had thought of a set of exciting stories, rather than a possible set of goalsevents and rules that could apply to any combination of players and situations. When people have the opportunity to explore, they become more engaged and they tend to own the experience more. This is what I loved about the game Deus Ex, the illusion of free will, and I felt that I constructed my own narrative in there, despite actually choosing from one of the three that was on offer on carefully hidden rails that you didn’t see until you’d played it through a few times.

Still my favourite computer game!

Apart from anything else, I had made the game design dull. There is nothing exciting about laying out hexagonal tiles to some algorithm, unless you are getting to pick the strategy, so my ‘random starting map’ was one of the first things to go. London has a number of areas and, by choosing a fixed board layout that increased or decreased based on player numbers, I got enough variation by randomising placement on a fixed map.

I love the game Arkham Horror but I don’t play it very often, despite owning all of the expansions. Why? The set-up and pack-up time take ages. Deck after deck of cards, some hundreds high, some 2-3, have to be placed out onto a steadily shrinking playing area and, on occasion, a player getting a certain reward will stop the game for 5-10 minutes as we desperately search for the appropriate sub-pack and specific card that they have earned. The game company that released Arkham has now released iPhone apps that allow you to monitor cards on your phone but, given that each expansion management app is an additional fee and that I have already paid money for the expansions themselves, this has actually added an additional layer of irritation. The game company recognises that their system is painful but now wish to charge me more money to reduce the problem! I realised that my ‘lay out the hexes’ for the game was boring set-up and a barrier to fun.

The other thing I had to realise is that nobody really cares about realism or, at least, there is only so much realism people need. I had originally allows for players to be soldiers, scientists, police, medical people, spies and administrators. Who really wants to be the player responsible for the budgetary allocation of a large covert government facility? Just because the administrator has narrative value doesn’t mean that the character will be fun to play! Similarly, why the separation between scientists and doctors? All that means is I have the unpleasant situation where the doctors can’t research the cure and the scientists can’t go into the field because they have no bandaging skill. If I’m writing a scenario as a novel or short story, I can control the level of engagement for each character because I’m writing the script. In a randomised series of events, no-one is quite sure who will be needed where and the cardinal rule of a game is that it should be fun. In fact, that final goal of keeping all players in the game until the end should be an explicit statement that all players are useful in the game until the end.

The games I like are varied but the games that I play have several characteristics in common. They do not take a long time to set-up or pack away. They allow every player to matter, up until the end. Whether working together or working against each other, everyone feels useful. There is now so much randomness that you can be destroyed by a bad roll but there is not so much predictability that you can coast after the second round. The games I really like to play are also forgiving. I am playing some strategy games at the moment and, for at least two of them, decisions made in the first two rounds will affect the entire game. I must say that I’m playing them to see if that is my lack of ability or a facet of the game. If it turns out to be the game, I’ll stop playing because I don’t need to have a game berating me for making a mistake 10 rounds previously. It’s not what I call fun.

I hope to have some more time to work on this over the summer but, as a design exercise, it has been really rewarding for me to think about. I understand myself more and I understand games more – and this means that I am enjoying the games that I do play more as well!


Waiting for Another Apocalypse

Many of you will know that the 21st of December is a date that has great significance to eschatologists. Now, while you might think “Why do I care about people who sing ‘bee boop doodly oop’ to Jazz?” I’m actually talking about people who are interested in studying the eschaton (Wikipedia), the end times or the last days of humanity. Most traditions have an end of the world event contained within but what happens after that moment varies far more widely than the ‘world will end’ event that most have ancient writings for. Some believe that great transformation will occur to unite all of humanity, some believe that there will be some sort of giant shift in consciousness and some believe that it will mark the end – as was so amusingly illustrated in that recent scientific tract from John Cusack, “2012”.

There is a fundamental point here that, of course, that when you have two or more conflicting claims, and the claims are mutually exclusive, that in the absence of any other evidence you can say that at least one of them must be wrong. It’s also worth noting that even where you have agreement on something, if it isn’t supported by evidence, there is no guarantee that anyone is actually right. Can we say that everyone is wrong? No, of course we can’t, and this is where situations like this provide an excellent way to talk about controversial but important facets of human thought with students, without having to actually try to control or undermine their existing sets of faith and belief. But, given that most of our students are now aware that they have lived through at least two predicted calamitous eschaton events prior to now, the next one provides an opportunity to look at how information, belief and culture interact.

The December 21st date is mostly related to the Mayan long count and the end of this particular b’ak’tun, a period of 144,000 days. However, Mayanist scholars note that interpreting this event as the ‘end of the calendar’ is not accurate and, apart from anything else, the Mayans referred to events beyond this date. (If you’re convinced that the world is ending on Tuesday, then making a note to pick up your shirts on Thursday is either absent-mindedness or a lack of conviction.) Basically, yes, the end of the 13th b’ak’tun may have been noteworthy, a matter for celebration, but far more along the lines of the Fresh Prince and Jazzy Jef re-uniting for a Y2K music video than the harbinger of the eschaton. The problem is that, much like pyramid power and magic water, it doesn’t take much fuel to get certain engines running and it would be fair to say that a group of people have run a very long way with the idea that the end of the world is coming in less than a month.

We all handle the eschaton in our own ways. Will Smith apparently stands above it.

However, looking around, it’s pretty obvious that this is not a mainstream belief although it is widespread information. We have had a number of these dates come and go, Y2K itself was one of them if you happened to be a millennialist, and some groups have more than others. It’s understandable that there is now some accumulated cynicism about this particular date, although we do have enough perceived knowledge penetration into the mainstream community that Hollywood was willing to bankroll cinema that combines conspiracy theory, Mayan calendars, eschatology, and some particular bizarre politics and ethics in the movie “2012”.

Breaking this down, we have the Mayan Calendar, which provides a date that we can map into our calendar and it appears that this date was noteworthy, in some way, but we don’t get anything else because the Mayans tended to record historically, rather than prophetically. That the 13th b’ak’tun is going to end soon is a fact, and let’s assume that everyone has done the correct adjustments for calendar shenanigans in the Western/Christian calendar. Where we have records of Mayan prophecy, they don’t make a big deal about this change. In fact, if anything, they were completely aware that a cycle had preceded this one and they certainly hoped that their world would continue into the next one. This is where I like to start a discussion about data, information and knowledge, after a rather contentious hierarchy that could be claimed by many (Ackoff springs to mind although Wisdom is, unnecessarily and confusingly in my opinion, added on top. I cite Quigley most often, but Sowa also discusses it well.) The data of the Mayan 13th b’ak’tun is the total number of days from the start of that cycle, which is 13*144,000: 1,872,000 days. By itself, if you gave someone that, they have a value but no structure, no context and no way to use it. Putting it into the Mayan Long Count format gives us 13.0.0.0.0. Now, with structure, we can see that we have 13 b’ak’tuns, 0 k’atuns, 0 tuns, 0 uinals and 0 days. Much like turning 730,500 days into 2000 years (I didn’t do this precisely, I just multiplied by 365.25, before anyone checks), we now see structure and we have some context for the value.

What we do not yet have is any understanding of how we would use this in order to make significant decisions and, as such, there is now knowledge implicitly associated with this that could tell us anything other than “this is a date with a lot of zeros”. After all, if your car odometer flips over to 20,000 miles/kilometres, that is merely a figure with a lot of zeros, unless you can associate this with a servicing schedule that says “come in when you hit 20,000”. Once we have correctly contextualised the information in a way that we can make decisions, we have knowledge. This is a great opportunity to talk with students about things like occult or secret knowledge, where great weight is placed upon the hidden or ritual knowledge of lost or ancient cultures, because of a perceived significance of a greater wisdom from these older cultures. (And this is the foundation of conspiracy theory, where wisdom is associated with occult knowledge of what the faceless they are up to. Not knowing these secret facts makes you a rube, or someone whose opinion may be discounted. Wake up, sheeple!) Without having to say whether anything is right or wrong, because it is impossible to make strong statements either way in most of these areas, we can look at how numbers (or facts) are placed into structures and how these structures can then be drawn upon and extended in ways that we would interpret as concrete or rational, and in ways where we see any number of reasoning or philosophical fallacies. We can also talk about cultural misappropriation and how the transporting of ideas from one culture to another sometimes just doesn’t work, because we don’t really have enough information or a correct cultural context to make any sense out of it.

Of course, the fallacy fallacy is the great out for everyone here because a fallacious argument does not mean that the idea is, itself, wrong. Thinking about all of this is important because it can help to identify where our facts have been taken up and used in ways that are, ultimately, not really well grounded in terms of their interpretation as knowledge. Certainly in neo-Piagetian terms, students are very prone to magical thinking when they start to learn in a new area (pre-operational) and being able to discuss magical thinking in other areas, even down to notion of mimicry and cargo-cultism, can help to broach the idea that, somewhere in the reasoning process, a leap has been made that is not necessarily supported.

Having said all this, I shall be highly surprised if the end of the world does occur on December the 21st, but I hope that you will understand why I do not publish an apology on the 22nd.