Thinking about teaching spaces: if you’re a lecturer, shouldn’t you be lecturing?
Posted: December 30, 2012 Filed under: Education | Tags: blogging, collaboration, community, curriculum, design, education, educational problem, feedback, Generation Why, higher education, in the student's head, learning, measurement, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, vygotsky Leave a commentI was reading a comment on a philosophical post the other day and someone wrote this rather snarky line:
He’s is a philosopher in the same way that (celebrity historian) is a historian – he’s somehow got the job description and uses it to repeat the prejudices of his paymasters, flattering them into thinking that what they believe isn’t, somehow, ludicrous. (Grangousier, Metafilter article 123174)
Rather harsh words in many respects and it’s my alteration of the (celebrity historian)’s name, not his, as I feel that his comments are mildy unfair. However, the point is interesting, as a reflection upon the importance of job title in our society, especially when it comes to the weighted authority of your words. From January the 1st, I will be a senior lecturer at an Australian University and that is perceived differently where I am. If I am in the US, I reinterpret this title into their system, namely as a tenured Associate Professor, because that’s the equivalent of what I am – the term ‘lecturer’ doesn’t clearly translate without causing problems, not even dealing with the fact that more lecturers in Australia have PhDs, where many lecturers in the US do not. But this post isn’t about how people necessarily see our job descriptions, it’s very much about how we use them.
In many respects, the title ‘lecturer’ is rather confusing because it appears, like builder, nurse or pilot, to contain the verb of one’s practice. One of the big changes in education has been the steady acceptance of constructivism, where the learners have an active role in the construction of knowledge and we are facilitating learning, in many ways, to a greater extent than we are teaching. This does not mean that teachers shouldn’t teach, because this is far more generic than the binding of lecturers to lecturing, but it does challenge the mental image that pops up when we think about teaching.
If I asked you to visualise a classroom situation, what would you think of? What facilities are there? Where are the students? Where is the teacher? What resources are around the room, on the desks, on the walls? How big is it?
Take a minute to do just this and make some brief notes as to what was in there. Then come back here.
It’s okay, I’ll still be here!
False Dichotomy: If I don’t understand it, then either I am worthless or it is!
Posted: December 29, 2012 Filed under: Education | Tags: authenticity, collaboration, community, curriculum, education, educational research, ethics, higher education, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools Leave a commentI’ve been reading an interesting post on Metafilter about the “Minima Moralia: Reflections from the Damaged Life“, by Theodor Adorno. While the book itself is very interesting, two of the comments on the article caught my eye. An earlier commenter had mentioned that they neither understood nor appreciated this kind of thing, and made the usual throwaway remark about postmodernism being “a scam to funnel money from the productive classes to the parasitical academy” (dydecker). Further down, another commenter, Frowner, gently took this statement to task, starting by noting that Adorno would have been appalled by being labelled a post-modernist, and then discussing why dydecker might have felt the need to attack things in this way. It’s very much worth reading Frowner’s comments on this post, but I shall distil the first one here:
- Just because a text is difficult to obscure does not mean that it is postmodern. Also post-modernist is not actually an insult and this may be a politically motivated stance to attacks group of people who are also likely to identify as status quo critical or (gasp) Marxist.
- Not all texts need to be accessible to all audiences, not is something worthless, fake or elitist if it requires pre-readings or some effort to get into. Advanced physics texts can be very difficult to comprehend for the layperson. This does not make Quantum Field Theory wrong or a leftist conspiracy.
- You don’t need to read books that you don’t want to read.
- You don’t need to be angry at difficult books for being difficult. To exactly quote Frowner,
Difficult books only threaten us if we decide to feel guilty and ashamed for not reading them.
If you’re actually studying an area, and read the books that the work relies upon, difficult books can become much clearer, illustrating that it was perhaps not the book that was causing the difficulty.
- Sometimes you won’t like something and this has nothing to do with its quality or worth – you just don’t like it.
- Don’t picture a perfect reader in your head who understands everything and hold yourself to that standard. If you’re reading a hard book then keep plugging away and accept your humanity.
Frowner then goes on to beautifully summarise all of this in a later comment, where he notes that we seem to learn to be angry at, or uncomfortable with, difficult texts, because we are under pressure to be capable of understanding everything of worth. This is an argument of legitimacy: if the work is legitimate and I don’t understand it, then I am stupid, however if I can argue that the work is illegitimate, then this is a terrible con job, I am not stupid for not understanding this and we should attack this work! Frowner wonders about how we are prepared for the world and believes that we are encouraged to see ourselves as inadequate if we do not understand everything for ourselves, hence the forced separation of work into legitimate and illegitimate, with am immediate, and often vicious, attack on those things we define as illegitimate in order to protect our image of ourselves.
I spend a reasonable amount of time in art galleries and I wish I had a dollar for everyone who stood in front of a piece of modern art (anything from the neo-impressionists on, basically) and felt the need to loudly state that they “didn’t get it” or that they could “have painted it themselves.” (I like Rothko, Mondrian and Klee, among others, so I am often in that part of the gallery.) It is quite strange when you come to think about it – why on earth are people actually vocalising this? Looking more closely, it is (less surprisingly) people in groups of two or more who seem to do this: I don’t understand this so, before you ask me about, I will declare it to be without worth. I didn’t get it, therefore this art has failed me. We go back to Frowner’s list and look at point 2: Not all art (in this case) is for everyone and that’s ok. I can admire Grant Wood’s skill and his painting “American Gothic” but the painting doesn’t appeal as much to me as does the work of Schiele, for example. That’s ok, that doesn’t make Schiele better than Wood in some Universal Absolute Fantasy League of Painters (although the Schiele/Klimt tag team wrestling duo, with their infamous Golden Coat Move, would be fun to watch) – it’s a matter of preference. I regularly look at things that I don’t quite understand but I don’t regard it as a challenge or an indication that it or I are at fault, although I do see things that I understand completely and can quite happily identify reasons that I don’t like it!

Klee’s “The Goldfish”. Some will see this as art, others will say “my kids could do that”. Unless you are Hans Wilhelm Klee, no, probably not.
I am, however, very lucky, because I have a job and lifestyle where my ability to think about things is a core component: falsely dichotomous thinking is not actually what I’m paid to do. However, I do have influence over students and I need to be very careful in how I present information to them. In my last course, I deliberately referred to Wikipedia among other documents because it is designed to be understood and is usually shaped by many hands until it reaches an acceptable standard of readability. I could have pointed my students at ethics texts but these texts often require more preparation and a different course structure, which may have put students off actually reading and understanding them. If my students go into ethics, or whatever other area they deem interesting, then point 4 becomes valid and their interest, and contextual framing, can turn what would have been a difficult book into a useful book.
I agree with this (effectively) anonymous poster and his or her summary of an ongoing issue: we make it hard for people to admit that they are learning, that they haven’t quite worked something out yet, because we make “not getting something immediately” a sign of slowness (informally) and often with negative outcomes (in assessment or course and career progression). We do not have to be experts at everything, nor should we pretend to be. We risk not actually learning some important and beautiful things because we feel obliged to reject it before it rejects us – and some things, of great worth that will be long appreciated, take longer to ‘get’ then just the minute or two that we feel we can allocate.
Adelaide Computing Education Conventicle 2012: “It’s all about the people”
Posted: December 27, 2012 Filed under: Education | Tags: acec2012, advocacy, authenticity, blogging, collaboration, community, conventicle, curriculum, design, education, educational problem, educational research, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, learning, principles of design, reconciliation, reflection, resources, student perspective, teaching, teaching approaches, thinking, universal principles of design 1 Commentacec 2012 was designed to be a cross-University event (that’s the whole point of the conventicles, they bring together people from a region) and we had a paper from the University of South Australia: ‘”It’s all about the people”; building cultural competence in IT graduates’ by Andrew Duff, Kathy Darzanos and Mark Osborne. Andrew and Kathy came along to present and the paper was very well received, because it dealt with an important need and a solid solution to address that need, which was inclusive, insightful and respectful.
For those who are not Australians, it is very important to remember that the original inhabitants of Australia have not fared very well since white settlement and that the apology for what happened under many white governments, up until very recently, was only given in the past decade. There is still a distance between the communities and the overall process of bringing our communities together is referred to as reconciliation. Our University has a reconciliation statement and certain goals in terms of representation in our staff and student bodies that reflect percentages in the community, to reduce the underrepresentation of indigenous Australians and to offer them the same opportunities. There are many challenges facing Australia, and the health and social issues in our indigenous communities are often exacerbated by years of poverty and a range of other issues, but some of the communities have a highly vested interest in some large-scale technical, ICT and engineering solutions, areas where indigenous Australians are generally not students. Professor Lester Irabinna Rigney, the Dean of Aboriginal Education, identified the problem succinctly at a recent meeting: when your people live on land that is 0.7m above sea level, a 0.9m sea-level rise starts to become of concern and he would really like students from his community to be involved in building the sea walls that address this, while we look for other solutions!
Andrea, Kathy and Mark’s aim was to share out the commitment to reconciliation across the student body, making this a whole of community participation rather than a heavy burden for a few, under the guiding statement that they wanted to be doing things with the indigenous community, rather than doing things to them. There’s always a risk of premature claiming of expertise, where instead of working with a group to find out what they want, you walk in and tell them what they need. For a whole range of very good and often heartbreaking reasons, the Australian indigenous communities are exceedingly wary when people start ordering them about. This was the first thing I liked about this approach: let’s not make the same mistakes again. The authors were looking for a way to embed cultural awareness and the process of reconciliation into the curriculum as part of an IT program, sharing it so that other people could do it and making it practical.
Their key tenets were:
- It’s all about the diverse people. They developed a program to introduce students to culture, to give them more than one world view of the dominant culture and to introduce knowledge of the original Australians. It’s an important note that many Australians have no idea how to use certain terms or cultural items from indigenous culture, which of course hampers communication and interaction.
For the students, they were required to put together an IT proposal, working with the indigenous community, that they would implement in the later years of their degree. Thus, it became part of the backbone of their entire program.
- Doing with [people], not to [people]. As discussed, there are many good reasons for this. Reduce the urge to be the expert and, instead, look at existing statements of right and how to work with other peplum, such as the UN rights of indigenous people and the UniSA graduate attributes. This all comes together in the ICUP – Indigenous Content in Undergraduate Program
How do we deal with information management in another culture? I’ve discussed before the (to many) quite alien idea that knowledge can reside with one person and, until that person chooses or needs to hand on that knowledge, that is the person that you need. Now, instead of demanding knowledge and conformity to some documentary standard, you have to work with people. Talking rather than imposing, getting the client’s genuine understanding of the project and their need – how does the client feel about this?
Not only were students working with indigenous people in developing their IT projects, they were learning how to work with other peoples, not just other people, and were required to come up with technologically appropriate solutions that met the client need. Not everyone has infinite power and 4G LTE to run their systems, nor can everyone stump up the cash to buy an iPhone or download apps. Much as programming in embedded systems shakes students out of the ‘infinite memory, disk and power’ illusion, working with other communities in Australia shakes them out of the single worldview and from the, often disrespectful, way that we deal with each other. The core here is thinking about different communities and the fact that different people have different requirements. Sometimes you have to wait to speak to the right person, rather than the available person.
The online forum has four questions that students have to find a solution to, where the forum is overseen by an indigenous tutor. The four questions are:
- What does culture mean to you?
- Post a cultural artefact that describes your culture?
- I came here to study Computer Science – not Aboriginal Australians?
- What are some of the differences between Aboriginal and non-Aboriginal Australians?
The first two are amazing questions – what is your answer to question number 2? The second pair of questions are more challenging and illustrate the bold and head-on approach of this participative approach to reconciliation. Reconciliation between all of the Australian communities requires everyone to be involved and, being honest, questions 3 and 4 are going to open up some wounds, drag some silly thinking out into the open but, most importantly, allow us to talk through issues of concern and confusion.
I suspect that many people can’t really answer question 4 without referring back to mid-50s archetypal depictions of Australian Aborigines standing on one leg, looking out over cliffs, and there’s an excellent ACMI (Australian Centre for the Moving Image) exhibit in Melbourne that discusses this cultural misappropriation and stereotyping. One of the things that resonated with me is that asking these questions forces people to think about these things, rather than repeating old mind grooves and received nonsense overheard in pubs, seen on TV and heard in racist jokes.
I was delighted that this paper was able to be presented, not least because the goal of the team is to share this approach in the hope of achieving even greater strides in the reconciliation process. I hope to be able to bring some of it to my Uni over the next couple of years.
John Henry Died
Posted: December 23, 2012 Filed under: Education | Tags: advocacy, authenticity, blogging, collaboration, community, curriculum, design, education, educational problem, ethics, feedback, Generation Why, grand challenge, higher education, in the student's head, john henry, learning, meritocracy, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, workload Leave a commentEvery culture has its myths and legends, especially surrounding those incredible individuals who stand out or tower over the rest of the society. The Ancient Greeks and Romans had their gods, demigods, heroes and, many times, cautionary tales of the mortals who got caught in the middle. Australia has the stories of pre- and post-federation mateship, often anti-authoritarian or highlighting the role of the larrikin. We have a lot of bushrangers (with suspiciously good hearts or reacting against terrible police oppression), Simpson and his donkey (a first world war hero who transported men to an aid station using his donkey, ultimately dying on the battlefield) and a Prime Minister who goes on YouTube to announce that she’s now convinced that the Mayans were right and we’re all doomed – tongue firmly in cheek. Is this the totality of the real Australia? No, but the stylised notion of ‘mateship’, the gentle knock and the “come off the grass, you officious … person” attitude are as much a part of how many Australians see themselves as shrimp on a barbie is to many US folk looking at us. In any Australian war story, you are probably more likely to hear about the terrible hangover the Gunner Suggs had and how he dragged his friend a kilometre over rough stones to keep him safe, than you are to hear about how many people he killed. (I note that this mateship is often strongly delineated over gender and racial lines, but it’s still a big part of the Australian story.)
The stores that we tell and those that we pass on as part of our culture strongly shape our culture. Look at Greek mythology and you see stern warnings against hubris – don’t rate yourself too highly or the gods will cut you down. Set yourself up too high in Australian culture and you’re going to get knocked down as well: a ‘tall poppies’ syndrome that is part cultural cringe, inherited from colonial attitudes to the Antipodes, part hubris and part cultural confusion as Anglo, Euro, Asian, African and… well, everyone, come to terms with a country that took the original inhabitants, the Australian Aboriginal and Torres Strait Islanders, quite a while to adapt to. As someone who wasn’t born in Australia, like so many others who live here and now call themselves Australia, I’ve spent a long time looking at my adopted homeland’s stories to see how to fit. Along the way, because of travel, I’ve had the opportunity to look at other cultures as well: the UK, obviously as it’s drummed into you at school, and the US, because it interests me.
The stories of Horatio Alger, from the US, fascinate me, because of their repeated statement of the rags to riches story. While most of Alger’s protagonists never become amazingly wealthy, they rise, through their own merits, to take the opportunities presented to them and, because of this, a good man will always rise. This is, fundamentally, the American Dream – that any person can become President, effectively, through the skills that they have and through rolling up their sleeves. We see this Dream become ugly when any of the three principles no longer hold, in a framing I first read from Professor Harlon Dalton:
- The notion that we are judged solely on our merits:For this to be true, we must not have any bias, racist, gendered, religious, ageist or other. Given the recent ruling that an attractive person can be sacked, purely for being attractive and for providing an irresistible attraction for their boss, we have evidence that not only is this point not holding in many places, it’s not holding in ways that beggar belief.
- We will each have a fair opportunity to develop these merits:This assumes equal opportunity in terms of education, in terms of jobs, which promptly ignores things like school districts, differing property tax levels, teacher training approaches and (because of the way that teacher districts work) just living in a given state or country because your parents live there (and can’t move) can make the distance between a great education and a sub-standard child minding service. So this doesn’t hold either.
- Merit will out:Look around. Is the best, smartest, most talented person running your organisation or making up all of the key positions? Can you locate anyone in the “important people above me” who is holding that job for reasons other than true, relevant merit?
Australia’s myths are beneficial in some ways and destructive in others. For my students, the notion that we help each other, we question but we try to get things done is a positive interpretation of the mild anti-authoritarian mateship focus. The downside is drinking buddies going on a rampage and covering up for each other, fighting the police when the police are actually acting reasonably and public vandalism because of a desire to act up. The mateship myth hides a lot of racism, especially towards our indigenous community, and we can probably salvage a notion of community and collaboration from mateship, while losing some of the ugly and dumb things.
Horatio Alger myths would give hope, except for the bleak reality that many people face which is that it is three giant pieces of boloney that people get hit about the head with. If you’re not succeeding, then Horatio Alger reasoning lets us call you lazy or stupid or just not taking the opportunities. You’re not trying to pull yourself up by your bootstraps hard enough. Worse still, trying to meet up to this, sometimes impossible, guideline leads us into John Henryism. John Henry was a steel driver, who hammered and chiseled the rock through the mountains to build tunnels for the railroad. One day the boss brought in a steam driven hammer and John Henry bet that he could beat it, to show that he and his crew should not be replaced. After a mammoth battle between man and machine, John Henry won, only to die with the hammer in his hand.
Let me recap: John Henry died – and the boss still got a full day’s work that was equal to two steam-hammers. (One of my objections to “It’s a Wonderful Life” is that the rich man gets away with stealing the money – that’s not a fairy tale, it’s a nightmare!) John Henryism occurs when people work so hard to lift themselves up by their bootstraps that they nearly (or do) kill themselves. Men in their 50s with incredibly high blood pressure, ulcers and arthritis know what I’m talking about here. The mantra of the John Henryist is:
“When things don’t go the way that I want them to, that just makes me work even harder.”
There’s nothing intrinsically wrong with this when your goal is actually achievable and you apply this maxim in moderation. At its extreme, and for those people who have people standing on their boot caps, this is a recipe to achieve a great deal for whoever is benefiting from your labour.
And then dying.
As John Henry observes in the ballad (Springsteen version), “I’ll hammer my fool self to death”, and the ballad of John Henry is actually a cautionary tale to set your pace carefully because if you’re going to swing a hammer all day, every day, then you have to do it at a pace that won’t kill you. This is the natural constraint on Horatio Alger and balances all of the issues with merit and access to opportunity: don’t kill your “fool self” striving for something that you can’t achieve. It’s a shame, however, that the stories line up like this because there’s a lot of hopelessness sitting in that junction.
Dealing with students always makes me think very carefully about the stories I tell and the stories I live. Over the next few days, I hope to put together some thoughts on a 21st century myth form that inspires without demanding this level of sacrifice, and that encourages without forcing people into despair if existing obstacles block them – and it’s beyond their current control to shift. However, on that last point, what I’d really like to come up with is a story that encourages people to talk about obstacles and then work together to lift them out of the way. I do like a challenge, after all. 🙂
“We are not providing an MIT education on the web…”
Posted: December 21, 2012 Filed under: Education | Tags: advocacy, blogging, collaboration, community, curriculum, design, education, educational research, ethics, Generation Why, higher education, in the student's head, learning, measurement, moocs, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, vygotsky Leave a commentI’ve been re-watching some older announcements that describe open courseware initiatives, starting from one of the biggest, the MIT announcement of their OpenCourseWare (OCW) initiative in April, 2001. The title of this post actually comes from the video, around the 5:20 mark, (Video quoted under a CC-BY-NC-SA licence, more information available at: http://ocw.mit.edu/terms)
“Let me be very clear, we are not providing an MIT education on the Web. We are, however, providing core materials that are the infrastructure that undergirds that information. Real education, in our view, involves interaction between people. It’s the interaction between faculty and students, in our classrooms and our living group, in our laboratories that are the heart, the real essence, of an MIT education. “
While the OCW was going to be produced and used on campus, the development of OCW was seen as something that would make more time available for student interaction, not less. President Vest then goes on to confidently predict that OCW will not make any difference to enrolment, which is hardly surprising given that he has categorically excluded anyone from achieving an MIT education unless they enrol. We see here exactly the same discussion that keeps coming up: these materials can be used as augmenting materials in these conventional universities but can never, in the view of the President or Vice Chancellor, replace the actual experience of obtaining a degree from that institution.
Now, don’t get me wrong. I still think that the OCW initiative was excellent, generous and visionary but we are still looking at two fundamentally different use cases: the use of OCW to augment an existing experience and the use of OCW to bootstrap a completely new experience, which is not of the same order. It’s a discussion that we keep having – what happens to my Uni if I use EdX courses from another institution? Well, ok, let’s ask that question differently. I will look at this from two sides with the introduction of a new skill and knowledge area that becomes ubiquitous, in my sphere, Computer Science and programming. Let’s look at this in terms of growth and success.
What happens if schools start teaching programming to first year level?
Let’s say that we get programming into every single national curriculum for secondary school and we can guarantee that students come in knowing how to program to freshman level. There are two ways of looking at this and the first, which we have probably all seen to some degree, is to regard the school teaching as inferior and re-teach it. The net result of this will be bored students, low engagement and we will be wasting our time. The second, far more productive, approach is to say “Great! You can program. Now let’s do some Computer Science.” and we use that extra year or so to increase our discipline knowledge or put breadth courses back in so our students come out a little more well-rounded. What’s the difference between students learning it from school before they come to us, or through an EdX course on fundamental programming after they come to us?
Not much, really, as long as we make sure that the course meets our requirements – and, in fact, it gives us bricks-and-mortar-bound entities more time to do all that face-to-face interactive University stuff that we know students love and from which they derive great benefit. University stops being semi-vocational in some aspects and we leap into knowledge construction, idea generation, big projects and the grand dreams that we always talk about, yet often don’t get to because we have to train people in basic programming, drafting, and so on. Do we give them course credit? No, because they’re assumed knowledge, or barrier tested, and they’re not necessarily part of our structure anymore.
What happens if no-one wants to take my course anymore?
Now, we know that we can change our courses because we’ve done it so many times before over the history of the Academy – Latin, along with Greek the language of scholarship, was only used in half of the University publications of 1800. Let me wander through a classical garden for a moment to discuss the nature of change from a different angle, that of decline. Languages had a special place in the degrees of my University with Latin and Greek dominating and then with the daring possibility of allowing substitution of French or German for Latin or Greek from 1938. It was as recently as 1958 that Latin stopped being compulsory for high school graduation in Adelaide although it was still required for the study of Law – student demand for Latin at school therefore plummeted and Latin courses started being dropped from the school curriculum. The Law Latin requirement was removed around 1969-1970, which then dropped any demand for Latin even further. The reduction in the number of school teachers who could teach Latin required the introduction of courses at the University for students who had studied no Latin at all – Latin IA entered the syllabus. However, given that in 2007 only one student at all of the schools across the state of South Australian (roughly 1.2-1.4 million people) studied Latin in the final year of school, it is apparent that if this University wishes to teach Latin, it has to start by teaching all of Latin. This is a course, and a discipline, that is currently in decline. My fear is that, one day, someone will make the mistake of thinking that we no longer need scholars of this language. And that worries me, because I don’t know what people 30 years from now will actually want, or what they could add to the knowledge that we already have of one of our most influential civilisations.
This decline is not unique to Latin (or Greek, or classics in general) but a truly on-line course experience would allow us to actually pool those scholars we have left and offer scaled resources out for much longer than isolated pockets in real offices can potentially manage but, as President Vest notes, a storehouse of Latin texts does not a course make. What reduced the demand for Latin? Possibly the ubiquity of the language that we use which is derived from Latin combined with a change of focus away from a classical education towards a more job- and achievement-oriented (semi-vocational) style of education. If you ask me, programming could as easily go this way in about 20 years, once we have ways to let machines solve problems for us. A move towards a less go-go-go culture, smarter machines and a resurgence of the long leisure cycles associated with Science Fiction visions of the future and suddenly it is the engineers and the computer scientists who are looking at shrinking departments and no support in the schools. Let me be blunt: course popularity and desirability rises, stabilises and falls, and it’s very hard to tell if we are looking at a parabola or a pendulum. With that in mind, we should be very careful about how we define our traditions and our conventions, especially as our cunning tools for supporting on-line learning and teaching get better and better. Yes, interaction is an essential part of a good education, no argument at all, but there is an implicit assumption of critical mass that we have seen, time and again, to implicitly support this interaction in a face-to-face environment that is as much a function of popularity and traditionally-associated prestige as it is of excellence.
What are MIT doing now?
I look at the original OCW release and I agree that, at time of production, you could not reproduce the interaction between people that would give you an MIT education. But our tools are better now. They are, quite probably not close enough yet to give you an “MIT of the Internet” but should this be our goal? Not the production of a facsimile of the core materials that might, with MIT instructors, turn into a course, but the commitment to developing the tools that actually reproduce the successful components of the learning experience with group and personal interaction, allowing the formation of what we used to call a physical interactive experience in a virtual side? That’s where I think the new MIT initiatives are showing us how these things can work now, starting from their original idealistic roots and adding the technology of the 21st Century. I hope that other, equally prestigious, institutions are watching this, carefully.
Legitimisation and Agency: I Believe That’s My Ox on Your Bridge
Posted: December 19, 2012 Filed under: Education | Tags: advocacy, blogging, collaboration, community, curriculum, design, education, educational problem, Generation Why, grand challenge, higher education, learning, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, workload 1 CommentThere’s an infamous newspaper advertisement that never ran, which reflected the entry of IBM into the minicomputer market. A number of companies, Data General principal among them, but including such (historically) powerful players as Digital Equipment Corporation, Prime and Hewlett Packard, were quite successful in the minicomputer market, growing rapidly and stealing market share from IBM’s mainframe market. (For an excellent account of these times, I recommend “The Soul of a New Machine” by Tracy Kidder.) IBM finally decided to enter the minicomputer market and, as analysts remarked at the time, IBM’s move into minicomputers legitimised the market.
Ed DeCastro, CEO of Data General, had a full-page news paper advertisement prepared, which I reproduce (mildly bowdlerised to keep my all ages posting status):
“They Say IBM’s Entry Into the Minicomputer Market Will Legitimize the Industry. The B***ards Say, Welcome.”
The ad never actually ran but was framed and put on Ed’s wall. The point, however, was well and precisely made: IBM’s approval was neither required nor desired, and nobody had set a goal of being legitimised.
Over on Mark’s blog, we see that a large number of UK universities are banding together to launch an on-line project, including the highly successful existing player in the analogous space, the Open University, but also some high power players such as Southampton and the disturbingly successful St Andrews. As Mark notes in the title, this is a serious change in terms of allying a UK effort that will produce a competitor (or competitors) to the existing US dominance. As Mark also notes:
Hmm — OxBridge isn’t throwing hats into the rings yet.
And this is a very thoughtful Hmm, because the Universities of Oxford and Cambridge are the impossible-to-ignore legitimising agencies because of their sheer weight on the rubber sheet of UK Academy Spacetime. When it comes to talking about groups of Universities in the UK, and believe me there are quite a few, the Russell Group awards the lion’s share of PhDs, with 78% of the most highly graded research staff as well, across the 24 Universities. One of its stated goals is to lead the research efforts of the UK, with another being to attract the best staff and students to its member institutions. However, the group of participants in the new on-line project involve Russell Group Universities and those outside, which makes the non-participation of Oxford and Cambridge even more interesting. How can a trans-group on-line proposal bring the best students in – or is this why we aren’t seeing involvement from Oxbridge, because of the two-tier perception between traditional and on-line? One can easily argue that Oxford and Cambridge have no need to participate because they are so entrenched in their roles and their success that, as I’ve noted on a different post, any ranking system that rates them out of, say, the top 5 in the UK has made itself suspect as a ranking, rather than a reflection of dropping quality. Oxbridge is at the heart of the UK’s tertiary system and competition will continue to be fierce to gain entry for the foreseeable future. They have no need to get together with the others in their group or beyond, although it’s not from protecting themselves from competitors, as they are not really in competition with most of the other Russell Group members – because they are Oxford and Cambridge.
It’s worth noting that Cambridge’s vice-chancellor Leszek Borysiewicz did think that this consortium was exciting, and I quote from the THE article:
“Online education is becoming an important approach which may open substantial opportunities to those without access to conventional universities,” he said.
And that pretty much confirms why Cambridge is happy to stand back – because they are almost the definition of a conventional university, catering to a well-established market for whom attending a bricks-and-mortar University is as (if not more) important than the course content or delivery mechanisms. The “Gentleman’s Third”, receiving the lowest possible passing grade for your degree examinations, indicates a dedication to many things at the University that are, most likely, of a less-than-scholarly nature but it is precisely for these activities that some people go to Oxford and Cambridge and it is also precisely these non-scholarly activities that we will have great difficulty transferring into a MOOC. There will be no Oxford-Cambridge boat race carried out on a browser-based Flash game, with distributed participants hooked up to rowing machines across the globe, nor will the Footlights be conducted as a Google Hangout (except, of course, highly ironically).
Over time, we’ll find out more about the role of tradition and convention in the composition and participation, but let me return to my opening anecdote. We are already dealing with issues of legitimacy in the on-line learning space, whether from pedagogical fatigue, academic cultural inertia, xenophobia, or the fact that some highly vaunted previous efforts have not been very good. The absence of two of the top three Universities in the UK in this fascinating and potentially quite fruitful collaboration makes me think a lot about IBM. I think of someone sitting back, watching things happen, certain in the knowledge that what they do is what the market needs and it is, oh happy day, what they are currently doing. When Oxford and Cambridge come in and anoint the MOOC, if they every do or if we ever can, then we have the same antique avuncular approach to patting an entire sector on the head and saying “oh, well done, but the grownups are here now”, and this is unlikely to result in anything good in terms of fellow feeling or transferability and accreditation of students, key challenges in MOOCs being taken more seriously. Right now, Oxford and Cambridge are choosing not to step in, and there is no doubt that they will continue to be excellent Universities for their traditional attendees – but is this a sensible long term survival strategy? Could they be contributing to the exploration of the space in a productive manner by putting their legitimising weight in sooner rather than later, at a time when they are saying “Let’s all look at this to see if it’s any good”, rather than going “Oh, hell. Now we have to do something”? Would there be much greater benefit in bringing in their considerable expertise, teaching and research excellence, and resources now, when there is so much room for ground level innovation?
This is certainly something I’m fearful of in my own system, where the Group of 8 Universities has most of the research funding, most of the higher degree granting and, as a goal at least, targets the best staff and students. Our size and tradition can be barriers to agility and innovation, although our recent strategy is obviously trying to set our University on a more innovative and more agile course. A number of recent local projects are embracing the legitimacy of new learning and teaching approaches. It is, however, very important to remember the example of IBM and how the holders of tradition may not necessarily be welcomed as a legitimising influence when other have been highly successful innovating in a new space, which the tradition holder has deemed beneath them until reality finally intruded.
It’s easy to stand back and say “Well, that’s fine for people who can’t afford mainframes” but such a stance must be balanced with looking to see whether people still need or want to afford mainframes. I think the future of education is heavily blended – MOOC + face-to-face is somewhere where I think we can do great things – but for now it’s very interesting to see how we develop as we start to take more and more steps down this path.
Game Design and Boredom: Learning From What I Like
Posted: November 25, 2012 Filed under: Education, Opinion | Tags: authenticity, blogging, collaboration, community, curriculum, data visualisation, design, education, games, Generation Why, higher education, in the student's head, learning, principles of design, reflection, resources, teaching, teaching approaches, thinking, tools, zombies 5 CommentsFor those of you poor deluded souls who are long term readers (or long term “receivers of e-mail that you file under the ‘read while anaesthetised’ folder”) you will remember that I talked about producing a zombie game some time ago and was crawling around the house to work out how fast you could travel as a legless zombie. Some of you (well, one of you – thanks, Mark) has even sent me appropriately English pictures to put into my London-based game. Yet, as you can see, there is not yet a game.
What happened?
The first thing I wanted to do was to go through the design process and work out if I could produce a playable game that worked well. Along the way, however, I’ve discovered a lot of about games because I have been thinking in far more detail about games and about why I like to play the games that I enjoy. To quote my previous post:
I play a number of board games but, before you think “Oh no, not Monopoly!”, these are along the lines of the German-style board games, games that place some emphasis on strategy, don’t depend too heavily on luck, may have collaborative elements (or an entirely collaborative theme), tend not to be straight war games and manage to keep all the players in the game until the end.
What I failed to mention, you might notice, is that I expect these games to be fun. As it turns out, the first design for the game actually managed to meet all of the above requirements and, yet, was not fun in any way at all. I realised that I had fallen into a trap that I am often prone to, which is that I was trying to impose a narrative over a set of events that could actually occur in any order or any way.
Ever prepared for a class, with lots of materials for one specific area, and then the class takes a sudden shift in direction (it turns out that the class haven’t assimilated a certain foundation concept) and all of that careful work has to be put away for later? Sometimes it doesn’t matter how much you prepare – life happens and your carefully planned activities get derailed. Even if you don’t get any content surprises, it doesn’t take much to upset the applecart (a fire alarm goes off, for example) and one of the signs of the good educator is the ability to adapt to continue to bring the important points to the learner, no matter what happens. Walking in with a fixed narrative of how the semester is going to roll out is unlikely to meet the requirements of all of your students and if something goes wrong, you’re stuffed (to use the delightful Australian vernacular, which seems oddly appropriate around Thanksgiving).
In my head, while putting my game together, I had thought of a set of exciting stories, rather than a possible set of goals, events and rules that could apply to any combination of players and situations. When people have the opportunity to explore, they become more engaged and they tend to own the experience more. This is what I loved about the game Deus Ex, the illusion of free will, and I felt that I constructed my own narrative in there, despite actually choosing from one of the three that was on offer on carefully hidden rails that you didn’t see until you’d played it through a few times.
Apart from anything else, I had made the game design dull. There is nothing exciting about laying out hexagonal tiles to some algorithm, unless you are getting to pick the strategy, so my ‘random starting map’ was one of the first things to go. London has a number of areas and, by choosing a fixed board layout that increased or decreased based on player numbers, I got enough variation by randomising placement on a fixed map.
I love the game Arkham Horror but I don’t play it very often, despite owning all of the expansions. Why? The set-up and pack-up time take ages. Deck after deck of cards, some hundreds high, some 2-3, have to be placed out onto a steadily shrinking playing area and, on occasion, a player getting a certain reward will stop the game for 5-10 minutes as we desperately search for the appropriate sub-pack and specific card that they have earned. The game company that released Arkham has now released iPhone apps that allow you to monitor cards on your phone but, given that each expansion management app is an additional fee and that I have already paid money for the expansions themselves, this has actually added an additional layer of irritation. The game company recognises that their system is painful but now wish to charge me more money to reduce the problem! I realised that my ‘lay out the hexes’ for the game was boring set-up and a barrier to fun.
The other thing I had to realise is that nobody really cares about realism or, at least, there is only so much realism people need. I had originally allows for players to be soldiers, scientists, police, medical people, spies and administrators. Who really wants to be the player responsible for the budgetary allocation of a large covert government facility? Just because the administrator has narrative value doesn’t mean that the character will be fun to play! Similarly, why the separation between scientists and doctors? All that means is I have the unpleasant situation where the doctors can’t research the cure and the scientists can’t go into the field because they have no bandaging skill. If I’m writing a scenario as a novel or short story, I can control the level of engagement for each character because I’m writing the script. In a randomised series of events, no-one is quite sure who will be needed where and the cardinal rule of a game is that it should be fun. In fact, that final goal of keeping all players in the game until the end should be an explicit statement that all players are useful in the game until the end.
The games I like are varied but the games that I play have several characteristics in common. They do not take a long time to set-up or pack away. They allow every player to matter, up until the end. Whether working together or working against each other, everyone feels useful. There is now so much randomness that you can be destroyed by a bad roll but there is not so much predictability that you can coast after the second round. The games I really like to play are also forgiving. I am playing some strategy games at the moment and, for at least two of them, decisions made in the first two rounds will affect the entire game. I must say that I’m playing them to see if that is my lack of ability or a facet of the game. If it turns out to be the game, I’ll stop playing because I don’t need to have a game berating me for making a mistake 10 rounds previously. It’s not what I call fun.
I hope to have some more time to work on this over the summer but, as a design exercise, it has been really rewarding for me to think about. I understand myself more and I understand games more – and this means that I am enjoying the games that I do play more as well!
Ebb and Flow – Monitoring Systems Without Intrusion
Posted: November 23, 2012 Filed under: Education | Tags: collaboration, community, curriculum, data visualisation, education, educational problem, educational research, ethics, feedback, Generation Why, higher education, in the student's head, learning, measurement, MIKE, principles of design, reflection, resources, student perspective, SWEDE, teaching, teaching approaches, thinking, tools Leave a commentI’ve been wishing a lot of people “Happy Thanksgiving” today because, despite being frightfully Antipodean, I have a lot of friends and family who are Thanksgiving observers in the US. However, I would know that something was up in the US anyway because I am missing about 40% of my standard viewers on my blog. Today is an honorary Sunday – hooray, sleep-ins all round! More seriously, this illustrates one of the most interesting things about measurement, which is measuring long enough to be able to determine when something out of the ordinary occurs. As I’ve already discussed, I can tell when I’ve been linked to a higher profile blog because my read count surges. I also can tell when I haven’t been using attractive pictures because the count drops by about 30%.

A fruit bat, in recovery, about to drink its special fruit smoothie. (Yes, this is shameless manipulation.)
This is because I know what the day-to-day operation of the blog looks like and I can spot anomalies. When I was a network admin, I could often tell when something was going wrong on the network just because of the way that certain network operations started to feel, and often well before these problems reached the level where they would trigger any sort of alarm. It’s the same for people who’ve lived by the same patch of sea for thirty years. They’ll look at what appears to be a flat sea on a calm day and tell you not to go out – because they can read a number of things from the system and those things mean ‘danger’.
One of the reasons that the network example is useful is because any time you send data through the network to see what happens, you’re actually using the network to do it. So network probes will actually consume network bandwidth and this may either mask or exacerbate your problems, depending on how unlucky you are. However, using the network for day-today operations, and sensing that something is off, then gives you a reason to run those probes or to check the counters on your networking gear to find out exactly why the hair on the back of your neck is going up.
I observe the behaviour of my students a lot and I try to gain as much information as I can from what they already give me. That’s one of the reasons that I’m so interested in assignment submissions, because students are going to submit assignments anyway and any extra information I can get from this is a giant bonus! I am running a follow-up Piazza activity on our remote campus and I’m fascinated to be able to watch the developing activity because it tells me who is participating and how they are participating. For those who haven’t heard about Piazza, it’s like a Wiki but instead of the Wiki model of “edit first, then argue into shape”, Piazza encourages a “discuss first and write after consensus” model. I put up the Piazza assignment for the class, with a mid-December deadline, and I’ve already had tens of registered discussions, some of which are leading to edits. Of course, not all groups are active yet and, come Monday, I’ll send out a reminder e-mail and chat to them privately. Instead of sending a blanket mail to everyone saying “HAVE YOU STARTED PIAZZA”, I can refine my contact based on passive observation.
The other thing about Piazza is that, once all of the assignment is over, I can still see all of their discussions, because that’s where I’ve told them to have the discussion! As a result, we can code their answers and track the development of their answers, classifying them in terms of their group role, their level of function and so on. For an open-ended team-based problem, this allows me a great deal of insight into how much understanding my students have of the area and allows me to fine-tune my teaching. Being me, I’m really looking for ways to improve self-regulation mechanisms, as well as uncovering any new threshold concepts, but this nonintrusive monitoring has more advantages than this. I can measure participation by briefly looking at my mailbox to see how many mail messages are foldered under a particular group’s ID, from anywhere, or I can go to Piazza and see it unfolding there. I can step in where I have to, but only when I have to, to get things back on track but I don’t have to prove or deconstruct a team-formed artefact to see what is going on.
In terms of ebb and flow, the Piazza groups are still unpredictable because I don’t have enough data to be able to tell you what the working pattern is for a successful group. I can tell you that no activity is undesirable but, even early on, I could tell you some interesting things about the people who post the most! (There are some upcoming publications that will deal with things along these lines and I will post more on these later.) We’ve been lucky enough to secure some Summer students and I’m hoping that at least some of their work will involve looking at dependencies in communication and ebb and flow across these systems.
As you may have guessed, I like simple. I like the idea of a single dashboard that has a green light (healthy course), an orange light (sick course) and a red light (time to go back to playing guitar on the street corner) although I know it will never be that easy. However, anything that brings me closer to that is doing me a huge favour, because the less time I have to spend actively probing in the course, the less of my students’ time I take up with probes and the less of my own time I spend not knowing what is going on!
Oh well, the good news is that I think that there are only three more papers to write before the Mayan Apocalypse occurs and at least one of them will be on this. I’ll see if I can sneak in a picture of a fruit bat. 🙂
Learner Pull and Educator Push
Posted: November 21, 2012 Filed under: Education | Tags: collaboration, community, curriculum, education, educational problem, feedback, higher education, moocs, resources, teaching, teaching approaches, thinking Leave a commentWe were discussing some of the strategic investments that might underpin my University’s progress for the next 5 years (all very hand wavy as we don’t yet have the confirmed strategy for the next 5 years) and we ended up discussing Learner Push and Educator Pull – in the context of MOOCs, unsurprisingly.
We know that if all we do is push content to people then we haven’t really undertaken any of the learning experience construction that we’re supposed to. If we expect students to mysteriously know what they need and then pull it all towards them, then we’re assuming that students are automatically self-educating and this is, fairly obviously, not universally true or there would have been no need for educational institutions for… hundreds of thousands of years.
What we actually have is a combination of push and pull from both sides, maintaining the right tension if you will, and it’s something that we have to think about the moment that we talk about any kind of information storage system. A library is full of information but you have to know what you’re looking for, where to find out and you have to want to find it! I’ve discussed on other blogs my concerns about the disconnected nature of MOOCs and the possibility of students “cherry picking” courses that are of interest to them but lead nowhere in terms of the construction of a professional level of knowledge.
Mark Guzdial recently responded to a comment of mine to remind me of the Gates Foundation initiative to set up eight foundation courses based on MOOCs but that’s a foundation level focus – how do we get from there to fourth year engineers or computer scientists? Part of the job of the educator is to construct an environment where the students not only want the knowledge but they want, and here’s the tricky bit, the right knowledge. So rather than forcing content down the student’s throat (the incorrect assumption of educator push, in my opinion) we are creating an environment that inspires, guides and excites – and pushing that.
I know that my students have vast amounts of passion and energy – the problem is getting it directed in the right way!
It’s great to be talking about some of these philosophical issues as we look forward over the next 5-10 years because, of course, by itself the IT won’t fix any of our problems unless we use it correctly. As an Associate Dean (IT) and a former systems administrator, I know that spending money on IT is easy but it’s always very easy to spend a lot of money and make no progress. Good, solid, principles help a lot and, while we have a lot of things to sort out, it’s going to be interesting to see how things develop, especially with the concept of the MOOC floating above us.




