The Part and the Whole

I like words a lot but I also love words that introduce me to whole new ways of thinking. I remember first learning the word synecdoche (most usually pronounced si-NEK-de-kee), where you used the word for part of something to refer to that something as a whole (or the other way around). Calling a car ‘wheels’ or champagne ‘bubbles’ are good examples of this. It’s generally interesting which parts people pick for synecdoche, because it emphasises what is important about something. Cars have many parts but we refer to it in parts as wheelsI and motor. I could bore you to tears with the components of champagne but we talk about the bubbles. In these cases, placing emphasis upon one part does not diminish the physical necessity of the remaining components in the object but it does tell us what the defining aspect of each of them is often considered to be.

Bubbles!

Bubbles!

There are many ways to extract a defining characteristic and, rather than selecting an individual aspect for relatively simple structures (and it is terrifying that a car is simple in this discussion), we use descriptive statistics to allow us to summarise large volumes of data to produce measures such as meanvariance and other useful things. In this case, the characteristic we obtain is not actually part of the data that we’re looking at. This is no longer synecdoche, this is statistics, and while we can use these measures to arrive at an understanding (and potentially move to the amazing world of inferential statistics), we run the risk of talking about groups and their measurements as if the measurements had as much importance as the members of the group.

I have been looking a lot at learning analytics recently and George Siemens makes a very useful distinction between learning analytics, academic analytics and data mining. When we analyse the various data and measures that come out of learning, we want to use this to inform human decision making to improve the learning environment, the quality of teaching and the student experience. When we look at the performance of the academy, we worry about things like overall pass rates, recruitment from external bodies and where our students go on to in their careers. Again, however, this is to assist humans in making better decisions. Finally, and not pejoratively but distinctly, data mining delves deep into everything that we have collected, looking for useful correlations that may or may not translate into human decision making. By separating our analysis of the teaching environment from our analysis of the academic support environment, we can focus on the key aspects in the specific area rather than doing strange things that try to drag change across two disparate areas.

When we start analysis, we start to see a lot of numbers: acceptable failure rates, predicted pass rates, retention figures, ATARs, GPAs. The reason that I talk about data analytics as a guide to human decision making is that the human factor reminds us to focus on the students who are part of the figures. It’s no secret that I’m opposed to curve grading because it uses a clear statement of numbers (70% of students will pass) to hide the fact that a group of real students could fail because they didm’ perform at the same level as their peers in the same class. I know more than enough about the ways that a student’s performance can be negatively affected by upbringing and prior education to know that this is not just weak sauce, but a poisonous and vicious broth to be serving to students under the guide of education.

I can completely understand that some employers want to employ people who are able to assimilate information quickly and put it into practice. However, let’s be honest, an ability to excel at University is not necessarily an indication of that. They might coincide, certainly, but it’s no guarantee. When I applied for Officer Training in the Army, they presented me with a speed and accuracy test, as part of the battery of psychological tests, to see if I could do decision making accurately at speed while under no more stress than sitting in a little room being tested. Later on, I was tested during field training, over and over again, to see what would happen. The reason? The Army knows that the skills they need in certain areas need specific testing.

Do you want detailed knowledge? Well, the numbers conspire again to undermine you because a focus on numerical grade measures to arrive at a single characteristic value for a student’s performance (GPA) makes students focus on getting high marks rather than learning. The GPA is not the same as the wheels of the car – it has no relationship to the applicable ability of the student to arbitrary tasks nor, if I may wax poetic, does it give you a sense of the soul of the student.

We have some very exciting tools at our disposal and, with careful thought and the right attitude, there is no doubt that analytics will become a valuable way to develop learning environments, improve our academies and find new ways to do things well. But we have to remember that these aggregate measures are not people, that “10% of students” represented real, living human beings who need to be counted, and that we have a long way to go before have an analytical approach that has a fraction of the strength of synecdoche.


The Fragile Student Relationship (working from #Unstuck #by Julie Felner @felner)

I was referred some time ago to a great site called “Unstuck”, which has some accompanying iPad software, that helps you to think about how to move past those stuck moments in your life and career to get things going. They recently posted an interesting item on “How to work like a human” and I thought that a lot of what they talked about had direct relevance to how we treat students and how we work with them to achieve things. The article is by Julie Felner and I strongly suggest that you read it, but here are my thoughts on her headings, as they apply to education and students.

Ultimately, if we all work together like human beings, we’re going to get on better than if we treat our students as answer machines and they treat us as certification machines. Here’s what optimising for one thing, mechanistically, can get you:

This robot is the business at climbing hills. Dancing like a fool, not so much. It's not human.

This robot is the business at climbing hills. Dancing like a fool, not so much. It’s not human.

But if we’re going to be human, we need to be connected. Here are some signs that you’re not really connected to your students.

  1. Anything that’s not work you treat with a one word response. A student comes to see you and you don’t have time to talk about anything but assignment X or project Y. I realise time is scarce but, if we’re trying to build people, we have to talk to people, like people.
  2. You’re impatient when they take time to learn or adjust. Oh yeah, we’ve all done this. How can they not pick it up immediately? What’s wrong with them? Don’t they know I’m busy?
  3. Sleep and food are for the weak – and don’t get sick. There are no human-centred reasons for not getting something done. I’m scheduling all of these activities back-to-back for two months. If you want it, you’ll work for it.
  4. We never ask how the students are doing. By which I mean, asking genuinely and eking out a genuine response, if some prodding is required. Not intrusively but out of genuine interest. How are they doing with this course?
  5. We shut them down. Here’s the criticism. No, I don’t care about the response. No, that’s it. We’re done. End of discussion. There are times when we do have to drawn an end to a discussion but there’s a big difference between closing off something that’s going nowhere and delivering everything as if no discussion is possible.

Here is my take on Julie’s suggestions for how we can be more human at work, which works for the Higher Ed community just as well.

  1. Treat every relationship as one that matters. The squeaky wheels and the high achievers get a lot of our time but all of our students are actually entitled to have the same level of relationship with us. Is it easy to get that balance? No. Is it a worthwhile goal? Yes.
  2. Generously and regularly express your gratitude. When students do something well, we should let them know- as soon as possible. I regularly thank my students for good attendance, handing things in on time, making good contributions and doing the prep work. Yes, they should be doing it but let’s not get into how many things that should be done aren’t done. I believe in this strongly and it’s one of the easiest things to start doing straight away.
  3. Don’t be too rigid about your interactions. We all have time issues but maybe you can see students and talk to them when you pass them in the corridor, if both of you have time. If someone’s been trying to see you, can you grab them from a work area or make a few minutes before or after a lecture? Can you talk with them over lunch if you’re both really pressed for time? It’s one thing to have consulting hours but it’s another to make yourself totally unavailable outside of that time. When students are seeking help, it’s when they need help the most. Always convenient? No. Always impossible to manage? No. Probably useful? Yes.
  4. Don’t pretend to be perfect. Firstly, students generally know when you’re lying to them and especially when you’re fudging your answers. Don’t know the answer? Let them know, look it up and respond when you do. Don’t know much about the course itself? Well, finding out before you start teaching is a really good idea because otherwise you’re going to be saying “I don’t know a lot” and there’s a big, big gap between showing your humanity and obviously not caring about your teaching. Fix problems when they arise and don’t try to make it appear that it wasn’t a problem. Be as honest as you can about that in your particular circumstances (some teaching environments have more disciplinary implications than others and I do get that).
  5. Make fewer assumptions about your students and ask more questions. The demographics of our student body have shifted. More of my students are in part-time or full-time work. More are older. More are married. Not all of them have gone through a particular elective path. Not every previous course contains the same materials it did 10 years ago. Every time a colleague starts a sentence with “I would have thought” or “Surely”, they are (almost always) projecting their assumptions on to the student body, rather than asking “Have you”, “Did you” or “Do you know”?

Julie made the final point that sometimes we can’t get things done to the deadline. In her words:

You sometimes have to sacrifice a deadline in order to preserve something far more important — a relationship, a person’s well-being, the quality of the work

I completely agree because deadlines are a tool but, particularly in academia, the deadline is actually rarely as important as people. If our goal is to provide a good learning environment, working our students to zombie status because “that’s what happened to us” is bordering on a cycle of abuse, rather than a commitment to quality of education.

We all want to be human with our students because that’s how we’re most likely to get them to engage with us as a human too! I liked this article and I hope you enjoyed my take on it. Thank you, Julie Felner!


5 Things: Scientists

Another 5-pointer, inspired by a post I read about the stereotypes of scientists. (I know there are just as many about other professions but scientist is one of my current ones.)

  1. We’re not all “bushy-haired” confused old white dudes.

    It’s amazing that pictures of 19th Century scientists and Einstein have had such an influence on how people portray scientists. This link shows you how academics (researchers in general but a lot of scientists are in here) are shown to children. I wouldn’t have as much of a problem with this if it wasn’t reinforcing a really negative stereotype about the potential uselessness of science (Professors who are not connected to the real world and who do foolish things) and the demography (it’s almost all men and white ones at that) which are more than likely having a significant impact on how kids feel about going into science.

    It’s getting better, as we can see from a Google image search for scientists, which shows a very obvious “odd man out”, but that image search actually throws up our next problem. Can you see what it is?

    Sorry, Albert.

    Sorry, Albert.

  2. We don’t all wear white coats!

    So we may have accepted that there is demographic diversity in science (but it still has to make it through to kid’s books) but that whole white coat thing is reinforced way too frequently. Those white coats are not a uniform, they’re protective clothing. When I was a winemaker, I wore heavy duty dark-coloured cotton clothing for work because I was expecting to get sprayed with wine, cleaning products and water on a regular basis. (Winemaking is like training an alcoholic elephant with a mean sense of humour.) When I was in the lab, if I was handling certain chemicals, I threw on a white coat as part of my protective gear but also to stop it getting on my clothes, because it would permanently stain or bleach them. Now I’m a computer scientist, I’ve hung up my white coat.

    Biological scientists, scientists who work with chemicals or pharmaceuticals – any scientists who work in labs – will wear white coats. Everyone else (and there’s a lot of them) tend not to. Think of it like surgical scrubs – if your GP showed up wearing them in her office then you’d think “what?” and you’d be right.

  3. Science can be a job, a profession, a calling and a hobby – but this varies from person to person.

    There’s the perception of scientist as a job so all-consuming that it robs scientists of the ability to interact with ‘normal’ people, hence stereotypes like the absent-minded Professor or the inhuman, toxic personality of the Cold Scientific Genius. Let’s tear that apart a bit because the vast majority of people in science are just not like that.

    Some jobs can only be done when you are at work. You do the work, in the work environment, then you go home and you do something else. Some jobs can be taken home. The amount of work that you do on your job, outside of your actual required working time – including overtime, is usually an indicator of how much you find it interesting. I didn’t have the facilities to make wine at home but I read a lot about it and tasted a lot of wine as part of my training and my job. (See how much cooler it sounds to say that you are ‘tasting wine’ rather than ‘I drink a lot’?) Some mechanics leave work and relax. Some work on stock cars. It doesn’t have to be any particular kind of job because people all have different interests and different hobbies, which will affect how they separate work and leisure – or blend them.

    Some scientists leave work and don’t do any thinking on things after hours. Some can think on things but not do anything because they don’t have the facilities at home. (The Large Hadron Collider cost close to USD 7 Billion, so no-one has one in their shed.) Some can think and do work at home, including Mathematicians, Computer Scientists, Engineers, Physicists, Chemists (to an extent) and others who will no doubt show up angrily in the comments. Yes, when I’m consumed with a problem, I’m thinking hard and I’m away with the pixies – but that’s because, as a Computer Scientist, I can build an entire universe to work with on my laptop and then test out interesting theories and approaches. But I have many other hobbies and, as anyone who has worked with me on art knows, I can go as deeply down the rabbit hole on selecting typefaces or colours.

    Everyone can appear absent-minded when they’re thinking about something deeply. Scientists are generally employed to think deeply about things but it’s rare that they stay in that state permanently. There are, of course, some exceptions which leads me to…

  4. Not every scientist is some sort of genius.

    Sorry, scientific community, but we all know it’s true. You have to be well-prepared, dedicated and relatively mentally agile to get a PhD but you don’t have to be crazy smart. I raise this because, all too often, I see people backing away from science and scientific books because “they wouldn’t understand it” or “they’re not smart enough for it”. Richard Feynman, an actual genius and great physicist, used to say that if he couldn’t explain it to Freshman at College then the scientific community didn’t understand it well enough. Think about that – he’s basically saying that he expects to be able to explain every well-understood scientific principle to kids fresh out of school.

    The genius stereotype is a not just a problem because it prevents people coming into the field but because it puts so much demand on people already in the field. You could probably name three physicists, at a push, and you’d be talking about some of the ground-shaking members of the field. Involved in work leading up those discoveries, and beyond, are hundreds of thousands of scientists, going about their jobs, doing things that are valuable, interesting and useful, but perhaps not earth-shattering. Do you expect every soldier to be a general? Every bank clerk to become the general manager? Not every scientist will visibly change the world, although many (if not most) will make contributions that build together to change the world.

    Sir Isaac Newton, another famous physicist, referred to the words of Bernard of Chartres when he famously wrote:

    “If I have seen further it is by standing on the sholders [sic] of Giants”

    making the point even more clearly by referring to a previous person’s great statement to then make it himself! But there’s one thing about standing on the shoulders of giants…

  5. There’s often a lot of wrong to get to right.

    Science is evidence-based, which means that it’s what you observe occurring that validates your theories and allows you to develop further ideas about how things work. The problem is that you start from a position of not knowing much, make some suggestions, see if they work, find out where they don’t and then fix up your ideas. This has one difficult side-effect for non-scientists in that scientists can very rarely state certainty (because there may be something that they just haven’t seen yet) and they can’t prove a negative, as you just can’t say something won’t happen because it hasn’t happened yet. (Absence of evidence is not evidence of absence.) This can be perceived as weakness but it’s one of the great strengths of science. We work with evidence that contradicts our theories to develop our theories and extend our understanding. Some things happen rarely and under only very specific circumstances. The Large Hadron Collider was built to find evidence to confirm a theory and, because the correct tool was built, physicists now better understand how our universe works. This is a Good Thing as the last thing we want do is void the warranty through incorrect usage.

    The more complicated the problem, the more likelihood that it will take some time to get it right. We’re very certain about gravity, in most practical senses, and we’re also very confident about evolution. And climate change, for that matter, which will no doubt get me some hate on the comments but the scientific consensus is settled. It’s happening. Can we say absolutely for certain? No, because we’re scientists. Again – strength, not weakness.

    When someone gets it wrong deliberately, and that sadly does happen occasionally, we take it very seriously because that whole “standing on shoulders of giants” is so key to our approach. A disingenuous scientist, like Andrew Wakefield and his shamefully bad and manipulated study on vaccination that has caused so much damage, will take a while to be detected and then we have to deal with the repercussions. The good news is that most of the time we find these people and limit their impact. The bad news is that this can be spun in many ways, especially by compromised scientists, and humans can be swayed by argument rather than fact quite easily.

    The take away from this is that admitting that we need to review a model is something you should regard in the same light as your plane being delayed because of a technical issue. You’d rather we fixed it, immediately and openly, than tried to fly on something we knew might fail.


Proud to be a #PreciousPetal, built on a strong #STEM, @PennyWrites @SenatorMilne @adambandt

I am proud to be a Precious Petal. Let me explain why I think we should reclaim this term for ourselves.

Australia, apparently, does not have a need for dedicated Science Minister, for the first time since the 1930s. Instead, it is a subordinate portfolio for our Minister for Industry, the Hon Ian Macfarlane, MP. Today, he was quoted in the Guardian, hitting out at “precious petals in the science industry” who are criticising the lack of a dedicated Science Minister. Macfarlane, whose Industry portfolio includes Energy, Skills and Science went on to say:

“I’m just not going to accept that crap,” he said. “It really does annoy me. There’s no one more passionate about science than me, I’m the son and the grandson of a scientist. I hear this whinge constantly from the precious petals in the science industry.”

So I’m not putting words in his mouth – that’s a pretty directed attack on the sector that happens to underpin Energy and Industry because, while Macfarlane’s genetic advantage in his commitment to science may or not be scientifically valid, the fact of the matter is that science, and innovation in science, have created pretty much all of what is referred to as industry in Australia. I’m not so one-eyed as to say that science is everything, because I recognise and respect the role of the arts and humanities in a well-constructed and balanced society, but if we’re going to talk about everything after the Industrial (there’s that word again) Revolution in terms of production industries – take away the science and we’re not far away poking things with sticks to work out which of the four elements (fire, air, earth, water) it belongs to. Scientists of today stand on a tradition of thousands of years of accumulated knowledge that has survived many, many regimes and political systems. We tell people what the world is like, rather than what people want it to be, and that often puts us at odds with politicians, for some reason. (I feel for the ethicists and philosophers who have to do the same thing but can’t get industry implementation partnerships as easily and are thus, unfairly, regularly accused of not being ‘useful’ enough.)

I had the opportunity to be addressed by the Minister at Science Meets Parliament where, like something out of a David Williamson play, the genial ageing bloke stood up and, in real Strine, declaimed “No Minister for Science? I’m your Minister for Science!” as if this was enough for a room full of people who were dedicated to real evidence. But he obviously thought it was enough as he threw a few bones to the crowd. On the back of the cuts to CSIRO and many other useful scientific endeavours, these words ring even more hollow than they did at the time.

But rather than take offence at the Minister’s more recent deliberately inflammatory and pejorative words, let me take them and illustrate his own lack of grasp of his portfolio.

My discipline falls into STEM – Science, Technology, Engineering and Mathematics – and I am scientist in that field. Personally, I like to add an A for Arts, as I am rather cross-disciplinary, and make it STEAM, because that conveys the amazing potential and energy in the area when we integrate across the disciplines. So, if Science is a flower, then we have a strong STEM in Australia, although it is currently under threat from a number of initiatives put in place by this very government.

But what of petals? If the Minister knew much botany, he’d know that petals are modified leaves that protect parts of the flower, attract or deliberately drive away certain pollinators, building relationships with their pollinating community to build a strong ecosystem. When flowers have no petals, they are subject to the whim on the winds for pollination and this means that you have to be very wasteful in your resources to try and get to any other plants. When the petals are strong and well-defined, you can draw in the assistance of other creatures to help you use your resources more wisely and achieve the goals of the flower – to produce more flowers over time.

At a time when bee colony collapse is threatening agriculture across the globe, you would think that a Minister of Industry (and Science) would have actually bothered to pick up some of the facts on this, very basic, role of a mechanism that he is using to deride and, attempt to, humiliate a community for having the audacity to complain about a bad decision. Scientists have been speaking truth to power since the beginning, Minister, and we’re not going to stop now.

If the Minister understood his portfolio, then he would realise that calling Australia’s scientific community “precious petals” is actually a reflection of their vital role in making science work for all Australians and the world. It is through these petals, protecting and guiding the resources in their area, that we can take the promise of STEM and share it with the world.

But let’s not pretend that’s what he meant. Much like the staggering Uncle at a Williamson Wedding, these words were meant to sting and diminish – to make us appear hysterical and, somehow, less valid. In this anachronistic, and ignorant, attack, we have never seen a better argument as to why Australia should have a dedicated Science Minister, who actually understands science.

I’m proud to be a Precious Petal, Minister.

An open nelumno nucifera flower, from the Botanic Gardens in Adelaide. Via Wikipedia.

An open nelumno nucifera flower, from the Botanic Gardens in Adelaide. Via Wikipedia.


Knowing the Tricks Helps You To Deal With Assumptions

I teach a variety of courses, including one called Puzzle-Based Learning, where we try to teach think and problem-solving techniques through the use of simple puzzles that don’t depend on too much external information. These domain-free problems have most of the characteristics of more complicated problems but you don’t have to be an expert in the specific area of knowledge to attempt them. The other thing that we’ve noticed over time is that a good puzzle is fun to solve, fun to teach and gets passed on to other people – a form of infectious knowledge.

Some of the most challenging areas to try and teach into are those that deal with probability and statistics, as I’ve touched on before in this post. As always, when an area is harder to understand, it actually requires us to teach better but I do draw the line at trying to coerce students into believing me through the power of my mind alone. But there are some very handy ways to show students that their assumptions about the nature of probability (and randomness) so that they are receptive to the idea that their models could need improvement (allowing us to work in that uncertainty) and can also start to understand probability correctly.

We are ferociously good pattern matchers and this means that we have some quite interesting biases in the way that we think about the world, especially when we try to think about random numbers, or random selections of things.

So, please humour me for a moment. I have flipped a coin five times and recorded the outcome here. But I have also made up three other sequences. Look at the four sequences for a moment and pick which one is most likely to be the one I generated at random – don’t think too much, use your gut:

  1. Tails Tails Tails Heads Tails
  2. Tails Heads Tails Heads Heads
  3. Heads Heads Tails Heads Tails
  4. Heads Heads Heads Heads Heads

Have you done it?

I’m just going to put a bit more working in here to make sure that you’ve written down your number…

I’ve run this with students and I’ve asked them to produce a sequence by flipping coins then produce a false sequence by making subtle changes to the generated one (turns heads into tails but change a couple along the way). They then write the two together on a board and people have to vote on which one is which. As it turns out, the chances of someone picking the right sequence is about 50/50, but I engineered that by starting from a generated sequence.

This is a fascinating article that looks at the overall behaviour of people. If you ask people to write down a five coin sequence that is random, 78% of them will start with heads. So, chances are, you’ve picked 3 or 4 as you’re starting sequence. When it comes to random sequences, most of us equate random with well-shuffled, and, on the large scale, 30 times as many people would prefer option 3 to option 4. (This is where someone leaps into the comments to say “A-ha” but, it’s ok, we’re talking about overall behavioural trends. Your individual experience and approach may not be the dominant behaviour.)

From a teaching point of view, this is a great way to break up the concepts of random sequences and some inherent notion that such sequences must be disordered. There are 32 different ways of flipping 5 coins in a strict sequence like this and all of them are equally likely. It’s only when we start talking about the likelihood of getting all heads versus not getting all heads that the aggregated event of “at least one head” starts to be more likely.

How can we use this? One way is getting students to write down their sequences and then asking them to stand up, then sit down when your ‘call’ (from a script) goes the other way. If almost everyone is still standing at heads then you’ve illustrated that you know something about how their “randomisers” work. A lot of people (if your class is big enough) should still be standing when the final coin is revealed and this we can address. Why do so many people think about it this way? Are we confusing random with chaotic?

The Law of Small Numbers (Tversky and Kahneman), also mentioned in the post, which is basically that people generalise too much from small samples and they expect small samples to act like big ones. In your head, if the grand pattern over time could be resorted into “heads, tails, heads, tails,…” then small sequences must match that or they just don’t look right. This is an example of the logical fallacy called a “hasty generalisation” but with a mathematical flavour. We are strongly biassed towards the the validity of our experiences, so when we generate a random sequence (or pick a lucky door or win the first time at poker machines) then we generalise from this small sample and can become quite resistant to other discussions of possible outcomes.

If you have really big classes (367 or more) then you can start a discussion on random numbers by asking people what the chances are that any two people in the room share a birthday. Given that there are only 366 possible birthdays, the Pigeonhole principle states that two people must share a birthday as, in a class of 367, there are only 366 birthdays to go around so one must be repeated! (Note for future readers: don’t try this in a class of clones.) There are lots of other, interesting thinking examples in the link to Wikipedia that helps you to frame randomness in a way that your students might be able to understand it better.

10 pigeons into 9 boxes? Someone has a roommate.

10 pigeons into 9 boxes? Someone has a roommate.

I’ve used a lot of techniques before, including the infamous card shouting, but the new approach from the podcast is a nice and novel angle to add some interest to a class where randomness can show up.


Talking Ethics with the Terminator: Using Existing Student Experience to Drive Discussion

One of the big focuses at our University is the Small-Group Discovery Experience, an initiative from our overall strategy document, the Beacon of Enlightenment. You can read all of the details here, but the essence is that a small group of students and an experienced research academic meet regularly to start the students down the path of research, picking up skills in an active learning environment. In our school, I’ve run it twice as part of the professional ethics program. This second time around, I think it’s worth sharing what we did, as it seems to be working well.

Why ethics? Well, this is first year and it’s not all that easy to do research into Computing if you don’t have much foundation, but professional skills are part of our degree program so we looked at an exploration of ethics to build a foundation. We cover ethics in more detail in second and third year but it’s basically a quick “and this is ethics” lecture in first year that doesn’t give our students much room to explore the detail and, like many of the more intellectual topics we deal with, ethical understanding comes from contemplation and discussion – unless we just want to try to jam a badly fitting moral compass on to everyone and be done.

Ethical issues present the best way to talk about the area as an introduction as much of the formal terminology can be quite intimidating for students who regard themselves as CS majors or Engineers first, and may not even contemplate their role as moral philosophers. But real-world situations where ethical practice is more illuminating are often quite depressing and, from experience, sessions in medical ethics, and similar, rapidly close down discussion because it can be very upsetting. We took a different approach.

The essence of any good narrative is the tension that is generated from the conflict it contains and, in stories that revolve around artificial intelligence, robots and computers, this tension often comes from what are fundamentally ethical issues: the machine kills, the computer goes mad, the AI takes over the world. We decided to ask the students to find two works of fiction, from movies, TV shows, books and games, to look into the ethical situations contained in anything involving computers, AI and robots. Then we provided them with a short suggested list of 20 books and 20 movies to start from and let them go. Further guidance asked them to look into the active ethical agents in the story – who was doing what and what were the ethical issues?

I saw the students after they had submitted their two short paragraphs on this and I was absolutely blown out of the water by their informed, passionate and, above all, thoughtful answers to the questions. Debate kept breaking out on subtle points. The potted summary of ethics that I had given them (follow the rules, aim for good outcomes or be a good person – sorry, ethicists) provided enough detail for the students to identify issues in rule-based approaches, utilitarianism and virtue ethics, but I could then introduce terms to label what they had already done, as they were thinking about them.

I had 13 sessions with a total of 250 students and it was the most enjoyable teaching experience I’ve had all year. As follow-up, I asked the students to enter all of their thoughts on their entities of choice by rating their autonomy (freedom to act), responsibility (how much we could hold them to account) and perceived humanity, using a couple of examples to motivate a ranking system of 0-5. A toddler is completely free to act (5) and completely human (5) but can’t really be held responsible for much (0-1 depending on the toddler). An aircraft autopilot has no humanity or responsibility but it is completely autonomous when actually flying the plane – although it will disengage when things get too hard. A soldier obeying orders has an autonomy around 5. Keanu Reeves in the Matrix has a humanity of 4. At best.

They’ve now filled the database up with their thoughts and next week we’re going to discuss all of their 0-5 ratings as small groups, then place them on a giant timeline of achievements in literature, technology, AI and also listing major events such as wars, to see if we can explain why authors presented the work that they did. When did we start to regard machines as potentially human and what did the world seem like them to people who were there?

This was a lot of fun and, while it’s taken a little bit of setting up, this framework works well because students have seen quite a lot, the trick is just getting to think about with our ethical lens. Highly recommended.

What do you think, Arnold? (Image from moviequotes.me)

What do you think, Arnold? (Image from moviequotes.me)


Swearing with @cadigan, @gavingsmith & @cstross #worldcon #loncon3 Rat’s Monkey’s Ahem

One of the other more interesting panels I went to at WorldCon was “Rat’s Monkey’s Ass”, a panel with Pat Cadigan, Gavin Smith, Mihaela Marija Perkovic, and Charles Stross on the use of swear words in genre fiction. Many pieces of work feature constructed swearing, such as frak in Battlestar Galactica and some of the more farcical attempts at science-oriented swearing in earlier science fiction. (Let’s not even start on Harry Harrison’s bowbidy-bowb.)

Image from tshirtbordello.com. It is a strangely satisfying word, though.

Image from tshirtbordello.com. It is a strangely satisfying word, though.

I’ve met Mihaela before, when she visited Australia, and she did a great job on keeping the panel going, as well as contributing some excellent swear words of her own. Of course, the authors present did a great job of swearing like a variety of troopers from a range of different timezones and militaries, but there are important aspects to this, which were also excellently covered.

The blurb for the panel reads:

 Swearing in science fiction and fantasy is occasionally a minefield of anachronism, but then, there’s often nothing weirder than hearing someone yell “frak”. Or even worse, a teenage character that refuses to curse at all. This panel will explore swear words in the genres. What purpose does swearing have within a society? What purpose does it serve in fiction, and how important, or not, are profanities to the narrative? When are invented curses more (or less) effective than real (contemporary or historical) examples, and why?

The general feeling was that conveying emotion is important and that swearing is an important part of this. It feels really hollow when a hardened space pirate says something like “Oh, dash” and this matters when you’re trying to convey the sense of reality required to hold up the parachute silk of disbelief.

There is one issue, which I raised in question time. Given that many young people do not have the delightfully proper middle and upper-middle class upbringing we see so often in Young Adult fiction, it’s positively disingenuous to remove swearing from certain works because that is the world those kids are growing up in. When people have fewer words at their disposal, they make use of the ones that they have. We know that children in the US from non-educationally successful backgrounds, with few books, can have a vocabulary deficit measured in the thousands of words and, probably, a lot of their emotional conveyance is going to come from the use of swearwords, whether we like it or not.

When someone picks up a book, they have to have a reason to keep reading, either by seeing themselves in there or just being really interested. When YA is a sterile “Boy’s Own” adventure of “Gosh” and “Golly”, this would seem farcical to a teen who is told to take out the f-ing garbage at night or they’d be in the s*. (Bowdlerised to keep my blog’s general rating, embarrassingly enough.) There’s an important issue in reaching the reluctant reader and we’re already aware of how much certain areas of education, such as Computer Science, have to be hidden from peer groups for not being perceived as “cool” enough.

I’m not recommending that Harry Potter has to start calling Ron an *#&*&#$@ piece of #(*#$ that wouldn’t *&#($ in a (()#$# )()#$, but there is a wider world that swearing can constructively reach, if we’re going to try and engage some of these borderline readers. (Of course, the frequency of pseudo-racist slurs between pure bloods and non- in the Potter world is astoundingly awful when you come to think about it, but I’m not actually as positive on that. There’s a big difference between giving people a voice that sounds like theirs and having a large number of cheerful racists mostly getting away with constant, casual racism.)

Panellists may have a completely different opinion on this so I welcome followups! Thank you!


Being Honest About Stress, Challenge and Humanity: R U OK? Day #ruok

 

The RUOK™ logo from https://www.ruok.org.au

The RUOK™ logo from https://www.ruok.org.au

R U Ok? Day (September the 11th) is coming up soon, with its focus on reaching out and starting conversations with people that you think might not be ok, or might benefit from a friendly conversation. It’s a great initiative and, as someone who has struggled with mental illness, I’m so happy to see us talking openly about this. For me to out myself as having suffered with depression is no big thing, as I discuss it in other parts of the ‘net, but I realise that some of you might now look at what I do and what I say in a different light.

And, if you do, I have to tell you that you need to change the way that you think about these things. A very large number of humans will go through some form of mental issue in their lives, unsurprisingly given the levels of stress that we put ourselves under, the struggle some people have just to survive and the challenges that lie ahead of us as a rather greedy species on a finite globe. So, yes, I’ve suffered from depression but it is an illness. It is treatable and, when it is treated and managed, then you can’t tell that I have problems. In fact, like many people with the problem, even when I’m suffering, you wouldn’t really know. Nobody asks to get mentally ill so stigmatising, isolating and discriminating against people with a treatable mental condition is not just wrong, it’s pretty stupid. So let’s get beyond this and start talking, openly.

That’s where RUOK? is great because it gives you a day and some agency to reach out to someone who seems a little … off and ask them if they’re ok. Trust me when I say that 99% of them will appreciate it. Yes, 1% might give you some grief but if I knew a bet would pay off 99% of the time, I’d take it. The web site has some great tips for starting conversations so please read them if you’re thinking about doing this. (Pro tip: starting a conversation with “You should just cheer up” is not a great way to start. Or finish. In fact, just scratch that and try again.)

I am very open with my students, which I know some people think is potentially unprofessional, and I am a strong believer in cognitive apprenticeship. We are, pretty much, all the same in many respects and me pretending that everything I do comes fully formed and perfect from my amazing brain is a lie. My wisdom, such as it is, is the accumulated memory of the mistakes I’ve made that haven’t killed me yet. My students need to know that the people around them struggle, wonder, stress out and, quite frequently, rise above it all to keep on doing wonderful and beautiful things. I am still professional but I am honest and I am human.

I want to share with all of you something that I wrote on the death of Robin Williams, which I’ve edited slightly for language, but it’s been shared a lot over my other social feeds so it obviously resonates with people. However, many of my students won’t have seen it because I keep my private social life and ‘work’ social media separated. So here it is. I hope that you find it useful and, if you need help, maybe nudges you to help, and if you know someone you’re worried about, it inspires you to ask them “R U OK?”

Mental illness is a poisonous and weird thing. If your eyes changed function, you’d see things differently. When your brain changes function, everything gets weird – and the only impression you have of the perceptual world is suddenly flawed and untrustworthy. But it’s a biochemical issue like diabetes – regulatory systems that aren’t working properly and cannot just be “got over” by thinking happily. Ask a diabetic whether they’ve “really tried” to handle their sugar and see how far that gets you. 🙂

I wrote something, years ago, that I’ve never posted, to try and explain why some people just can’t stay. The nastiest thing about mental illness is that it can show you a world and a way of thinking that makes suicide apparently logical and, even more sadly, necessary. If you saw that world, then maybe you wouldn’t stay either. This doesn’t make it easier on the survivors but it’s important to recognise the role that an actual illness plays here. That f***ing ba***rd, cancer, takes people from us all the time but it at least has the decency to wield the knife itself. Depression puts the knife in the hands of its victim and makes it look like calculated agency, which hurts the people left behind even more.

There is no magic bullet for helping people with mental illness. Some need visible support. Some need solitude. Some need to work. Some drown in it. That’s because mental illness affects people, in all of their variety and their glorious irrationality, and I am no more a poster child for depression than anyone else. I can’t even tell you how to help me and, given how much I communicate, that’s the most irritating thing of all. But I do know that the ongoing support of caring people who are watching and listening makes a big difference and those of you who are aware and supporting, you keep up that good work! (And thank you, on behalf of the people who are still here because other people helped.)

It’s a sad day with Robin WIlliams passing but this is only a part of him. It’s a sad and mad part of him and I wish it hadn’t happened but I won’t let it define him, because his struggles were a part of him and his contribution to laughter and joy were so much greater. The least I can do is to see past his ‘mental diabetes’ to celebrate his actual talent and contribution. And offer my deepest sympathies and condolences to his family and friends.

Rest well, Robin.

 


ITiCSE 2014, Session 3C: Gender and Diversity, #ITiCSE2014 #ITiCSE @patitsel

This sessions was dedicated to the very important issues of gender and diversity. The opening talk in this session was “A Historical Examination of the Social Factors Affecting Female Participation in Computing”, presented by Elizabeth Patitsas (@patitsel). This paper was a literature review of the history of the social factors affecting the old professional association of the word “computer” with female arithmeticians to today’s very male computing culture. The review spanned 73 papers, 5 books, 2 PhD theses and a Computing Educators Oral History project. The mix of sources was pretty diverse. The two big caveats were that it only looked at North America (which means that the sources tend to focus on Research Intensive universities and white people) and that this is a big picture talk, looking at social forces rather than individual experiences. This means that, of course, individuals may have had different experiences.

The story begins in the 19th Century, when computer was a job and this was someone who did computations, for scientists, labs, or for government. Even after first wave feminism, female education wasn’t universally available and the women in education tended to be women of privilege. After the end of the 19th century, women started to enter traditional universities to attempt to study PhDs (although often receiving a Bachelors for this work) but had few job opportunities on graduation, except teaching or being a computer. Whatever work was undertaken was inherently short-term as women were expected to leave the work force on marriage, to focus on motherhood.

During the early 20th Century, quantitative work was seen to be feminine and qualitative work required the rigour of a man – things have changed in perceptions, haven’t they! The women’s work was grunt work: calculating, microscopy. Then there’s men’s work: designing and analysing. The Wars of the 20th Century changed this by removing men and women stepping into the roles of men. Notably, women were stereotyped as being better coders in this role because of their computer background. Coding was clerical, performed by a woman under the direction of a male supervisor. This became male typed over time. As programming became more developed over the 50s and 60s and the perception of it as a dark art started to form a culture of asociality. Random hiring processes started to hurt female participation, because if you are hiring anyone then (quitting the speaker) if you could hire a man, why hire a woman? (Sound of grinding teeth from across the auditorium as we’re all being reminded of stupid thinking, presented very well for our examination by Elizabeth.)

CS itself stared being taught elsewhere but became its own school-discipline in the 60s and 70s, with enrolment and graduation of women matching that of physics very closely. The development of the PC and its adoption in the 80s changed CS enrolments in the 80s and CS1 became a weeder course to keep the ‘under qualified’ from going on to further studies in Computer Science. This then led to fewer non-traditional CS students, especially women, as simple changes like requiring mathematics immediately restricted people without full access to high quality education at school level.

In the 90s, we all went mad and developed hacker culture based around the gamer culture, which we already know has had a strongly negative impact on female participation – let’s face it, you don’t want to be considered part of a club that you don’t like and goes to effort to say it doesn’t welcome you. This led to some serious organisation of women’s groups in CS: Anita Borg Institute, CRA-W and the Grace Hopper Celebration.

Enrolments kept cycling. We say an enrolment boom and bust (including greater percentage of women) that matched the dot-com bubble. At the peak, female enrolment got as high as 30% and female faculty also increased. More women in academia corresponded to more investigation of the representation of women in Computer Science. It took quite a long time to get serious discussions and evidence identifying how systematic the under-representation is.

Over these different decades, women had very different experiences. The first generation had a perception that they had to give up family, be tough cookies and had a pretty horrible experience. The second generation of STEM, in 80s/90s, had female classmates and wanted to be in science AND to have families. However, first generation advisers were often very harsh on their second generation mentees as their experiences were so dissimilar. The second generation in CS doesn’t match neatly that of science and biology due to the cycles and the negative nerd perception is far, far stronger for CS than other disciplines.

Now to the third generation, starting in the 00s, outperforming their male peers in many cases and entering a University with female role models. They also share household duties with their partners, even when both are working and family are involved, which is a pretty radical change in the right direction.

If you’re running a mentoring program for incoming women, their experience may be very. very different from those of the staff that you have to mentor them. Finally, learning from history is essential. We are seeing more students coming in than, for a number of reasons, we may be able to teach. How will we handle increasing enrolments without putting on restrictions that disproportionately hurt our under-represented groups? We have to accept that most of our restrictions actually don’t apply in a uniform sense and that this cannot be allowed to continue. It’s wrong to get your restrictions in enrolment at a greater expense on one group when there’s no good reason to attack one group over another.

One of the things mentioned is that if you ask people to do something because of they are from group X, and make this clear, then they are less likely to get involved. Important note: don’t ask women to do something because they’re women, even if you have the intention to address under-representation.

The second paper, “Cultural Appropriation of Computational Thinking Acquisition Research: Seeding Fields of Diversity”, presented by Martha Serra, who is from Brazil and good luck to them in the World Cup tonight! Brazil adapted scalable game design to local educational needs, with the development of a web-ased system “PoliFacets”, seeding the reflection of IT and Educational researchers.

Brazil is the B in BRICS, with nearly 200 million people and the 5th largest country in the World. Bigger than Australia! (But we try harder.) Brazil is very regionally diverse: rain forest, wetlands, drought, poverty, Megacities, industry, agriculture and, unsurprisingly, it’s very hard to deal with such diversity. 80% of youth population failed to complete basic education. Only 26% of the adult population reach full functional literacy. (My jaw just dropped.)

Scalable Game Design (SGD) is a program from the University of Colorado in Boulder, to motivate all students in Computer Science through game design. The approach uses AgentSheets and AgentsCubes as visual programming environments. (The image shown was of a very visual programming language that seemed reminiscent of Scratch, not surprising as it is accepted that Scratch picked up some characteristics from AgentSheets.)

The SGD program started as an after-school program in 2010 with a public middle school, using a Geography teacher as the program leader. In the following year, with the same school, a 12-week program ran with a Biology teacher in charge. Some of the students who had done it before had, unfortunately, forgotten things by the next year.  The next year, a workshop for teachers was introduced and the PoliFacets site. The next year introduced more schools, with the first school now considered autonomous, and the teacher workshops were continued. Overall, a very positive development of sustainable change.

Learners need stimulation but teachers need training if we’re going to introduce technology – very similar to what we learned in our experience with digital technologies.

The PolFacets systems is a live documentation web-based system used to assist with the process. Live demo not available as the Brazilian corner of internet seems to be full of football. It’s always interesting to look at a system that was developed in a different era – it makes you aware how much refactoring goes into the IDEs of modern systems to stop them looking like refugees from a previous decade. (Perhaps the less said about the “Mexican Frogger” game the better…)

The final talk (for both this session and the day) was “Apps for Social Justice: Motivating Computer Science Learning with Design and Real-World Problem Solving”, presented by Sarah Van Wart. Starting with motivation, tech has diversity issues, with differential access and exposure to CS across race and gender lines. Tech industry has similar problems with recruiting and retaining more diverse candidates but there are also some really large structural issues that shadow the whole issue.

Structurally, white families have 18-20 times the wealth of Latino and African-American people, while jail population is skewed the opposite way. The schools start with the composition of the community and are supposed to solve these distribution issues, but instead they continue to reflect the composition that they inherited. US schools are highly tracked and White and Asian students tend to track into Advanced Placement, where Black and Latino students track into different (and possibly remedial) programs.

Some people are categorically under-represented and this means that certain perspectives are being categorically excluded – this is to our detriment.

The first aspect of the theoretical prestige is Conceptions of Equity. Looking at Jaime Escalante, and his work with students to do better at the AP calculus exam. His idea of equity was access, access to a high-value test that could facilitate college access and thus more highly paid careers. The next aspect of this was Funds of Knowledge, Gonzalez et al, where focusing on a white context reduces aspects of other communities and diminishes one community’s privilege. The third part, Relational Equity (Jo Boaler), reduced streaming and tracking, focusing on group work, where each student was responsible for each student’s success. Finally,Rico Gutstein takes a socio-political approach with Social Justice Pedagogy to provide authentic learning frameworks and using statistics to show up the problems.

The next parts of the theoretical perspective was  Computer Science Education, and Learning Sciences (socio-cultrual perspective on learning, who you are and what it means to be ‘smart’)

In terms of learning science, Nasir and Hand, 2006, discussed Practice-linked Identities, with access to the domain (students know what CS people do), integral roles (there are many ways to contribute to a CS project) and self-expression and feeling competent (students can bring themselves to their CS practice).

The authors produced a short course for a small group of students to develop a small application. The outcome was BAYP (Bay Area Youth Programme), an App Inventor application that queried a remote database to answer user queries on local after-school program services.

How do we understand this in terms of an equity intervention? Let’s go back to Nasir and Hand.

  1. Access to the domain: Design and data used together is part of what CS people do, bridging students’ concepts and providing an intuitive way of connecting design to the world. When we have data, we can get categories, then schemas and so on. (This matters to CS people, if you’re not one. 🙂 )
  2. Integral Roles: Students got to see the importance of design, sketching things out, planning, coding, and seeing a segue from non-technical approaches to technical ones. However, one other very important aspect is that the oft-derided “liberal arts” skills may actually be useful or may be a good basis to put coding upon, as long as you understand what programming is and how you can get access to it.
  3. Making a unique contribution: The students felt that what they were doing was valuable and let them see what they could do.

Take-aways? CS can appeal to so many peopleif we think about how to do it. There are many pathways to help people. We have to think about what we can be doing to help people. Designing for their own community is going to be empowering for people.

Sarah finished on some great questions. How will they handle scaling it up? Apprenticeship is really hard to scale up but we can think about it. Does this make students want to take CS? Will this lead to AP? Can it be inter-leaved with a project course? Could this be integrated into a humanities or social science context? Lots to think about but it’s obvious that there’s been a lot of good work that has gone into this.

What a great session! Really thought-provoking and, while it was a reminder for many of us how far we have left to go, there were probably people present who had heard things like this for the first time.


ITiCSE 2014: Monday, Keynote 1, “New Technology, New Learning?” #ITiCSE2014 #ITiCSE

This keynote was presented by Professor Yvonne Rogers, from University College of London. The talk was discussing how we could make learning more accessible and exciting for everyone and encourage students to think, to create and share our view. Professor Rogers started by sharing a tweet by Conor Gearty on a guerrilla lecture, with tickets to be issued at 6:45pm, for LSE students. (You can read about what happened here.) They went to the crypt of Westminster Cathedral and the group, split into three smaller groups, ended up discussing the nature of Hell and what it entailed. This was a discussion on religion but, because of the way that it was put together, it was more successful than a standard approach – context shift, suspense driving excitement and engagement. (I wonder how much suspense I could get with a guerrilla lecture on polymorphism… )

Professor Rogers says that suspense matters, as the students will be wondering what is coming next, and this will hopefully make them more inquisitive and thus drive them along the path to scientific enquiry. The Ambient Wood was a woodland full of various technologies for student pairs, with technology and probes, an explorative activity. You can read about the Ambient Wood here. The periscope idea ties videos into the direction that you are looking – a bit like Google Glass without a surveillance society aspect (a Woodopticon?). (We worked on similar ideas at Adelaide for an early project in the Arts Precinct to allow student exploration to drive the experience in arts, culture and botanical science areas.) All of the probes were recorded in the virtual spatial environment matching the wood so that, after the activity, the students could then look at what they did. Thus, a group of 10-12 year olds had an amazing day exploring and discovering, but in a way that was strongly personalised, with an ability to see it from the bird’s eye view above them.

And, unsurprisingly, we moved on to MOOCs, with an excellent slide on MOOC HYSTERIA. Can we make these as engaging as the guerrilla lecture or the ambient wood?

hysertia

MOOCs, as we know, are supposed to increase our reach and access to education but, as Professor Rogers noted, it is also a technology that can make the lecturer a “bit of a star”. This is one of the most honest assessments of some of the cachet that I’ve heard – bravo, Professor Rogers. What’s involved in a MOOC? Well, watching things, doing quizzes, and there’s probability a lot of passive, rather than active, learning. Over 60% of the people who sign up to do a MOOC, from the Stanford experience, have a degree – doing Stanford for free is a draw for the already-degreed. How can we make MOOCs fulfil their promise, give us good learning, give us active learning and so on? Learning analytics give us some ideas and we can data mine to try and personalise the course to the student. But this has shifted what our learning experience is and do we have any research to show the learning value of MOOCs?

In 2014, 400 students taking a Harvard course:

  1. Learned in a passive way
  2. Just want to complete
  3. Take the easy option
  4. Were unable to apply what they learned
  5. Don’t reflect on or talk to their colleagues about it.

Which is not what we want? What about the Flipped Classroom? Professor Rogers attributed this to Khan but I’m not sure I agree with this as there were people, Mazur for example, who were doing this in Peer Instruction well before Khan – or at least I thought so. Corrections in the questions please! The idea of the flip is that we don’t have content delivery in lectures with the odd question – we have content beforehand and questions in class. What is the reality?

  1. Still based on chalk and talk.
  2. Is it simply a better version of a bad thing?
  3. Are students more motivated and more active?
  4. Very labour-intensive for the teacher.

So where’s the evidence? Well, it does increase interaction in class between instructors and students. It does allow for earlier identification of misconceptions. Pierce and Fox, 2012, found that it increased exam results for pharmacology students. It also fostered critical thinking in case scenarios. Maybe this will work for 10s-100s – what about classes of thousands? Can we flip to this? (Should we even have classes of this size is another good question)

Then there’s PeerWise, Paul Denny (NZ), where there is active learning in which students create questions, answer them and get feedback. Students create the questions and then they get to try other student’s questions and can then rate the question and rate the answer. (We see approaches like this, although not as advanced, in other technologies such as Piazza.)

How effective is this? Performance in PeerWise correlated with exam marks (Anyadi, Green and Tang, 2013), with active student engagement. It’s used for revision before the exams, and you get hihg-quality questions and answers, while supporting peer interaction. Professor Rogers then showed the Learning Pyramid, from the National Training Laboratories, Bethel, Maine. The PeerWise system plays into the very high retention area.

pyramid

Professor Rogers then moved on to her own work, showing us a picture of the serried rank nightmare of a computer-based classroom: students in rows, isolated and focused on their screens. Instead of ‘designing for one’, why don’t we design to orchestrate shared activities, with devices that link to public displays and can actively foster collaboration. One of Professor Rogers’ students is looking at ways to share simulations across tablets and screens. This included “4Decades“, a a simulation of climate management, with groups representing the different stakeholders to loo at global climate economics. We then saw a video that I won’t transcribe. The idea is that group work encourages discussion, however we facilitate it, and this tends to leading to teaching others in the sharing of ideas. Another technology that Professor Rogers’ group have developed in this space is UniPad: orchestrating collaborate activities across multiple types of devices, with one device per 6-7 students, and used in classes without many researchers present. Applications of this technology include budgeting for students (MyBank), with groups interacting and seeing the results on a public display. Given how many students operate in share houses collaboratively, this is quite an interesting approach to the problem. From studies on this, all group members participated and used the tablet as a token for discussion, taking ownership of a part of the problem. This also extended to reflection on other’s activities, including identifying selfish behaviour on the part of other people. (Everyone who has had flatmates is probably groaning at the moment. Curse you, Love Tarot Pay-By-The-Minute Telephone Number, which cost me and my flatmates a lot of dollars after a flatmate skipped out on us.)

The next aspect Professor Rogers discussed was physical creation toolkits, such as MaKey MaKey, where you can build alternative input for a computer, based on a simple printed circuit board with alligator clips and USB cables. The idea is simple: you can turn anything you like into a keyboard key. Demonstrations included a banana space bar, a play dough MarioKart gamepad, and many other things (a water bowl in front of the machine became a cat-triggered photo booth). This highlights one of the most important aspects of thinking about learning: learning for life. How can we keep people interested in learning in the face of busy, often overfull, lives when many people still think about learning as something that had to be endured on their pathway into the workforce? (Paging my climbing friends with their own climbing wall: you could make the wall play music if you wanted to. Just saying.)

One of the computers stopped working during a trial of the MaKey MaKey system with adult learners and the collaboration that ensued changed the direction of the work and more people were assigned to a single kit. Professor Rogers showed a small video of a four-person fruit orchestra of older people playing Twinkle Twinkle Little Star. (MORE KIWI!) This elicited a lot of ideas, including for their grandchildren and own parent, transforming exercise to be more fun, to help people learn fundamental knowledge skills and give good feedback. We often heavily intervene in the learning experience and the reflection of the Fruit Orchestra was that intervening less in self-driven activities such as MaKey MaKey might be a better way to go, to increase autonomy and thus drive engagement.

Next was the important question: How can we gets to create and code, where coding is just part of the creating? Can we learn to code differently beyond just choosing a particular language? We have many fascinating technologies but what is the suite of tools over the top that will drive creativity and engagement in this area, to produce effective learning? The short video shown demonstrated a pop-out prefabricated system, where physical interfaces and gestures across those represented coding instructions: coding without any typing at all. (Previous readers will remember my fascination with pre-literate programming.) This early work, electronics on a sheet, is designed to be given away because the production cost is less than 3 Euros. The project is called “code me” from University College London and is designed to teach logic without people realising it: the fundamental building block of computational thinking. Future work includes larger blocks with Bluetooth input and sensors. (I can’t find a web page for this.)

What role should technology play in learning? Professor Rogers mentioned thinking about this in two ways. The inside learning using technology to think about the levels students to reach to foster attainment: personalise, monitor, motivate, flexible, adaptive. The outside learning approach is to work with other people away from the screen: collaborate, create, connect, reflect and play. Professor Rogers believes that the choice is ours but that technology should transform learning to make it active, creative, collaborative, exciting (some other things I didn’t catch) and to recognise the role of suspense in making people think.

An interesting and thought-provoking keynote.