The Fragile Student Relationship (working from #Unstuck #by Julie Felner @felner)

I was referred some time ago to a great site called “Unstuck”, which has some accompanying iPad software, that helps you to think about how to move past those stuck moments in your life and career to get things going. They recently posted an interesting item on “How to work like a human” and I thought that a lot of what they talked about had direct relevance to how we treat students and how we work with them to achieve things. The article is by Julie Felner and I strongly suggest that you read it, but here are my thoughts on her headings, as they apply to education and students.

Ultimately, if we all work together like human beings, we’re going to get on better than if we treat our students as answer machines and they treat us as certification machines. Here’s what optimising for one thing, mechanistically, can get you:

This robot is the business at climbing hills. Dancing like a fool, not so much. It's not human.

This robot is the business at climbing hills. Dancing like a fool, not so much. It’s not human.

But if we’re going to be human, we need to be connected. Here are some signs that you’re not really connected to your students.

  1. Anything that’s not work you treat with a one word response. A student comes to see you and you don’t have time to talk about anything but assignment X or project Y. I realise time is scarce but, if we’re trying to build people, we have to talk to people, like people.
  2. You’re impatient when they take time to learn or adjust. Oh yeah, we’ve all done this. How can they not pick it up immediately? What’s wrong with them? Don’t they know I’m busy?
  3. Sleep and food are for the weak – and don’t get sick. There are no human-centred reasons for not getting something done. I’m scheduling all of these activities back-to-back for two months. If you want it, you’ll work for it.
  4. We never ask how the students are doing. By which I mean, asking genuinely and eking out a genuine response, if some prodding is required. Not intrusively but out of genuine interest. How are they doing with this course?
  5. We shut them down. Here’s the criticism. No, I don’t care about the response. No, that’s it. We’re done. End of discussion. There are times when we do have to drawn an end to a discussion but there’s a big difference between closing off something that’s going nowhere and delivering everything as if no discussion is possible.

Here is my take on Julie’s suggestions for how we can be more human at work, which works for the Higher Ed community just as well.

  1. Treat every relationship as one that matters. The squeaky wheels and the high achievers get a lot of our time but all of our students are actually entitled to have the same level of relationship with us. Is it easy to get that balance? No. Is it a worthwhile goal? Yes.
  2. Generously and regularly express your gratitude. When students do something well, we should let them know- as soon as possible. I regularly thank my students for good attendance, handing things in on time, making good contributions and doing the prep work. Yes, they should be doing it but let’s not get into how many things that should be done aren’t done. I believe in this strongly and it’s one of the easiest things to start doing straight away.
  3. Don’t be too rigid about your interactions. We all have time issues but maybe you can see students and talk to them when you pass them in the corridor, if both of you have time. If someone’s been trying to see you, can you grab them from a work area or make a few minutes before or after a lecture? Can you talk with them over lunch if you’re both really pressed for time? It’s one thing to have consulting hours but it’s another to make yourself totally unavailable outside of that time. When students are seeking help, it’s when they need help the most. Always convenient? No. Always impossible to manage? No. Probably useful? Yes.
  4. Don’t pretend to be perfect. Firstly, students generally know when you’re lying to them and especially when you’re fudging your answers. Don’t know the answer? Let them know, look it up and respond when you do. Don’t know much about the course itself? Well, finding out before you start teaching is a really good idea because otherwise you’re going to be saying “I don’t know a lot” and there’s a big, big gap between showing your humanity and obviously not caring about your teaching. Fix problems when they arise and don’t try to make it appear that it wasn’t a problem. Be as honest as you can about that in your particular circumstances (some teaching environments have more disciplinary implications than others and I do get that).
  5. Make fewer assumptions about your students and ask more questions. The demographics of our student body have shifted. More of my students are in part-time or full-time work. More are older. More are married. Not all of them have gone through a particular elective path. Not every previous course contains the same materials it did 10 years ago. Every time a colleague starts a sentence with “I would have thought” or “Surely”, they are (almost always) projecting their assumptions on to the student body, rather than asking “Have you”, “Did you” or “Do you know”?

Julie made the final point that sometimes we can’t get things done to the deadline. In her words:

You sometimes have to sacrifice a deadline in order to preserve something far more important — a relationship, a person’s well-being, the quality of the work

I completely agree because deadlines are a tool but, particularly in academia, the deadline is actually rarely as important as people. If our goal is to provide a good learning environment, working our students to zombie status because “that’s what happened to us” is bordering on a cycle of abuse, rather than a commitment to quality of education.

We all want to be human with our students because that’s how we’re most likely to get them to engage with us as a human too! I liked this article and I hope you enjoyed my take on it. Thank you, Julie Felner!


5 Things: Scientists

Another 5-pointer, inspired by a post I read about the stereotypes of scientists. (I know there are just as many about other professions but scientist is one of my current ones.)

  1. We’re not all “bushy-haired” confused old white dudes.

    It’s amazing that pictures of 19th Century scientists and Einstein have had such an influence on how people portray scientists. This link shows you how academics (researchers in general but a lot of scientists are in here) are shown to children. I wouldn’t have as much of a problem with this if it wasn’t reinforcing a really negative stereotype about the potential uselessness of science (Professors who are not connected to the real world and who do foolish things) and the demography (it’s almost all men and white ones at that) which are more than likely having a significant impact on how kids feel about going into science.

    It’s getting better, as we can see from a Google image search for scientists, which shows a very obvious “odd man out”, but that image search actually throws up our next problem. Can you see what it is?

    Sorry, Albert.

    Sorry, Albert.

  2. We don’t all wear white coats!

    So we may have accepted that there is demographic diversity in science (but it still has to make it through to kid’s books) but that whole white coat thing is reinforced way too frequently. Those white coats are not a uniform, they’re protective clothing. When I was a winemaker, I wore heavy duty dark-coloured cotton clothing for work because I was expecting to get sprayed with wine, cleaning products and water on a regular basis. (Winemaking is like training an alcoholic elephant with a mean sense of humour.) When I was in the lab, if I was handling certain chemicals, I threw on a white coat as part of my protective gear but also to stop it getting on my clothes, because it would permanently stain or bleach them. Now I’m a computer scientist, I’ve hung up my white coat.

    Biological scientists, scientists who work with chemicals or pharmaceuticals – any scientists who work in labs – will wear white coats. Everyone else (and there’s a lot of them) tend not to. Think of it like surgical scrubs – if your GP showed up wearing them in her office then you’d think “what?” and you’d be right.

  3. Science can be a job, a profession, a calling and a hobby – but this varies from person to person.

    There’s the perception of scientist as a job so all-consuming that it robs scientists of the ability to interact with ‘normal’ people, hence stereotypes like the absent-minded Professor or the inhuman, toxic personality of the Cold Scientific Genius. Let’s tear that apart a bit because the vast majority of people in science are just not like that.

    Some jobs can only be done when you are at work. You do the work, in the work environment, then you go home and you do something else. Some jobs can be taken home. The amount of work that you do on your job, outside of your actual required working time – including overtime, is usually an indicator of how much you find it interesting. I didn’t have the facilities to make wine at home but I read a lot about it and tasted a lot of wine as part of my training and my job. (See how much cooler it sounds to say that you are ‘tasting wine’ rather than ‘I drink a lot’?) Some mechanics leave work and relax. Some work on stock cars. It doesn’t have to be any particular kind of job because people all have different interests and different hobbies, which will affect how they separate work and leisure – or blend them.

    Some scientists leave work and don’t do any thinking on things after hours. Some can think on things but not do anything because they don’t have the facilities at home. (The Large Hadron Collider cost close to USD 7 Billion, so no-one has one in their shed.) Some can think and do work at home, including Mathematicians, Computer Scientists, Engineers, Physicists, Chemists (to an extent) and others who will no doubt show up angrily in the comments. Yes, when I’m consumed with a problem, I’m thinking hard and I’m away with the pixies – but that’s because, as a Computer Scientist, I can build an entire universe to work with on my laptop and then test out interesting theories and approaches. But I have many other hobbies and, as anyone who has worked with me on art knows, I can go as deeply down the rabbit hole on selecting typefaces or colours.

    Everyone can appear absent-minded when they’re thinking about something deeply. Scientists are generally employed to think deeply about things but it’s rare that they stay in that state permanently. There are, of course, some exceptions which leads me to…

  4. Not every scientist is some sort of genius.

    Sorry, scientific community, but we all know it’s true. You have to be well-prepared, dedicated and relatively mentally agile to get a PhD but you don’t have to be crazy smart. I raise this because, all too often, I see people backing away from science and scientific books because “they wouldn’t understand it” or “they’re not smart enough for it”. Richard Feynman, an actual genius and great physicist, used to say that if he couldn’t explain it to Freshman at College then the scientific community didn’t understand it well enough. Think about that – he’s basically saying that he expects to be able to explain every well-understood scientific principle to kids fresh out of school.

    The genius stereotype is a not just a problem because it prevents people coming into the field but because it puts so much demand on people already in the field. You could probably name three physicists, at a push, and you’d be talking about some of the ground-shaking members of the field. Involved in work leading up those discoveries, and beyond, are hundreds of thousands of scientists, going about their jobs, doing things that are valuable, interesting and useful, but perhaps not earth-shattering. Do you expect every soldier to be a general? Every bank clerk to become the general manager? Not every scientist will visibly change the world, although many (if not most) will make contributions that build together to change the world.

    Sir Isaac Newton, another famous physicist, referred to the words of Bernard of Chartres when he famously wrote:

    “If I have seen further it is by standing on the sholders [sic] of Giants”

    making the point even more clearly by referring to a previous person’s great statement to then make it himself! But there’s one thing about standing on the shoulders of giants…

  5. There’s often a lot of wrong to get to right.

    Science is evidence-based, which means that it’s what you observe occurring that validates your theories and allows you to develop further ideas about how things work. The problem is that you start from a position of not knowing much, make some suggestions, see if they work, find out where they don’t and then fix up your ideas. This has one difficult side-effect for non-scientists in that scientists can very rarely state certainty (because there may be something that they just haven’t seen yet) and they can’t prove a negative, as you just can’t say something won’t happen because it hasn’t happened yet. (Absence of evidence is not evidence of absence.) This can be perceived as weakness but it’s one of the great strengths of science. We work with evidence that contradicts our theories to develop our theories and extend our understanding. Some things happen rarely and under only very specific circumstances. The Large Hadron Collider was built to find evidence to confirm a theory and, because the correct tool was built, physicists now better understand how our universe works. This is a Good Thing as the last thing we want do is void the warranty through incorrect usage.

    The more complicated the problem, the more likelihood that it will take some time to get it right. We’re very certain about gravity, in most practical senses, and we’re also very confident about evolution. And climate change, for that matter, which will no doubt get me some hate on the comments but the scientific consensus is settled. It’s happening. Can we say absolutely for certain? No, because we’re scientists. Again – strength, not weakness.

    When someone gets it wrong deliberately, and that sadly does happen occasionally, we take it very seriously because that whole “standing on shoulders of giants” is so key to our approach. A disingenuous scientist, like Andrew Wakefield and his shamefully bad and manipulated study on vaccination that has caused so much damage, will take a while to be detected and then we have to deal with the repercussions. The good news is that most of the time we find these people and limit their impact. The bad news is that this can be spun in many ways, especially by compromised scientists, and humans can be swayed by argument rather than fact quite easily.

    The take away from this is that admitting that we need to review a model is something you should regard in the same light as your plane being delayed because of a technical issue. You’d rather we fixed it, immediately and openly, than tried to fly on something we knew might fail.


5 Things: Computers

In the interests of blogging more usefully, I’m trying some “5 point posts” in areas where I have some reasonable knowledge. Hope they’re useful!

  1. Computers neither like you nor hate you.

    If I had a dollar for every time I saw someone go through some sort of ritual like pleading with, patting or hitting a computer, I’d be a very rich man. We often talk about computers as if they understand what we’re talking about (a fallacy that can trip up novice programmers, thanks for the reminder, Mark!) and this assumes that there’s some kind of mind in there. I know that you all know that it’s not actually true but we have to stop acting like it’s true as well.

    If you have important documents on your computer – then back them up, somewhere. If you are writing large documents, save them every 5 minutes or so. And check regularly to make sure that they’re actually being saved. The amount of preparation you put into making sure that the computer doing something ‘bad’ won’t actually affect you will directly reduce the amount of stress that you feel when it does go wrong. The computer is neither your friend nor your enemy and it will do what the programs tell it to do – not what you want it to do or what any reasonable person would do. This is pretty much true across every computer and operating systems. The computer can’t tell your vital photos from an old recipe copy you don’t need and it most certainly has no idea that you have a deadline – you’re just more likely to make mistakes because you’re under pressure.

    One of the best things you can ever install to stop your computer “behaving badly” is anti-virus software that you keep updated. Yes, it costs money (sorry) but how much is your time worth? If you can say “Yes, I lost the last 12 months work and it will take me a week to get my computer working again and I don’t care” then you can skip Anti-Virus. Everyone else – please install supported Anti-Virus software (look on line for customer reviews and recommendations, I make none here.) Having your computer hacked isn’t some jolly pirate image that pops up and goes “ho ho ho”. Modern attacks can wait, encrypt your backups and then charge you money to get at your own data – deleting it if you don’t pay. Computers don’t hate you but there are a lot of haters out there. One of the biggest threats is becoming part of a BotNet, a collection of computers that are being used to conduct unauthorised or criminal activities, without the knowledge of their owners. Not that worried? BotNets can be used to host all sorts of things, including child pornography chat servers and files. But don’t be worried! Install good anti-virus software instead and keep it up to date!

    Now, very, very few people are “bad with computers” but a lot of people have had unfortunate first encounters (and that is far more likely to have to do with the computer than with what you are doing) and have retreated to what is, essentially, a position of superstition. This wouldn’t be a problem, except that…

  2. Computers are everywhere.

    And this is why not being comfortable with computers is going to be more of a problem. I have now learned to program (in simplistic form but still) everything from cars to video recorders, including my vacuum cleaner, because all of them have little computers inside them. It will, sadly, get harder and harder to stay away from them. I’m not advocating some Butlerian evolution of the machine but it’s just happening anyway. Do they work exactly as we wish? No, but I’ll get back to that later, because they are close enough most of the time.

    So you probably already have one at home, in some form, which brings me to… 

  3. Computers need to be replaced and upgraded.

    This is a bit of a pain, particularly for those who don’t like (or prefer not to) change or have no cash (or anything else that says “I don’t want to upgrade”). The computing hardware will eventually break down and the more active the life of the computer, the more likely it is for something to go wrong. Laptops tend to die before desktops because of vibration, dust and heat, and tablets and phones are easy to drop. That’s why the point I made about backing up is really important anyway and triply important for anything vaguely mobile.

    Companies regularly declare an end-of-life period for their software and hardware and you have to keep an eye out for this as, after this time, you will no longer get updates for the software and the hardware. An unsupported computer is a crash waiting to happen and a security hole that you could drive a truck through. So not only do you need to keep using something supported, you have to keep patching it (downloading updates from the company and installing them) to keep it safe. In 2008, an unpatched Windows XP box would be hacked in, on average, 4 minutes of connection time. XP itself was released in 2001 and it was officially declared end-of-life on April 8, 2014. That’s over 12 years, compared to the usual product cycle of 10 years. But now, unless something big happens or you happen to be running ATMs, you will not get any more support for Microsoft on this operating system. Which means that, soon enough, your machine will take but minutes to infect and become part of someone else’s network of compromised machines – if it hasn’t happened already.

    Hardware does change and removing old machines can be painful when you have a trusted companion that is still working. However, these sorts of changes (like Apple’s removal of support for the PowerPC chip) are advertised well in advance (it took 7 years for Apple to stop supporting the PowerPC) and there is at least one silver lining on the creep in hardware and system specifications. If you buy 12 months behind the release of new technology, you should still get 5-ish good years out of your machines and avoid paying full price – plus you can buy refurbished models from early adopters with more money than sense. However, be careful and don’t buy something from a discontinued line because it is cheap – it will end-of-life much sooner than the low-end new line hardware.

    Yes, forced obsolescence sucks but we actually don’t have to buy the new shiny every time (not that many of us can afford to) and knowledge of the refresh/end-of-life cycle will help you to make a good decision. Those of you who are supporting older family members, I know it sucks but you’re going to have broach the issue of operating system changeovers before they become part of an distributed denial-of-service attack on some government department or have all of their e-mails encrypted for a $500 decrypt fee.

  4. It doesn’t really matter which computer you use, if it works for you.

    I’ve used pretty much everything in the way of computers and I use what works for me, when I need to. Right now, I’m using a lot of Apple gear because I’m not doing as much gaming and it all does what I need. If I were working more in different areas, I might be doing a lot more in Linux. I’ve worked with Windows before and I’ll probably work with it again. In 10 years time, who knows?

    I have no strong opinions as to what is best and I’m certainly not going to lecture someone on their choice. If they’re obviously unhappy, then we might chat, but don’t let anyone tell you that you’re right or wrong just because you have this system or that. (Unless it’s horribly out of date or not backed up, in which case, please look into updating/upgrading/fixing!)

  5. Computers are here to stay and the computing profession has some work to do

    And that’s the truth of it. We have a long way to go in making computers work better with people, that’s for sure. It would be great if we could be more ambiguous and hand wavy with a machine and get it to do what we want but there’s a lot of things to get working before that happens. However, hand on my heart, it is so much easier to use computers now than it was 10 years ago, let along 20 or 30. I genuinely think that we are going to see better and better ways to work with them as time goes on so, please, hang in there if you’re having trouble. That next upgrade might be just what you were looking for, even if it seems like a pain at the time.


When Does Collaborative Work Fall Into This Trap?

A recent study has shown that crowdsourcing activities are prone to bringing out the competitors’ worst competitive instincts.

“[T]he openness makes crowdsourcing solutions vulnerable to malicious behaviour of other interested parties,” said one of the study’s authors, Victor Naroditskiy from the University of Southampton, in a release on the study. “Malicious behaviour can take many forms, ranging from sabotaging problem progress to submitting misinformation. This comes to the front in crowdsourcing contests where a single winner takes the prize.” (emphasis mine)

You can read more about it here but it’s not a pretty story. Looks like a pretty good reason to be very careful about how we construct competitive challenges in the classroom!

We both want to build this but I WILL DO IT WITH YOUR BONES!

We both want to build this but I WILL DO IT WITH YOUR BONES!


Proud to be a #PreciousPetal, built on a strong #STEM, @PennyWrites @SenatorMilne @adambandt

I am proud to be a Precious Petal. Let me explain why I think we should reclaim this term for ourselves.

Australia, apparently, does not have a need for dedicated Science Minister, for the first time since the 1930s. Instead, it is a subordinate portfolio for our Minister for Industry, the Hon Ian Macfarlane, MP. Today, he was quoted in the Guardian, hitting out at “precious petals in the science industry” who are criticising the lack of a dedicated Science Minister. Macfarlane, whose Industry portfolio includes Energy, Skills and Science went on to say:

“I’m just not going to accept that crap,” he said. “It really does annoy me. There’s no one more passionate about science than me, I’m the son and the grandson of a scientist. I hear this whinge constantly from the precious petals in the science industry.”

So I’m not putting words in his mouth – that’s a pretty directed attack on the sector that happens to underpin Energy and Industry because, while Macfarlane’s genetic advantage in his commitment to science may or not be scientifically valid, the fact of the matter is that science, and innovation in science, have created pretty much all of what is referred to as industry in Australia. I’m not so one-eyed as to say that science is everything, because I recognise and respect the role of the arts and humanities in a well-constructed and balanced society, but if we’re going to talk about everything after the Industrial (there’s that word again) Revolution in terms of production industries – take away the science and we’re not far away poking things with sticks to work out which of the four elements (fire, air, earth, water) it belongs to. Scientists of today stand on a tradition of thousands of years of accumulated knowledge that has survived many, many regimes and political systems. We tell people what the world is like, rather than what people want it to be, and that often puts us at odds with politicians, for some reason. (I feel for the ethicists and philosophers who have to do the same thing but can’t get industry implementation partnerships as easily and are thus, unfairly, regularly accused of not being ‘useful’ enough.)

I had the opportunity to be addressed by the Minister at Science Meets Parliament where, like something out of a David Williamson play, the genial ageing bloke stood up and, in real Strine, declaimed “No Minister for Science? I’m your Minister for Science!” as if this was enough for a room full of people who were dedicated to real evidence. But he obviously thought it was enough as he threw a few bones to the crowd. On the back of the cuts to CSIRO and many other useful scientific endeavours, these words ring even more hollow than they did at the time.

But rather than take offence at the Minister’s more recent deliberately inflammatory and pejorative words, let me take them and illustrate his own lack of grasp of his portfolio.

My discipline falls into STEM – Science, Technology, Engineering and Mathematics – and I am scientist in that field. Personally, I like to add an A for Arts, as I am rather cross-disciplinary, and make it STEAM, because that conveys the amazing potential and energy in the area when we integrate across the disciplines. So, if Science is a flower, then we have a strong STEM in Australia, although it is currently under threat from a number of initiatives put in place by this very government.

But what of petals? If the Minister knew much botany, he’d know that petals are modified leaves that protect parts of the flower, attract or deliberately drive away certain pollinators, building relationships with their pollinating community to build a strong ecosystem. When flowers have no petals, they are subject to the whim on the winds for pollination and this means that you have to be very wasteful in your resources to try and get to any other plants. When the petals are strong and well-defined, you can draw in the assistance of other creatures to help you use your resources more wisely and achieve the goals of the flower – to produce more flowers over time.

At a time when bee colony collapse is threatening agriculture across the globe, you would think that a Minister of Industry (and Science) would have actually bothered to pick up some of the facts on this, very basic, role of a mechanism that he is using to deride and, attempt to, humiliate a community for having the audacity to complain about a bad decision. Scientists have been speaking truth to power since the beginning, Minister, and we’re not going to stop now.

If the Minister understood his portfolio, then he would realise that calling Australia’s scientific community “precious petals” is actually a reflection of their vital role in making science work for all Australians and the world. It is through these petals, protecting and guiding the resources in their area, that we can take the promise of STEM and share it with the world.

But let’s not pretend that’s what he meant. Much like the staggering Uncle at a Williamson Wedding, these words were meant to sting and diminish – to make us appear hysterical and, somehow, less valid. In this anachronistic, and ignorant, attack, we have never seen a better argument as to why Australia should have a dedicated Science Minister, who actually understands science.

I’m proud to be a Precious Petal, Minister.

An open nelumno nucifera flower, from the Botanic Gardens in Adelaide. Via Wikipedia.

An open nelumno nucifera flower, from the Botanic Gardens in Adelaide. Via Wikipedia.


Knowing the Tricks Helps You To Deal With Assumptions

I teach a variety of courses, including one called Puzzle-Based Learning, where we try to teach think and problem-solving techniques through the use of simple puzzles that don’t depend on too much external information. These domain-free problems have most of the characteristics of more complicated problems but you don’t have to be an expert in the specific area of knowledge to attempt them. The other thing that we’ve noticed over time is that a good puzzle is fun to solve, fun to teach and gets passed on to other people – a form of infectious knowledge.

Some of the most challenging areas to try and teach into are those that deal with probability and statistics, as I’ve touched on before in this post. As always, when an area is harder to understand, it actually requires us to teach better but I do draw the line at trying to coerce students into believing me through the power of my mind alone. But there are some very handy ways to show students that their assumptions about the nature of probability (and randomness) so that they are receptive to the idea that their models could need improvement (allowing us to work in that uncertainty) and can also start to understand probability correctly.

We are ferociously good pattern matchers and this means that we have some quite interesting biases in the way that we think about the world, especially when we try to think about random numbers, or random selections of things.

So, please humour me for a moment. I have flipped a coin five times and recorded the outcome here. But I have also made up three other sequences. Look at the four sequences for a moment and pick which one is most likely to be the one I generated at random – don’t think too much, use your gut:

  1. Tails Tails Tails Heads Tails
  2. Tails Heads Tails Heads Heads
  3. Heads Heads Tails Heads Tails
  4. Heads Heads Heads Heads Heads

Have you done it?

I’m just going to put a bit more working in here to make sure that you’ve written down your number…

I’ve run this with students and I’ve asked them to produce a sequence by flipping coins then produce a false sequence by making subtle changes to the generated one (turns heads into tails but change a couple along the way). They then write the two together on a board and people have to vote on which one is which. As it turns out, the chances of someone picking the right sequence is about 50/50, but I engineered that by starting from a generated sequence.

This is a fascinating article that looks at the overall behaviour of people. If you ask people to write down a five coin sequence that is random, 78% of them will start with heads. So, chances are, you’ve picked 3 or 4 as you’re starting sequence. When it comes to random sequences, most of us equate random with well-shuffled, and, on the large scale, 30 times as many people would prefer option 3 to option 4. (This is where someone leaps into the comments to say “A-ha” but, it’s ok, we’re talking about overall behavioural trends. Your individual experience and approach may not be the dominant behaviour.)

From a teaching point of view, this is a great way to break up the concepts of random sequences and some inherent notion that such sequences must be disordered. There are 32 different ways of flipping 5 coins in a strict sequence like this and all of them are equally likely. It’s only when we start talking about the likelihood of getting all heads versus not getting all heads that the aggregated event of “at least one head” starts to be more likely.

How can we use this? One way is getting students to write down their sequences and then asking them to stand up, then sit down when your ‘call’ (from a script) goes the other way. If almost everyone is still standing at heads then you’ve illustrated that you know something about how their “randomisers” work. A lot of people (if your class is big enough) should still be standing when the final coin is revealed and this we can address. Why do so many people think about it this way? Are we confusing random with chaotic?

The Law of Small Numbers (Tversky and Kahneman), also mentioned in the post, which is basically that people generalise too much from small samples and they expect small samples to act like big ones. In your head, if the grand pattern over time could be resorted into “heads, tails, heads, tails,…” then small sequences must match that or they just don’t look right. This is an example of the logical fallacy called a “hasty generalisation” but with a mathematical flavour. We are strongly biassed towards the the validity of our experiences, so when we generate a random sequence (or pick a lucky door or win the first time at poker machines) then we generalise from this small sample and can become quite resistant to other discussions of possible outcomes.

If you have really big classes (367 or more) then you can start a discussion on random numbers by asking people what the chances are that any two people in the room share a birthday. Given that there are only 366 possible birthdays, the Pigeonhole principle states that two people must share a birthday as, in a class of 367, there are only 366 birthdays to go around so one must be repeated! (Note for future readers: don’t try this in a class of clones.) There are lots of other, interesting thinking examples in the link to Wikipedia that helps you to frame randomness in a way that your students might be able to understand it better.

10 pigeons into 9 boxes? Someone has a roommate.

10 pigeons into 9 boxes? Someone has a roommate.

I’ve used a lot of techniques before, including the infamous card shouting, but the new approach from the podcast is a nice and novel angle to add some interest to a class where randomness can show up.


MOOCs and the on-line Masters Degree

There’s been a lot of interest in Georgia Tech’s new on-line masters degree in Computer Science, offered jointly with Udacity and AT&T. The first offering ran with 375 students, and there are 500 in the pipeline, but readmissions opened again two days ago so this number has probably gone up. PBS published an article recently, written up on the ACM blog.

I think we’re all watching this with interest as, while it’s neither Massive at this scale or Open (fee-paying and admission checked), if this works reasonably, let alone well, then we have something new to offer at the tertiary scale but without many of the problems that we’ve traditionally seen with existing MOOCs (retention, engagement, completion and accreditation.)

Right now, there are some early observations: the students are older (11 years older on average) and most are working. In this way, we’re much closer to the standard MOOC demographic for success: existing degree, older and practised in work. We would expect this course to do relatively well, much as our own experiences with on-line learning at the 100s scale worked well for that demographic. This is, unlike ours, more tightly bound into Georgia’s learning framework and their progress pathways, so we are very keen to see how their success will translate to other areas.

We are still learning about where MOOC (and its children SPOC and the Georgia Tech program) will end up in the overall scheme of education. With this program, we stand a very chance of working out exactly what it means to us in the traditional higher educational sector.

An inappropriate picture of a bricks-and-mortar campus for an article on on-line learning.

An inappropriate picture of a bricks-and-mortar campus for an article on on-line learning.


Follow

Get every new post delivered to your Inbox.

Join 790 other followers