SIGCSE 2014: Collecting and Analysing Student Data 1, Paper 3, Thursday 3:15 – 5:00pm (#SIGCSE2014)

Ok, this is the last paper, with any luck I can summarise it in four words so you don’t die of work poisoning. The final talk was “Using CodeBrowser to Seek Difference Between Novice Programmers” by Kenny Heinonen, Kasper Hirvikoski, Matti Luukkainen, and Arto Vihavainen, from the University of Helsinki. I regret to say that due to some battery issues, this blog is probably going to be cut short. My apologies to the speakers!

The takeaway from the talk is that CodeBrowser is a fancy tool for identifying challenges that students face as they are learning to program. it sues your snapshot data and, if you have lots of students, course outcomes and another measures should be used to find a small number of students to analyse first. (Oh, and penguins are cool.)

Helsinki has hundreds of locals and thousands of MOOC participants learning to program, recording student progress as they learn to program. The system is built on top of NetBeans and provides scaffolding for students as they learn to program. Ok, so were recording the students’ progress but so what? Well, we have snapshots with time and source and we can use this to identify students at risk of dropping CS1 and a parallel maths course. (Retention and early drop-out? Subjects close to my heart!) It can also be used to seek insight into the problems that students are facing. There are not a great many systems that allow you to analyse and visualise code snapshots, apparently.

Looks interesting, I’ll have to go and check it out!

Sorry, battery is going, committing before this all goes away!


SIGCSE 2014: Research: Concept Inventories and Neo-Piagetian Theory, Thursday 1:45-3:00pm (#SIGCSE2014)

The first talk was “Developing a Pre- and Post- Course Concept Inventory to Gauge Operating Systems Learning” presented by Kevin Webb.

Kevin opened by talking about the difficulties we have in sharing our comparison of student learning behaviour and performance. Assessment should be practical, technical, comprehensive, and, most critically, comparable so you can compare these results across instructors, courses and institutions. It is, as we know, difficult to compare homework and lab assignments, student surveys and exam results, for a wide range of reasons. Concept inventories, according to Kevin, give us a mechanism for combining the technical and comparable aspects.

Concept inventories are short, standardised exempts to deal with high-levbe conceptual take-awaks to reveal systematic misconceptions, MCQ format, deployed before and after courses. You can supplement your courses with the small exam to see how student learning is progressing and you can use this to compare performance and learning between classes. The one you’ve probably heard of is the Physics Force Concept Inventory, which Mazur talks about a lot as it was the big motivator for Peer Instruction to address shallow conceptual learning.

There are two Concept Inventories for CS but they’re not publicly available or even maintained anymore but, when they were run, students were less successful than expected – 40-60% of the course was concepts were successfully learned AFTER the course. If your students were struggling with 40% of the key concepts, wouldn’t you like to know?

This work hopes to democratise CI development, using open source principles. (There is an ITiCSE paper coming soon, apparently.) This work has some preliminary development of a CI for Operating Systems.

Goals and challenges included dealing with the diversity of OS courses and trading off which aspects would best fit into the CI. The researchers also wanted it to be transparent and flexible to make questions available immediately and provide a path (via GitHub) for collaboration and iteration. From an accessibility perspective, developing questions for a universal pre-test is hard, and the work is based in the real world where possible.

An example of this is paging/caching replacement, because of the limited capacity of some of these storage mechanism, so the key concept is locality, with an “evict oldest” policy. What happens if the students don’t have the vocabulary of a page table or staleness yet? How about an example of books on your desk, via books on a shelf, via books in the library? (We used similar examples in our new course to explain memory structures in C++ with a supermarket and the various shelves.)

Results so far indicate that taking the OS course improved performance (good) but not all concepts showed an equal increase – some concepts appear to be less intuitive than others. Student confidence increased, even where they weren’t getting the right answers. Scenario “word problems” appear to be challenging to students and opted for similar, less efficient solutions. (This may be related to the “long document hard to read” problem that we’ve observed locally.)

The next example was on indirection with pointers where simplifying the pointer chain was something students intuitively did, even where the resulting solution was sub-optimal. This was tested by asking two similar questions on the exam, where the first was neutrally stated as a “should we” and the second asked them to justify the complexity of something, which gave them a tip as to where the correct answer lay.

Another example, using input/output and polling, presenting the device without a name deprived the students of the ability to use a common pattern. When, in an exam, the device was named (as a disk) then the correct answer was chosen, but the reasoning behind the answer was still lacking – so they appear to be pattern matching, rather than thinking to the answer. From some more discussion, students unsurprisingly appear to choose solutions that match what they have already seen – so they will apply mutexes even in applications where it’s not needed because we drown them in locks. Presenting the same problem without “constricting” names as a code examples, the students could then solve the problem correctly, without locks, despite almost all of them wanting to use locks earlier.

Interesting talk with a fair bit to think about. I need to read the paper! The concept inventory can be found at https://github.com/osconceptinventory” and the group welcome collaboration so go and … what’s the verb for “concept inventory” – inventorise? Anyway, go and do it! (There was a good reminder in question time to mine your TAs for knowledge about what students come to talk to them about – those areas of uncertainty might be ripe for redevelopment!)

The next talk was “Misconceptions and Concept Inventory Questions for Hash Tables and Binary Search Trees” presented by Kuba Karpierz ( a senior Computer Science student at the University of British Columbia). Kuba reviewed the concept inventory concept for newcomers to the room. (Poor Kuba was slightly interrupted by a machine shutdown that nearly broke his presentation but carried on with little evidence of problem and recovered it well.) The core properties of concept inventories are that they must be brief and multiple choice at least.

Students found hash table resizing to be difficult so this was nominated as a CI question. Students would sketch the wrong graph for resizing, ignoring the resize cost and exaggerating the curve shape of what should be a linear increase.The team used think aloud exercises to explain why students picked the wrong solution. Regrettably, the technical problems continued and made it harder to follow the presentation.

A large number of students had no idea how to resize the hash table (for reasons I won’t explain) but this was immediately obvious after the concept inventory exam, rather than having to dig it out of the exams. The next example was on Binary Search Trees and the misconception that they are are always balanced. (It turns out that students are conflating them with heaps.) Looking at the CI MCQs for this, it’s apparent that we were teaching with these exemplars in lectures, but not as an MCQ short exam. Food for thought. The example shown did make me think because it was deliberately ambiguous. I wondered if it would be better if it were slightly less challenging and the students could pick the right answer. Apparently they are looking at this in a different question.

The final talk was “Neo-Piagetian Theory as a Guide to Curriculum Analysis”, presented by Claudia Szabo, from our Computer Science Education Research group. This is the work that we’re using as the basis for the course redesign of our local Object Oriented Programming course so I know this work quite well! (It’s nice to see theory being put into practice, though, isn’t it?)

Claudia started with a discussion of curriculum analyse – the systematic processes that we use to guide teachers to identify instructional goals and learning objectives. We develop, we teach, we observe and we refine, but this refinement may lead to diversion from the originally stated goals. The course loses focus and structure, and possibly even lose its scaffolding. Claudia’s paper has lots of good references for the various theory areas so I won’t reproduce it here but, to get back to the talk, Claudia covered the Piagetian stages of cognitive development in the child: sensorimotor, pre-operational, concrete operational and formal operational. In short, you can handle concepts in pre-, can perform logic and solve for specific situations in concrete but only get to abstract thought and true problem-sovling in the formal operational mode. (Pre-operations is ages 2-7, concrete is 7-11 and formal is 11-15 by the time it is achieved. This is not a short process but also explains why we teach things differently at different age groups.)

Fundamentally, Neo-Piagetian theory starts from the premise that the cognitive developmental stages that humans go through during childhood are seen again as we learn very new and different concepts in new contexts, including mathematics and computer science, exhibited in the same stages. Ultimately, this means places limitations on the amount of abstraction versus concrete reasoning that students can apply. (Without trying to start an Internet battle, neo-Piagetian theory is one of the theories in this space, with the other two that I generally associate being Threshold Concepts and Learning Edge Momentum – we’re going to hold a workshop in Australia shortly to talk about how these intersect, conflict and agree, but I digress.)

So this peer is looking to analyse learning and teaching activities to determine the level at which we are teaching it and the level at which we are assessing it – this should allow us to determine prerequisite concepts (concept is tested before being taught) and assessment leaps (concept is assessed at a level higher than we taught it). The approach uses an ACM CS curriculum basis, combined with course-secific materials, and a neo-Piaget taxonomy to classify teaching activities to work out if we have not provided the correct pre-requisite material or whether we are assessing at a higher level than we taught students (or we provided a learning environment for them to reach that level, if we’re being precise). There’s a really good write-up in the paper to show you how conceptual handling and abstraction changes over the developmental stages.

For example, in representational systems a concrete explanation of memory allocation is “memory allocation is when you use the keyword new to create a variable”. In a familiar Single Abstraction, we could rely upon knowledge of the programming language and the framework to build upon the memory allocation knowledge to explain how memory allocation dynamically requests memory from the free store, initialises it and returns a pointer to the allocated space. If the student was able to carry out Single Abstraction on the global level, they would be able to map their knowledge of memory allocation in C++ into a new language such as Java. As the student developed, they can map abstractions to a global level, so class hierarchies in C++ can be mapped into similar understanding in Java, for example.

The course that was analysed, Object Oriented Programming, had a high failure rate, and students were struggling in the downstream course with fundamental concepts that we thought we had covered in OOP. So a concept definition document was produced to give a laundry list of concepts (Pro tip: concept inventories get big quickly. Be ruthless in your trimming.) For the selected concepts, the authors looked to see where it was taught, how it was taught and then how it was assessed. This quickly identified problems that needed to be fixed. One example is that the important C++ concept of Strings, assessment had been carried out before the concrete operational teaching had taken place! We start to see why the failure rate had been creeping up over time.

As the developer, in association with the speaker, of the new OOP course, this framework is REALLY handy because you are aways thinking “How am I teaching this? Can I assess it at this level yet?” If you do this up front then you can design a much better course, in my opinion, as you can move around the course to get things in the right order at the right time and have enough time to rewrite materials to match the levels. It doesn’t actually take that long to run over the course and it clearly visualises where our pitfalls are.

Next on the table is looking at second and third year courses and improving the visualisation – but I suspect I may have to get involved in that one, personally.

Good session! Lots of great information. Seriously, if you’re not at SIGCSE, why aren’t you here?


SIGCSE Keynote #1 – Computational Thinking For All, Robert M. Panoff, Shodor Education Foundation

Bob Panoff is the wonder of the 2014 SIGCSE Award for Outstanding Contribution to Computer Science Education and so he gets to give a keynote, which is a really good way to do it rather than delaying the award winners to the next year.

Bob kicked off with good humour, most of which I won’t be able to capture, but the subtext of hits talk is “The Power and the Peril”, which is a good start to the tricky problem of Comp thinking for all. What do we mean by computational thinking? Well, it’s not teaching programming, we can teach programming to enhance computational thinking but thinking is the key word here. (You can find his slides here: http://shodor.org/talks/cta/)

Bob has faced the same problem we all have: that of being able to work on education when your institution’s focus is research. So he went to start an independent foundation where CS Ed where such activities could be supported. Bob then started talking about expectation management, noting that satisfaction is reality divided by expectations – so if you lower your expectations. (I like that and will steal it.)

Where did the name Shodor come from? Bob asked us if we knew and then moved to put us through a story, which would answer this question. As it turns out, he name came from a student’s ungenerous pattern characterisation of Bob, whose name he couldn’t remember, as “short and kinda dorky looking”.

I need to go and look at the Shodor program in detail because they have a lawyered apprenticeship model, teaching useful thinking and applied skills, to high schoolers, which fills in the missing math and 21st century skills that might prevent them from going further in education. Many of the Shodor apprentices end up going on as first-in-family to college, which is a great achievement.

Now, when we say Computational Science Education, is it Computational (Science Education) or (Computational Science) Education? (This is the second slide in the pack). The latter talks about solving the right problem, getting the problem solved in the right way and actually being right.

Right Answer = Wrong Answer + Corrections

This is one of the key issues in modelling over finite resources, because we have to take shortcuts in most systems to produce a model that will fit. Computationally, if we have a slightly wrong answer (because of digital approximations or so on), then many iterations will make it more and more wrong. If we remember to adjust for the corrections, we can still be right. How helpful is it to have an exact integral that you can’t evaluate, especially when approximations make that exact integral exceedingly unreliable? (The size of the Universe is not 17cm, for example.)

Elegant view of science: Expectation, Observation and Reflection. What do you expect to see? What do you see? What does it actually mean Programming is a useful thought amplifier because we can get a computer to do something BUT before you get to the computer, what do you expect the code to work and how will you now what it’s doing? Verification and validation are important job skills, along with testing, QA and being able to read design documents. Why? Because then you have to be able to Expect, Observe and Reflect. Keyboard skills do not teach you any of this and some programming ‘tests’ are more keyboard skills than anything else.

(If you ever have a chance to see Bob talk, get there. He’s a great speaker and very clever and funny at the same time.)

Oh dear.

Oh dear.

Can we reformable the scientific method and change the way that we explain science to people? What CAN I observe? What DO I observe? How do I know that it’s right? How am I sure? Why should I care? A lot of early work was driven by wonder (Hey, that’s cool) rather than hypothesis driven (which is generally what we’re supposed to be doing.) (As a very bad grounded theorist, this appeals.)

How do we produce and evaluate models? Well, we can have an exact solution to an exact model, an exact solution to an approximate model (not real but assessable), an approximate solution to an exact model and an approximate solution to an approximate model. Some of the approximation in the model is the computing itself, with human frailty thrown into the mix.

What does Computational Thinking allow you to? To build and explore a new world where new things are true and other things are false, because this new universe is interesting to us. “The purpose of computing is insight, not numbers” — R. Hamming, “If you can’t trust the numbers, you won’t get much insight” — R. Panoff. Because the computer is dumb, we have to do more work and more thinking to make up for the fast and accurate moron that does what we order it to do.

“Killing off the big lie” – every Math class you have, you see something on page 17 showing a graph and an equation which has “as you can see from the graph” starting it. Bob’s lament is that he CAN’T see from the graph and not many other people can either. We just say that but, many times, it’s a big lie. Pattern recognition and characterisation are more important than purely manipulating numbers. (All of this is on the Shodor website) Make something dynamic and interactive and student can explore, which allows them to think about what happens when they change things – set an expectation, observe and reflect, change conditions and do it again.

Going to teachers, they know that teaching mathematics is frequently teaching information repetitively with false rules so that simple assessment can be carried out. (Every histogram must have 10 bars and so many around the mean, etc) Using computing to host these sorts of problems allows us to change the world and then see what happens. Rather than worry about how long it takes students to produce one histogram on paper, they can make one in an on-line environment and play with it. There are better and worse ways to represent data so let’s use computational resources to allow everyone to do this, even when they’re learning. This all comes down to different models as well as different representations. (There is value to making kids work up a histogram by hand but there are many ways to do this and we can change the question and the support and remove the tedium of having to use paper and pen to do one, when we could use computing to do the dull stuff.)

Bob emphasised the importance of drawing pictures and telling stories, they hand-waving that communicates site complicated concepts to people. “What’s this?” “I don’t know but here comes a whole herd of them!”

The four things we need for computational thinking are: Quantitative Reasoning, Algorithm Thinking, Analogic Thinking, and Multi-scale Modelling. Bob showed an interesting example of calculating a known result when you don’t know the elements by calculating the relative masses of the Earth and Pluto using Google and just typing “mass of the earth / mass of pluto” Is this right? What is our reason for believing it? You would EXPECT things to be well-know but what do you OBSERVE? Hmm, time to REFLECT. (As the example, the earth mass value varies dramatically between sources – Google tells you where it gets the information but a little digging reveals that things don’t align AND the values may change over time. The answer varies depends upon the model you use and how you measure it. All of the small differences add up.)

The next example is the boiling point of Radium, given as 1,140C by Google, but the matching source doesn’t even agree with this! If you can’t trust the numbers then this is yet another source of uncertainty and error in our equations.

Even “=” has different interpretations – F = ma is the statement that force occurs as mass accelerates. In nRT = PV, we are saying that energy is conserved in these reactions. dR/dT = bR – the number of rabbits having bunnies will affect the rate of change of rabbits. No wonder students have trouble with what “s=3” means, on occasion. Speaking of meaning, Bob played this as an audio clip, but I attach the text here:

The missile knows where it is at all times. It knows this because it knows where it isn’t. By subtracting where it is from where it isn’t, or where it isn’t from where it is (whichever is greater), it obtains a difference, or deviation. The guidance subsystem uses deviations to generate corrective commands to drive the missile from a position where it is to a position where it isn’t, and arriving at a position where it wasn’t, it now is. Consequently, the position where it is, is now the position that it wasn’t, and it follows that the position that it was, is now the position that it isn’t.

In the event that the position that it is in is not the position that it wasn’t, the system has acquired a variation, the variation being the difference between where the missile is, and where it wasn’t. If variation is considered to be a significant factor, it too may be corrected by the GEA. However, the missile must also know where it was.

The missile guidance computer scenario works as follows. Because a variation has modified some of the information the missile has obtained, it is not sure just where it is. However, it is sure where it isn’t, within reason, and it knows where it was. It now subtracts where it should be from where it wasn’t, or vice-versa, and by differentiating this from the algebraic sum of where it shouldn’t be, and where it was, it is able to obtain the deviation and its variation, which is called error.

Try reading that out loud! Bob then went on to show us some more models to see how we can experiment with factors (parameters) in a dynamic visualisations in a way that allows us to problem solve. So schoolkids can reduce differential equations to simple statements relating change and then experiment – without having to know HOW to solve differential equations (what you have now is what you had then, modified by change). This is model building without starting with programming, it’s starting with modelling, showing what they can do and then exposing how this approach can be limited – which provides a motivation to learn how to program so you can fix the problems in this model.

Overall, an excellent talk about an interesting project attacking the core issue of getting students to think in the right way, instead of just getting them to conform to some dry mechanistic programming approaches. The National Computer Science Institute is doing work across the US (if they come and do a workshop, you have to give them a mug and they have a lot of mugs). NCSI are looking for summer workshop hosts so, if you’re interested, you should contact them (not me!) Here’s one of the quotes from the end:

“It was once conjectured that a million monkeys typing on a million typewriters could eventually produce all of the works of Shakespeare. Now, thanks to the Internet, we know that this is not true” (Bob Willinsky (possible attribution, spelling may be wrong))

What would happen if the Internet went away? That’s a big question and, sadly, Bob started to run out of time. Our world runs in parallel so we need to have be able to think in parallel as well. Distributed computation requires us to think in different ways and that gets hard, quickly.

Bob wrapped it up by saying that Shodor was a village, a lot of fun and was built upon a lot of funding. Great talk!


Getting it wrong, offensively so. The scales ARE biassed.

6874balance_scale

Mark Guzdial has put out some excellent posts recently on Barbara Ericson’s ongoing work on analysing AP CS exam attempts and results across the US. Unsurprisingly, to those of us who see the classrooms on a day-to-day basis, women are grossly underrepresented. In this interview, Barbara is quoted:

Barbara Ericson, director of computing outreach at Georgia Tech, has made a startling claim. She said not one female student in three states – Mississippi, Montana and Wyoming — took the Advanced Placement exam in computer science last year.

Ericson appeared on Weekend Express to discuss the gender gap and explains why more women aren’t interested in computer science.

Now, I’m not going to rehash all of these posts but I did want to pick on one blogger who took the AP data and then, as far as I’m concerned, not only got it wrong by making some fundamental interpretational errors  but managed to do so in a way that so heavily reeked of privilege that I’m going to call it out.

I hesitate to link to the article on the Huffington Post but it’s only fair that you should read it to see what you think, even though it will generate traffic. The article is called “Memo to Chicken Little: Female Scientists Do Roam Among Us, and Gasp! Some Even Wear Lipstick”. So before we’ve even started, we’ve got one good stereotype going in the title.

Look, I’m not planning to drag apart the whole article but I will pick on one point that the author makes because it really irritates me. Here’s the paragraph:

As a woman who likes science as a bystander but chose not to pursue it professionally, I’ve got a couple of problems with all this handwringing. Mostly, well-intentioned as it is, it implies that women need “help” choosing a field of study. High school girls are exposed to exactly the same science and math courses they need to graduate as boys are, but in the eyes of the handwringers, girls are either too shallow or simple to choose for themselves, or need to be socially engineered into the correct balance of male vs. female, regardless of their choices. I appreciate your concern, but frankly, it’s pretty demeaning.

Frankly, I’ve never seen a more disingenuous interpretation of attempts to undo and reverse the systematic anti-female bias that is built into our culture. I’ve never seen anyone who is trying to address this problem directly or indirectly label girls as shallow or too simple to choose – this is a very unpleasant strawman, constructed to make those of us who are trying to address a bias look like we’re the ones with the attitude problem. We don’t need to socially engineer girls into the correct balance, we need to engineer society to restore the balance and articles like this, which make it appear that women are deliberately choosing to avoid STEM, are unwelcome, unnecessary and unfair to the many young women who are being told that the way that our society works is the way that it should work.

Need I remind people of stereotype threat? The PNAS study that shows that women are as automatically likely to harshly judge women and lessen their rewards as their male colleagues? Looking at the AP attendance and performance doesn’t show equality, it shows the outcome of a systematically biased system.

To say that “High school girls are exposed to exactly the same science and math courses they need to graduate as boys are” is a difficult statement. Yes, women rack up roughly the same number of course credits but on the critical measurement of whether they choose to go on and pursue a profession? No, something breaks here. The AP test is a great measure because it is an Advanced Placement exam and your intention is to use this to go further.  Is there clear evidence of far fewer women, as a percentage, going on from high school to college in STEM despite scoring the same kinds of grades? Yes. Is there evidence that some of these problems (anxiety about maths, for example) can start with perceptions of teachers in primary school? Yes. Is there a problem?

Yes.

And the question is always, if your previous exposure has not been fair, then is it reasonable to pick an arbitrary level of course that would be fair to people who haven’t been discriminated against? For years, racism was justified by culturally-based testing that could not be performed at the same level by people outside the culture – which was then used to restrict their access to the culture.

To me, that statement about exposure summarises everything that is wrong with glib arguments about constructing equal opportunity. If we’re going for a big job and there’s a corporate ‘interview dinner’ for 20 people, then we’ll all be on our best behaviour at dinner. For someone to lose the job because nobody showed them how to use a finger bowl or because their family uses a knife in the ‘other’ way, is to provide an equal exposure in the present that is blatantly unfair because it doesn’t take into account the redress of previous bias to bring people up to the point where it is really equal opportunity.

I think history supports me in the statement that we have been proved wrong every other time we’ve tried to segregate human ability and talent based on fixed physical abilities that were assigned at birth. Isn’t it about time we started investing all of our effort into producing truly equal opportunity for everyone?


Start with good grapes, don’t mess them up.

“Make no little plans; they have no magic to stir men’s blood and probably themselves will not be realised.” Daniel Burnham

I was watching a film today called “Antiviral”, directed by Brandon Cronenburg, and one of the themes addressed was what we choose to do with technology. Celebrity cell reproduction is the theme of the movie and it is quite bizarre to see a technology that could be so useful (in building new organs and prolonging life) being used to allow people to have the same colds that their idols do. (Because of the rating of this blog, I must state that Antiviral is an adult film and there are themes that I will not discuss here.)

We have many technologies that are powerful and we are developing more of them, daily. We have developed the ability to print human organs (to a limited fashion, although 40 days for a liver is another month of life for someone) and we in the foothills of printing food. Our automated and autonomous systems become more capable and more effective on a daily basis, although Amazon’s drone network won’t be buzzing your house tomorrow.

One of the most profound reasons for education is the requirement to ensure that the operators of powerful things are reasoning, thinking, informed human beings. As humans, we tend to build amplification engines, it’s just what we do, but in so many cases, a good intention is then amplified to a great one, and a malign intention can be amplified to massive and evil result.

Our production processes for food and drink often take a similar form. To make good bread, you grow good wheat in good soil and then you use good yeast, clean conditions and control the oven. You start with good ingredients and you use technology and knowledge to make it better – or to transform it without damage. The same is true of wine. I can make good wine from just about anything but if you want me to make great wine? I have to start with good grapes and then not mess them up!

Good grapes!

Good grapes!

Our technologies are, however, able to go either way. I could burn the bread, cook the yeast, freeze the wine, just as easily if I was poorly trained or if I had malicious intent. Education is not just about training, it’s about preparation for the world in which our students will live. This world is always changing but we have to move beyond thinking about “Driver’s Ed” as a social duty and think about “Resource Ed”, “The Ethics of Cloning” (for example) and all sorts of difficult and challenging issues when we try and teach. We don’t have to present a given viewpoint, by any means, but to ignore the debate and the atmosphere in which we (and I in particular) are training young tertiary students would be to do them a disservice.

This starts young. The sooner we can start trying to grow good students and the sooner that we make our educational systems transform these into wonderful people, the better off we’ll be. The least I would hope for, for any of my students, is that they will always at least think briefly of some of the issues before they do something. They may still choose to be malign, for whatever reason, but let it be then a choice and not from ignorance – but also, let the malign be few and far between and a dying breed!


The Bad Experience That Stays With You and the Legendary Bruce Springsteen.

I was talking with a friend of mine and we were discussing perceptions of maths and computing (yeah, I’m like this off duty, too) and she felt that she was bad at Maths. I commented that this was often because  of some previous experience in school and she nodded and told me this story, which she’s given me permission to share with you now. (My paraphrasing but in her voice)

“When I was five, we got to this point in Math where I didn’t follow what was going on. We got to this section and it just didn’t make any sense to me. The teacher gave us some homework to do and I looked at it and I couldn’t do it but I didn’t want to hand in nothing. So I scrunched it up and put it in the bin. When the teacher asked for it back, I told her that I didn’t have it.

It turns out that the teacher had seen me put it in the bin and so she punished me. And I’ve never thought of myself as good at math since.”

Wow. I’m hard-pressed to think of a better way to give someone a complex about a subject. Ok, yes, my friend did lie to the teacher about not the work and, yes, it would  have been better if she’d approached the teacher to ask for help – but given what played out, I’m not really sure how much it would have changed what happened. And, before we get too carried away, she was five.

Now this is all some (but not that many) years ago and a lot of things have changed in teaching, but all of us who stand up and call ourselves educations could do worse than remember Bruce Springsteen’s approach to concerts. Bruce plays a lot of concerts but, at each one, he tries to give his best because a lot of the people in the audience are going to their first and only Springsteen concert. It can be really hard to deal with activities that are disruptive, disobedient and possible deliberately so, but they may be masking fear, uncertainty and a genuine desire for the problem to go away because someone is overwhelmed. Whatever we get paid, that’s really one of the things we get paid to do.

We’re human. We screw up. We get tired. But unless we’re thing about and trying to give that Springsteen moment to every student, then we’re setting ourselves up to be giving a negative example. Somewhere down the line, someone’s going to find their life harder because of that – it may be us in the next week, it may be another teacher next year, but it will always be the student.

Bad experiences hang around for years. It would be great if there were fewer of them. Be awesome. Be Springsteen.

EMBRACE YOUR AWESOMENESS! Don't make me come over and sing "Blinded by the Light!"

EMBRACE YOUR AWESOMENESS! Don’t make me come over and sing “Blinded by the Light!”


Enemies, Friends and Frenemies: Distance, Categorisation and Fun.

As Mario Puzo and Francis Ford Coppola wrote in “The Godfather Part II”:

… keep your friends close but your enemies closer.

(I bet you thought that was Sun Tzu, the author of “The Art of War”. So did I but this movie is the first use.)

I was thinking about this the other day and it occurred to me that this is actually a simple modelling problem. Can I build a model which will show the space around me and where I would expect to find friends and enemies? Of course, you might be wondering “why would you do this?” Well, mostly because it’s a little bit silly and it’s a way of thinking that has some fun attached to it. When I ask students to build models of the real world, where they think about how they would represent all of the important aspects of the problem and how they would simulate the important behaviours and actions seen with it, I often give them mathematical or engineering applications. So why not something a little more whimsical?

From looking at the quote, we would assume that there is some distance around us (let’s call it a circle) where we find everyone when they come up to talk to us, friend or foe, and let’s also assume that the elements “close” and “closer” refer to how close we let them get in conversation. (Other interpretations would have us living in a neighbourhood of people who hate us, while we have to drive to a different street to sit down for dinner with people who like us.) So all of our friends and enemies are in this circle, but enemies will be closer. That looks like this:

FREN1

I have more friends than enemies because I’m popular!

So now we have a visual model of what is going on and, if we wanted to, we could build a simple program that says something like “if you’re in this zone, then you’re an enemy, but if you’re in that zone then you’re a friend” where we define the zones in terms of nested circular regions. But, as we know, friend always has your back and enemies stab you in the back, so now we need to add something to that “ME” in the middle – a notion of which way I’m facing – and make sure that I can always see my enemies. Let’s make the direction I’m looking an arrow. (If I could draw better, I’d put glasses on the front. If you’re doing this in the classroom, an actual 3D dummy head shows position really well.) That looks like this:

Same numbers but now I can keep an eye on those enemies!

Same numbers but now I can keep an eye on those enemies!

Now our program has to keep track of which way we’re facing and then it checks the zones, on the understanding that either we’re going to arrange things to turn around if an enemy is behind us, or we can somehow get our enemies to move (possibly by asking nicely). This kind of exercise can easily be carried out by students and it raises all sorts of questions. Do I need all of my enemies to be closer than my friends or is it ok if the closest person to me is an enemy? What happens if my enemies are spread out in a triangle around me? Is they won’t move, do I need to keep rotating to keep an eye on them or is it ok if I stand so that they get as much of my back as they can? What is an acceptable solution to this problem? You might be surprised how much variation students will suggest in possible solutions, as they tell you what makes perfect sense to them for this problem.

When we do this kind of thing with real problems, we are trying to specify the problem to a degree that we remove all of the unasked questions that would otherwise make the problem ambiguous. Of course, even the best specification can stumble if you introduce new information. Some of you will have heard of the term ‘frenemy’, which apparently:

can refer to either an enemy pretending to be a friend or someone who really is a friend but is also a rival (from Wikipedia and around since 1953, amazingly!)

What happens if frenemies come into the mix? Well, in either case, we probably want to treat them like an enemy. If they’re an enemy pretending to be a friend, and we know this, then we don’t turn our back on them and, even in academia, it’s never all that wise to turn your back on a rival, either. (Duelling citations at dawn can be messy.) In terms of our simple model, we can deal with extending the model because we clearly understand what the important aspects are of this very simple situation. It would get trickier if frenemies weren’t clearly enemies and we would have to add more rules to our model to deal with this new group.

This can be played out with students of a variety of ages, across a variety of curricula, with materials as simple as a board, a marker and some checkers. Yet this is a powerful way to explain modelsspecification and improvement, without having to write a single line of actual computer code or talk about mathematics or bridges! I hope you found it useful.


“Begrudgingly honest because we might be surveilled?”

A drawing of a prison built as a panopticon with all cells visible from the centre.

The Plans of the Panopticon

O’Reilly Community are hosting an online conference on “Data, Crime, and Conflict”, which I’m attending at the rather unhealthy hour of 3:30am on the morning of January the 8th (it’s better for you if you’re in the UK or US). Here’s an extract of the text:

A world of sensors gives us almost complete surveillance. Every mobile device tracks moves, forming a digital alibi or new evidence for the prosecution. And with the right data, predictions look frighteningly like guilt.

How does a data-driven, connected world deal with crime, conflict, and peacekeeping? Will we be prisoners in a global Panopticon, begrudgingly honest because we might be surveilled? Or will total transparency even the balance between the enforcer and the citizen?

Join a lineup of thinkers and technologists for this free online event as we look at the ways data is shaping how we police ourselves, from technological innovations to ethical dilemmas.

 I’ve been interested in the possible role and expansion (and the implications) of the panopticon since first reading about it. I even wrote a short story once to explore a global society where the removal of privacy had not been the trip down into dystopia that we always expect it to be. (This doesn’t mean that I believe that it is a panacea – I just like writing stories!) I’m looking forward to seeing what the speakers have to say. They claim that there are limited places but I managed to sign up today so it’s probably not too late.

 


Three Stories: #2 Why I Don’t Make New Year’s Resolutions

This is a story I’ve never told anyone before, but I hope that it will help to explain why I think many students struggle with making solid change in their academic and life practices. They focus on endpoints and set deadlines reactively, rather than focusing on process and finding a good time to change. Let me explain this in the narrative.

When I was younger, I was quite a bit heavier than I am now – by about 30% of my body mass. As I got older, this became more of a problem and my weight went up and down quite a lot as I tried to get a regular regime of exercise into my life and cut back on my eating. Unfortunately, when I get stressed, I tend to eat, and one of the things I used to get stressed about was … losing weight. It’s a common, vicious, circle. Anyway, one year, after a Christmas where I had found it difficult to fit into my ‘good’ clothes and just felt overstuffed and too hot most of the time, I decided that enough was enough. I would make a New Year’s Resolution to lose weight. Seriously. (As background, Christmas in Australia is in Summer, so we sing snows about snow and eat roast turkey while sitting around in 90-100F/32-38C heat – so if your clothes are squeezy, boy, are you going to feel it.)

I can’t remember the details of the New Year’s Eve party but I do remember waking up the next day and thinking “Ok, so now I lose weight”. But there were some problems.

  1. It was still very hot.
  2. Everything was closed because it was a public holiday.
  3. I was still stuffed from Christmas/NY indulgence.
  4. I was hungover.
  5. I had no actual plan.
  6. I hadn’t actually taken any steps towards either dietary change or exercise that I could implement.

So, instead of getting out of bed and doing anything healthy, I thought “Oh, ok, I’ll start tomorrow.” because it was just about impossible, to my mind, to get things started on that day. I made some plans as to what I’d do the next day and thought “Ok, that’s what I’ll do tomorrow.”

But then a friend called on the 2nd and they were in town so we caught up. Then I was back at work and it was really busy.

And… and… and…

When I finally did lose the weight, many years later, and get to a more stable point, it wasn’t through making a resolution – it was through developing a clear plan to achieve a goal. I set out to work up to walking 10 miles as loops around my block. Then, when I achieved that, I assessed myself and realised that I could replace that with running. So then, ever time I went out, I ran a little at the start and walked the rest. Finally I was (slowly) running the whole distance. Years later, a couple of bad falls have stopped me from long-distance running, but I have three marathons and numerous halves under my belt.

Why didn’t it work before? Well, lack of preparation is always bad, but also because New Year’s is one of the worst possible times to try and make a massive change unless you’ve actually prepared for it and the timing works for you. Think about it:

  1. New Year’s Eve is a highly social activity for many people as are the days after- any resolutions involving food, alcohol, sex or tobacco are going to much harder to keep.
  2. It’s high stakes, especially if you make your resolution public. Suddenly, failure is looming over you and other people may be either trying to force you into keeping your resolution – and some people will actively be trying to tempt you out of it.
  3. There’s just a lot going on around this time for most people and it’s not a time when you have lots of extra headspace. If your brain is already buzzing, making big change will make it harder.
  4. Setting your resolution as a goal is not the same as setting a strategy. This is really important if you fall off the wagon, so to speak. If you are trying to give up smoking but grab a quick cigarette on the 3rd, then your resolution is shot. If you have a plan to cut down, allowing for the occasional divergence, then you can be human without thinking “Oh, now I have to abandon the whole project.”
  5. New Year’s Resolutions tend to be tip of the mind things – if something had been really bothering you for months, why wait until NYE to do it? This means that you’re far less likely to think everything out.

After thinking over this for quite a long time, I’ve learned a great deal about setting goals for important changes and you have to try to make these changes:

  1. When you have a good plan as to what you’re trying to achieve or what you’re just trying to do as a regular practice.
  2. When you have everything you need to make it work.
  3. When you have enough headspace to think it through.
  4. When you won’t beat yourself up too badly if it goes wrong.

So have a Happy New Year and be gentle on yourself for a few days. If you really want to change something in your life, plan for it properly and you stand a much better chance of success. Don’t wait until a high stakes deadline to try and force change on yourself – it probably won’t work.

HNY2014


Three Stories: #1 What I Learned from Failure

It’s considered bad form to start ‘business stories’ with “Once upon a time” but there’s a strong edge of bard to my nature and it’s the end of a long year. (Let’s be generous.) So, are you sitting comfortably? (Ok, I’ll spare you ‘Once…’)

Many years ago, I went to university, after a relatively undistinguished career at school. I got into a course that was not my first preference but, rather than wonder why I had not set the world on fire academically, I assumed that it was because I hadn’t really tried. The companion to this sentiment is that I could achieve whatever I wanted academically, as long as I really wanted it and actually tried. This concept, that I could achieve anything academic I wanted if I tried, got a fairly good workout over the next few years, despite evidence that I was heading in a downward spiral academically. What I became good at was barely avoiding failure, rather than excelling, and while this is a skill, it’s a dangerous line to try and walk. If you’re genuinely aiming to excel, which includes taking the requisite planning steps and time commitment you need, and you fall short then you will probably still do quite well and pass. If you are focused lower down, then missing that bar means failure.

What I didn’t realise at the time was that I was almost doomed to fail when I tried to set my own interpretation of what constituted the right level of effort and participation. If you are a student who has a good knowledge of the whole course then you will have a pretty good idea of how you have answered questions in exams, what is required for assignments and, if you wanted to, you could choose to answer part of a question and have some idea of how many marks are involved. If you don’t know the material in detail, then your perception of your own performance is going to be heavily filtered by your own lack of knowledge. (A reminder of a previous post on this for those who are new here or are vague post-Christmas.)

After some years out in the workforce, and coming back to do postgraduate study, I finally learned something from what should have been quite clear to me, if it hadn’t been hidden by two things: my firm conviction that I could change things immediately if I wished to, and my completely incorrect assumption that my own performance in a subject could be assessed by someone with my level of knowledge!

I became a good student because I finally worked out three key things (with a lot of help and support from my teachers and my friends);

  1. There is no “lower threshold” of knowledge that allows you to predict if you’re going to pass. If you have enough grasp of the course to know how much you need to do to pass, then you probably know enough to do much better than that! (Terry Pratchett covers this beautifully in a book called “Moving Pictures“, where a student has to know the course better than the teachers to maintain a very specific grade over the years.)
  2. Telling yourself that you “could have done better” is almost completely useless unless you decide to do better and put a plan in place to achieve that. This excuse gets you off the hook but, unless it’s teamed with remedial action, it’s just an excuse.
  3. Setting yourself up for failure is just as effective as setting yourself up for success, but it can be far subtler and comprised of many small actions that you don’t take, rather than a few actions that you do take.

Knowing what is going wrong (or thinking you do) doesn’t change anything unless you actively try to change it. It’s a simple truth that, I hope, is a useful and interesting story.