CodeSpells! A Kickstarter to make a difference. @sesperu @codespells #codespells

I first met Sarah Esper a few years ago when she was demonstrating the earlier work in her PhD project with Stephen Foster on CodeSpells, a game-based project to start kids coding. In a pretty enjoyable fantasy game environment, you’d code up spells to make things happen and, along the way, learn a lot about coding. Their team has grown and things have come a long way since then for CodeSpells, and they’re trying to take it from its research roots into something that can be used to teach coding on a much larger scale. They now have a Kickstarter out, which I’m backing (full disclosure), to get the funds they need to take things to that next level.

Teaching kids to code is hard. Teaching adults to code can be harder. There’s a big divide these days between the role of user and creator in the computing world and, while we have growing literary in use, we still have a long way to go to get more and more people creating. The future will be programmed and it is, honestly, a new form of literacy that our children will benefit from.

If you’re one of my readers who likes the idea of new approaches to education, check this out. If you’re an old-timey Multi-User Dungeon/Shared Hallucination person like me, this is the creative stuff we used to be able to do on-line, but for everyone and with cool graphics in a multi-player setting. If you have kids, and you like the idea of them participating fully in the digital future, please check this out.

To borrow heavily from their page, 60% of jobs in science, technology,engineering and maths are computing jobs but AP Computer Science is only taught at 5% of schools. We have a giant shortfall of software people coming up and this will be an ugly crash when it comes because all of the nice things we have become used to in the computing side will slow down and, in some cases, pretty much stop. Invest in the future!

I have no connection to the project apart from being a huge supporter of Sarah’s drive and vision and someone who would really like to see this project succeed. Please go and check it out!

The Earth Magic Sphere image, from the Kickstarter page.

The Earth Magic Sphere image, from the Kickstarter page.


Funding Education: Trust me, you want to. #stem #education #csed

Some very serious changes to the Higher Education system of Australia are going to be discussed starting from October 28th – deregulating the University fee structure, which will most likely lead to increasing fees and interest rates, leading to much greater student debt. (Yes, there are some positives in there but it’s hard to get away from massive increase of student debt.) While some university representative organisations are in favour of this, with amendments and protections for some students, I am yet to be convinced that deregulating the Universities is going to do much while we labour under the idea that students will move around based on selected specialisations,  the amount of “life lessons” they will accumulate or their perception of value for money. We have no idea what price sensitivity is going to do to the Australian market. We do know what happened in the UK when they deregulated fees:

‘Professor Byrne agreed, but said fee deregulation would have to be “carefully thought through so as to avoid what happened in the UK when they did it there – initially, when the fees were uncapped, all the universities just charged the maximum amount. It’s been corrected now, but that was a complete waste of time because all it did was transfer university costing from the public to the private sphere.”’

But, don’t worry, Professor Byrne doesn’t think this will lead to a two-tier system, split between wealthy universities and less-well-off regionals:

“I’d call it an appropriately differentiated system, with any number of levels within it.”

We have four classes! That must be better than have/have not. That’s… wait…

The core of this argument is that, somehow, it is not the role of Universities to provide the same thing as every other university, which is a slashing of services more usually (coyly) referred to as “playing to your strengths”. What this really is, however, is geographical and social entrapment. You weren’t born in a city, you don’t want to be saddled with huge debt or your school wasn’t great so you didn’t get the marks to go to a “full” University? Well, you can go to a regional University, which is playing to its strengths, to offer you a range of courses that have been market-determined to be suitable.  But it will be price competitive! This is great, because after 2-3 generations of this, the people near the regional University will not have the degree access to make the money to work anywhere other than their region or to go to a different University. And, of course, we have never seen a monopolised, deregulated market charging excessive fees when their consumer suffers from a lack of mobility…

There are some quite valid questions as to why we need to duplicate teaching capabilities in the same state, until we look at the Australian student, who tends to go to University near where they live, rather than moving into residential accommodation on campus, and, when you live in a city that spans 70km from North to South as Adelaide does, it suddenly becomes more evident why there might be repeated schools in the Universities that span this geographical divide. When you live in Sydney, where the commute can be diabolical and the city is highly divided by socioeconomic grouping, it becomes even more important. Duplication in Australian Universities is not redundancy, it’s equality.

The other minor thing to remember is that the word University comes from the Latin word for whole. The entire thing about a University is that it is most definitely not a vocational training college, focussed on one or two things. It is defined by, and gains strength from, its diversity and the nature of study and research that comes together in a place that isn’t quite like any other. We are at a point in history when the world is changing so quickly that predicting the jobs of the next 20 years is much harder, especially if we solve some key problems in robotics. Entire jobs, and types of job, will disappear almost overnight – if we have optimised our Universities to play to their strengths rather than keeping their ability to be agile and forward-looking, we will pay for it tomorrow. And we will pay dearly for it.

Education can be a challenging thing for some people to justify funding because you measure the money going in and you can’t easily measure the money that comes back to you. But we get so much back from an educated populace. Safety on the road: education. Safety in the skies: education. Art, literature, music, film: a lot of education. The Internet, your phone, your computer: education, Universities, progressive research funding and CSIRO.

Did you like a book recently? That was edited by someone who most likely had a degree that many wouldn’t consider worth funding. Just because it’s not obvious what people do with their degrees, and just because some jobs demand degrees when they don’t need them, it doesn’t mean that we need to cut down on the number of degrees or treat people who do degrees with a less directly vocational pathway as if they are parasites (bad) or mad (worse). Do we need to change some things about our society in terms of perceptions of worth and value? Yes – absolutely, yes. But let’s not blame education for how it gets mutated and used. And, please, just because we don’t understand someone’s job, let us never fall into the trap of thinking it’s easy or trivial.

The people who developed the first plane had never flown. The people who developed WiFi had never used a laptop. The people who developed the iPhone had never used one before. But they were educated and able to solve challenges using a combination of technical and non-technical knowledge. Steve Jobs may never have finished college (although he attributed the Mac’s type handling to time he spent in courses there) but he employed thousands of people who did – as did Bill Gates. As do all of the mining companies if they actually want to find ore bodies and attack them properly.

Education will define what Australia is for the rest of this century and for every century afterwards. To argue that we have to cut funding and force more debt on to students is to deny education to more Australians and, ultimately, to very much head towards a permanently divided Australia.

You might think, well, I’m ok, why should I worry? Ignoring any altruistic issues, what do you think an undereducated, effectively underclass, labour force is going to do when all of their jobs disappear? If there are still any History departments left, then you might want to look into the Luddites and the French Revolution. You can choose to do this for higher purposes, or you can do it for yourself, because education will help us all to adjust to an uncertain future and, whether you think so or not, we probably need the Universities running at full speed as cradles of research and ideas, working with industry to be as creative as possible to solve the problems that you will only read about in tomorrow’s paper.

Funding Education: Trust me, you want to.

Rage Against the Machine

Rage Against the Machine


I have a new book out: A Guide to Teaching Puzzle-based learning. #puzzlebasedlearning #education

Time for some pretty shameless self-promotion. Feel free to stop reading if that will bother you.

My colleagues, Ed Meyer from BWU, Raja Sooriamurthi from CMU and Zbyszek Michalewicz (emeritus from my own institution) and I have just released a new book, called “A Guide to Teaching Puzzle-based learning.” What a labour of love this has been and, better yet, we are still still talking to each other. In fact, we’re planning some follow-up events next year to do some workshops around the book so it’ll be nice to work with the team again.

(How to get it? This is the link to Springer, paperback and e-Book. This is the link to Amazon, paperback only I believe.)

Here’s a slightly sleep-deprived and jet-lagged picture of me holding the book as part of my “wow, it got published” euphoria!

See how happy I am?

See how happy I am? And also so out of it.

The book is a resource for the teacher, although it’s written for teachers from primary to tertiary and it should be quite approachable for the home school environment as well. We spent a lot of time making it approachable, sharing tips for students and teachers alike, and trying to get all of our knowledge about how to teach well with puzzles down into the one volume. I think we pretty much succeeded. I’ve field-tested the material here at Universities, schools and businesses, with very good results across the board. We build on a good basis and we love sound practical advice. This is, very much, a book for the teaching coalface.

It’s great to finally have it all done and printed. The Springer team were really helpful and we’ve had a lot of patience from our commissioning editors as we discussed, argued and discussed again some of the best ways to put things into the written form. I can’t quite believe that we managed to get 350 pages down and done, even with all of the time that we had.

If you or your institution has a connection to SpringerLink then you can read it online as part of your subscription. Otherwise, if you’re keen, feel free to check out the preview on the home page and then you may find that there are a variety of prices available on the Web. I know how tight budgets are at the moment so, if you do feel like buying, please buy it at the best price for you. I’ve already had friends and colleagues ask what benefits me the most and the simple answer is “if people read it and find it useful”.

To end this disgraceful sales pitch, we’re actually quite happy to run workshops and the like, although we are currently split over two countries (sometimes three or even four), so some notice is always welcome.

That’s it, no more self-promotion to this extent until the next book!

 


Talking Ethics with the Terminator: Using Existing Student Experience to Drive Discussion

One of the big focuses at our University is the Small-Group Discovery Experience, an initiative from our overall strategy document, the Beacon of Enlightenment. You can read all of the details here, but the essence is that a small group of students and an experienced research academic meet regularly to start the students down the path of research, picking up skills in an active learning environment. In our school, I’ve run it twice as part of the professional ethics program. This second time around, I think it’s worth sharing what we did, as it seems to be working well.

Why ethics? Well, this is first year and it’s not all that easy to do research into Computing if you don’t have much foundation, but professional skills are part of our degree program so we looked at an exploration of ethics to build a foundation. We cover ethics in more detail in second and third year but it’s basically a quick “and this is ethics” lecture in first year that doesn’t give our students much room to explore the detail and, like many of the more intellectual topics we deal with, ethical understanding comes from contemplation and discussion – unless we just want to try to jam a badly fitting moral compass on to everyone and be done.

Ethical issues present the best way to talk about the area as an introduction as much of the formal terminology can be quite intimidating for students who regard themselves as CS majors or Engineers first, and may not even contemplate their role as moral philosophers. But real-world situations where ethical practice is more illuminating are often quite depressing and, from experience, sessions in medical ethics, and similar, rapidly close down discussion because it can be very upsetting. We took a different approach.

The essence of any good narrative is the tension that is generated from the conflict it contains and, in stories that revolve around artificial intelligence, robots and computers, this tension often comes from what are fundamentally ethical issues: the machine kills, the computer goes mad, the AI takes over the world. We decided to ask the students to find two works of fiction, from movies, TV shows, books and games, to look into the ethical situations contained in anything involving computers, AI and robots. Then we provided them with a short suggested list of 20 books and 20 movies to start from and let them go. Further guidance asked them to look into the active ethical agents in the story – who was doing what and what were the ethical issues?

I saw the students after they had submitted their two short paragraphs on this and I was absolutely blown out of the water by their informed, passionate and, above all, thoughtful answers to the questions. Debate kept breaking out on subtle points. The potted summary of ethics that I had given them (follow the rules, aim for good outcomes or be a good person – sorry, ethicists) provided enough detail for the students to identify issues in rule-based approaches, utilitarianism and virtue ethics, but I could then introduce terms to label what they had already done, as they were thinking about them.

I had 13 sessions with a total of 250 students and it was the most enjoyable teaching experience I’ve had all year. As follow-up, I asked the students to enter all of their thoughts on their entities of choice by rating their autonomy (freedom to act), responsibility (how much we could hold them to account) and perceived humanity, using a couple of examples to motivate a ranking system of 0-5. A toddler is completely free to act (5) and completely human (5) but can’t really be held responsible for much (0-1 depending on the toddler). An aircraft autopilot has no humanity or responsibility but it is completely autonomous when actually flying the plane – although it will disengage when things get too hard. A soldier obeying orders has an autonomy around 5. Keanu Reeves in the Matrix has a humanity of 4. At best.

They’ve now filled the database up with their thoughts and next week we’re going to discuss all of their 0-5 ratings as small groups, then place them on a giant timeline of achievements in literature, technology, AI and also listing major events such as wars, to see if we can explain why authors presented the work that they did. When did we start to regard machines as potentially human and what did the world seem like them to people who were there?

This was a lot of fun and, while it’s taken a little bit of setting up, this framework works well because students have seen quite a lot, the trick is just getting to think about with our ethical lens. Highly recommended.

What do you think, Arnold? (Image from moviequotes.me)

What do you think, Arnold? (Image from moviequotes.me)


ITiCSE 2014, Day 3, Final Session, “CS Ed Research”, #ITiCSE2014 #ITiCSE

The first paper, in the final session, was the “Effect of a 2-week Scratch Intervention in CS1 on Learners with Varying Prior Knowledge”, presented by Shitanshu Mirha, from IIT Bombay. The CS1 course context is a single programming course for all freshmen engineer students, thus it has to work for novice and advanced learners. It’s the usual problem: novices get daunted and advanced learners get bored. (We had this problem in the past.) The proposed solution is to use Scratch, because it’s low-floor (easy to get started), high-ceiling (can build complex projects) and wide-walls (applies to a wide variety of topics and themes). Thus it should work for both novice and advanced learners.

The theoretical underpinning is that novice learners reach cognitive overload while trying to learn techniques for programming and a language at the same time. One way to reduce cognitive load is to use visual programming environments such as Scratch. For advanced learners, Scratch can provide a sufficiently challenging set of learning material. From the perspective of Flow theory, students need to reach equilibrium between challenge level and perceived skill.

The research goal was to investigate the impact of a two-week intervention in a college course that will transition to C++. What would novices learn in terms of concepts and C++ transition? What would advanced students learn? What was the overall impact on students?

The cohort was 450 students, no CS majors, with a variety of advanced and novice learners, with a course objective of teaching programming in C++ across 14 weeks. The Scratch intervention took place over the first four weeks in terms of teaching and assessment. Novice scaffolding was achieved by ramping up over the teaching time. Engagement for advanced learners was achieved by starting the project early (second week). Students were assessed by quizzes, midterms and project production, with very high quality projects being demonstrated as Hall of Fame projects.

Students were also asked to generate questions on what they learned and these could be used for other students to practice with. A survey was given to determine student perception of usefulness of the Scratch approach.

The results for Novices were presented. While the Novices were able to catch up in basic Scratch comprehension (predict output and debug code), this didn’t translate into writing code in Scratch or debugging programs in C++. For question generation, Novices were comparable to advanced learners in terms of number of questions generated on sequences, conditionals and data. For threads, events and operators, Novices generated more questions – although I’m not sure I see the link that demonstrates that they definitely understood the material. Unsurprisingly, given the writing code results, Novices were weaker in loops and similar programming constructs. More than 53% of Novices though the Scratch framing was useful.

In terms of Advanced learner engagement, there were more Advanced projects generated. Unsurprisingly, Advanced projects were far more complicated. (I missed something about Most-Loved projects here. Clarification in the comments please!) I don’t really see how this measures engagement – it may just be measuring the greater experience.

Summarising, Scratch seemed to help Novices but not with actual coding or working with C++, but it was useful for basic concepts. The author claims that the larger complexity of Advanced user projects shows increased engagement but I don’t believe that they’ve presented enough here to show that. The sting in the tail is that the Scratch intervention did not help the Novices catch up to the Advanced users for the type of programming questions that they would see in the exam – hence, you really have to question its utility.

The next paper is “Enhancing Syntax Error Messages Appears Ineffectual” presented by Paul Denny, from The University of Auckland. Apparently we could only have one of Paul or Andrew Luxton-Reilly, so it would be churlish to say anything other than hooray for Paul! (Those in the room will understand this. Sorry we missed you, Andrew! Catch up soon.) Paul described this as the least impressive title in the conference but that’s just what science is sometimes.

Java is the teaching language at Auckland, about to switch to Python, which means no fancy IDEs like Scratch or Greenfoot. Paul started by discussing a Java statement with a syntax error in it, which gave two different (but equally unhelpful) error messages for the same error.

if (a < 0) || (a > 100)
  error=true;

// The error is in the top line because there should be surrounding parentheses around conditions
// One compiler will report that a ';' is required at the ||, which doesn't solve the right problem.
// The other compiler says that another if statement is required at the ||
// Both of these are unhelpful - as well as being wrong. It wasn't what we intended.

The conclusion (given early) is simple: enhancing the error messages with a controlled empirical study found no significant effect. This work came from thinking about an early programming exercise that was quite straightforward but seemed to came students a lot of grief. For those who don’t know, programs won’t run until we fix the structural problems in how we put the program elements together: syntax errors have to be fixed before the program will run. Until the program runs, we get no useful feedback, just (often cryptic) error messages from the compiler. Students will give up if they don’t make progress in a reasonable interval and a lack of feedback is very disheartening.

The hypothesis was that providing more useful error messages for syntax errors would “help” users, help being hard to quantify. These messages should be:

  • useful: simple language, informal language and targeting errors that are common in practice. Also providing example code to guide students.
  • helpful: reduce the number of non-compiling submissions in total, reduce number of consecutive non-compiling submissions AND reduce the number of attempts to resolve a specific error.

In related work, Kummerfeld and Kay (ACE 2003), “The neglected battle fields of Syntax Errors”, provided a web-based reference guide to search for the error text and then get some examples. (These days, we’d probably call this Stack Overflow. 🙂 ) Flowers, Carver and Jackson, 2004, developed Gauntlet to provide more informal error messages with user-friendly feedback and humour. The paper was published in Frontiers in Education, 2004, “Empowering Students and Building Confidence in Novice Programmers Through Gauntlet.” The next aspect of related work was from Tom Schorsch, SIGCSE 1995, with CAP, making specific corrections in an environment. Warren Toomey modified BlueJ to change the error subsystem but there’s no apparent published work on this. The final two were Dy and Rodrigo, Koli Calling 2010, with a detector for non-literal Java errors and Debugging Tutor: Preliminary evaluation, by Carter and Blank, KCSC, January 2014.

The work done by the authors was in CodeWrite (written up in SIGCSE 2011 and ITiCSE 2011, both under Denny et al). All students submit non-compiling code frequently. Maybe better feedback will help and influence existing systems such as Nifty reflections (cloud bat) and CloudCoder. In the study, student had 10 problems they could choose from, with a method, description and return result. The students were split in an A/B test, where half saw raw feedback and half saw the enhanced message. The team built an error recogniser that analysed over 12,000 submissions with syntax errors from a 2012 course and the raw compiler message identified errors 78% of the time. (“All Syntax Errors are Not Equal”, ITiCSE 2012). In other cases, static analysis was used to work out what the error was. Eventually, 92% of the errors were classifiable from the 2012 dataset. Anything not in that group was shown as raw error message to the student.

In the randomised controlled experiment, 83 students had to complete the 10 exercises (worth 1% each), using the measures of:

  • number of consecutive non-compiing submissions for each exercise
  • Total number of non-compiling submissions
  • … and others.

Do students even read the error messages? This would explain the lack of impact. However, examining student code change there appears to be a response to the error messages received, although this can be a slow and piecemeal approach. There was a difference between the groups, but it wasn’t significant, because there was a 17% reduction in non-compiling submissions.

I find this very interesting because the lack of significance is slightly unexpected, given that increased expressiveness and ease of reading should make it easier for people to find errors, especially with the provision of examples. I’m not sure that this is the last word on this (and I’m certainly not saying the authors are wrong because this work is very rigorous) but I wonder what we could be measuring to nail this one down into the coffin.

The final talk was “A Qualitative Think-Aloud Study of Novice Programmers’ Code Writing Strategies”, which was presented by Tony Clear, on behalf of the authors. The aim of the work was to move beyond the notion of levels of development and attempt to explore the process of learning, building on the notion of schemas and plans. Assimilation (using existing schemas to understand new information) and accommodation  (new information won’t fit so we change our schema) are common themes in psychology of learning.

We’re really not sure how novice programmers construct new knowledge and we don’t fully understand the cognitive process. We do know that learning to program is often perceived as hard. (Shh, don’t tell anyone.) At early stages, movie programmers have very few schemas to draw on, their knowledge is fragile and the cognitive load is very high.

Woohoo, Vygotsky reference to the Zone of Proximal Development – there are things students know, things that can learn with help, and then the stuff beyond that. Perkins talked about attitudinal factors – movers, tinkerers and stoppers. Stoppers stop and give up in the face of difficulty, tinkers fiddle until it works and movers actually make good progress and know what’s going on. The final aspect of methodology was inductive theory construction, while I’ll let you look up.

Think-aloud protocol requires the student to clearly vocalise what they were thinking about as they completed computation tasks on a computer, using retrospective interviews to address those points in the videos where silence, incomprehensibility or confused articulation made interpreting the result impossible. The scaffolding involve tutoring, task performance and follow-up. The programming tasks were in a virtual world-based pogromming environment to solve tasks of increasing difficulty.

How did they progress? Jacquie uses the term redirection to mean that the student has been directed to re-examine their work, but is not given any additional information. They’re just asked to reconsider what they’ve done. Some students may need a spur and then they’re fine. We saw some examples of students showing their different progression through the course.

Jacquie has added a new category, PLANNERS, which indicates that we can go beyond the Movers to explain the kind of behaviour we see in advanced students in the top quartile. Movers who stretch themselves can become planners if they can make it into the Zone of Proximal Development and, with assistance, develop their knowledge beyond what they’d be capable of by themselves. The More Competent Other plays a significant role in helping people to move up to the next level.

Full marks to Tony. Presenting someone else’s work is very challenging and you’d have to be a seasoned traveller to even reasonably consider it! (It was very nice to see the lead author recognising that in the final slide!)

 


ITiCSE 2014, Day 2, Session4A, Software Engineering, #ITiCSE2014 #ITiCSE

The first talk, “Things Coming Together: Learning Experiences in a Software Studio”, was presented by Julia Prior, from UTS. (One of the nice things about conferences is catching up with people so Julia, Katrina and I got to have a great chat over breakfast before taxiing into the venue.)
Julia started with the conclusions. From their work, the group have evidence of genuine preparation for software practice, this approach works for complex technical problems and tools, it encourages effective group work, builds self-confidence, it also builds the the more elusive prof competencies, provides immersion in rich environments, and furnishes different paths to group development and success. Now for the details!
Ther are three different parts of a studio, based on the arts and architecture model:
  • People: learning community
    teachers and learners
  • Process: creative , reflective
    • interactions
    • physical space
    • collaboration
  • Product: designed object – a single focus for the process
UTS have been working on designing and setting up a Software Development Studio for some time and have had a chance to refine their approach. The subject was project-based on a team project for parks and wildlife, using the Scrum development method. The room the students were working in was trapezoidal, with banks of computers up and down.
What happened? What made this experience different was that an ethnographer sat in and observed the class, as well as participating, for the whole class and there was also an industry mentor who spent 2-3 hours a week with the students. There were also academic mentors. The first week started with Lego where students had to build a mini-town based on a set of requirements, with colour and time constraints. Watching the two groups working at this revealed two different approaches: one planned up front, with components assigned to individuals, and finished well on time. The other group was in complete disarray, took pieces out as they needed it, didn’t plan or allocate roles. This left all the building to two members, with two members passing blocks, and the rest standing around. (This was not planned – it just happened.)
The group that did the Lego game well quickly took on Scrum and got going immediately, (three members already knew about Scrum), including picking their team. The second group felt second-rate and this was reflected in their sprint – no one had done the required reading or had direction, thus they needed a lot of mentor intervention. After some time, during presentations, the second group presented first and, while it was unadventurous, they had developed a good plan. The other group, with strong leadership, were not prepared for their presentation and it was muddled and incomplete. Some weeks after that presentation practice, the groups had started working together with leaders communicating, which was at odds with the first part of the activity.
Finding 1: Group Relations.
  • Intra-Group Relations: Group 1 has lots of strong characters and appeared to be competent and performing well, with students in group learning about Scrum from each other. Group 2 was more introverted, with no dominant or strong characters, but learned as a group together. Both groups ended up being successful despite the different paths. Collaborative learning inside the group occurred well, although differently.
  • Inter-Group Relations: There was good collaborative learning across and between groups after the middle of the semester, where initially the groups were isolated (and one group was strongly focused on winning a prize for best project). Groups learned good practices from observing each other.
Finding 2: Things Coming Together
The network linking the students together doesn’t start off being there but is built up over time – it is strongly relational. The methodologies, mentors and students are tangible components but all of the relationships are intangible. Co-creation becomes a really important part of the process.
Across the whole process, integration become a large focus, getting things working in a complex context.Group relations took more effort and the group had to be strategic in investing their efforts. Doing time was an important part of the process – more time spent together helped things to work better. This time was an investment in developing a catalyst for deep learning that improved the design and development of the product. (Student feedback suggested that students should be timetabled into the studio more.) This time was also spent developing the professional competencies and professional graduates that are often not developed in this kind of environment.
(Apologies, Julia, for a slightly sketchy write-up. I had Internet problems at the start of the process so please drop me a line if you’d like to correct or expand upon anything.)
The next talk was on “Understanding Students’ Preferences of Software Engineering Projects” presented by Robert McCartney. The talk as about a maintenance-centrerd Sofwtare Engineering course (this is a close analogue to industry where you rarely build new but you often patch old.)
We often teach SE with project work where the current project approach usually has a generative aspect based on planning, designing and building. In professional practice, most of SE effort involves maintenance and evolution. The authors developed a maintenance-focused SE course to change the focus to maintenance and evolution. Student start with some existing system and the project involves comprehending and documenting the existing code, proposing functional enhancements, implement, test and document changes.
This is a second-year course, with small teams (often pairs), but each team has to pick a project, comprehend it, propose enhancements, describe and document, implement enhancements, and present their results. (Note: this would often be more of a third-year course in its generative mode.) Since the students are early on, they are pretty fresh in their knowledge. They’ll have some Object Oriented programming and Data Structures, experience with UML class diagrams and experience using Eclipse. (Interesting – we generally avoid IDEs but it may be time to revisit this.)
The key to this approach is to have enough projects of sufficient scope to work on and the authors went out to the open source project community to grab existing open source code and work on it, but without the intention to release it back into the wild. This lifts the chances of having good, authentic code, but it’s important to make sure that the project code works. There are many pieces of O/S code out there, with a wide range of diversity, but teachers have to be involved in the clearing process for these things as there many crap ones out there as well. (My wording. 🙂 )
The paper mith et al “Selecting Open Souce Software Projects to Teach Software Engineering” was presented at SIGCSE 2014 and described the project search process. Starting from the 1000 open source projects that were downloaded, 200 were the appropriate size, 20 were suitable (could build, had sensible structure and documentation). This takes a lot of time to get this number of projects and is labour intensive.
Results in the first year: find suitable projects was hard, having each team work on a different project is too difficult for staff (the lab instructor has to know about 20 separate projects), and small projects are often not as good as the larger projects. Up to 10,000 lines of code were considered small projects but theses often turned out to be single-developer projects, which meant that there was no group communication structure and a lot of things didn’t get written down so the software wouldn’t build as the single developer hadn’t needed to let anyone know the tricks and tips.
In the second year, the number of projects was cut down to make it easier on the lab instructors (down to 10) and the size of the projects went up (40-100k lines) in order to find the group development projects. The number of teams grew and then the teams could pick whichever project they wanted, rather than assigning one team per project on a first-come first-served approach. (The first-come first-served approach meant students were picking based on the name and description of the project, which is very shallow.) To increase group knowledge, the group got a project description , with links to the source code and commendation, build instructions (which had been tested), the list of proposed enhavements and a screen shot of the working program. This gave the group a lot more information to make a deeper decision as to which project they wanted to undertake and students could get a much better feeling for what they took on.
What the students provided, after reviewing the projects, was their top 3 projects and list of proposed enhancements, with an explanation of their choices and a description of the relationship between the project and their proposed enhancement. (Students would receive their top choice but they didn’t know this.)
Analysing the data  with a thematic analysis, abstracting the codes into categories and then using Axial coding to determine the relations between categories to combine the AC results into a single thematic diagram. The attract categories were: Subject Appeal (consider domain of interest, is it cool or flashy), Value Added (value of enhancement, benefit to self or users), Difficulty (How easy/hard it is), and Planning (considering the match between team skills and the skills that the project required, the effects of the project architecture). In the axial coding, centring on value-adding, the authors came up with a resulting thematic map.
Planning was seen as a sub-theme of difficulty, but both subject appeal and difficulty (although considered separately) were children of value-adding. (You can see elements of this in my notes above.) In the relationship among the themes, there was a lot of linkage that led to concepts such as weighing value add against difficulty meant that enhancements still had to be achievable.
Looking at the most frequent choices, for 26 groups, 9 chose an unexacting daily calendar scheduler (Rapla), 7 chose an infrastructure for games (Triple A) and a few chose a 3D home layout program (Sweet Home). Value-add and subject-appeal were dominant features for all of these. The only to-four project that didn’t mention difficulty was a game framework. What this means is that if we propose projects that provide these categories, then we would expect them to be chosen preferentially.
The bottom line is that the choices would have been the same if the selection pool had been 5 rather than 10 projects and there’s no evidence that there was that much collaboration and discussion between those groups doing the same projects. (The dreaded plagiarism problem raises its head.) The number of possible enhancements for such large projects were sufficiently different that the chance of accidentally doing the same thing was quite small.
Caveats: these results are based on the students’ top choices only and these projects dominate the data. (Top 4 projects discussed in 47 answers, remaining 4 discussed in 15.) Importantly, there is no data about why students didn’t choose a given project – so there may have been other factors in play.
In conclusion, the students did make the effort to look past the superficial descriptions in choosing projects. Value adding is a really important criterion, often in conjunction with subject appeal and perceived difficulty. Having multiple teams enhancing the same project (independently) does not necessarily lead to collaboration.
But, wait, there’s more! Larger projects meant that teams face more uniform comprehension tasks and generally picked different enhancements from each other. Fewer projects means less stress on the lab instructor. UML diagrams are not very helpful when trying to get the big-picture view. The UML view often doesn’t help with the overall structure.
In the future, they’re planning to offer 10 projects to 30 teams, look at software metrics of the different projects, characterise the reasons that students avoid certain projects, and provide different tools to support the approach. Really interesting work and some very useful results that I suspect my demonstrators will be very happy to hear. 🙂
The questions were equally interesting, talking about the suitability of UML for large program representation (when it looks like spaghetti) and whether the position of projects in a list may have influenced the selection (did students download the software for the top 5 and then stop?). We don’t have answers to either of these but, if you’re thinking about offering a project selection for your students, maybe randomising the order of presentation might allow you to measure this!

 


CSEDU, Day 1, Keynote 2, “Mathematics Teaching: is the future syncretic?” (#csedu14 #csedu #AdelEd)

This is an extension of the position paper that was presented this morning. I must be honest and say that I have a knee-jerk reaction when I run across titles like this. There’s always the spectre of Rand or Gene Ray in compact phrases of slightly obscure terminology. (You should probably ignore me, I also twitch every time I run across digital hermeneutics and that’s perfectly legitimate.) The speaker is Larissa Fradkin who is trying to improve the quality of mathematics teaching and overall interest in mathematics – which is a good thing and so I should probably be far more generous about “syncretic”. Let’s review the definition of syncretic:

Again, from Wikipedia, Syncretism /ˈsɪŋkrətɪzəm/ is the combining of different, often seemingly contradictory beliefs, while melding practices of various schools of thought. (The speaker specified this to religious and philosophical schools of thought.)

There’s a reference in the talk to gnosticism, which combined oriental mysticism, Judaism and Christianity. Apparently, in this talk we are going to have myths debunked regarding the Maths Wars of Myths, including traditionalist myths and constructivist myths. Then discuss the realities in the classroom.

Two fundamental theories of learning were introduced: traditionalist and constructivist. Apparently, these are drummed into poor schoolteachers and yet we academics are sadly ignorant of these. Urm. You have to be pretty confident to have a go at Piaget: “Piaget studied urchins and then tried to apply it to kids.” I’m really not sure what is being said here but the speaker has tried to tell two jokes which have fallen very flat and, regrettably, is making me think that she doesn’t quite grasp what discovery learning is. Now we are into Guided Teaching and scaffolding with Vygotsky, who apparently, as a language teacher, was slightly better than a teacher of urchins.

The first traditionalist myth is that intelligence = implicit memory (no conscious awareness) + basic pattern recognition. Oh, how nice, the speaker did a lot of IQ tests and went from 70 to 150 in 5 tests. I don’t think many people in the serious educational community places much weight  on the assessment of intelligence through these sorts of test – and the objection to standardised testing is coming from the edu research community of exactly those reasons. I commented on this speaker earlier and noted that I felt that she was having an argument that was no longer contemporary. Sadly, my opinion is being reinforced. The next traditionalist myth is that mathematics should be taught using poetry, other mnemonics and coercion.

What? If the speaker is referring to the memorisation of the multiplication tables, we are taking about a definitional basis for further development that occupies a very short time in the learning phase. We are discussing a type of education that is already identified as negative as if the realisation that mindless repetition and extrinsic motivational factors are counter-productive. Yes, coercion is an old method but let’s get to what you’re proposing as an alternative.

Now we move on to the constructivist myths. I’m on the edge of my seat. We have a couple of cartoons which don’t do anything except recycle some old stereotypes. So, the first myth is “Only what students discover for themselves is truly learned.” So the problem here is based on Rebar, 2007, met study. Revelation: Child-centred, cognitively focused and open classroom approaches tend to perform poorly.

Hmm, not our experience.

The second myth is both advanced and debunked by a single paper, that there are only two separate and distinct ways to teach mathematics: conceptual understanding and drills. Revelation: Conceptual advanced are invariably built on the bedrock of technique.

Myth 3: Math concepts are best understood and mastered when presented in context, in that way the underlying math concept will follow automatically. The speaker used to teach with engineering examples but abandoned them because of the problem of having to explain engineering problems, engineering language and then the problem. Ah, another paper from Hung-Hsi Wu, UCB, “The Mathematician and Mathematics Education Reform.” No, I really can’t agree with this as a myth. Situated learning is valid and it works, providing that the context used is authentic and selected carefully.

Ok, I must confess that I have some red flags going up now – while I don’t know the work of Hung-Hsi Wu, depending on a single author, especially one whose revelatory heresy is close to 20 years old, is not the best basis for a complicated argument such as this. Any readers with knowledge in this should jump on to the comments and get us informed!

Looking at all of these myths, I don’t see myths, I see straw men. (A straw man is a deliberately weak argument chosen because it is easy to attack and based on a simplified or weaker version of the problem.)

I’m in agreement with many of the outcomes that Professor Fradkin is advocating. I want teachers to guide but believe that they can do it in the constriction of learning environments that support constructivist approaches. Yes, we should limit jargon. Yes, we should move away from death-by-test. Yes, Socratic dialogue is a great way to go.

However, as always, if someone says “Socratic dialogue is the way to go but I am not doing it now” then I have to ask “Why not?” Anyone who has been to one of my sessions knows that when I talk about collaboration methods and student value generation, you will be collaborating before your seat has had a chance to warm up. It’s the cornerstone of authentic teaching that we use the methods that we advocate or explain why they are not suitable – cognitive apprenticeship requires us to expose our selves as we got through the process we’re trying to teach!

Regrettably, I think my initial reaction of cautious mistrust of the title may have been accurate. (Or I am just hopelessly biassed by an initial reaction although I have been trying to be positive.) I am trying very hard to reinterpret what has been said. But there is a lot of anecdote and dependency upon one or two “visionary debunkers” to support a series of strawmen presented as giant barriers to sensible teaching.

Yes, listening to students and adapting is essential but this does not actually require one to abandon constructivist or traditionalist approaches because we are not talking about the pedagogy here, we’re talking about support systems. (Your take on that may be different.)

There is some evidence presented at the end which is, I’m sorry to say, a little confusing although there has obviously been a great deal of success for an unlisted, uncounted number and unknown level of course – success rates improved from 30 passing to 70% passing and no-one had to be trained for the exam. I would very much like to get some more detail on this as claiming that the syncretic approach is the only way to reach 70% is essential is a big claim. Also, a 70% pass rate is not all that good – I would get called on to the carpet if I did that for a couple of offerings. (And, no, we don’t dumb down the course to improve pass rate – we try to teach better.)

Now we move into on-line techniques. Is the flipped classroom a viable approach? Can technology “humanise” the classroom? (These two statements are not connected, for me, so I’m hoping that this is not an attempt to entail one by the other.) We then moved on to a discussion of Khan, who Professor Fradkin is not a fan of, and while her criticisms of Khan are semi-valid (he’s not a teacher and it shows), her final statement and dismissal of Khan as a cram-preparer is more than a little unfair and very much in keeping with the sweeping statements that we have been assailed by for the past 45 minutes.

I really feel that Professor Fradkin is conflating other mechanisms with blended and flipped learning – flipped learning is all about “me time” to allow students to learn at their own pace (as she notes) but then she notes a “Con” of the Khan method of an absence of “me time”. What if students don’t understand the recorded lectures at all? Well… how about we improve the material? The in-class activities will immediately expose faulty concept delivery and we adapt and try again (as the speaker has already noted). We most certainly don’t need IT for flipped learning (although it’s both “Con” point 3 AND 4 as to why Khan doesn’t work), we just need to have learning occur before we have the face-to-face sessions where we work through the concepts in a more applied manner.

Now we move onto MOOCs. Yes, we’re all cautious about MOOCs. Yes, there are a lot of issues. MOOCs will get rid of teachers? That particular strawman has been set on fire, pushed out to sea, brought back, set on fire again and then shot into orbit. Where they set it on fire again. Next point? Ok, Sebastian Thrun made an overclaim that the future will have only 10 higher ed institutions in 50 years. Yup. Fire that second strawman into orbit. We’ve addressed Professor Thrun before and, after all, he was trying to excite and engage a community over something new and, to his credit, he’s been stepping back from that ever since.

Ah, a Coursera course that came from a “high-quality” US University. It is full of imprecise language, saying How and not Why, with a Monster generator approach. A quick ad hominen attack on the lecturer in the video (He looked like he had been on drugs for 10 years). Apparently, and with no evidence, Professor Fradkin can guarantee that no student picked up any idea of what a function was from this course.

Apparently some Universities are becoming more cautious about MOOCs. Really.

I’m sorry to have editorialised so badly during this session but this has been a very challenging talk to listen to as so much of the underlying material has been, to my understanding, misrepresented at least. A very disappointing talk over all and one that could have been so much better – I agree with a lot of the outcomes but I don’t really think that this is not the way to lead towards it.

Sadly, already someone has asked to translate the speaker’s slides into German so that they can send it to the government! Yes, text books are often bad and a lack of sequencing is a serious problem. Once again I agree with the conclusion but not the argument… Heresy is an important part of our development of thought, and stagnation is death, but I think that we always need to be cautious that we don’t sensationalise and seek strawmen in our desire to find new truths that we have to reach through heresy.

 


Start with good grapes, don’t mess them up.

“Make no little plans; they have no magic to stir men’s blood and probably themselves will not be realised.” Daniel Burnham

I was watching a film today called “Antiviral”, directed by Brandon Cronenburg, and one of the themes addressed was what we choose to do with technology. Celebrity cell reproduction is the theme of the movie and it is quite bizarre to see a technology that could be so useful (in building new organs and prolonging life) being used to allow people to have the same colds that their idols do. (Because of the rating of this blog, I must state that Antiviral is an adult film and there are themes that I will not discuss here.)

We have many technologies that are powerful and we are developing more of them, daily. We have developed the ability to print human organs (to a limited fashion, although 40 days for a liver is another month of life for someone) and we in the foothills of printing food. Our automated and autonomous systems become more capable and more effective on a daily basis, although Amazon’s drone network won’t be buzzing your house tomorrow.

One of the most profound reasons for education is the requirement to ensure that the operators of powerful things are reasoning, thinking, informed human beings. As humans, we tend to build amplification engines, it’s just what we do, but in so many cases, a good intention is then amplified to a great one, and a malign intention can be amplified to massive and evil result.

Our production processes for food and drink often take a similar form. To make good bread, you grow good wheat in good soil and then you use good yeast, clean conditions and control the oven. You start with good ingredients and you use technology and knowledge to make it better – or to transform it without damage. The same is true of wine. I can make good wine from just about anything but if you want me to make great wine? I have to start with good grapes and then not mess them up!

Good grapes!

Good grapes!

Our technologies are, however, able to go either way. I could burn the bread, cook the yeast, freeze the wine, just as easily if I was poorly trained or if I had malicious intent. Education is not just about training, it’s about preparation for the world in which our students will live. This world is always changing but we have to move beyond thinking about “Driver’s Ed” as a social duty and think about “Resource Ed”, “The Ethics of Cloning” (for example) and all sorts of difficult and challenging issues when we try and teach. We don’t have to present a given viewpoint, by any means, but to ignore the debate and the atmosphere in which we (and I in particular) are training young tertiary students would be to do them a disservice.

This starts young. The sooner we can start trying to grow good students and the sooner that we make our educational systems transform these into wonderful people, the better off we’ll be. The least I would hope for, for any of my students, is that they will always at least think briefly of some of the issues before they do something. They may still choose to be malign, for whatever reason, but let it be then a choice and not from ignorance – but also, let the malign be few and far between and a dying breed!


You want thinkers. Let us produce them.

I was at a conference recently where the room (about 1000 people from across the business and educational world) were asked what they would like to say to everyone in the room, if they had a few minutes. I thought about this a lot because, at the time, I had half an idea but it wasn’t in a form that would work on that day. A few weeks later, in a group of 100 or so, I was asked a similar question and I managed to come up with something coherent. What follows here is a more extended version of what I said, with relevant context.

If I could say anything to the parents and  future employers of my students, it would be to STOP LOOKING AT GRADES as some meaningful predictor of the future ability of the student. While measures of true competency are useful, the current fine-grained but mostly arbitrary measurements of students, with rabid competitiveness and the artificial divisions between grade bands, do not fulfil this purpose. When an employer demands a GPA of X, there is no guaranteed true measure of depth of understanding, quality of learning or anything real that you can use, except for conformity and an ability to colour inside the lines. Yes, there will be exceptional people with a GPA of X, but there will also be people whose true abilities languished as they focused their energies on achieving that false grail. The best person for your job may be the person who got slightly (or much) lower marks because they were out doing additional tasks that made them the best person.

Please. I waste a lot of my time giving marks when I could be giving far more useful feedback, in an environment where that feedback could be accepted and actual positive change could take place. Instead, if I hand back a 74 with comments, I’ll get arguments about the extra mark to get to 75 rather than discussions of the comments – but don’t blame the student for that attitude. We have created a world in which that kind of behaviour is both encouraged and sensible. It’s because people keep demanding As and Cs to somehow grade and separate people that we still use them. I couldn’t switch my degree over to “Competent/Not Yet Competent” tomorrow because, being frank, we’re not MIT or Stanford and people would assume that all of my students had just scraped by – because that’s how we’re all trained.

If you’re an employer then I realise that it’s very demanding but please, where you can, look at the person wherever you can and ask your industrial bodies that feed back to education to focus on ensuring that we develop competent, thinking individuals who can practice in your profession, without forcing them to become grade-haggling bean counters who would cut a group member’s throat for an A.

If you’re a parent, then I would like to ask you to think about joining that group of parents who don’t ask what happened to that extra 1% when a student brings home a 74 or 84. I’m not going to tell you how to raise your children, it’s none of my business, but I can tell you, from my professional and personal perspective, that it probably won’t achieve what you want. Is your student enjoying the course, getting decent marks and showing a passion and understanding? That’s pretty good and, hopefully, if the educators, the parents and the employers all get it right, then that student can become a happy and fulfilled human being.

Do we want thinkers? Then we have to develop the learning environments in which we have the freedom and capability to let them think. But this means that this nonsense that there is any real difference between a mark of 84 and a mark of 85 has to stop and we need to think about how we develop and recognise true measures of competence and suitability that go beyond a GPA, a percentage or a single letter grade.

You cannot contain the whole of a person in a single number. You shouldn’t write the future of a student on such a flimsy structure.


The Bad Experience That Stays With You and the Legendary Bruce Springsteen.

I was talking with a friend of mine and we were discussing perceptions of maths and computing (yeah, I’m like this off duty, too) and she felt that she was bad at Maths. I commented that this was often because  of some previous experience in school and she nodded and told me this story, which she’s given me permission to share with you now. (My paraphrasing but in her voice)

“When I was five, we got to this point in Math where I didn’t follow what was going on. We got to this section and it just didn’t make any sense to me. The teacher gave us some homework to do and I looked at it and I couldn’t do it but I didn’t want to hand in nothing. So I scrunched it up and put it in the bin. When the teacher asked for it back, I told her that I didn’t have it.

It turns out that the teacher had seen me put it in the bin and so she punished me. And I’ve never thought of myself as good at math since.”

Wow. I’m hard-pressed to think of a better way to give someone a complex about a subject. Ok, yes, my friend did lie to the teacher about not the work and, yes, it would  have been better if she’d approached the teacher to ask for help – but given what played out, I’m not really sure how much it would have changed what happened. And, before we get too carried away, she was five.

Now this is all some (but not that many) years ago and a lot of things have changed in teaching, but all of us who stand up and call ourselves educations could do worse than remember Bruce Springsteen’s approach to concerts. Bruce plays a lot of concerts but, at each one, he tries to give his best because a lot of the people in the audience are going to their first and only Springsteen concert. It can be really hard to deal with activities that are disruptive, disobedient and possible deliberately so, but they may be masking fear, uncertainty and a genuine desire for the problem to go away because someone is overwhelmed. Whatever we get paid, that’s really one of the things we get paid to do.

We’re human. We screw up. We get tired. But unless we’re thing about and trying to give that Springsteen moment to every student, then we’re setting ourselves up to be giving a negative example. Somewhere down the line, someone’s going to find their life harder because of that – it may be us in the next week, it may be another teacher next year, but it will always be the student.

Bad experiences hang around for years. It would be great if there were fewer of them. Be awesome. Be Springsteen.

EMBRACE YOUR AWESOMENESS! Don't make me come over and sing "Blinded by the Light!"

EMBRACE YOUR AWESOMENESS! Don’t make me come over and sing “Blinded by the Light!”