ICER 2012 Day 2 Research Session 3
Posted: September 15, 2012 Filed under: Education | Tags: collaboration, community, education, educational problem, educational research, feedback, higher education, icer, icer 2012, icer2012, in the student's head, shotgun debugging, teaching, teaching approaches, tools, universal principles of design Leave a commentThe session kicked off with “The Abstraction Transition Taxonomy: Developing Desired Learning Outcomes through the Lens of Situated Cognition”, (Quintin Cutts (presenting), Sarah Esper, Marlena Fecho, Stephen Foster and Beth Simon) and the initial question: “Do our learning outcomes for programming classes match what we actually do as computational thinkers and programmers?” To answer this question, we looked Eric Mazur’s Peer Instruction, an analysis of PU questions as applied to a CS principles pilot course, and then applied the Abstraction Transition Taxonomy (ATT) to published exams, with a wrap of observations and ‘where to from here’.
Physicists have, some time ago, noticed that their students can plug numbers into equations (turn the handle, so to speak) but couldn’t necessarily demonstrate that they understood things: they couldn’t demonstrate that that they thought as physicists should. (The Force Concept Inventory was mentioned here and, if you’re not familiar, it’s a very interesting thing to look up.) To try and get students who thought as physicists, Mazur developed Peer Instruction (PI), which had pre-class prep work, in-class questions, followed by voting, discussion and re-voting, with an instructor leading class-wide discussion. These activities prime the students to engage with the correct explanations – that is, the way that physicists think about and explain problems.
Looking at Computer Science, many CS people use the delivery of a working program as a measure of the correct understanding and appropriate use of programming techniques.
Given that generating a program is no guarantee of understanding, which is sad but true given the existence of the internet, other students and books. We could try and force a situation where students are isolated from these support factors but this then leads us back to permutation programming, voodoo code and shotgun debugging unless the students actually understand the task and how to solve it using our tools. In other words, unless they think as Computer Scientists.
UCSD had a CS Principles Pilot course that used programming to foster computational thinking that was aimed at acculturation into the CS ‘way’ rather than trying to create programmers. The full PI implementation asked students to reason about their programs, through exploratory homework and a PI classroom, with some limited time traditional labs as well. While this showed a very positive response, the fear was that this may have been an effect of the lecturers themselves so analysis was required!
By analysing the PI questions, a taxonomy was developed that identified abstraction levels and the programming concepts within them. The abstraction levels were “English”, “Computer Science Speak” and “Code”. The taxonomy was extended with the transitions between these levels (turning an English question into code for example is a 1-3 transition, if English is abstraction level 1 and Code 3. Similarly, explain this code in English is 3-1). Finally, they considered mechanism (how does something work) and rationale (why did we do it this way)?
Analysing the assignment and assessment questions to determine what was being asked, in terms of abstraction level and transitions, and whether it was mechanism or rationale, revealed that 21% of the in-class multiple choice questions were ‘Why?’ questions but there actually very few ‘Why?’ questions in the exam. Unsurprisingly, almost every question asked in the PI framework is a ‘Why?’ question, so there should be room for improvement in the corresponding examinations. PI emphasises the culture of the discipline through the ‘Why?’ framing because it requires acculturation and contextualisation to get yourself into the mental space where a Rationale becomes logical.
The next paper “Subgoal-Labeled Instructional Material Improves Performance and Transfer in Learning to Develop Mobile Applications”, Lauren Margulieux, Mark Guzdial and Richard Catrambone, dealt with mental models and how the cognitive representation of an action will affect both the problem state and how well we make predictions. Students have so much to think about – how do they choose?
The problem with just waiting for a student to figure it out is high cognitive load, which I’ve referred to before as helmet fire. If students become overwhelmed they learn nothing, so we can explicitly tell students and/or provide worked examples. If we clearly label the subgoals in a worked example, students remember the subgoals and the transition from one to another. The example given here was an Android App Inventor worked example, one example of which had no labels, the other of which had subgoal labels added as overlay callouts to the movie as the only alteration. The subgoal points were identified by task analysis – so this was a very precise attempt to get students to identify the important steps required to understand and complete the task.
(As an aside, I found this discussion very useful. It’s a bit like telling a student that they need comments and so every line has things like “x=3; //x is set to 3” whereas this structured and deliberate approach to subgoal definition shows students the key steps.)
In the first experiment that was run, the students with the subgoals (and recall that this was the ONLY difference in the material) had attempted more, achieved more and done it in less time. A week later, they still got things right more often. In the second experiment, a talk-aloud experiment, the students with the subgoals discussed the subgoals more, tried random solution strategies less and wasted less effort than the other group. This is an interesting point. App Inventor allows you to manipulate blocks of code and the subgoal group were less likely to drag out a useless block to solve the problem. The question, of course, is why. Was it the video? Was it the written aspects? Was it both?
Students appear to be remembering and using the subgoals and, as was presented, if performance is improving, perhaps the exact detail of why it’s happening is something that we wish to pursue but, in the short term, we can still use the approach. However, we do have to be careful with how many labels we use as overloading visual cues can lead to confusion, thwarting any benefit.
The final paper in the session was “Using collaboration to overcome disparities in Java experience”, Colleen Lewis (presenting), Nathaniel Titterton and Michael Clancy. This presented the transformation of a a standard 3 Lecture, 2 hours of lab and 1 discussion hour course into a 1 x 1 hour lecture with 2 x 3 hour labs, with the labs now holding the core of the pedagogy. Students are provided feedback through targeted tutoring, using on-line multiple choices for the students to give feedback and assist the TAs. Pair programming gives you someone to talk to before you talk to the TA but the TA can monitor the MCQ space and see if everyone is having a problem with a particular problem.
This was addressing a problem in a dual speed entry course, where some students had AP CS and some didn’t, therefore the second year course was either a review for those students who had Java (from AP CS) or was brand new. Collaboration and targeted support was aimed at reducing the differences between the cohorts and eliminate disadvantage.
Now, the paper has a lot of detail on the different cohorts, by intake, by gender, by retention pattern, but the upshot is that the introduction of the new program reduced the differences between those students who did and did not have previous Java experience. In other words, whether you started at UCB in CS 1 (with no AP CS) or CS 1.5 (with AP CS), the gap between your cohorts shrank – which is an excellent result. Once this high level of collaboration was introduced, the only factor that retained any significant difference was the first exam, but this effect disappeared throughout the course as students received more exposure to collaboration.
I strongly recommend reading all three of these papers!
The Narrative Hunger: Stories That Meet a Need
Posted: September 15, 2012 Filed under: Education | Tags: authenticity, collaboration, community, curriculum, design, education, educational research, feedback, Generation Why, higher education, in the student's head, learning, principles of design, resources, student perspective, teaching, teaching approaches, tools, universal principles of design Leave a commentI have been involved in on-line communities for over 20 years now and, apparently, people are rarely surprised when they meet me. “Oh, you talk just like you type.” is the effective statement and I’m quite happy with this. While some people adopt completely different personae on-line, for a range of reasons, I seem to be the same. It then comes as little surprise that I am as much of storyteller in person as I am online. I love facts, revel in truth, but I greatly enjoying putting them together into a narrative that conveys the information in a way that is neither dry nor dull. (This is not to say that the absence of a story guarantees that things must be dry and dull but, without a focus on those elements of narrative that appeal to common human experience, we always risk this outcome.)
One of Katrina’s recent posts referred to the use of story telling in education. As she says, this can be contentious because:
stories can be used to entertain students, to have them enjoy your lectures, but are not necessarily educational.
The shibboleth of questionable educational research is often a vaguely assembled study, supported by the conjecture that the “students loved it”, and it is very easy to see how story telling could fall into this. However, we as humans are fascinated by stories. We understand the common forms even where we have not read Greek drama or “The Hero With a Thousand Faces”. We know when stories ring true and when they fall flat. Searching the mental engines of our species for the sweet spots that resonate across all of us is one way to convey knowledge in a more effective and memorable way. Starting from this focus, we must then observe our due diligence in making sure that our story framework contains a worthy payload.
I love story telling and I try to weave together a narrative in most of my lectures, even down to leaving in sections where deliberate and tangential diversion becomes part of the teaching, to allow me to contrast a point or illuminate it further by stripping it of its formal context and placing it elsewhere. After all, an elephant next to elephants is hardly memorable but an elephant in a green suit, as King of a country, tends to stick in the mind.
The power of the narrative is that it involves the reader or listener in the story. A well-constructed narrative leads the reader to wonder about what is going to happen next and this is model formation. Very few of us read in a way where the story unfolds with us completely distant from it – in fact, maintaining distance from a story is a sign of a poor narrative. When the right story is told, or the right person is telling it, you are on the edge of your seat, hungry to know more. When it is told poorly, then you stifle a yawn and smile politely, discreetly peering at your watch as you attempt to work out the time at which you can escape.
Of course, this highlights the value of narrative for us in teaching but it also reinforces that requirement that it be more than an assemblage of rambling anecdotes, it must be a constructed narration that weaves through points in a recognisable way and giving us the ability to conjecture on its direction. O. Henry endings, the classic twist endings, make no sense unless you have constructed a mental model that can be shaken by the revelations of the last paragraphs. Harry Potter book 7 makes even less sense unless one has a model of the world in which the events of the book can be situated.
As always, this stresses the importance of educational design, where each story, each fact, each activity, is woven into the greater whole with a defined purpose and in full knowledge of how it will be used. There is nothing more distracting than someone who rambles during a lecture about things that not only seem irrelevant, but are irrelevant. Whereas a musing on something that, on first glance, appears irrelevant can lead to exploration of the narrative by students. Suddenly, they are within a Choose Your Own Adventure book and trying to work out where each step will take them.
Stories are an excellent way to link knowledge and problems. They excite, engage and educate, when used correctly. We are all hungry for stories: we are players within our own stories, observers of those of the people around us and, eventually, will form part of the greater narrative by the deeds for which we are written up in the records to come. It makes sense to use this deep and very human aspect of our intellect to try and assist with the transfer of knowledge.
Our Influence: Prejudice As Predictor
Posted: September 14, 2012 Filed under: Education | Tags: advocacy, authenticity, community, education, educational problem, educational research, higher education, in the student's head, learning, measurement, principles of design, reflection, student perspective, teaching, teaching approaches, thinking Leave a commentIf you want to see Raymond Lister get upset, tell him that students fall into two categories: those who can program and those who can’t. If you’ve been reading much (anything) of what I’ve been writing recently, you’ll realise that I’ve been talking about things like cognitive development, self-regulation, dependence on authority, all of which have one thing in common in that students can be at different stages when they reach us. There is no guarantee that students will be self-reliant, cognitively mature and completely capable of making reasoned decisions at the most independent level.
There was a question raised several times during the conference and it’s the antithesis of the infamous “double hump conjecture”, that students divide into two groups naturally and irrevocably because of some innate characteristic. The question is “Do our students demonstrate their proficiency because of what we do or in spite of what we do?” If the innate characteristic conjecture is correct, and this is a frequently raised folk pedagogy, then our role has no real bearing on whether a student will learn to program or not.
If we accept that students come to us at different stages in their development, and that these development stages will completely influence their ability to learn and form mental models, then the innate characteristic hypothesis withers and dies almost immediately. A student who does not have their abilities ready to display can no more demonstrate their ability to program than a three-year old child can write Shakespeare – they are not yet ready to be able to learn, assemble, reassemble or demonstrate the requisite concepts and related skills.
However, a prejudicial perspective that students who cannot demonstrate the requisite ability are innately and permanently lacking that skill will, unpleasantly, viciously and unnecessarily, cause that particular future to lock in. Of course a derisive attitude to these ‘stupid’ or ‘slow’ students will make them withdraw or undermine their confidence! As I will note from the conference, confidence and support have a crucial impact on students. Undermining a student’s confidence is worse than not teaching them at all. Walking in with the mental model that separates the world into programmers and non-programmers forces that model into being.
Since I’ve entered the area of educational research, I’ve been exposed to things that I can separate into the following categories:
- Fascinating knowledge and new views of the world, based on solid research and valid experience.
- Nonsense
- Damned nonsense
- Rank stupidity
Where most of the latter come from other educators who react, our of fear or ignorance, to the lessons from educational research with disbelief, derision and resentment. “I don’t care what you say, or what that paper says, you’re wrong” says the voice of “experience”.
There is no doubt that genuine and thoughtful experience is, has been, and will always be a strong and necessary sibling to the educational and psychological theory that is the foundation of educational research. However, shallow experience can often be built up into something that it is not, when it is combined with fallacious thinking, cherry picking, confirmation bias and any other permutation of fear, resentment and inertia. The influence of folk pedagogies, lessons claimed from tea room mutterings and the projection of a comfortable non-reality that mysteriously never requires the proponent to ever expend any additional effort or change what they do, is a malign shadow over the illumination of good learning and teaching practice.
The best educators explain their successes with solid theory, strive to find a solution to the problems that lead to failure, and listen to all sources in order to construct a better practice and experience for their students. I hope, one day, to achieve this level- but I do know that doubting everything new is not the path forward for me.
I am pleased to say that the knowledge and joy of this (to me) new field far outstrips most of the other things that I have seen but I cannot stress any more how important it is that we choose our perspectives carefully. We, as educators, have disproportionally high influence: large shadows and big feet. Reading further into this discipline illustrates that we must very carefully consider the way that we think, the way that our students think and the capability that we actually have in the students for reasoning and knowledge accumulation before we make any rash or prejudicial statements about the innate capabilities of that most mythical of entities: the standard student.
ICER 2012 Research Paper Session 2
Posted: September 13, 2012 Filed under: Education | Tags: advocacy, collaboration, community, community sharing resources, education, educational research, higher education, icer, icer 2012, icer2012, teaching, teaching approaches, tools 3 CommentsOk, true confession time. My (and Katrina’s) paper was in this session and I’ll write this up separately. So this session consisted of “Adapting Disciplinary Commons Model: Lessons and Results from Georgia” (Brianna Morrison, Lijun Ni and Mark Guzdial) and… another paper. 🙂
- To document and share knowledge about student learning in CS classrooms
- To establish practices for the scholarship of teaching by making it public, peer-reviewed and amenable for public use. (portfolio model)
- Creating community
- Sharing resources and knowledge of how things are taught in other contexts.
- Supporting student recruitment within the high school environment.
ICER 2012 Research Paper Session 1
Posted: September 13, 2012 Filed under: Education | Tags: curriculum, education, educational research, higher education, icer, icer2012, in the student's head, measurement, teaching, teaching approaches, thinking, tools Leave a commentIt would not be over-stating the situation to say that every paper presented at ICER led to some interesting discussion and, in some cases, some more… directed discussion than others. This session started off with a paper entitled “Threshold Concepts and Threshold Skills in Computing” (Kate Sanders, Jonas Boustedt, Anna Eckerdal, Robert McCartney, Jan Erik Moström Lynda Thomas and Carol Zander), on whether threshold skills, as distinct from threshold concepts, existed and, if they did, what their characteristics would be. Threshold skills were described as transformative, integrative, troublesome knowledge, semi-irreversible (in that they’re never really lost), and requiring practice to keep current. The discussion that followed raised a lot of questions, including whether you could learn a skill by talking about it or asking someone – skill transfer questions versus environment. The consensus, as I judged it from the discussion, was that threshold skills didn’t follow from threshold concepts but there was a very rapid and high-level discussion that I didn’t quite follow, so any of the participants should feel free to leap in here!
The next talk was “On the reliability of Classifying Programming Tasks Using a Neo-Piagetian Theory of Cognitive Development” (Richard Gluga, Raymond Lister, Judy Kay, Sabina Kleitman and Donna Teague), where Ray raised and extended a number of the points that he had originally shared with us in the workshop on Sunday. Ray described the talk as being a bit “Neo-Piagetian theory for dummies” (for which I am eternally grateful) and was seeking to address the question as to where students are actually operating when we ask them to undertake tasks that require a reasonable to high level of intellectual development.
Ray raised the three bad programming habits he’d discussed earlier:
- Permutation programming (where students just try small things randomly and iteratively in the hope that they will finally get the right solution – this is incredibly troublesome if the many small changes take you further away from the solution )
- Shotgun debugging (where a bug causes the student to put things in with no systematic approach and potentially fixing things by accident)
- Voodoo coding/Cargo cult coding (where code is added by ritual rather than by understanding)
These approaches show one very important thing: the student doesn’t understand what they’re doing. Why is this? Using a Neo-Piagetian framework we consider the student as moving through the same cognitive development stages that they did as a child (Piagetian) but that this transitional approach applies to new and significant knowledge frameworks, such as learning to program. Until they reach the concrete operational stage of their development, they will be applying poor or inconsistent models – logically inadequate models to use the terminology of the area (assuming that they’ve reached the pre-operational stage). Once a student has made the next step in their development, they will reach the concrete operational stage, characterised (among other things, but these were the ones that Ray mentioned) by:
- Transitivity: being able to recognise how things are organised if you can impose an order upon them.
- Reversibility: that we can reverse changes that we can impose.
- Conservation: realising that the numbers of things stay the same no matter how we organise them.
In coding terms, these can be interpreted in several ways but the conservation idea is crucial to programming because understanding this frees the student from having to write the same code for the same algorithm every time. Grasping that conversation exists, and understanding it, means that you can alter the code without changing the algorithm that it implements – while achieving some other desirable result such as speeding the code up or moving to a different paradigm.
Ray’s paper discussed the fact that a vast number of our students are still pre-operational for most of first and second year, which changes the way that we actually try to teach coding. If a student can’t understand what we’re talking about or has to resort to magical thinking to solve problem, then we’ve not really achieved our goals. If we do start classifying the programming tasks that we ask students to achieve by the developmental stages that we’re expecting, we may be able to match task to ability, making everyone happy(er).
The final paper in the session was “Social Sensitivity Correlations with the Effectiveness of team Process Performance: An Empirical Study”, (Luisa Bender (presenting), Gursimra Walia, Krishna Kambhampaty, Travis Nygard and Kendall Nygard), which discussed the impact of socially sensitive team members in programming teams. (Social sensitivity is the ability to correctly understand the feelings and the viewpoints of other people.)
The “soft skills” are essential to teamwork process and a successful team enhances learning outcomes. Bad teams hinder team formation and progress, and things go downhill from there. From Wooley et al’s study of nearly 700 participants, the collective intelligence of the team stems from how well the team works rather than the individual intelligence of the participants. The group whose members were more socially sensitive had a higher group intelligence.
Just to emphasise that point: a team of smart people may not be as effective as a team as a team of people who can understand the feelings and perspectives of each other. (This may explain a lot!)
Social sensitivity is a good predictor of team performance and the effectiveness of team-oriented processes, as well as the satisfaction of the team members. However, it is also apparent that we in Science, Technology, Engineering and Mathematics (STEM) have lower social sensitivity readings (supporting Baron-Cohen’s assertion – no, not that one) than some other areas. Future work in this area is looking at the impact of a single high or low socially sensitive person in a group, a study that will be of great interest to anyone who is running teams made up on randomly assigned students. How can we construct these groups for the best results for the students?
More MOOCs! (Still writing up ICER, sorry!)
Posted: September 12, 2012 Filed under: Education | Tags: advocacy, community, curriculum, education, educational research, higher education, measurement, moocs, teaching, teaching approaches, tools Leave a commentThe Gates Foundation is offering grants for MOOCs in Introductory Classes. I mentioned in an earlier post that if we can show that MOOCs work, then generally available and cheap teaching delivery is a fantastically transformative technology. You can read the press release but it’s obvious that this has some key research questions in it, much as we’ve all been raising:
The foundation wants to know, for instance, which students benefit most from MOOC’s (sic) and which kinds of courses translate best to that format.
Yes! If these courses do work then for whom do they work and which courses? There’s little doubt that the Gates have been doing some amazing things with their money and this looks promising – of course, now I have to find out if my University has been invited to join and, if so, how I can get involved. (Of course, if they haven’t, then it’s time to put on my dancing trousers and try to remedy that situation.)
However, money plus research questions is a good direction to go in.
A side post on MOOCs: angrymath Hates Statistics 101
Posted: September 11, 2012 Filed under: Education | Tags: blogging, community, education, educational problem, feedback, higher education, moocs, teaching approaches, udacity Leave a commentA friend just forwarded me a rather scathing critique of one of the Udacity courses. The rather aptly named angrymath has published Udacity Statistics 101. To forewarn you, this is one of the leading quotes:
In brief, here is my overall assessment: the course is amazingly, shockingly awful.
As one of the commenters put it, hopefully the problems are growing pains and iteration towards perfection will continue. I haven’t seen the course in question so can’t comment, merely present.
ICER 2012 Day 1: Discussion Papers Session 1
Posted: September 11, 2012 Filed under: Education | Tags: blogging, community, education, educational research, higher education, icer, icer2012, measurement, principles of design, student perspective, teaching approaches, tools, universal principles of design Leave a commentICER contains a variety of sessions: research papers, discussion papers, lightning talks and elevator pitches. The discussion papers allow people to present ideas and early work in order to get the feedback of the community. This is a very vocal community so opening yourself up to discussion is going to be a bit like drinking from the firehouse: sometimes you quench your thirst for knowledge and sometimes you’re being water-cannonned.
Web-scale Data Gathering with BlueJ
Ian Utting, Neil Brown, Michael Kölling, Davin McCall and Philip Stevens
BlueJ is a very long-lived and widely used Java programming environment with a development environment designed to assist with the learning and teaching of object-oriented programming, as well as Java. The BlueJ project is now adding automated instrumentation to every single BlueJ installation and students can opt-in to a data reporting mechanism that will allow the collection and formation of a giant data repository: Project Blackbox. (As a note, that’s a bit of a super villain name, guys.)
Evaluating an Early Software Engineering Course with Projects and Tools from Open Source Software
Robert McCartney, Swapna Gokhale and Therese Smith
We tend to give Software Engineering students a project that requires them to undertake design and then, as a group, produce a large software artefact from scratch. In this talk, Robert discussed using existing projects that use a range of skills that are directly relevant to one of the most common activities our students will carray out in industry: maintenance and evolution.
Under a model of developing new features in an open-source system, the instructors provide a pre-selected set of projects and then the 2 person team:
- picks a project
- learns to comprehend code
- proposes enhancements
- describes and documents
- implements and presents
A Case Study of Environmental Factors Influencing Teaching Assistant Job Satisfaction
Elizabeth Patitsas
Elizabeth presented some interesting work on the impact of lecture theatres on what our TAs do. If the layout is hard to work with then, unsurprisingly, the TAs are less inclined to walk around and more inclined to disengage, sitting down the front checking e-mail. When we say ‘less inclined’, we mean that in closed lab layouts TAs spend 40% of the their time interacting with students, versus 76% in an open layout. However, these effects are also seen in windowless spaces: make a space unpleasant and you reduce the time that people spend answering questions and engaging.
The value of a pair of TAs was stressed: a pair gives you a backup but doesn’t lead to decision problems when coming to consensus. However, the importance of training was also stressed, as already clearly identified in the literature.
Education and Research: Evidence of a Dual Life
Joe Mirõ Julia, David López and Ricardo Alberich
ICER 2012 Day 1 Keynote: How Are We Thinking?
Posted: September 10, 2012 Filed under: Education | Tags: community, curriculum, education, educational problem, educational research, higher education, icer, icer 2012, in the student's head, reflection, teaching, teaching approaches, thinking, threshold concepts, tools, workload 3 CommentsWe started off today with a keynote address from Ed Meyer, from University of Queensland, on the Threshold Concepts Framework (Also Pedagogy, and Student Learning). I am, regrettably, not as conversant with threshold concepts as I should be, so I’ll try not to embarrass myself too badly. Threshold concepts are central to the mastery of a given subject and are characterised by some key features (Meyer and Land):
- Grasping a threshold concept is transformative because it changes the way that we think about something. These concepts become part of who we are.
- Once you’ve learned the concept, you are very unlikely to forget it – it is irreversible.
- This new concept allows you to make new connections and allows you to link together things that you previously didn’t realise were linked.
- This new concept has boundaries – they have an area over which they apply. You need to be able to question within the area to work out where it applies. (Ultimately, this may identify areas between schools of thought in an area.)
- Threshold concepts are ‘troublesome knowledge’. This knowledge can be counter-intuitive, even alien and will make no sense to people until they grasp the new concept. This is one of the key problems with discussing these concepts with people – they will wish to apply their intuitive understanding and fighting this tendency may take some considerable effort.
Meyer then discussed how we see with new eyes after we integrate these concepts. It can be argued that concepts such as these give us a new way of seeing that, because of inter-individual differences, students will experience in varying degrees as transformative, integrative, and (look out) provocative and troublesome. For this final one, a student experiences this in many ways: the world doesn’t work as I think it should! I feel lost! Helpless! Angry! Why are you doing this to me?
How do you introduce a student to one of these troublesome concepts and, more importantly, how can you describe what you are going to talk about when the concept itself is alien: what do you put in the course description given that you know that the student is not yet ready to assimilate the concept?
Meyer raised a really good point: how do we get someone to think inside the discipline? Do they understand the concept? Yes. Does this mean that they think along the right lines? Maybe, maybe not. If I don’t think like a Computer Scientist, I may not understand why a CS person sees a certain issue as a problem. We have plenty of evidence that people who haven’t dealt with the threshold concepts in CS Education find it alien to contemplate that the lecture is not the be-all and end-all of teaching – their resistance and reliance upon folk pedagogies is evidence of this wrestling with troublesome knowledge.
A great deal to think about from this talk, especially in dealing with key aspects of CS Ed as the threshold concept that is causing many of our non-educational research oriented colleagues so much trouble, as well as our students.
ICER 2012: Day 0 (Workshops)
Posted: September 10, 2012 Filed under: Education | Tags: collaboration, community, design, education, educational problem, educational research, feedback, Generation Why, higher education, icer, icer 2012, in the student's head, learning, principles of design, student perspective, teaching, teaching approaches, workload 1 CommentWell, it’s Sunday so it must be New Zealand (or at least it was Sunday yesterday). I attended that rarest of workshops, one where every session was interesting and made me think – a very good sign for the conference to come.
We started with an on-line workshop on Bloom’s taxonomy, classifying exam questions, with Raymond Lister from UTS. One of the best things about this for me was the discussion about the questions where we disagreed: is this application or synthesis? It really made me think about how I write my examinations and how they could be read.
We then segued into a fascinating discussion of neo-Piagetian theory, where we see the development stages that we usually associate with children in adults as they learn new areas of knowledge. In (very rough) detail, we look at whether we have enough working memory to carry out a task and, if not, weird things happen.
Students can indulge in some weird behaviours when they don’t understand what’s going on. For example, permutation programming, where they just type semi-randomly until their program compiles or works. Other examples include shotgun debugging and voodoo programming and what these amount to are the student not having a good consistent model of what works and, as a result, they are basically dabbling in a semi-magic approach.
My notes from the session contain this following excerpt:
“Bizarro” novice programmer behaviours are actually normal stages of intellectual development.Accept this and then work with this to find ways of moving students from pre-op, to concrete op, to formal operational. Don’t forget the evaluation. Must scaffold this process!
What this translates to is that the strange things we see are just indications that students having moved to what we would normally associate with an ‘adult’ (formal operational) understanding of the area. This shoots several holes in the old “You’re born a programmer” fallacy. Those students who are more able early may just have moved through the stages more quickly.
There was also an amount of derisive description of folk pedagogy, those theories that arise during pontification in the tea room, with no basis in educational theory or formed from a truly empirical study. Yet these folk pedagogies are very hard to shake and are one of the most frustrating things to deal with if you are in educational research. One “I don’t think so” can apparently ignore the 70 years since Dewey called the classrooms prisons.
The worst thought is that, if we’re not trying to help the students to transition, then maybe the transition to concrete operation is happening despite us instead of because of us, which is a sobering thought.
I thought that Ray Lister finished the session with really good thought regarding why students struggle sometimes:
The problem is not a student’s swimming skill, it’s the strength of the torrent.
As I’ve said before, making hard things easier to understand is part of the job of the educator. Anyone will fail, regardless of their ability, if we make it hard enough for them.


