A Year of Beauty

Plato: Unifying key cosmic values of Greek culture to a useful conceptual trinity.

Plato: Unifying key cosmic values of Greek culture to a useful conceptual trinity.

Ever since education became something we discussed, teachers and learners alike have had strong opinions regarding the quality of education and how it can be improved. What is surprising, as you look at these discussions over time, is how often we seem to come back to the same ideas. We read Dewey and we hear echoes of Rousseau. So many echoes and so much careful thought, found as we built new modern frames with Vygotsky, Piaget, Montessori, Papert and so many more. But little of this should really be a surprise because we can go back to the writings of Marcus Fabius Quintilianus (Quinitilian) and his twelve books of The Orator’s Education and we find discussion of small class sizes, constructive student-focused discussions, and that more people were capable of thought and far-reaching intellectual pursuits than was popularly believed.

… as birds are born for flying, horses for speed, beasts of prey for ferocity, so are [humans] for mental activity and resourcefulness.” Quintilian, Book I, page 65.

I used to say that it was stunning how contemporary education seems to be slow in moving in directions first suggested by Dewey a hundred years ago, then I discovered that Rousseau had said it 150 years before that. Now I find that Quntilian wrote things such as this nearly 2,000 years ago. And Marcus Aurelius, among other stoics, made much of approaches to thinking that, somehow, were put to one side as we industrialised education much as we had industrialised everything else.

This year I have accepted that we have had 2,000 years of thinking (and as much evidence when we are bold enough to experiment) and yet we just have not seen enough change. Dewey’s critique of the University is still valid. Rousseau’s lament on attaining true mastery of knowledge stands. Quintilian’s distrust of mere imitation would not be quieted when looking at much of repetitive modern examination practice.

What stops us from changing? We have more than enough evidence of discussion and thought, from some of the greatest philosophers we have seen. When we start looking at education, in varying forms, we wander across Plato, Hypatia, Hegel, Kant, Nietzsche, in addition to all of those I have already mentioned. But evidence, as it stands, does not appear to be enough, especially in the face of personal perception of achievement, contribution and outcomes, whether supported by facts or not.

Evidence of uncertainty is not enough. Evidence of the lack of efficacy of techniques, now that we can and do measure them, is not enough. Evidence that students fail who then, under other tutors or approaches, mysteriously flourish elsewhere, is not enough.

Authority, by itself, is not enough. We can be told to do more or to do things differently but the research we have suggests that an externally applied control mechanism just doesn’t work very well for areas where thinking is required. And thinking is, most definitely, required for education.

I have already commented elsewhere on Mark Guzdial’s post that attracted so much attention and, yet, all he was saying was what we have seen repeated throughout history and is now supported in this ‘gilt age’ of measurement of efficacy. It still took local authority to stop people piling onto him (even under the rather shabby cloak of ‘scientific enquiry’ that masks so much negative activity). Mark is repeating the words of educators throughout the ages who have stepped back and asked “Is what we are doing the best thing we could be doing?” It is human to say “But, if I know that this is the evidence, why am I acting as if it were not true?” But it is quite clear that this is still challenging and, amazingly, heretical to an extent, despite these (apparently controversial) ideas pre-dating most of what we know as the trappings and establishments of education. Here is our evidence that evidence is not enough. This experience is the authority that, while authority can halt a debate, authority cannot force people to alter such a deeply complex and cognitive practice in a useful manner. Nobody is necessarily agreeing with Mark, they’re just no longer arguing. That’s not helpful.

So, where to from here?

We should not throw out everything old simply because it is old, as that is meaningless without evidence to do so and it is wrong as autocratically rejecting everything new because it is new.

The challenge is to find a way of explaining how things could change without forcing conflict between evidence and personal experience and without having to resort to an argument by authority, whether moral or experiential. And this is a massive challenge.

This year, I looked back to find other ways forward. I looked back to the three values of Ancient Greece, brought together as a trinity through Socrates and Plato.

These three values are: beauty, goodness and truth. Here, truth means seeing things as they are (non-concealment). Goodness denotes the excellence of something and often refers to a purpose of meaning for existence, in the sense of a good life. Beauty? Beauty is an aesthetic delight; pleasing to those senses that value certain criteria. It does not merely mean pretty, as we can have many ways that something is aesthetically pleasing. For Dewey, equality of access was an essential criterion of education; education could only be beautiful to Dewey if it was free and easily available. For Plato, the revelation of knowledge was good and beauty could arose a love for this knowledge that would lead to such a good. By revealing good, reality, to our selves and our world, we are ultimately seeking truth: seeing the world as it really is.

In the Platonic ideal, a beautiful education leads us to fall in love with learning and gives us momentum to strive for good, which will lead us to truth. Is there any better expression of what we all would really want to see in our classrooms?

I can speak of efficiencies of education, of retention rates and average grades. Or I can ask you if something is beautiful. We may not all agree on details of constructivist theory but if we can discuss those characteristics that we can maximise to lead towards a beautiful outcome, aesthetics, perhaps we can understand where we differ and, even more optimistically, move towards agreement. Towards beautiful educational practice. Towards a system and methodology that makes our students as excited about learning as we are about teaching. Let me illustrate.

A teacher stands in front of a class, delivering the same lecture that has been delivered for the last ten years. From the same book. The classroom is half-empty. There’s an assignment due tomorrow morning. Same assignment as the last three years. The teacher knows roughly how many people will ask for an extension an hour beforehand, how many will hand up and how many will cheat.

I can talk about evidence, about pedagogy, about political and class theory, about all forms of authority, or I can ask you, in the privacy of your head, to think about these questions.

  • Is this beautiful? Which of the aesthetics of education are really being satisfied here?
  • Is it good? Is this going to lead to the outcomes that you want for all of the students in the class?
  • Is it true? Is this really the way that your students will be applying this knowledge, developing it, exploring it and taking it further, to hand on to other people?
  • And now, having thought about yourself, what do you think your students would say? Would they think this was beautiful, once you explained what you meant?

Over the coming year, I will be writing a lot more on this. I know that this idea is not unique (Dewey wrote on this, to an extent, and, more recently, several books in the dramatic arts have taken up the case of beauty and education) but it is one that we do not often address in science and engineering.

My challenge, for 2016, is to try to provide a year of beautiful education. Succeed or fail, I will document it here.


Teaching in Hong Kong, Day 10

Friday the 18th was a quieter day after such a busy week but I’m very glad to say that the students had not only looked at the podcasts but they had done the pre-quiz prior to attending! So we were able to go through two or three topics and discuss them in lectures, getting students to frame the ideas and then explore how correct they were, combining mental priming and good ol’ Vygotsky to drive understanding. Still, everyone was looking a little frazzled by the end of it and I was glad that we had rearranged the timetable a bit.

The quick quiz had 10 questions in it, versus the 5 from the previous one. As it turns, later feedback indicated that this was preferred by some students because you could be a bit wrong and not lose so many marks – it’s easy to forget that 2 marks out of 5 can be high stakes to some people so this was a pertinent reminder.

We finished up with the quiz and then I briefed all the students on their requirements for the next week. Rather than have a traditional lecture set on the MOnday, they would do the podcasts and quizzes over the weekend, as they finished their assignments and we would then go through that material in class as a tutorial/socratic dialogue on the Tuesday. This gave them time to step back, work on their assignments, revise for Monday’s short-answer exam and also gave us time for more visits on Monday!

That was it for week 2! Come back soon for what happened in Week 3!


ITiCSE 2014, Day 3, Final Session, “CS Ed Research”, #ITiCSE2014 #ITiCSE

The first paper, in the final session, was the “Effect of a 2-week Scratch Intervention in CS1 on Learners with Varying Prior Knowledge”, presented by Shitanshu Mirha, from IIT Bombay. The CS1 course context is a single programming course for all freshmen engineer students, thus it has to work for novice and advanced learners. It’s the usual problem: novices get daunted and advanced learners get bored. (We had this problem in the past.) The proposed solution is to use Scratch, because it’s low-floor (easy to get started), high-ceiling (can build complex projects) and wide-walls (applies to a wide variety of topics and themes). Thus it should work for both novice and advanced learners.

The theoretical underpinning is that novice learners reach cognitive overload while trying to learn techniques for programming and a language at the same time. One way to reduce cognitive load is to use visual programming environments such as Scratch. For advanced learners, Scratch can provide a sufficiently challenging set of learning material. From the perspective of Flow theory, students need to reach equilibrium between challenge level and perceived skill.

The research goal was to investigate the impact of a two-week intervention in a college course that will transition to C++. What would novices learn in terms of concepts and C++ transition? What would advanced students learn? What was the overall impact on students?

The cohort was 450 students, no CS majors, with a variety of advanced and novice learners, with a course objective of teaching programming in C++ across 14 weeks. The Scratch intervention took place over the first four weeks in terms of teaching and assessment. Novice scaffolding was achieved by ramping up over the teaching time. Engagement for advanced learners was achieved by starting the project early (second week). Students were assessed by quizzes, midterms and project production, with very high quality projects being demonstrated as Hall of Fame projects.

Students were also asked to generate questions on what they learned and these could be used for other students to practice with. A survey was given to determine student perception of usefulness of the Scratch approach.

The results for Novices were presented. While the Novices were able to catch up in basic Scratch comprehension (predict output and debug code), this didn’t translate into writing code in Scratch or debugging programs in C++. For question generation, Novices were comparable to advanced learners in terms of number of questions generated on sequences, conditionals and data. For threads, events and operators, Novices generated more questions – although I’m not sure I see the link that demonstrates that they definitely understood the material. Unsurprisingly, given the writing code results, Novices were weaker in loops and similar programming constructs. More than 53% of Novices though the Scratch framing was useful.

In terms of Advanced learner engagement, there were more Advanced projects generated. Unsurprisingly, Advanced projects were far more complicated. (I missed something about Most-Loved projects here. Clarification in the comments please!) I don’t really see how this measures engagement – it may just be measuring the greater experience.

Summarising, Scratch seemed to help Novices but not with actual coding or working with C++, but it was useful for basic concepts. The author claims that the larger complexity of Advanced user projects shows increased engagement but I don’t believe that they’ve presented enough here to show that. The sting in the tail is that the Scratch intervention did not help the Novices catch up to the Advanced users for the type of programming questions that they would see in the exam – hence, you really have to question its utility.

The next paper is “Enhancing Syntax Error Messages Appears Ineffectual” presented by Paul Denny, from The University of Auckland. Apparently we could only have one of Paul or Andrew Luxton-Reilly, so it would be churlish to say anything other than hooray for Paul! (Those in the room will understand this. Sorry we missed you, Andrew! Catch up soon.) Paul described this as the least impressive title in the conference but that’s just what science is sometimes.

Java is the teaching language at Auckland, about to switch to Python, which means no fancy IDEs like Scratch or Greenfoot. Paul started by discussing a Java statement with a syntax error in it, which gave two different (but equally unhelpful) error messages for the same error.

if (a < 0) || (a > 100)
  error=true;

// The error is in the top line because there should be surrounding parentheses around conditions
// One compiler will report that a ';' is required at the ||, which doesn't solve the right problem.
// The other compiler says that another if statement is required at the ||
// Both of these are unhelpful - as well as being wrong. It wasn't what we intended.

The conclusion (given early) is simple: enhancing the error messages with a controlled empirical study found no significant effect. This work came from thinking about an early programming exercise that was quite straightforward but seemed to came students a lot of grief. For those who don’t know, programs won’t run until we fix the structural problems in how we put the program elements together: syntax errors have to be fixed before the program will run. Until the program runs, we get no useful feedback, just (often cryptic) error messages from the compiler. Students will give up if they don’t make progress in a reasonable interval and a lack of feedback is very disheartening.

The hypothesis was that providing more useful error messages for syntax errors would “help” users, help being hard to quantify. These messages should be:

  • useful: simple language, informal language and targeting errors that are common in practice. Also providing example code to guide students.
  • helpful: reduce the number of non-compiling submissions in total, reduce number of consecutive non-compiling submissions AND reduce the number of attempts to resolve a specific error.

In related work, Kummerfeld and Kay (ACE 2003), “The neglected battle fields of Syntax Errors”, provided a web-based reference guide to search for the error text and then get some examples. (These days, we’d probably call this Stack Overflow. 🙂 ) Flowers, Carver and Jackson, 2004, developed Gauntlet to provide more informal error messages with user-friendly feedback and humour. The paper was published in Frontiers in Education, 2004, “Empowering Students and Building Confidence in Novice Programmers Through Gauntlet.” The next aspect of related work was from Tom Schorsch, SIGCSE 1995, with CAP, making specific corrections in an environment. Warren Toomey modified BlueJ to change the error subsystem but there’s no apparent published work on this. The final two were Dy and Rodrigo, Koli Calling 2010, with a detector for non-literal Java errors and Debugging Tutor: Preliminary evaluation, by Carter and Blank, KCSC, January 2014.

The work done by the authors was in CodeWrite (written up in SIGCSE 2011 and ITiCSE 2011, both under Denny et al). All students submit non-compiling code frequently. Maybe better feedback will help and influence existing systems such as Nifty reflections (cloud bat) and CloudCoder. In the study, student had 10 problems they could choose from, with a method, description and return result. The students were split in an A/B test, where half saw raw feedback and half saw the enhanced message. The team built an error recogniser that analysed over 12,000 submissions with syntax errors from a 2012 course and the raw compiler message identified errors 78% of the time. (“All Syntax Errors are Not Equal”, ITiCSE 2012). In other cases, static analysis was used to work out what the error was. Eventually, 92% of the errors were classifiable from the 2012 dataset. Anything not in that group was shown as raw error message to the student.

In the randomised controlled experiment, 83 students had to complete the 10 exercises (worth 1% each), using the measures of:

  • number of consecutive non-compiing submissions for each exercise
  • Total number of non-compiling submissions
  • … and others.

Do students even read the error messages? This would explain the lack of impact. However, examining student code change there appears to be a response to the error messages received, although this can be a slow and piecemeal approach. There was a difference between the groups, but it wasn’t significant, because there was a 17% reduction in non-compiling submissions.

I find this very interesting because the lack of significance is slightly unexpected, given that increased expressiveness and ease of reading should make it easier for people to find errors, especially with the provision of examples. I’m not sure that this is the last word on this (and I’m certainly not saying the authors are wrong because this work is very rigorous) but I wonder what we could be measuring to nail this one down into the coffin.

The final talk was “A Qualitative Think-Aloud Study of Novice Programmers’ Code Writing Strategies”, which was presented by Tony Clear, on behalf of the authors. The aim of the work was to move beyond the notion of levels of development and attempt to explore the process of learning, building on the notion of schemas and plans. Assimilation (using existing schemas to understand new information) and accommodation  (new information won’t fit so we change our schema) are common themes in psychology of learning.

We’re really not sure how novice programmers construct new knowledge and we don’t fully understand the cognitive process. We do know that learning to program is often perceived as hard. (Shh, don’t tell anyone.) At early stages, movie programmers have very few schemas to draw on, their knowledge is fragile and the cognitive load is very high.

Woohoo, Vygotsky reference to the Zone of Proximal Development – there are things students know, things that can learn with help, and then the stuff beyond that. Perkins talked about attitudinal factors – movers, tinkerers and stoppers. Stoppers stop and give up in the face of difficulty, tinkers fiddle until it works and movers actually make good progress and know what’s going on. The final aspect of methodology was inductive theory construction, while I’ll let you look up.

Think-aloud protocol requires the student to clearly vocalise what they were thinking about as they completed computation tasks on a computer, using retrospective interviews to address those points in the videos where silence, incomprehensibility or confused articulation made interpreting the result impossible. The scaffolding involve tutoring, task performance and follow-up. The programming tasks were in a virtual world-based pogromming environment to solve tasks of increasing difficulty.

How did they progress? Jacquie uses the term redirection to mean that the student has been directed to re-examine their work, but is not given any additional information. They’re just asked to reconsider what they’ve done. Some students may need a spur and then they’re fine. We saw some examples of students showing their different progression through the course.

Jacquie has added a new category, PLANNERS, which indicates that we can go beyond the Movers to explain the kind of behaviour we see in advanced students in the top quartile. Movers who stretch themselves can become planners if they can make it into the Zone of Proximal Development and, with assistance, develop their knowledge beyond what they’d be capable of by themselves. The More Competent Other plays a significant role in helping people to move up to the next level.

Full marks to Tony. Presenting someone else’s work is very challenging and you’d have to be a seasoned traveller to even reasonably consider it! (It was very nice to see the lead author recognising that in the final slide!)

 


CSEDU, Day 3, Final Keynote, “Digital Age Learning – The Changing Face of Online Education”, (#csedu14 #AdelED @timbuckteeth)

Now, I should warn you all that I’ve been spending time with Steve Wheeler (@timbuckteeth) and we agree on many things, so I’m either going to be in furious agreement with him or I will be in shock because he suddenly reveals himself to be a stern traditionalist who thinks blended learning is putting a textbook in the Magimix. Only time will tell, dear reader, so let’s crack on, shall we? Steve is from the Plymouth Institute of Education, conveniently located in Plymouth University, and is a ferocious blogger and tweeter (see his handle above).

Erik introduced Steve by saying that Steve didn’t need much introduction and noted that Steve was probably one of the reasons that we had so many people here on the last day! (This is probably true, the afternoon on the last day of a European conference is normally notable due to the almost negative number of participants.)

When you’re a distance educator, the back of the classroom can be thousands of miles away” (Steve Wheeler)

Steve started with the idea that on-line learning is changing and that his presentation was going to be based on the idea that the future will be richly social and intensely personal. Paradoxical? Possibly but let’s find out. Oh, look, an Einstein quote – we should have had Einstein bingo cards. It’s a good one and it came with an anecdote (which was a little Upstairs Downstairs) so I shall reproduce it here.

I never teach my students. I only provide the conditions in which they can learn.” Albert Einstein

There are two types of learning: shallow (rote) learning that we see when cramming, where understanding is negligible or shallow if there at all, and then there is the fluid intelligence, the deeper kind of learning that draws on your previous learning and your knowledge structures. But what about strategic learning where we switch quickly between the two. Poor pedagogy can suppress these transitions and lock people into one spot.

There are three approaches here: knowledge (knowing that, which is declarative), wisdom (knowing how, which is procedural) and transformation (knowing why, which is critical). I’ve written whole papers about the missing critical layer so I’m very happy to see Steve saying that the critical layer is the one that we often do the worst with. This ties back into blooms where knowledge is cognitive, wisdom is application and transformation is analysis and evaluation. Learning can be messy but it’s transformative and it can be intrinsically hard to define. Learning is many things – sorry, Steve, not going to summarise that whole sentence.

We want to move through to the transformational stage of learning.

What is the first attempt at distance learning? St Paul’s name was tossed out, as was Moses. But St Paul was noted as the first correspondence course offered. (What was the assessment model, I wonder, for Epistola.) More seriously, it was highly didactic and one-way, and it was Pitman who established a two-way correspondence course that was both laborious and asynchronous but it worked. Then we had television and in 1968, the Stanford Instructions TV Network popped up. In 1970, Steve saw an example of video conferencing that had been previously confined to Star Trek. I was around in the early 70s and we were all agog about the potential of the future – where is my moon base, by the way? But the tools were big and bulk – old video cameras were incredibly big and ridiculously short lived in their battery life… but it worked! Then people saw uses for the relationship between this new technology and pedagogy. Reel-to-reel, copiers, projectors, videos: all of these technologies were effective for their teaching uses at the time.

Of course, we moved on to computer technology including the BBC Model B (hooray!) and the reliable but hellishly noisy dot matrix printer. The learning from these systems was very instructional, using text and very simplistic in multiple choice question approach. Highly behaviouristic but this is how things were done and the teaching approach matched the technology. Now, of course, we’ve gone tablet-based, on-line gaming environments that have non-touch technologies such as Kinect, but the principle remains the same: over the years we’ve adapted technology to pedagogy.

But it’s only now that, after Sir Tim Berners-Lee, we have the World Wide Web that on-line learning is now available to everybody, where before it was sort-of available but not anywhere near as multiplicable. Now, for our sins, we have Learning Management Systems, the most mixed of blessings, and we still have to ask what are we using them for, how are we using them? Is our pedagogy changing? Is out connection with our students changing? Illich (1972) criticised educational funnels that had a one-directional approach and intend motivated educational webs that allow the transformation of each moment of living into one of learning, sharing and caring.

What about the Personal Learning Environment (PLE)? This is the interaction of tools such as blogs, twitters and e-Portfolios, then add in the people we interact with, and then the other tools that we use – and this would be strongly personal to an individual. If you’ve ever tried to use your partner’s iPad, you know how quickly personalisation changes your perception of a tool! Wheeler and Malik (2010) discuses the PLE that comprises the personal learning network and personal web tools, with an eye on more than the classroom, but as a part of life-long learning. Steve notes (as Stephen Heppel did) that you may as well get students to use their PLEs in the open because they’ll be using them covertly otherwise: the dreaded phone under the table becomes a learning tool when it’s on top of the table. Steve discussed the embedded MOOC that Hugh discussed yesterday to see how the interaction between on-line and f2f students can benefit from each other.

In the late ’80s, the future was “multi-media” and everything had every other medium jammed into it (and they don’t like it up ’em) and then the future was going to converge on the web. Internet take up is increasing: social, political and economic systems change incrementally, but technology changes exponentially. Steve thinks the future is smart mobile and pervasive, due to miniaturisation and capability of new devices. If you have WiFi then you have the world.

Change is not linear, it’s exponential.” Kurzweil

Looking at the data, there are no more people in the world with mobile phones than people without, although some people have more than one. (Someone in the audience had four, perhaps he was a Telco?) Of course, some reasons for this are because mobile phones replace infrastructure: there are entire African banks that run over mobile networks, as an example. Given that we always have a computer in our pocket, how can we promote learning everywhere? We are using these all the time, everywhere, and this changes what we can do because we can mix leisure and learning without having to move to fixed spaces.

Steve then displayed the Intel info graphic “What Happens In an Internet Minute“, but it’s scary to see how much paper is lagging these days. What will the future look like? What will future learning look like? If we think exponentially then things are changing fast. There is so much content being generated, there must be something that we can use (DOGE photos and Justin Bieber vides excepted) for our teaching and learning. But, given that 70% of what we learn is if informal and outside of the institution, this is great! But we need to be able to capture this and this means that we should produce a personal learning network, because trying to drink down all that content by yourself is exceeding our ability! By building a network, we build a collection of filters and aggregators that are going to help us to bring sense out of the chaos. Given that nobody can learn everything, we can store our knowledge in other people and know where to go when we need that knowledge. A plank of connectivist theory and leading into paragogy, where we learn from each other. This also leads us to distributed cognition, where we think across the group (a hive mind, if you will) but, more simply, you learn from one person, then another, and it becomes highly social.

Steve showed us a video on “How have you used your own technology to enhance your learning“, which you can watch on YouTube. Lucky old 21st Century you! This is a recording of some of Steve’s students answering the question and sharing their personal learning networks with us. There’s an interesting range of ideas and technologies in use so it’s well worth a look. Steve runs a Twitter wall in his classroom and advertises the hashtag for a given session so questions, challenges and comments go out on to that board and that allows Steve to see it but also retweet it to his followers, to allow the exponential explosion that we would want in a personal learning network. Students accessed when they harness the tools they need to solve their problems.

Steve showed us a picture of about 10,000 Germans taking pictures of the then-Presidential Elect Barack Obama because he was speaking in Berlin and it was a historical moment that people wanted to share with other people. This is an example of the ubiquitous connection that we now enjoy and, in many ways, take for granted. It is a new way of thinking and it causes a lot of concern for people who want to stick to previous methods. (There will come a time when a paper exam for memorised definitions will make no sense because people have computers connected to their eyes – so let’s look at asking questions in ways that always require people to actually use their brains, shall we.) Steve then showed us a picture of students “taking notes” by taking pictures of the whiteboard: something that we are all very accustomed to now. Yes, some teachers are bothered by this but why? What is wrong with instantaneous capture versus turning a student into a slow organic photocopying machine? Let’s go to a Papert quote!

I am convinced that heh best learning takes place when the learner takes charge,” Seymour Papert

We learn by doing“, Piaget, 1960

We learn by making“, Papert, 1960.

Steve alluded to constructionist theory and pointed out how much we have to learn about learning by making. He, like many of us, doesn’t subscribe to generational or digital native/immigrant theory. It’s an easy way of thinking but it really gets in the way, especially when it makes teachers fearful of weighing in because they feel that their students know more than they do. Yes, they might, but there is no grand generational guarantee. It’s not about your age, it’s about your context. It’s about how we use the technology, it’s not about who we are and some immutable characteristics that define us as in or out. (WTF does not, for the record, mean “Welcome to Facebook”. Sorry, people.) There will be cultural differences but we are, very much, all in this together.

Steve showed us a second video, on the Future of Publishing, which you can watch again! Some of you will find it confronting that Gaga beats Gandhi but cultures change and evolve  and you need to watch to the end of the video because it’s really rather clever. Don’t stop halfway through! As Steve notes, it’s about perception and, as I’ve noted before, I’m pretty sure that people put people into the categories that they were already thinking about – it’s one of the reasons I have such a strong interest in grounded theory. If you have a “Young bad” idea in your head then everything you see will tend to confirm this. Perception and preconception can heavily interfere with each other but using perception, and being open to change, is almost always a better idea.

Steve talked about Csíkszentmihályi’s Flow, the zone you’re in when the level of challenge roughly matches your level of skill and you balance anxiety and boredom. Then, for maximum Nick points, he got onto Vygotsky’s  Zone of Proximal Development, where we build knowledge better and make leaps when we do it with other people, using the knowledgable other to scaffold the learning. Steve also talked about mashing them up, and I draw the reader back to something I wrote on this a whole ago on Repenning’s work.

We can do a lot of things with computers but we don’t have to do all the things that we used to do and slavishly translate them across to the new platform. Waters (2011) talks about new learners: learners who are more self-directed and able to make more and hence learn more.

There are many digital literacies: social networking, privacy management, identity management, creating content, organising content, reusing and repurposing, filtering and selection, self presentation, transliteracy (using any platform to get your ideas across). We build skills, that become competencies, that become literacies and, finally, potentially become masteries.

Steve finished with in discussing the transportability of skills using driving in the UK and the US as an example. The skill is pretty much the same but safe driving requires a new literacy when you make a large contextual change. Digital environments can be alien environments so you need to be able to take the skills that you have now and be able to put them into the new contexts. How do you know that THIS IS SHOUTING?  It’s a digital literacy.

Steve presented a quote from Socrates, no, Socrates, no, Plato:

Knowledge that is acquired under compulsion obtains no hold on the mind.

and used the rather delightful neologism “Darwikianism” to illustrate evolving improvement on on-line materials over time. (And illustrated it with humour and pictures.) Great talk with a lot of content! Now I have to go and work on my personal learning network!

Vatsoc

This is not actually Socrates. Sorry!


Thanks for the exam – now I can’t help you.

I have just finished marking a pile of examinations from a course that I co-taught recently. I haven’t finalised the marks but, overall, I’m not unhappy with the majority of the results. Interestingly, and not overly surprisingly, one of the best answered sections of the exam was based on a challenging essay question I set as an assignment. The question spans many aspects of the course and requires the student to think about their answer and link the knowledge – which most did very well. As I said, not a surprise but a good reinforcement that you don’t have to drill students in what to say in the exam, but covering the requisite knowledge and practising the right skills is often helpful.

However, I don’t much like marking exams and it doesn’t come down to the time involved, the generally dull nature of the task or the repetitive strain injury from wielding a red pen in anger, it comes down to the fact that, most of the time, I am marking the student’s work at a time when I can no longer help him or her. Like most exams at my Uni, this was the terminal examination for the course, worth a substantial amount of the final marks, and was taken some weeks after teaching finished. So what this means is that any areas I identify for a given student cannot now be corrected, unless the student chooses to read my notes in the exam paper or come to see me. (Given that this campus is international, that’s trickier but not impossible thanks to the Wonders of Skypenology.) It took me a long time to work out exactly why I didn’t like marking, but when I did, the answer was obvious.

I was frustrated that I couldn’t actually do my job at one of the most important points: when lack of comprehension is clearly identified. If I ask someone a question in the classroom, on-line or wherever, and they give me an answer that’s not quite right, or right off base, then we can talk about it and I can correct the misunderstanding. My job, after all, is not actually passing or failing students – it’s about knowledge, the conveyance, construction and quality management thereof. My frustration during exam marking increases with every incomplete or incorrect answer I read, which illustrates that there is a section of the course that someone didn’t get. I get up in the morning with the clear intention of being helpful towards students and, when it really matters, all I can do is mark up bits of paper in red ink.

Quickly, Jones! Construct a valid knowledge framework! You're in a group environment! Vygotsky, man, Vygotsky!

Quickly, Jones! Construct a valid knowledge framework! You’re in a group environment! Vygotsky, man, Vygotsky!

A student who, despite my sweeping, and seeping, liquid red ink of doom, manages to get a 50 Passing grade will not do the course again – yet this mark pretty clearly indicates that roughly half of the comprehension or participation required was not carried out to the required standard. Miraculously, it doesn’t matter which half of the course the student ‘gets’, they are still deemed to have attained the knowledge. (An interesting point to ponder, especially when you consider that my colleagues in Medicine define a Pass at a much higher level and in far more complicated ways than a numerical 50%, to my eternal peace of mind when I visit a doctor!) Yet their exam will still probably have caused me at least some gnashing of teeth because of points missed, pointless misstatement of the question text, obscure song lyrics, apologies for lack of preparation and the occasional actual fact that has peregrinated from the place where it could have attained marks to a place where it will be left out in the desert to die, bereft of the life-giving context that would save it from such an awful fate.

Should we move the exams earlier and then use this to guide the focus areas for assessment in order to determine the most improvement and develop knowledge in the areas in most need? Should we abandon exams entirely and move to a continuous-assessment competency based system, where there are skills and knowledge that must be demonstrated correctly and are practised until this is achieved? We are suffering, as so many people have observed before, from overloading the requirement to grade and classify our students into neatly discretised performance boxes onto a system that ultimately seeks to identify whether these students have achieved the knowledge levels necessary to be deemed to have achieved the course objectives. Should we separate competency and performance completely? I have sketchy ideas as to how this might work but none that survive under the blow-torches of GPA requirements and resource constraints.

Obviously, continuous assessment (practicals, reports, quizzes and so on) throughout the semester provide a very valuable way to identify problems but this requires good, and thorough, course design and an awareness that this is your intent. Are we premature in treating the exam as a closing-off line on the course? Do we work on that the same way that we do any assignment? You get feedback, a mark and then more work to follow-up? If we threw resourcing to the wind, could we have a 1-2 week intensive pre-semester program that specifically addressed those issues that students failed to grasp on their first pass? Congratulations, you got 80%, but that means that there’s 20% of the course that we need to clarify? (Those who got 100% I’ll pay to come back and tutor, because I like to keep cohorts together and I doubt I’ll need to do that very often.)

There are no easy answers here and shooting down these situations is very much in the fish/barrel plane, I realise, but it is a very deeply felt form of frustration that I am seeing the most work that any student is likely to put in but I cannot now fix the problems that I see. All I can do is mark it in red ink with an annotation that the vast majority will never see (unless they receive the grade of 44, 49, 64, 74 or 84, which are all threshold-1 markers for us).

Ah well, I hope to have more time in 2013 so maybe I can mull on this some more and come up with something that is better but still workable.


Thinking about teaching spaces: if you’re a lecturer, shouldn’t you be lecturing?

I was reading a comment on a philosophical post the other day and someone wrote this rather snarky line:

He’s is a philosopher in the same way that (celebrity historian) is a historian – he’s somehow got the job description and uses it to repeat the prejudices of his paymasters, flattering them into thinking that what they believe isn’t, somehow, ludicrous. (Grangousier, Metafilter article 123174)

Rather harsh words in many respects and it’s my alteration of the (celebrity historian)’s name, not his, as I feel that his comments are mildy unfair. However, the point is interesting, as a reflection upon the importance of job title in our society, especially when it comes to the weighted authority of your words. From January the 1st, I will be a senior lecturer at an Australian University and that is perceived differently where I am. If I am in the US, I reinterpret this title into their system, namely as a tenured Associate Professor, because that’s the equivalent of what I am – the term ‘lecturer’ doesn’t clearly translate without causing problems, not even dealing with the fact that more lecturers in Australia have PhDs, where many lecturers in the US do not. But this post isn’t about how people necessarily see our job descriptions, it’s very much about how we use them.

In many respects, the title ‘lecturer’ is rather confusing because it appears, like builder, nurse or pilot, to contain the verb of one’s practice. One of the big changes in education has been the steady acceptance of constructivism, where the learners have an active role in the construction of knowledge and we are facilitating learning, in many ways, to a greater extent than we are teaching. This does not mean that teachers shouldn’t teach, because this is far more generic than the binding of lecturers to lecturing, but it does challenge the mental image that pops up when we think about teaching.

If I asked you to visualise a classroom situation, what would you think of? What facilities are there? Where are the students? Where is the teacher? What resources are around the room, on the desks, on the walls? How big is it?

Take a minute to do just this and make some brief notes as to what was in there. Then come back here.

It’s okay, I’ll still be here!

Read the rest of this entry »


Vitamin Ed: Can It Be Extracted?

Mmm. Taste the learnination.

Mmm. Taste the learnination.

There are a couple of ways to enjoy a healthy, balanced diet. The first is to actually eat a healthy, balanced diet made up from fresh produce across the range of sources, which requires you to prepare and cook foods, often changing how you eat depending on the season to maximise the benefit. The second is to eat whatever you dang well like and then use an array of supplements, vitamins, treatments and snake oil to try and beat your diet of monster burgers and gorilla dogs into something that will not kill you in 20 years. If you’ve ever bothered to look on the side of those supplements, vitamins, minerals or whatever, that most people have in their ‘medicine’ cabinets, you might see statements like “does not substitute for a balanced diet” or nice disclaimers like that. There is, of course, a reason for that. While we can be fairly certain about a range of deficiency disorders in humans, and we can prevent these problems with selective replacement, many other conditions are not as clear cut – if you eat a range of produce which contains the things that we know we need, you’re probably getting a slew of things that we also need but don’t make themselves as prominent.

In terms of our diet, while the debate rages about precisely which diet humans should be eating, we can have a fairly good stab at a sound basis from a dietician’s perspective built out of actual food. Recreating that from raw sugars, protein, vitamin and mineral supplements is technically possible but (a) much harder to manage and (b) nowhere near as satisfying as eating the real food, in most cases. Let’s nor forget that very few of us in the western world are so distant from our food that we regard it purely as fuel, with no regard for its presentation, flavour or appeal. In fact, most of us could muster a grimace for the thought of someone telling us to eat something because it was good for us or for some real or imagined medical benefit. In terms of human nutrition, we have the known components that we have to eat (sugars, proteins, fats…) and we can identify specific vitamins and minerals that we need to balance to enjoy good health, yet there is not shortage of additional supplements that we also take out of concern for our health that may have little or no demonstrated benefit, yet still we take them.

There’s been a lot of work done in trying to establish an evidence base for medical supplements and far more of the supplements fail than pass this test. Willow bark, an old remedy for pain relief, has been found to have a reliable effect because it has a chemical basis for working – evidence demonstrated that and now we have aspirin. Homeopathic memory water? There’s no reliable evidence for this working. Does this mean it won’t work? Well, here we get into the placebo effect and this is where things get really complicated because we now have the notion that we have a set of replacements that will work for our diet or health because they contain useful chemicals, and a set of solutions that work because we believe in them.

When we look at education, where it’s successful, we see a lot of techniques being mixed in together in a ‘natural’ diet of knowledge construction and learning. Face-to-face and teamwork, sitting side-by-side with formative and summative assessment, as part of discussions or ongoing dialogues, whether physical or on-line. Exactly which parts of these constitute the “balanced” educational diet? We already know that a lecture, by itself, is not a complete educational experience, in the same way that a stand-alone multiple-choice question test will not make you a scholar. There is a great deal of work being done to establish an evidence basis for exactly which bits work but, as MIT said in the OCW release, these components do not make up a course. In dietary terms, it might be raw fuel but is it a desirable meal? Not yet, most likely.

Now let’s get into the placebo side of the equation, where students may react positively to something just because it’s a change, not because it’s necessarily a good change. We can control for these effects, if we’re cautious, and we can do it with full knowledge of the students but I’m very wary of any dependency upon the placebo effect, especially when it’s prefaced with “and the students loved it”. Sorry, students, but I don’t only (or even predominantly) care if you loved it, I care if you performed significantly better, attended more, engaged more, retaining the information for longer, could achieve more, and all of these things can only be measured when we take the trouble to establish base lines, construct experiments, measure things, analyse with care and then think about the outcomes.

My major concern about the whole MOOC discussion is not whether MOOCs are good or bad, it’s more to do with:

  • What does everyone mean when they say MOOC? (Because there’s variation in what people identify as the components)
  • Are we building a balanced diet or are we constructing a sustenance program with carefully balanced supplements that might miss something we don’t yet value?
  • Have we extracted the essential Vitamin Ed from the ‘real’ experience?
  • Can we synthesise Vitamin Ed outside of the ‘real’ educational experience?

I’ve been searching for a terminological separation that allows me to separate ‘real’/’conventional’ learning experiences from ‘virtual’/’new generation’/’MOOC’ experiences and none of those distinctions are satisfying – one says “Restaurant meal” and the other says “Army ration pack” to me, emphasising the separation. Worse, my fear is that a lot of people don’t regard MOOC as ever really having Vitamin Ed inside, as the MIT President clearly believed back in 2001.

I suspect that my search for Vitamin Ed starts from a flawed basis, because it assumes a single silver bullet if we take a literal meaning of the term, so let me me spread the concept out a bit to label Vitamin Ed as the essential educational components that define a good learning and teaching experience. Calling it Vitamin Ed gives me a flag to wave and an analogue to use, to explain why we should be seeking a balanced diet for all of our students, rather than a banquet for one and dog food for the other.


“We are not providing an MIT education on the web…”

I’ve been re-watching some older announcements that describe open courseware initiatives, starting from one of the biggest, the MIT announcement of their OpenCourseWare (OCW) initiative in April, 2001. The title of this post actually comes from the video, around the 5:20 mark, (Video quoted under a CC-BY-NC-SA licence, more information available at: http://ocw.mit.edu/terms)

“Let me be very clear, we are not providing an MIT education on the Web. We are, however, providing core materials that are the infrastructure that undergirds that information. Real education, in our view, involves interaction between people. It’s the interaction between faculty and students, in our classrooms and our living group, in our laboratories that are the heart, the real essence, of an MIT education. “

While the OCW was going to be produced and used on campus, the development of OCW was seen as something that would make more time available for student interaction, not less. President Vest then goes on to confidently predict that OCW will not make any difference to enrolment, which is hardly surprising given that he has categorically excluded anyone from achieving an MIT education unless they enrol. We see here exactly the same discussion that keeps coming up: these materials can be used as augmenting materials in these conventional universities but can never, in the view of the President or Vice Chancellor, replace the actual experience of obtaining a degree from that institution.

Now, don’t get me wrong. I still think that the OCW initiative was excellent, generous and visionary but we are still looking at two fundamentally different use cases: the use of OCW to augment an existing experience and the use of OCW to bootstrap a completely new experience, which is not of the same order. It’s a discussion that we keep having – what happens to my Uni if I use EdX courses from another institution? Well, ok, let’s ask that question differently. I will look at this from two sides with the introduction of a new skill and knowledge area that becomes ubiquitous,  in my sphere, Computer Science and programming. Let’s look at this in terms of growth and success.

What happens if schools start teaching programming to first year level? 

Let’s say that we get programming into every single national curriculum for secondary school and we can guarantee that students come in knowing how to program to freshman level. There are two ways of looking at this and the first, which we have probably all seen to some degree, is to regard the school teaching as inferior and re-teach it. The net result of this will be bored students, low engagement and we will be wasting our time. The second, far more productive, approach is to say “Great! You can program. Now let’s do some Computer Science.” and we use that extra year or so to increase our discipline knowledge or put breadth courses back in so our students come out a little more well-rounded. What’s the difference between students learning it from school before they come to us, or through an EdX course on fundamental programming after they come to us?

Not much, really, as long as we make sure that the course meets our requirements – and, in fact, it gives us bricks-and-mortar-bound entities more time to do all that face-to-face interactive University stuff that we know students love and from which they derive great benefit. University stops being semi-vocational in some aspects and we leap into knowledge construction, idea generation, big projects and the grand dreams that we always talk about, yet often don’t get to because we have to train people in basic programming, drafting, and so on. Do we give them course credit? No, because they’re assumed knowledge, or barrier tested, and they’re not necessarily part of our structure anymore.

What happens if no-one wants to take my course anymore?

Now, we know that we can change our courses because we’ve done it so many times before over the history of the Academy – Latin, along with Greek the language of scholarship, was only used in half of the University publications of 1800. Let me wander through a classical garden for a moment to discuss the nature of change from a different angle, that of decline. Languages had a special place in the degrees of my University with Latin and Greek dominating and then with the daring possibility of allowing substitution of French or German for Latin or Greek from 1938. It was as recently as 1958 that Latin stopped being compulsory for high school graduation in Adelaide although it was still required for the study of Law – student demand for Latin at school therefore plummeted and Latin courses started being dropped from the school curriculum. The Law Latin requirement was removed around 1969-1970, which then dropped any demand for Latin even further. The reduction in the number of school teachers who could teach Latin required the introduction of courses at the University for students who had studied no Latin at all – Latin IA entered the syllabus. However, given that in 2007 only one student at all of the schools across the state of South Australian (roughly 1.2-1.4 million people) studied Latin in the final year of school, it is apparent that if this University wishes to teach Latin, it has to start by teaching all of Latin. This is a course, and a discipline, that is currently in decline. My fear is that, one day, someone will make the mistake of thinking that we no longer need scholars of this language. And that worries me, because I don’t know what people 30 years from now will actually want, or what they could add to the knowledge that we already have of one of our most influential civilisations.

This decline is not unique to Latin (or Greek, or classics in general) but a truly on-line course experience would allow us to actually pool those scholars we have left and offer scaled resources out for much longer than isolated pockets in real offices can potentially manage but, as President Vest notes, a storehouse of Latin texts does not a course make. What reduced the demand for Latin? Possibly the ubiquity of the language that we use which is derived from Latin combined with a change of focus away from a classical education towards a more job- and achievement-oriented (semi-vocational) style of education. If you ask me, programming could as easily go this way in about 20 years, once we have ways to let machines solve problems for us. A move towards a less go-go-go culture, smarter machines and a resurgence of the long leisure cycles associated with Science Fiction visions of the future and suddenly it is the engineers and the computer scientists who are looking at shrinking departments and no support in the schools. Let me be blunt: course popularity and desirability rises, stabilises and falls, and it’s very hard to tell if we are looking at a parabola or a pendulum. With that in mind, we should be very careful about how we define our traditions and our conventions, especially as our cunning tools for supporting on-line learning and teaching get better and better. Yes, interaction is an essential part of a good education, no argument at all, but there is an implicit assumption of critical mass that we have seen, time and again, to implicitly support this interaction in a face-to-face environment that is as much a function of popularity and traditionally-associated prestige as it is of excellence.

What are MIT doing now?

I look at the original OCW release and I agree that, at time of production, you could not reproduce the interaction between people that would give you an MIT education. But our tools are better now. They are, quite probably not close enough yet to give you an “MIT of the Internet” but should this be our goal? Not the production of a facsimile of the core materials that might, with MIT instructors, turn into a course, but the commitment to developing the tools that actually reproduce the successful components of the learning experience with group and personal interaction, allowing the formation of what we used to call a physical interactive experience in a virtual side? That’s where I think the new MIT initiatives are showing us how these things can work now, starting from their original idealistic roots and adding the technology of the 21st Century. I hope that other, equally prestigious, institutions are watching this, carefully.


Leading the Innovation Charge: Research and Teachers (NESTA Report on Digital Education)

I’m currently reading the NESTA report “Decoding Learning: The Proof, Promise and Potential of Digital Education” and the report talks about ways of learning with technology and sources of innovation. At the start, in scene setting, the two sources of innovation are identified as being either research efforts that were based on large amount of gathered evidence (research-led) and informal literature such as blogs and teacher networks (teacher-led) – which means, woohoo, if anyone does anything based on what I’ve written in here, it’s a teacher-led innovation. (I realise that there is argument for overlap in here but it appears that formal research publication denotes the division and it appears that there was no reason why a teacher-led initiative couldn’t be high quality if it was still evidence-based, even if there was no strict formal publication.)

Looking across the world, the report started with 210 cases that were either research- or teacher-led and narrowed this down to a representative sample of 150. What’s interesting, to me, is the split by country between research- and teacher-led projects. The US has 65 ‘innovations’, 28 teacher-led, 37 research-led. The UK has 64, 45 teacher-led, 19 research. Australia has 9, all of which are teacher-led. Outside of the UK and Australia, the most likely approach to educational innovation is through a research-based approach. It appears that our relationship to the UK educational system may be even closer than we thought in this respect. However, to look in more detail at these innovations, we have to look at the breakdown of that ways that we see students learning with technology. The learning themes in this document are:

  • Learning from Experts
  • Learning with Others
  • Learning through Making
  • Learning through Exploring
  • Learning through Inquiry
  • Learning through Practising
  • Learning from Assessment
  • Learning in and from Settings

Most of these are pretty self-explanatory (and highly constructivist, unsurprisingly) but they are based on the learners’ actions and include factors such as the resources employed and the structure – which gives a greater potential depth to the classification as you can’t just say you’re doing X, you have to support it with technological resources and learning design.

A very important point raised early on in the teacher-driven, research-driven dichotomy is that the requirement for large volumes of evidence, in the case of research publication, can have a tendency to make the research-led initiatives more risk averse, in that much more information has to be gathered before recommendations can be adopted or conclusions can be drawn. The teacher-led initiatives can highlight serious innovations that are worth trying, but may not yet have the evidence behind them to actually provide a convincing argument. What a dilemma! I can either have evidence for something that I probably already thought of or take a chance on something for which I have no evidence – and in the world of technology, where innovation often costs money, good luck getting a solid amount of cash with a good feeling about an innovation direction. I need to go and look further in the case of Australia, because I know a great number of excellent educational researchers here who are, as far as I know, proposing solid research-led innovations but they aren’t showing up on this particular radar. And, being cynical, if it’s not showing up on NESTA’s radar, it’s probably not showing up at the government level and, hearts and minds, we want the government to be aware that the research approaches (often University-driven) are visible, viable and valuable. (Another thing for the to-do list, apart from finding alliterative phrases starting with ‘x’.)

In looking at the themes, I find it interesting to think about how these themes are both guidelines of good practice and cautionary tales. When set up technology that enables us to Learn from Experts, which is one of the potential underlying principles of the MOOC, we have to make sure that we’re actually providing experts. There’s an interesting example of the statistics expert who tore about an on-line stats course and, while it was rapidly corrected, we have that slight worry that the power to set up a course in no way correlates with your ability to actually provide the course information. Of course, I’m not a trained teacher but my qualification in my academic discipline and prior industry experience does provide me with a level of expected expertise in an area. I’m not allowed to get out in front of students unless I reach a certain bar of qualification – but that is most certainly not always the case. Suddenly the technology innovation theme “Learning from Experts” becomes the source of a philosophical reflection on how we are doing this at all – do we even refer to experts in innovation, education or the discipline? If we want a combination of these, how does it work? As noted in the report, it’s not just access to the expert that learners need, it’s the supporting dialogue between them that assists in knowledge construction and learning. How can innovation in technology support this new dialogue in a way that works?

The future is not just about the provision of information; we solved that problem in the first instance with the book, refined it with the library and then did … something … with it when we developed Wikipedia (all joking aside, on-line resources have added immediacy and ubiquity to the information provision solution). The future is about successful learning, which involves the development of knowledge, and thus involves the arrangement, storage, organisation, retrieval, and development of information in order to support that newly constructed knowledge. There’s a lot of scope for the development of innovative technological tools in this space but, as the report clearly indicates through its themes, this involves thinking about how we learn, how we’re going to learn and how the tech can help us to achieve it.

There’s still a lot of research- and teacher-led innovation to come, which is great because we all love a challenge, but I’d like to finish by noting what is not one of the key themes from the NESTA report. There is no “Learning from watching dull videos of uninteresting material presented with the least effort possible, because that’s how it’s always been done” because this is, quite simply, not innovative. We already know how well that works and that’s why we have to innovate now. Viva the glorious fusion of cutting edge innovation and sufficient evidence to allow us to leap off the metaphorical cliff!

Oh good, it's Monday.(Photo by John Moore/Getty Images)

Oh good, it’s Monday.
(Photo by John Moore/Getty Images)


A Difficult Argument: Can We Accept “Academic Freedom” In Defence of Poor Teaching?

Let me frame this very carefully, because I realise that I am on very, very volatile ground with any discussion that raises the spectre of a right or a wrong way of teaching. The educational literature is equally careful about this and, very sensibly, you read about rates of transfer, load issues, qualitative aspects and quantitative outcomes, without any hard and fast statements such as “You must never lecture again!” or “You must use formative assessment or bees will consume your people!”

Not even your marching bands will be safe!

I am aware, however, that we are seeing a split between those people who accept that educational research has something to tell them, which may possibly override personal experience or industry requirement, and those who don’t. But, and let me tread very carefully indeed, while those of us who accept that the traditional lecture is not always the right approach realise that the odd lecture (or even entire course of lectures) won’t hurt our students, there is far more damaging and fundamental disagreement.

Does education transform in the majority of cases or are most students ‘set’ by the time that they come to us?

This is a key question because it affects how we deal with our students. If there are ‘good’ and ‘bad’ students, ‘smart’ and ‘dumb’ or ‘hardworking’ and ‘lazy’, and this is something that is an immutable characteristic, then a lot of what we are doing in order to engage students, to assist them in constructing knowledge and placing into them collaborative environments, is a waste of their time. They will either get it (if they’re smart and hardworking) or they won’t. Putting a brick next to a bee doesn’t double your honey-making capacity or your ability to build houses. Except, of course, that students are not bees or bricks. In fact, there appears to be a vast amount of evidence that says that such collaborative activities, if set up correctly in accordance with the established work in social constructivism and cognitive apprenticeship, will actually have the desired effect and you will see positive transformations in students who take part.

However, there are still many activities and teachers who continue to treat students as if they are always going to be bricks or bees. Why does this matter? Let me digress for a moment.

I don’t care if vampires, werewolves or zombies actually exist or not and, for the majority of my life, it is unlikely to make any difference to me. However, if someone else is convinced that she is a vampire and she attacks me and drain my blood, I am just as dead as if she were not a vampire – of course, I now will not rise from the dead but this is of little import to me. What matters is the impact upon me because of someone else’s practice of their beliefs.

If someone strongly believes that students are either ‘smart enough’ to take their courses or not, they don’t care who fails or how many, and that it is purely the role of the student to have or to spontaneously develop this characteristic then their impact will likely be high enough to have a negative impact on at least some students. We know about stereotype threat. We’re aware of inherent bias. In this case, we’re no longer talking about right or wrong teaching (thank goodness), we’re talking about a fundamentally self-fulfilling prophecy as a teaching philosophy. This will have as great an impact to those who fail or withdraw as the transformation pathway does to those who become better students and develop.

It is, I believe, almost never about the bright light of our most stellar successes. Perhaps we should always be held to answer (or at least explain) for the number and nature of those who fall away. I have been looking for statements of student rights across Australia and the Higher Education sites all seem to talk about ‘fair assessment’ and ‘right of appeal’, as well as all of the student responsibilities. The ACARA (Australian Curriculum and Reporting Authority) website talks a lot about opportunities and student needs in schools. What I haven’t yet found is something that I would like to see, along these lines:

“Educational is transformational. Students are entitled to be assessed on their own performance, in the context of their opportunities.”

Curve grading, which I’ve discussed before, immediately forces a false division of students into good and bad, merely by ‘better’ students existing. It is hard to think of something that is fundamentally less fair or appropriate to the task if we accept that our goal is improvement to a higher standard, regardless of where people start. In a curve graded system, the ‘best’ person can coast because all they have to do is stay one step ahead of their competition and natural alignment and inflation will do the rest. This is not the motivational framework that we wish to establish, especially when the lowest realise that all is lost.

I am a long distance runner and my performances will never set the world on fire. To come first in a race, I would have to be in a small race with very unfit people. But no-one can take away my actual times for my marathons and it is those times that have been used to allow me to enter other events. You’ll note that in the Olympics, too. Qualifying times are what are used because relative performance does not actually establish any set level of quality. The final race? Yes, we’ve established competitiveness and ranking becomes more important – but then again, entering the final heat of an Olympic race is an Olympian achievement. Let’s not quibble on this, because this is the equivalent of Nobel and Turing awards.

And here is the problem again. If I believe that education is transformative and set up all of my classes with collaborative work, intrinsic motivation and activities to develop self-regulation, then that’s great but what if it’s in third-year? If the ‘students were too dumb to get it’ people stand between me and my students for the first two years then I will have lost a great number of possibly good students by this stage – not to mention the fact that the ones who get through may need some serious de-programming.

Is it an acceptable excuse that another academic should be free to do what they want, if what they want to do is having an excluding and detrimental effect on students? Can we accept that if it means that we have to swallow that philosophy? If I do, does it make me complicit? I would like nothing more than to let people do what they want, hey, I like that as much as the next person, but in thinking about the effect of some decisions being made, is the notion of personal freedom in what is ultimately a public service role still a sufficiently good argument for not changing practice?