Putting it all together – discussing curriculum with students
Posted: August 15, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, collaboration, community, curriculum, education, educational problem, educational research, ethics, grand challenge, higher education, in the student's head, popeye, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, wimpy, workload Leave a commentOne of the nice things about my new grand challenges course is that the lecture slots are a pre-reading based discussion of the grand challenges in my discipline (Computer Science), based on the National Science Foundation’s Taskforce report. Talking through this with students allows us to identify the strengths of the document and, perhaps more interestingly, some of its shortfalls. For example, there is much discussion on inter-disciplinary and international collaboration as being vital, followed by statements along the lines of “We must regain the ascendancy in the discipline that we invented!” because the NSF is, first and foremost, a US-funded organisation. There’s talk about providing the funds for sustainability and then identifying the NSF as the organisation giving the money, and hence calling the shots.
The areas of challenge are clearly laid out, as are the often conflicting issues surrounding the administration of these kinds of initiative. Too often, we see people talking about some amazing international initiative – only to see it fail because nobody wants to go first, or no country/government wants to put money up that other people can draw on until everyone does it at the same time.
In essence, this is a timing and trust problem. If we may quote Wimpy from the Popeye cartoons:
The NSF document lays bare the problem we always have: those who have the hamburgers are happy to talk about sharing the meal but there are bills to be paid. The person who owns the hamburger stand is going to have words with you if you give everything away with nothing to show in return except a promise of payment on Tuesday.
Having covered what the NSF considered important in terms of preparing us for the heavily computerised and computational future, my students finished with a discussion of educational issues and virtual organisations. The educational issues were extremely interesting because, having looked at the NSF Taskforce report, we then looked at the ACM/IEEE 2013 Computer Science Strawman curriculum to see how many areas overlapped with the task force report. Then we looked at the current curriculum of our school, which is undergoing review at the moment but was last updated for the 2008 ACM/IEEE Curriculum.
What was pleasing was, rom the range of students, how many of the areas were being addressed throughout our course and how much overlap there was between the highlighted areas of the NSF Report and the Strawman. However, one of the key issues from the task force report was the notion of greater depth and breadth – an incredible challenge in the time-constrained curriculum implementations of the 21st century. Adding a new Knowledge Area (KA) to the Strawman of ‘Platform Dependant Computing’ reflects the rise of the embedded and mobile device yet, as the Strawman authors immediately admit, we start to make it harder and harder to fit everything into one course. Combine this with the NSF requirement for greater breadth, including scientific and mathematical aspects that have traditionally been outside of Computing, and their parallel requirement for the development of depth… and it’s not easy.
The lecture slot where we discussed this had no specific outcomes associated with it – it was a place to discuss the issues arising but also to explain to the students why their curriculum looks the way that it does. Yes, we’d love to bring in Aspect X but where does it fit? My GC students were looking at the Ethics aspects of the Strawman and wondered if we could fit Ethics into its own 3-unit course. (I suspect that’s at least partially my influence although I certainly didn’t suggest anything along these lines.) “That’s fine,” I said, “But what do we lose?”
In my discussions with these students, they’ve identified one of the core reasons that we changed teaching languages, but I’ve also been able to talk to them about how we think as we construct courses – they’ve also started to see the many drivers that we consider, which I believe helps them in working out how to give feedback that is the most useful form for us to turn their needs and wants into improvements or developments in the course. I don’t expect the students to understand the details and practice of pedagogy but, unless I given them a good framework, it’s going to be hard for them to communicate with me in a way that leads most directly to an improved result for both of us.
I’ve really enjoyed this process of discussion and it’s been highly rewarding, again I hope for both sides of the group, to be able to discuss things without the usual level of reactive and (often) selfish thinking that characterises these exchanges. I hope this means that we’re on the right track for this course and this program.
In A Student’s Head – Mine From 26 Years Ago
Posted: August 11, 2012 Filed under: Education | Tags: advocacy, ALTA, blogging, collaboration, community, curriculum, design, education, educational problem, educational research, feedback, games, higher education, in the student's head, principles of design, student perspective, teaching, teaching approaches, thinking Leave a commentI attended an Australian Council of Deans of ICT Learning and Teaching Academy event run by Elena Sitnikova from the University of South Australia. Elena is one of the (my fellow?) Fellows in ALTA and works in Cyber Security and Forensic Computing. Today’s focus was on discussing the issues in ICT education in Australia, based on the many surveys that have been run, presenting some early work on learning and teaching grants and providing workshops on “Improving learning and teaching practice in ICT Education” and “Developing Teamwork that Works!”. The day was great (with lots of familiar faces presenting a range of interesting topics) and the first workshop was run by Sue Wright, graduate school in Education, University of Melbourne. This, as always, was a highly rewarding event because Sue forced me to go back and think about myself as a student.
This is a very powerful technique and I’m going to outline it here, for those who haven’t done it for a while. Drawing on Bordieu’s ideas on social and cultural capital, Sue asked us to list our non-financial social assets and disadvantages when we first came to University. This included things like:
- Access to resources
- Physical appearance
- Educational background
- Life experiences
- Intellect and orientation to study
- Group membership
- Accent
- Anything else!
When you think about yourself in this way, you suddenly have to think about not only what you had, but what you didn’t have. What helped you stay in class?What meant that you didn’t show up? From a personal perspective, I had good friends and a great tan but I had very little life experience, a very poor study ethic, no real sense of consequences and a very poor support network in an academic sense. It really brought home how lucky I was to have a group of friends that kept me coming to University. Of course, in those pre-on-line days, you had to come to Uni to see your friends, so that was a good reason to keep people on campus – it allowed for you to learn things by bumping into a people, which I like to refer to as “Brownian Communication”.
This exercise made me think about my transition to being a successful student. In my case, it took more than one degree and a great deal more life experience before I was ready to come back and actually succeed. To be honest, if you looked at my base degree, you’d never have thought that I would make it all the way to a PhD and, yet, here I am, on a path where I am making a solid and positive difference.
Sue then reminded people of Hofstede’s work on cultural dimensions – power distance, individualism versus collectivism, and uncertainty avoidance. How do students work – do they need a large ‘respect gap’ between student and teacher? Do they put family before their own study? Do they do anything rather than explore the uncertain? It’s always worth remembering that, where “the other” exists for us, we exist as “the other” reciprocally. While it’s comfortable as white, culturally English and English speaking people to assume that “the other” is transgressing with respect to our ‘dominant’ culture, we may be asking people to do something that is incredibly uncomfortable and goes far beyond learning another language.
One of the workshop participants was born and grew up in Korea and he made the observation that, when he was growing up, the teacher was held at the same level of the King and your father – and you don’t question the King or your father! He also noted that, on occasion, ‘respect’ had to be directed towards teachers that they did not actually respect. He had one bad teacher and, in that class, the students asked no questions and just let the teacher talk. As someone who works within a very small power distance relationship with y students, I have almost never felt disrespected by anything that my students do, unless they are actively trying to be rude and disrespectful. If I have nobody following, or asking questions, then I always start to wonder if I’ve been tuned out and they are listening to the music in their heads. (Or on their iPhones, as it is the 21st Century!)
Australia is a low power distance/high individualism culture with a focus on the short-term in many respects (as evidence by profit and loss quarterly focus and, to be frank, recent political developments). Bringing people from a high PD/high collectivism culture, such as some of those found in South East Asia, will need some sort of management to ensure that we don’t accidentally split the class. It’s not enough to just say “These students do X” because we know that we can, with the right approach, integrate our student body. But it does take work.
As always, working with Sue (you never just listen to Sue, she always gets you working) was a very rewarding and reflective activity. I spent 20 minutes trying to learn enough about a colleague from UniSA, Sang, that I could answer questions about his life. While I was doing this, he was trying to become Nick. What emerged from this was how amazingly similar we actually are – different Unis, different degrees, different focus, one Anglo-origin, one Korean-origin – and it took us quite a while to find things where we were really so different that we could talk about the challenges if we had to take on each other’s lives.
It was great to see most of the Fellows again but, looking around a large room that wasn’t full to the brim, it reminded me that we are often talking to those people who already believe that what we’re doing is the right thing. The people that really needed to be here were the people who weren’t in the room.
I’m still thinking about how we can continue our work to reach out and bring more people into this very, very rewarding community.
Teaching Ethics in a Difficult World: Free Range and Battery Games
Posted: August 9, 2012 Filed under: Education | Tags: advocacy, blogging, community, education, educational problem, ethics, free range games, games, higher education, in the student's head, principles of design, teaching, teaching approaches, thinking, time banking, work/life balance, workload 2 Comments(Note, this is not a post about the existing game company, Free Range Games, although their stuff looks cool!)
I enjoy treating ethics or, to be more precise, getting the students to realise the ethical framework that they all live within. I’ve blogged before about this and how easy it is to find examples of unethical behaviour but, as we hear more stories about certain ‘game-related’ industries and the way that they teach testers, it becomes more and more apparent that we are reaching a point where the ethical burden of a piece of software may end up becoming something that we have to consider.
We’re already aware of the use of child labour in some products and people can make a decision not to shop at certain stores or buy certain products – but this requires awareness and tying the act to the brand.
In the areas I live in, it’s very hard to find a non-free range chicken, even in a chicken take-away shop (for various definitions of ‘free range’ but we pretty much do mean ‘neither battery nor force fed’) and eggs are clearly labelled. Does this matter to you? If so, you can make an informed decision. Doesn’t matter to you? Buy the cheapest or the tastiest or whichever other metric you’re using.
But what about games? You don’t have to look far (ea_spouse and the many other accounts available) to see that the Quality Assurance roles, vital to good games, are seeing a resurgence in the type of labour management that is rapidly approaching the Upton Sinclair Asymptote. Sinclair wrote a famous turn-of-the 20th Century novelisation of the conditions in the meat packing industry, “The Jungle”, that, apart from a rather dour appeal to socialism at the end, is an amazing read. It changed conditions and workers’ rights because it made these invisible people visible. Once again, as well apparently fall in love with the ‘wealth creators’ (an Australian term that is rapidly become synonymous with ‘robber baron’) all over again, we are approaching this despite knowing what the conditions are.
What I mean by this is that it is well known that large numbers of staff in the QA area in games tolerate terrible conditions – no job security, poor working conditions, malicious and incompetent management – and for what? To bring you a game. It’s not as if they are fighting to maintain democracy (or attack democracy, depending on what you consider to be more important) or staying up for days on end trying to bring the zombie infection under control. No, the people who are being forced into sweatboxes, occasionally made to work until they wet themselves, who are unceremoniously fired at ‘celebration’ events, are working to make sure that the people who wrote your game didn’t leave any unexplained holes in the map. Or that, when you hit a troll with an axe, it inflicts damage rather than spontaneously causing the NyanCat video to play on your phone.
This discussion of ethics completely ignores the ethics of computer games that demean or objectify women, glorify violence or any of the ongoing issues. Search for ethics of video games and it is violence and sexism that dominates the results. It’s only when you start searching for “employee abuse video game” that you start to get hits. Here are some quotes from one of them.
It seems as though the developers of L. A. Noire might have been under more pressure themselves than any of the interrogated criminals in their highly praised crime drama. Reports have surfaced about employees being forced to work excruciating hours, in some cases reaching 120 hour weeks and 22 hour days. In addition, a list has been generated of some 130 members of the Australian-based Team Bondi, the creators of L. A. Noire, whose names have been omitted from the game’s own credits.
…
On the subject of the unprecedented scope of the project for Australian developers, McNamara replied, “The expectation is slightly weird here, that you can do this stuff without killing yourself; well, you can’t, whether it’s in London or New York or wherever; you’re competing against the best people in the world at what they do, and you just have to be prepared to do what you have to do to compete against those people. The expectation is slightly different.”
The saddest thing, to me, is that everyone knows this. The same people who complain on my FB feed back how overworked they are and how little they see their family then go out and buy games that have been produced in electronic sweatshops. You didn’t buy L. A. Noire? Rockstar San Diego are on the “overworking staff” list for “Red Dead Redemption” and the “not crediting everyone” for “Manhunt 2”. (That last one might not be so bad!)
Everyone talks about the crunch as if it’s unavoidable. Well, yes , it is, if you intend to work people to the crunch. We’ve seen similar argument for feedlot meat production, battery animals and, let’s not forget, that there have always been “excellent” reasons for slavery in economic and social terms.
This is one of the hardest things to talk about to my students because they’re not dumb. They read, often more widely than I do in these areas. They know that for all my discussions of time management and ethics, if they get a certain kind of job they will work 7 days a week, 10-14 hours a day, in terrible conditions and maybe, just maybe, if they sell their soul enough they can get a full-time job, rather than being laid off indiscriminately. They know that the message coming down from these companies is “maximum profit, minimum spend” and, of course, most of these game companies aren’t profitable so that’s less about being mercenary and more about survival.
But, given that these products are not exactly… essential (forgive me, Deus Ex!), one has to wonder whether terms like ‘survival’ have any place in this discussion. Is it worth nearly killing people, destroying their social lives and so one, to bring a game to market? People often say “Well, they have a choice” and, in some ways, I suppose they do – but in an economic market where any job is better than job, and people can make decisions at 15 that lead to outcomes they didn’t expect at 25, this seems both ungenerous and thoughtless.
Perhaps we need the equivalent of a ‘Free Range/Organic’ movement for games: All programmers and QA people were officially certified to have had at least 8 hours sleep a night, with a minimum break of 50 hours every 6 days and were kept at a maximum density of 2 programmers per 15 square metres, in a temperature and humidity controlled environment that meets recognised comfort standards.
(Yeah, I didn’t include management. I think they’re probably mostly looking after themselves on that one. 🙂 )
Then you can choose. If it matters to you, buy 21st century Labour Force Games – Ethically and sustainably produced. If it doesn’t matter, ignore it and game on.
Silk Purses and Pig’s Ears
Posted: August 6, 2012 Filed under: Education | Tags: advocacy, authenticity, community, curriculum, design, education, educational problem, educational research, ethics, feedback, grand challenge, higher education, in the student's head, learning, principles of design, resources, student perspective, teaching, teaching approaches, thinking, winemaking 2 CommentsThere’s an old saying “You can’t make a silk purse out of a pig’s (or sow’s) ear”. It’s the old chestnut that you can’t make something good out of something bad and, when you’re talking about bad grapes or rotten wood, then it has some validity (but even then, not much, as I’ll note later). When it’s applied to people, for any of a large range of reasons, it tends to become an excuse to give up on people or a reason why a lack of success on somebody’s part cannot be traced back to you.
I’m doing a lot of reading in the medical and general ethics as part of my preparation for one of the Grand Challenge lectures. The usual names and experiments show up, of course, when you start looking at questionable or non-existent ethics: Milgram, the Nazis, Stanford Prison Experiment, Unit 731, Tuskegee Syphilis Experiment, Little Albert and David Reimer. What starts to come through from this study is that, in many of these cases, the people being experimented upon have reached a point in the experimenter’s eyes where they are not people, but merely ‘subjects’ – and all too often in the feudal sense as serfs, without rights or ability to challenge what is happening.
But even where the intention is, ostensibly, therapeutic, there is always the question of who is at fault when a therapeutic procedure fails to succeed. In the case of surgical malpractice or negligence, the cause is clear – the surgeon or a member of her or his team at some point made a poor decision or acted incorrectly and thus the fault lies with them. I have been reading up on early psychiatric techniques, as these are full of stories of questionable approaches that have been later discredited, and it is interesting in how easy it is for some practitioners to wash their hands of their subject because they had a lack of “good previous personality” – you can’t make a silk purse out of a pig’s ear. In many cases, with this damning judgement, people with psychiatric problems would often be shunted off to the wards of mental hospitals.
I refer, in this case, to William Sargant (1907-1988), a British psychiatrist who had an ‘evangelical zeal’ for psychosurgery, deep sleep treatment, electroconvulsive therapy (ECT) and insulin shock therapy. Sargant used narcosis extensively, drug induced deep sleep, as he could then carry out a range of procedures on the semi- and unconscious patients that they would have possibly learned to dread if they have received them while conscious. Sargant believed that anyone with psychological problems should be treated early and intensively with all available methods and, where possible, all these methods should be combined and applied as necessary. I am not a psychiatrist and I leave it to the psychiatric and psychotherapy community to assess the efficacy and suitability of Sargant’s methods (they disavow them, for the most part, for what it’s worth) but I mention him here because he did not regard failures as being his fault. It is his words that I am quoting in the previous paragraph. People for whom his radical, often discredited, zealous and occasionally lethal experimentation did not work were their own problem because they lacked a “good previous personality”. You cannot, as he was often quoted to have said, make a silk purse out of a pig’s ear.
How often I have heard similar ideas being expressed within the halls of academia and the corridors of schools. How easy a thing it is to say. Well, one might say, we’ve done all that we can with this particular pupil, but… They’re just not very bright. They daydream in class rather than filling out their worksheets. They sleep at their desks. They never do the reading. They show up too late. They won’t hang around after class. They ask too many questions. They don’t ask enough questions. They won’t use a pencil. They only use a pencil. They talk back. They don’t talk. They think they’re so special. Their kind never amounts to anything. They’re just like their parents. They’re just like the rest of them.
“We’ve done all we can but you can’t make a silk purse out of a sow’s ear.”
As always, we can look at each and every one of those problems and ask “Why?” and, maybe, we’ll get an answer that we can do something about. I realise that resources and time are both scarce commodities but, even if we can’t offer these students the pastoral care that they need (and most of those issues listed above are more likely to be social/behavioural than academic anyway), let us stop pretending that we can walk away, blameless, as Sargant did because these students are fundamentally unsalvageable.
Yeah, sorry, I know that I go on about this but it’s really important to keep on hammering away at this point, every time that I see how my own students could be exposed to it. They need to know that the man that they’re working with expects them to do things but that he understands how much of his job is turning complex things into knowledge forms that they can work with – even if all he does is start the process and then he hands it to them to finish.
Do you want to know how to make great wine? Start with really, really good grapes and then don’t mess it up. Want to know how to make good wine? Well, as someone who used to be a reasonable wine maker, you can give me just about anything – good fruit, ok fruit, bad fruit, mouldy fruit – and I could turn it into wine that you would happily drink. I hasten to point out that I worked for good wineries and the vast quantity of what I did was making good wine from good grapes, but there were always the moments where you had something that, from someone else’s lack of care or inattention, had got into a difficult spot. Understanding the chemical processes, the nature of wine and working out how we could recover the wine? That is a challenge. It’s time consuming, it takes effort, it takes a great deal of scholarly knowledge and you have to try things to see if they work.
In the case of wine, while I could produce perfectly reasonable wine from bad grapes, simple chemistry prevents me from leaving in enough of the components that could make a wine great. That is because wine recovery is all about taking bad things out. I see our challenge in education as very different. When we find someone who is need of our help, it is what we can put in that changes them. Because we are adding, mentoring, assisting and developing, we are not under the same restrictions as we are with wine – starting from anywhere, I should be able to help someone to become a great someone.
The pig’s ears are safe because I think that we can make silk purses out of just about anything that we set our minds to.
Wading In: No Time For Paddling
Posted: July 31, 2012 Filed under: Education | Tags: collaboration, community, curriculum, design, education, educational problem, feedback, Generation Why, grand challenge, grand challenges, higher education, in the student's head, principles of design, reflection, student perspective, teaching, teaching approaches, thinking, universal principles of design, work/life balance 1 CommentI’m up to my neck in books on visualisation and data analysis at the moment. So up to my neck that this post is going to be pretty short – and you know how much I love to talk! I’ve spent most of the evening preparing for tomorrow’s visualising data tutorial for Grand Challenges and one of the things I was looking for was bad visualisations. I took a lot away from Mark’s worked examples posts, and I look forward to seeing the presentation, but visualisation is a particularly rich area for worked ‘bad’ examples. With code, it has to work to a degree or manifest its failure in interesting ways. A graphic can be completely finished and still fail to convey the right information.
(I’ve even thrown in some graphics that I did myself and wasn’t happy with – I’m looking forward to the feedback on those!) (Ssh, don’t tell the students.)
I had the good fortune to be given a copy of Visual Strategies (Frankel and DePace) which was designed by one of the modern heroes of design – the amazing Stefan Sagmeister. This is, without too much hyperbole, pretty much the same as being given a book on painting where Schiele had provided the layout and examples. (I’m a very big fan of Egon Schiele and Hundertwasser for that matter. I may have spent a little too much time in Austria.) The thing I like about this book is that it brings a lot of important talking and thinking points together: which questions should you ask when thinking about your graphic, how do you start, what do you do next, when do you refine, when do you stop?
Thank you, again, Metropolis Bookstore on Swanston Street in Melbourne! You had no real reason to give a stranger a book for free, except that you thought it would be useful for my students. It was, it is, and I thank you again for your generosity.
I really enjoy getting into a new area and I think that the students are enjoying it too, as the entire course is a new area for them. We had an excellent discussion of the four chapters of reading (the NSF CyberInfrastructure report on Grand Challenges), where some of it was a critique of the report itself – don’t write a report saying “community engagement and visualisation are crucial” and (a) make it hard to read, even for people inside the community or (b) make it visually difficult to read.
On the slightly less enthusiastic front, we get to the crux of the course this week – the project selection – and I’m already seeing some hesitancy. Remember that these are all very good students but some of them are not comfortable picking an area to do their analysis in. There could be any number of reasons so, one on one, I’m going to ask them why. If any of them say “Well, I could if I wanted to but…” then I will expect them to go and do it. There’s a lot of scope for feedback in the course so an early decision that doesn’t quite work out is not a death sentence, although I think that waiting for permission to leap is going to reduce the amount of ownership and enjoyment that the student feels when the work is done.
I have no time for paddling in the shallows, personally, and I wade on in. I realise, however, that this is a very challenging stance for many people, especially students, so while I would prefer people to jump in, I recognise my job as life guard in this area and I am happy to help people out.
However, these students are the Distinction/High Distinction crowd, the ones who got 95-100 on leaving secondary school and, as we thought might occur, some of them are at least slightly conditioned to seek my approval, a blessing for their project choice before they have expended any effort. Time to talk to people and work with them to help them move on to a more confident and committed stance – where that confidence is well-placed and the commitment is based on solid fact and thoughtful reasoning!
The 1-Year Degree – what’s your reaction?
Posted: July 30, 2012 Filed under: Education | Tags: constructive student pedagogy, curriculum, education, educational problem, educational research, higher education, in the student's head, learning, principles of design, reflection, resources, student perspective, teaching, teaching approaches, thinking, tools, workload 2 CommentsI’m going to pose a question and I’d be interested in your reaction.
“Is there a structure and delivery mechanism that could produce a competent professional graduate from a degree course such as engineering or computer science, which takes place over a maximum of 12 months including all assessment, without sacrificing quality or content?”
What was your reaction? More importantly, what is the reasoning behind your reaction?
For what it’s worth my answer is “Not with our current structures but, apart from that, maybe.” which is why one of my side projects is an attempt to place an entire degree’s worth of work into a 12-month span as a practice exercise for discussing the second and third year curriculum review that we’re holding later on this year.
Our ‘standard’ estimate for any normal degree program is that a student is expected to have a per-semester load of four courses (at 3 units a course, long story) and each of these courses will require 156 hours from start to finish. (This is based on 10 hours per week, including contact and non-contact, and roughly 36 hours for revision towards examination or the completion of other projects.) Based on this estimate, and setting up an upper barrier of 40 hours/week, for all of the good research-based reasons that I’ve discussed previously, there is no way that I can just pick up the existing courses and drop them into a year. A three-year program has six semesters, with four courses per semester, which gives an hour burden of 24*156 = 3,744. At 40 hours per week, we’d need 93.6 weeks (let’s call that 94), or 1.8 years.
But, hang on, we already have courses that are 6-unit and span two semesters – in fact, we have enormous projects for degree programs like Honours that are worth the equivalent of four courses. Interestingly, rather than having an exam every semester, these have a set of summative and formative assignments embedded to allow the provision of feedback and the demonstration of knowledge and skill acquisition – does this remove the need to have 36 hours for exam study for each semester if we build the assignments correctly?
Let’s assume that it does. Now we have a terminal set of examinations at the end of each year, instead of every semester. Now I have 12 courses at 120 hours each and 12 at 156 hours each. Now we’re down to 3,312 – which is only 1.6 years. Dang. Still not there. But it’s ok, I can see all of you who have just asked “Well, why are you so keen on using examinations if you’re happy with summative assignments testing concepts as you go and then building in the expectation of this knowledge in later modules?” Let’s drop the exam requirement even further to a final set of professional level assessment criteria, carried out at the end of the degree to test high-level concepts and advanced skills. Now, of the 24 courses that a student sits, almost all assessment work has moved into continuous assessment mode, rich in feedback, with summative checkpoints and a final set of examinations as part of the four capstone courses at the end. This gives us 3,024 hours – about 1.45 years.
But this is also ignoring that the first week of many of these courses is required revision after some 6-18 weeks of inactivity as the students go away to summer break or home for various holidays. Let’s assume even further that, with the exception of the first four courses that they do, that we build this continuously so that skills and knowledge are reinforced as micro slides, scattered throughout the work, supported with recordings, podcasts, notes, guides and quick revision exercises in the assessment framework. Now I can slice maybe 5 hours off 20 of the courses (the last 20) – cutting me down by another 100 hours and that’s half a month saved, down to 1.4 years.
Of course, I’m ignoring a lot of issues here. I’m ignoring the time it takes someone to digest information but, having raised that, can you tell me exactly how long it takes a student to learn a new concept? This is a trick question as the answer generally depends upon the question “how are you teaching them?” We know that lectures are one of the worst ways to transfer information, with A/V displays, lectures and listening all having a retention rate less than 40%. If you’re not retaining, your chances of learning something are extremely low. At the same time, somewhere between 30-50% of the time that we’re allocating to those courses we already teach are spent in traditional lectures – at time of writing. We can improve retention (of both knowledge and students) when we use group work (50% and higher for knowledge) or get the students to practice (75%) or, even better, instruct someone else (up to 90%). If we can restructure the ’empty’ or ‘low transfer’ times into other activities that foster collaboration or constructive student pedagogy with a role transfer that allows students to instruct each other, then we can potentially greatly improve our usage of time.
If we use this notion and slice, say, 20 hours from each course because we can get rid of that many contact hours that we were wasting and get the same, if not better, results, we’re down to 2,444 hours, about 1.18 years. And I haven’t even started looking at the notion of concept alignment, where similar concepts are taught across two different concepts and could be put in one place, taught once, consistently and then built upon for the rest of the course. Suddenly, with the same concepts and a potentially improved educational design – we’re looking the 1-year degree in the face.
Now, there will be people who will say “Well, how does the student mature in this time? That’s only one year!” to which my response is “Well, how are you training them for maturity? Where are the developing exercises? The formative assessment based on careful scaffolding in societal development and intellectual advancement?” If the appeal of the three-year degree is that people will be 19-20 when they graduate, and this is seen as a good thing, then we solve this problem for the 1-year degree by waiting two years before they start!
Having said all of this, and believing that a high quality 1-year degree is possible, let me conclude by saying that I think that it is a terrible idea! University is more than a sequence of assessments and examinations, it is a culture, a place for intellectual exploration and the formation of bonds with like-minded friends. It is not a cram school to turn out a slightly shell-shocked engineer who has worked solidly, and without respite, for 52 weeks. However, my aim was never actually to run a course in a year, it was to see how I could restructure a course to be able to more easily modularise it, to break me out of the mental tyranny of a three- or four-year mandate and to focus on learning outcomes, educational design and sound pedagogy. The reason that I am working on this is so that I can produce a sound course structure with which students can engage, regardless of whether they are full-time or not, clearly outlining dependency and requirements. Yes, if we break this up into part-time, we need to add revision modules back in – but if we teach it intensively (or on-line) then those aren’t required. This is a way to give students choice and the freedom to come in at any age, with whatever time they have, but without sacrificing the quality of the underlying program. This is a bootstrap program for a developing nation, a quick entry point for people who had to go to work – this is making up for decades of declining enrolments in key areas.
This is going on a war footing against the forces of ignorance.
There are many successful “Open” universities that use similar approaches but I wanted to go through the exercise myself, to allow me the greatest level of intellectual freedom while looking at our curriculum review. Now, I feel that I can focus on Knowledge Areas for my specifics and on the program as a whole, freed of the binding assumption that there is an inevitable three-year grind ahead for any student. Perhaps one of the greatest benefits for me is the thought that, for students who can come to us for three years, I can put much, much more into the course if they have the time – and these things, of interest, regarding beauty, of intellectual pursuits, can replace some of the things that we’ve lost over the years in the last two decades of change in University.
The Early-Career Teacher
Posted: July 24, 2012 Filed under: Education, Opinion | Tags: advocacy, authenticity, blogging, community, curriculum, design, education, educational problem, educational research, ethics, herdsa, higher education, learning, principles of design, reflection, resources, teaching, teaching approaches, thinking, tools, universal principles of design, work/life balance, workload Leave a commentRecently, I mentioned the Australian Research Council (ARC) grant scheme, which recognises that people who have had their PhDs for less than five years are regarded as early-career researchers (ECRs). ECRs have a separate grant scheme (now, they used to have a different way of being dealt with in the grant application scheme) that recognises the fact that their track records, the number of publications and activity relative to opportunity, is going to be less than that of more seasoned individuals.
What is interesting about this is that someone who has just finished their PhD will have spent (at least) three years, more like four, doing research and, we hope, competent research under guidance for the last two of those years. So, having spent a couple of years doing research, we then accept that it can take up to five years for people to be recognised as being at the same level.
But, for the most part, there is no corresponding recognition of the early-career teacher, which is puzzling given that there is no requirement to meet any teaching standards or take part in any teaching activities at all before you are put out in front of a class. You do no (or are not required to do any) teaching during your PhD in Australia, yet we offer support and recognition of early status for the task that you HAVE been doing – and don’t have a way to recognise the need to build up your teaching.
We discussed ideas along these lines at a high-level meeting that I attended this morning and I brought up the early-career teacher (and mentoring program to support it) because someone had brought up a similar idea for researchers. Mentoring is very important, it was one of the big HERDSA messages and almost everywhere I go stresses this, and it’s no surprise that it’s proposed as a means to improve research but, given the realities of the modern Australian University where more of our budget comes from teaching than research, it is indicative of the inherent focus on research that I need to propose teaching-specific mentoring in reaction to research-specific mentoring, rather than vice versa.
However, there are successful general mentoring schemes where senior staff are paired with more junior staff to give them help with everything that they need and I quite like this because it stresses the nexus of teaching and research, which is supposed to be one of our focuses, and it also reduces the possibility of confusion and contradiction. But let’s return to the teaching focus.
The impact of an early-career teacher program would be quite interesting because, much as you might not encourage a very raw PhD to leap in with a grant application before there was enough supporting track record, you might have to restrict the teaching activities of ECTs until they had demonstrated their ability, taken certain courses or passed some form of peer assessment. That, in any form, is quite confronting and not what most people expect when they take up a junior lectureship. It is, however, a practical way to ensure that we stress the value of teaching by placing basic requirements on the ability to demonstrate skill within that area! In some areas, as well as practical skill, we need to develop scholarship in learning and teaching as well – can we do this in the first years of the ECT with a course of educational psychology, discipline educational techniques and practica to ensure that our lecturers have the fundamental theoretical basis that we would expect from a school teacher?
Are we dancing around the point and, extending the heresy, require something much closer to the Diploma of Education to certify academics as teachers, moving the ECR and the ECT together to give us an Early Career Academic (ECA), someone who spends their first three years being mentored in research and teaching? Even ending up with (some sort of) teaching qualification at the end? (With the increasing focus on quality frameworks and external assessment, I keep waiting for one of our regulatory bodies to slip in a ‘must have a Dip Ed/Cert Ed or equivalent’ clause sometime in the next decade.)
To say that this would require a major restructure in our expectations would be a major understatement, so I suspect that this is a move too far. But I don’t think it’s too much to put limits on the ways that we expose our new staff to difficult or challenging teaching situations, when they have little training and less experience. This would have an impact on a lot of teaching techniques and accepted practices across the world. We don’t make heavy use of Teaching Assistants (TAs) at my Uni but, if we did, a requirement to reduce their load and exposure would immediately push more load back onto someone else. At a time when salary budgets are tight and people are already heavily loaded, this is just not an acceptable solution – so let’s look at this another way.
The way that we can at least start this, without breaking the bank, is to emphasise the importance of teaching and take it as seriously as we take our research: supporting and developing scholarship, providing mentoring and extending that mentoring until we’re sure that the new educators are adapting to their role. These mentors can then give feedback, in conjunction with the staff members, as to what the new staff are ready to take on. Of course, this requires us to carefully determine who should be mentored, and who should be the mentor, and that is a political minefield as it may not be your most senior staff that you want training your teachers.
I am a fairly simple man in many ways. I have a belief that the educational role that we play is not just staff-to-student, but staff-to-staff and student-to-student. Educating our new staff in the ways of education is something that we have to do, as part of our job. There is also a requirement for equal recognition and support across our two core roles: learning and teaching, and research. I’m seeing a lot of positive signs in this direction so I’m taking some heart that there are good things on the nearish horizon. Certainly, today’s meeting met my suggestions, which I don’t think were as novel as I had hoped they would be, with nobody’s skull popping out of their mouth. I take that as a positive sign.
The Extrinsic Reward: As Seen in the Wild.
Posted: July 20, 2012 Filed under: Education, Opinion | Tags: community, curriculum, design, education, educational problem, higher education, in the student's head, principles of design, reflection, research, teaching, teaching approaches, thinking Leave a comment“Why should I do it? What’s in it for me?”
How many times have you heard, said or thought the above sentiment, in one form or another? I go to a lot of meetings so I get to hear this one a lot. Reanalysing my interactions with people over the past 12 months or so, it has become apparent how many people are clearly focused on the payoff, and this is usually not related to their intrinsic reward mechanisms.
We get it from students when they ask “Will this be on the test?” (Should I study this? What’s in it for me?) We get it from our colleagues when they look at a new suggestion and say “Well, no-one’s going to do that.” (Which usually means “I wouldn’t do it. What’s in it for me?”) We get it from ourselves when we don’t do something because something else becomes more important – and this is very interesting as it often gives an indicator of where you sit on the work/life balance scale. Where I work, there are a large number of occasions where the rewards mechanisms used can result in actions and thinking patterns that, as an observer, I find both interesting and disturbing.
Let me give you some background on how research funding works in Australia (very brief). You have a research idea or are inside a group that has some good research ideas. You do research. You discover something. You write it up and get it published in conferences and journals. Repeat this step until you have enough publications to have a credible track record. You can now apply for funding from various bodies, so you spend 3-4 weeks writing a grant and you write up your great grant idea, write it up really well, attach your track record evidence as part of your CV, and then wait. In my discipline, ICT, our success rate is very low, and very few of the people who apply for Australian Research Council Discovery Grants get their grants. Now this is, of course, not a lottery – this is a game of skill! Your grant is rated by other people, you get some feedback, you can respond to this feedback (the rejoinder), and the ratings that you originally received, plus your rejoinder, go forward to a larger panel. Regrettably, there is not much money to go around (most grants are only funded at the 50% level of the 22% of grants that get through across the board), so an initial poor rating means that your grant is (effectively) dead.
This makes grants scarce and intrinsically competitive, as well as artificially inflated in their perceived value. Receiving a grant will also get you public congratulations, the money and gear (obviously) and an invitation to the best Christmas cocktail party in the University – the Winner’s Circle, in effect. The same is true if you bring in a heap of research cash of any other kind – public praise, money and networking opportunities.
Which, if you think about it, is rather curious because you have just been given a wodge (technical term) of cash that you can use to hire staff and buy gear, travel to conferences, and basically improve your chances of getting another grant – but you then get additional extrinsic rewards, including the chance to meet the other people who have risen to this level. This is, effectively, a double reward and I suppose I wouldn’t have much of a problem with it, except that we start to run into those issues of extrinsic motivation again which risks robbing people of their inclination to do research once those extrinsic rewards dry up. I note that we do have a scheme to improve the grant chances of people who just missed out on getting Australian Research Council (ARC) funding but it is literally for those people who just missed out.
Not getting a grant can be a very negative result, because the absence of success is also often accompanied by feedback that will force you to question the value of your performance to date, rather than just the work that has been submitted.
When an early career researcher looks at the ARC application process and thinks “What’s in it for me?” – the answer is far more likely to be “an opportunity to receive feedback of variable quality for the investment of several weeks of your life, from people with whom you are actively competing” rather than an actual grant. So this is obviously a point where mentoring, support and (yes) seed funding to be able to improve become very important – as it provides an ability to develop skill, confidence and (hopefully) the quality of the work, leading to success in the future. The core here, however, is not to bribe the person into improving, it’s to develop the person in order that they improve. Regrettably, a scheme that is (effectively) rewarding the rewarded does not have a built-in “and lifting up those who aren’t there” component. In fact, taking on a less experienced researcher is far more likely to hinder a more capable applicant’s chances. When a senior researcher looks at assisting a more junior researcher, under the current system, “What’s in it for me?” is mostly “Reduced chance of success.” Given that this may also cut you out of the Winner’s Circle, as funds dry up, as you are no longer successful, as it then gets harder to do the research and hence get grants, combined with the fact that you can only apply for these once a year… it’s a positive disincentive to foster emerging talent, unless that talent is so talented that it probably doesn’t need that much help!
So the extrinsic manipulation here has a built-in feedback loop and is, regrettably, prone to splitting people into two groups (successful and not) very early on, at the risk of those groups staying separated for some time to come.
If the large body of work in the area is to be believed, most people don’t plan with the long term outcomes in mind (hence, being told that if you work hard you might get a grant in five years is unlikely to change anyone’s behaviour) and on top of that, as Kohn posits, praising a successful person is more likely to cause envy and division than any real improvement. How does someone else being praised tell you how to improve from your current position?
So what does all of this hot air mean for my students?
I have just finished removing all ‘attendance-based’ incentive schemes from my courses – there are no marks being given just for showing up in any form, marks are only achieved when you demonstrate that you have acquired knowledge. Achievement will not generate any additional reward – the achievement will be the reward. Feedback is crucial but, and this will be challenging, everything I say or do must provide the students with a way to improve, without resorting to the more vague areas of general praise. I will be interested to see if this appears to have any (anecdotal) effect upon the number of times someone asks “What’s in it for me?”
A Design Challenge, a Grand Design Challenge, if you will.
Posted: July 18, 2012 Filed under: Education | Tags: education, educational problem, educational research, feedback, Generation Why, grand challenge, higher education, in the student's head, measurement, principles of design, student perspective, teaching, teaching approaches, thinking, tools, universal principles of design, vygotsky, work/life balance 1 CommentQuestion: What is one semester long, designed as a course for students who perform very well academically, has no prerequisites and can be taken by students with no programming exposure and by students with a great deal of programming experience?
Answer: I don’t know but I’m teaching it on Monday.

While I talk about students who perform well academically, this is for the first instance of this course. My goal is that any student can take this course, in some form, in the future.
The new course in our School, Grand Challenges in Computer Science, is part of our new degree structure, the Bachelor of Computer Science (Advanced). This adds lot more project work and advanced concepts, without disrupting the usual (and already excellent) development structure of the degree. One of the challenges of dealing with higher-performing students is keeping them in a sufficiently large and vibrant peer group while also addressing the minor problem that they’re moving at a different pace to many people that they are friends with. Our solution has been to add additional courses that sit outside of the main progression but still provide interesting material for these students, as well as encouraging them to take a more active role in the student and general community. They can spend time with their friends, carry on with their degrees and graduate at the same time, but also exercise themselves to greater depth and into areas that we often don’t have time to deal with.
In case you’re wondering, I know that some of my students read this blog and I’m completely comfortable talking about the new course in this manner because (a) they know that I’m joking about the “I don’t know” from the Answer above and (b) I have no secrets regarding this course. There are some serious challenges facing us as a species. We are now in a position where certain technologies and approaches may be able to help us with this. One of these is the notion of producing an educational community that can work together to solve grand challenges and these students are very much a potential part of this new community.
The biggest challenge for me is that I have such a wide range of students. I have students who potentially have no programming background and students who have been coding for four years. I have students who are very familiar with the School’s practices and University, and people whose first day is Monday. Of course, my solution to this is to attack it with a good design. But, of course, before a design, we have to know the problem that we’re trying to solve.
The core elements of this course are the six grand challenges as outlined but he NSF, research methods that will support data analysis, the visualisation of large data sources as a grand challenge and community participation to foster grand challenge communities. I don’t believe that a traditional design of lecturing is going to support this very well, especially as the two characteristics that I most want to develop in the students are creativity and critical thinking. I really want all of my students to be able to think their way around, over or through an obstacle and I think that this course is going to be an excellent place to be able to concentrate on this.
I’ve started by looking at my learning outcomes for this course – what do I expect my students to know by the end of this course? Well, I expect them to be able to tell me what the grand challenges are, describe them, and then provide examples of each one. I expect them to be able to answer questions about key areas and, in the areas that we explore in depth, demonstrate this knowledge through the application of relevant skills, including the production of assignment materials to the best of their ability, given their previous experience. Of course, this means that every student may end up performing slightly differently, which immediately means that personalised assessment work (or banded assessment work) is going to be required but it also means that the materials I use will need to be able to support a surface reading, a more detailed reading and a deep reading, where students can work through the material at their own pace.
I don’t want the ‘senior’ students to dominate, so there’s going to have be some very serious scaffolding, and work from me, to support role fluidity and mutual respect, where the people leading discussion rotate to people supporting a point, or critiquing a point, or taking notes on the point, to make sure that everyone gets a say and that we don’t inhibit the creativity that I’m expecting to see in this course. I will be setting standards for projects that take into account the level of experience of each person, discussed and agreed with the student in advance, based on their prior performance and previous knowledge.
What delights me most about this course is that I will be able to encourage people to learn from each other. Because the major assessment items are all unique to a student, then sharing knowledge will not actually lead to plagiarism or copying. Students will be actively discouraged from doing work for each other but, in this case, I have no problem in students helping each other out – as long as the lion’s share of the work is done by the main student. (The wording of this is going to look a lot more formal but that’s a Uni requirement. To quote “The Castle”, “It’s about the vibe.”) Students will regularly present their work for critique and public discussion, with their response to that critique forming a part of their assessment.
I’m trying to start these students thinking about the problems that are out there, while at the same time giving them a set of bootstrapping tools that can set them on the path to investigation and (maybe) solution well ahead of the end of their degrees. This then feeds into their project work in second and third year. (And, I hope, for at least some of them, Honours and maybe PhD beyond.)
Writing this course has been a delight. I have never had so much excuse to buy books and read fascinating things about challenging issues and data visualisation. However, I think that it will be the student’s response to this that will give me something that I can then share with other people – their reactions and suggestions for improvement will put a seal of authenticity on this that I can then pack up, reorganise, and put out into the world as modules for general first year and high school outreach.
I’m very much looking forward to Monday!






