HERDSA 2012: Connecting with VET

As part of the final session I attended in the general conference, I attended a talk by A/Prof Anne Langworthy and Dr Susan Johns who co-presented a talk on “Why is it important for Higher Education to connect with the VET sector?” As a point of clarification, VET stands for Vocational Education and Training, as I’ve previously mentioned, and is generally concerned with trade qualifications, with some students going on to diplomas and advanced diplomas that may be recognised as some or all of a first-year University course equivalent. If a student starts in a trade and then goes to a certain point, we can then easily accept them into an articulation program.

The Tasmanian context, a small state that is still relatively highly ruralised, provides a challenging backdrop to this work, as less than 18% of school leavers who could go on to University actually do go on to University. Combine this with the highest lower socio-economic status (SES) population by percentage in Australia, and many Tasmanians can be considered to be disadvantaged by both access to and participation in University level studies. Family influence plays a strong part here – many families have no-one in their immediate view who has been to University at all, although VET is slightly more visible.

Among the sobering statistics presented, where that out of a 12 year schooling system, where Year 12 is the most usual pre-requisite to University entry, as a state, the percentage of people 15 years or older who had year 10 schooling or less was 54%. Over half the adult population had not completed secondary schooling, the usual stepping stone to higher education.

The core drivers for this research were the following:

  1. VET pathways are more visible and accessible to low SES/rural students because the entry requirements aren’t necessarily as high and someone in their family might be able to extol the benefits.
  2. There are very low levels of people articulating from VET to Higher Ed – so, few people are going on to the diploma.
  3. There is an overall decline in VET and HE participation.
  4. Even where skills shortages are identified, students aren’t studying in the areas of regional need.
  5. UTAS is partnering with the Tasmanian VET provider Tasmanian Polytechnic and Skills Institute.

An interesting component of this background data is that, while completion rates are dropping for the VET skills, individual module completion rates are still higher than the courses in which the modules sit. In other words, people are likely to finish a module that is of use to them, or an employer, but don’t necessarily see the need of completing the whole course. However, across the board, the real problem is that VET, so often a pathway to higher ed for people who didn’t quite finish skill, is dropping in percentage terms as a pathway to Uni. There has been a steady decline in VET to HE articulation in Tasmania across the last 6 years.

The researchers opted for an evidence based approach to examine those students who had succeeded in articulating from VET to HE, investigating their perceptions and then mapping existing pathways to discover what could be learned. The profile of VET and HE students from the SES/rural areas in Tas are pretty similar although the VET students who did articulate into Uni were less likely to seek pathways that rewarded them with specific credits and were more likely to seek general admission. Given that these targeted articulations, with credit transfer, are supposed to reflect student desire and reward it, or encourage participation in a specific discipline, it appears that these pathways aren’t working as well as they could.

So what are the motivators for those students who do go from VET to Uni? Career and vocational aspirations, increased confidence from doing VET, building on their practical VET basis, the quality and enthusiasm of their VET teachers, the need to bridge over an existing educational hurdle and satisfaction with their progress to date. While participation in VET generally increased a student’s positive attitude to study, the decision to (or not to) articulate often came down to money, time, perceived lack of industry connection and even more transitional assistance.

It’s quite obvious that our students, and industry, can become fixated with the notion of industrial utility – job readiness – and this may be to the detriment to Universities if we are perceived as ivory towers. More Higher Ed participation is excellent for breaking the poverty cycle, developing social inclusion and the VET to Higher Ed nexus offers great benefits in terms of good student outcomes, such as progression and retention, but it’s obvious that people coming from a practice-based area, especially in terms of competency training, are going to need bridging and some understanding to adapt to the University teaching model. Or, of course, our model has to change. (Don’t think I was going to sneak that one past you.)

The authors concluded that bridging was essential. articulation and credit transfer arrangement should be reviewed and that better articulation agreements should be offered in areas of national and regional priority. The cost of going to University, which may appear very small to students who are happy to defer payment, can be an impediment to lower SES participants because, on average, people in these groups can be highly debt averse. The relocation costs and support costs of moving away from a rural community to an urban centre for education is also significant. It’s easy sometimes to forget about how much it costs to live in a city, especially when you deprive someone of their traditional support models.

Of course, that connection to industry is one where students can feel closer when they undertake VET and Universities can earn some disrespect, fairly or not, for being seen to be ivory towers, too far away from industry. If you have no-one in your family who has been to Uni, there’s no advocate for the utility of ‘wasting’ three years of your life not earning money in order to get a better job or future. However, this is yet another driver for good industry partnerships and meaningful relationships between industry, VET and Higher Education.

It’s great to see so much work being down in both understanding and then acting to fix some of the more persistent problems with those people who may never even see a University, unless we’re dynamic and thoughtful in our outreach programs. On a personal note, while Tasmania has the lowest VET to HE conversion, I noticed that South Australia (my home state) has the second lowest and a similar decline. Time, I think, to take some of the lessons learned and apply them in our back yard!


HERDSA 2012: Informal Communities of Practice: What are the advantages?

One of the talks I went to today was on “Money, Mountains and the Law, The powerful process of interdisciplinary collaboration”. I’m afraid that I can’t give you all of the names of the presenters as there were two physical and three virtual (Edit: The speakers were Leslie Almberg and Judy McGowan – thanks, Leslie!)- and the paper was submitted by Symons (spelling corrected!), Almberg, Goh and McGowan, from Curtin in Western Australia.

The academics in question all came from different disciplines, and different generations and cultures of academia, and found that they had a key thing in common: they considered themselves to be “constructive dissenters”, people who are not happy with how things are in their own patch but rather than just grumbling, they’re looking to make positive change. In this case, these academics had to stop outside of their own discipline, looking in a framework for embedding language elements into their courses, and their similarities were identified by a facilitator who said, effectively, “You’re all saying the same thing from your own discipline.”

The language expert, in this case, worked as interdisciplinary hub – a meeting point for the other three academics. For me, what was most interesting here was how the community of practice was defined between people with similar ideas, rather than people from similar disciplines.

One of the academics, who self-described themselves as an end-of-life academic, was musing on the difference in the modern academy from the one that she had originally entered. The new academy is competitive, full of Roosters (in the strutting sense, rather than the sitting on eggs sense [ Edit: Roosters don’t sit on eggs, do they?]), and requires you to be constantly advertising your excellence. (I’ll speak more on this in another post.) This makes it harder to form an in-discipline community of practice, because there’s always the chance that you will think about the person across from you as a competitor first and a collaborator second.

I have a blog!

The advantages of the interdisciplinary community of practice, as outlined in the talk, is that it is outside of your traditional hierarchies, formality and established space for competition. It doesn’t matter if you tell someone your amazing teaching secret – you won’t be competing against them for promotion. Better still, telling someone something good doesn’t have to make them (if they’re keyed this way) feel bad because you’re outperforming them on their home turf.

In the words of the speakers: these interdisciplinary communities of practice are “organic, outside of hierarchies and silos, provide support mechanism, remove the undercurrent of competitiveness and liberate”.

This forced me to carry out some reflection because I am, to a great extent, someone who would be easy to describe as a rooster. I am a very visible, and mildly successful, early career academic who has a number of things to talk about. (Long time readers will know that an absence of content hasn’t actually slowed me down yet, either.) I try very hard to be inclusive, to help, to be a mentor in the small circle of expertise where that would apply, and to shut up when I can’t help. I work with just about anyone who wants to work with me, but I do expect people to work. I collaborate inside my school, outside of my discipline and outside of my University and, to be honest, the feeling of liberation of working with someone else who has the same problems is fantastic – it makes you feel less alone. The fact that I can share ideas with someone and know that we’re building bridges, not being boastful or accidentally (but implicitly) belittling people who haven’t achieved the same things, is one of the best reasons to work ex discipline.

But I realise that I would not be some people’s first choice to work with because I am still, to many interpretations, a rooster. Obviously, I have a lot of personal reflection left to do on this to work out how I can still achieve and maintain a position as a positive role model, while being fairly sure that I don’t end up as a point of division or someone that is seen as a glory hound. I would be slightly surprised if the last were felt widely but it’s a good time to step back and think about my dealings with the people who are more successful than me (have I been resentful or subordinate?) and the people who are on my level (are we helping each other?) and those people who might benefit from my help (am I helping them or am I being unapproachable)?


HERDSA 2012: Integrating concepts across common engineering first year courses

I attended a talk by Dr Andrew Guzzomi on “Interdisciplinary threshold concepts in engineering”, where he talked about University of Western Australia’s reconfiguration of their first year common engineering program in the face of their new 3+2 course roll-out across the University. Most Unis have a common engineering first year that is the basis for all disciplines. This is usually a collection of individual units each focusing on one discipline, developed and taught by academics from that discipline. For example, civil engineers teach statics, mechs teach dynamics, but there is no guaranteed connection or conceptual linkage between the two areas. This is despite the fact that statics is effectively dynamics with some key variables set to zero. (Engineers, you may now all howl in dismay!)

This work looked at what the threshold concepts were for engineering. These threshold concepts are transformative, in that if you understand them it will change the way that you think about the discipline, but they are also troublesome, they need work to teach and effort to learn. But, in theory, if we identify the right threshold concepts then we:

  • Focus teaching, learning and assessment activities
  • Renew otherwise crowded curricula

This is a big issue as we balance the requirements of our students, our discipline, our professional bodies and industry – we have to make sure that whatever teach is appropriate and the most useful (in all of the objective spaces) thing that we can be teaching.

Dr Guzzomi then discussed the ALTC (Australian Learning and Teaching Council) project that supported the basic investigation to conduct an inventory of what all groups considered to be the core threshold concepts. UWA was the case study, with an aim to reducing a guide for other educators, and to add back to threshold concept theory. This is one of the main contributions of the large-scale Australia-wide educational research support bodies: they can give enough money and influence to a project to allow change to occur.

(I picked up from the talk that, effectively, it helped to have a Chair of Engineering Education on board to get an initiative like this through. Even then, there was still resistance from some quarters. This isn’t surprising. If we all agreed with each other, I’d be shocked.)

The threshold concept identification required a very large set of workshops and consultative activities, across students and staff both within and without the discipline, starting with a diversification phase as concepts were added, and then moving to an integration phase that rationalised these concepts down into the set that really expressed the key threshold concepts of engineering for first year.

The implementation in Syllabus terms required the implementors to:

  • Focus teaching and learning on TCs
  • Address troublesome features
  • provide opportunities to experience variation (motion unit taught using variation theory, when students work at indiv tables, doing different problems at different tables but pool similar answers for comparison to show the difference in approach and answer)
Then developed concept maps for each unit, showing inclusion, requirements and examples, used with, dependencies and so on.
This was then turned into a course implementation that had no lectures at all: courses were composed of four individual units that had readings, tutorial-like information sessions and 2 hour studio session that comprised practicals and more interaction sessions. I did ask Andrew about the assessment mechanisms in use and, while they’ve been completely rebuilt for the new course, they are still reviewing these to make sure that they exercise the threshold concepts appropriately. (I’ll be sending him e-mail to get more detail on this.)
Their findings so far are that these concept identification exercises have revealed the connections between the disciplines and the application of the same concepts across the whole of the discipline. Three concepts were identified as being good examples of concepts that have a reach that spread across all disciplines (integrating threshold concepts):
  1. System identification: where you work out which system he problem fits into to allow you simplify analysis
  2. Modelling and abstraction: where quantitative analysis is facilitated through translation to mathematical language and students use judgement to break system into salient components for modelling
  3. Dimensional reasoning: Identifying the variables needed to describe a complex system – making sure that equations balance.
The conclusions were relatively straight forward:
  • Rather than a traditional and relatively unlinked common foundation, teaching integrating concepts is showing promise
  • Threshold concepts provided the lens and developed approach to integrated disciplines
  • Teaching through variation supports student diversity in solutions
  • This approach reveals connections across engineering disciplines beyond those in which they later chose to specialise
UWA and U Melbourne run a very different degree program from the rest of us, so it’s always interesting to see what they are up to. In this case, there’s a lot going on. Not only have they done a great deal of surveying in order to find the new threshold concepts upon which their courses are now built, but they’ve also completely changed their teaching style to support it, with much greater use of collaboration and team work. I’ll be very interested to see some more follow-up on this after it’s run for the full year.

HERDSA 2012: Session 1 notes – Student Wellbeing

I won’t be giving detailed comments on all sessions – firstly, I can’t attend everything and, secondly, I don’t want you all to die of word poisoning – but I’ve been to a number of talks and thought I’d discuss those here that really made me think. (My apologies for the delay. I seem to be coming down with a cold/flu and it’s slowing me down.)

In Session 1, I went to a talk entitled “Integrating teaching, learning, support and wellbeing in Universities”, presented by Dr Helen Stallman from University of Queensland. The core of this talk was that, if we want to support our students academically, we have to support them in every other way as well. The more distressed students are, the less well they do academically. If we want good outcomes, we have to able to support students’ wellbeing and mental health. We already provide counselling and support skill workshops but very few students will go and access these resources, until they actually need them.

This is a problem. Tell a student at the start of the course, when they are fine, where they can find help and they won’t remember it when they actually may need to know where that resource is. We have a low participation in many of the counselling and support skill workshop activities – it is not on the student’s agenda to go to one of these courses, it is on their agenda is to get a good mark. Pressured for time, competing demands, anything ‘optional’ is not a priority.

The student needs to identify that they have a problem, then they have to be able to find the solution! Many University webpages not actually useful in this regard, although they contain a lot of marketing information on the front page.

What if we have an at-risk profile that we can use to identify students? It’s not 100% accurate. Students who are ‘at risk’ may not have problems but students who don’t have the profile may still have problems! We don’t necessarily know what’s going on with our students. Where we have 100s of students, how can we know all of them? (This is one of the big drivers for my work in submission management and elastic time – identifying students who are at risk as soon as they may be at risk.)

So let me reiterate the problem with the timing of information: we tend to mention support services once, at the start. People don’t access resources unless they’re relevant and useful at the particular time. Talking to people when they don’t have a problem – they’ll forget it.

So what are the characteristics of interventions that promote student success:

  • Inclusive of all students (and you can find it)
  • Encourages self-management skills  (Don’t smother them! Our goal is not dependency, it’s self-regulation)
  • Promotes academic achievement (highest potential for each of our students)
  • Promotes wellbeing (not just professional capabilities but personal capabilities and competencies)
  • Minimally sufficient (students/academics/unis are not doing more work than they need to, and only providing the level of input that is required to achieve this goal.)
  • Sustainable (easy for students and academics)

Dr Stallman then talked about two tools – the Learning Thermometer and The Desk. Student reflection and system interface gives us the Learning Thermometer, then automated and personalised student feedback is added, put in by academic. Support and intervention, web-based, as a loop around student feedback. Student privacy data is maintained and student gets to choose intervention that is appropriate. Effectively, the Learning Thermometer tells the student which services are available, as and when they are needed, based on their results, their feedback and the lecturer’s input.

This is designed to promote self-management skills and makes the student think “What can I do? What are the things that I can do?” Gives students of knowledge of which resources they can access. (And this resource is called “The Desk”) Who are the people who can help me?

What is being asked is: What are the issues that get in the way of achieving academic success?

About “The Desk”: it contains quizzes related to all part of the desk that gives students personalised feedback to give them module suggestions as appropriate. Have a summary sheet of what you’ve done so you can always remember it. Tools section to give you short tips on how to fix things. Coffee House social media centre to share information and pictures (recipes and anything really).

To allow teachers to work out what is going on, an addition to the Learning Thermometer can give the teacher feedback based on reflection and the interface. Early feedback to academics allows us to improve learning outcomes. THese improvements in teaching practices. (Student satisfaction correlates poorly with final mark, this is more than satisfaction.)

The final items in the talk focussed on:

  • A universal model of prevention
  • All students can be resilient
  • Resources need to be timely relevant and useful
  • Multiple access points
  • Integrated within the learning environment

What are the implications?

  • Focus on prevention
  • Close the loop between learning, teaching, wellbeing and support
  • More resilient students
  • Better student graduate outcomes.

Overall a very interesting talk, which a lot of things to think about. How can I position my support resources so that students know where to go as and when they need them? Is ‘resiliency’ an implicit or explicit goal inside my outcomes and syllabus structure? Do the mechanisms that I provide for assessment work within this framework?

With my Time Banking hat on, I am always thinking about how I can be fair but flexible, consistent but compassionate, and maintain quality while maintaining humanity. This talk is yet more information to consider as I look at alternative ways to work with students for their own benefit, while improving their performance at the same time.

Contact details and information on tools discussed:

h.stallman@uq.edu.au
http://www.thelearningthermometer.org.au
http://www.thedesk.org.au
thedesk@uq.edu.au


HERSDA Keynote: “Cultivating Connections Throughout the Academe: Learning to Teach by Learning to Learn” Dr Kathy Takayama.

A very interesting keynote today and I took a lot of notes. Anyone who has read my SIGCSE blogs knows that I’m prone to being verbose so I hope that this is useful if wordy. (Any mistakes are, of course, mine.)

Dr Takayama started talk with an art image: “Venn diagrams (under the spotlight)” by Amalia Pica. She stressed how the Venn Diagram was simple, versatile, intearction, connection, commonality, and it also transcends boundaries – the overlap of two colours produces a new colour. We can also consider this as an absence of difference (sharing) or new knowledge (creation). However the background of the artwork was based in the artist’s experience of the suppression of group theory and Venn diagrams in Argentina – as both of these were seen to encourage subversive forms of group activity and critical theory. Dr Takayama then followed this thread into ideas of inclusion and exclusion. What are the group dynamics from our structures? How do we group students and acaemdics into exclusive and inclusive domains? What does this mean for our future?
How does this limit learning?
She then talked about our future professoriate – those students who will go on to join us in the professorial ranks. She broke this into three aspects: Disciplinary Identity, Dispositions for engagement and Integrative communities. Our disciplinary identity reflects our acculturation to disciplinary practices and habits of mind, where our dispositions identify who we find ourselves in the learning – rather than focusing on what to learn.
“In the face of today’s hyper-accelerated ultra-competitive global society, the preservation of opportunities for self-development and autonmous reflection is a value we underestimate at our peril.” (Richard Wolin)
When we discuss the emergence of disciplinary identity, we are talking about expert thinking – the scholarly habits of a discipline that allow someone to identify themselves as a member of the discipline. What do we do that allows us to say “I am a microbiologist” or “I am an engineer”?
The development of expertise is through iterative authentic experiences, truly appropriate activities carried out inside the disciple, where we have discipline-centred practices, including signature pedagogies (Schulman). The signature pedagogies of a discipline have four features. They must  be pervasive, routine, habitual and deeply engaging
Dr Takayama then discussed, at some length, a study in placing students into unfamiliar territory, where they were required to take scholarly habits from another discipline. In this case, Dr Takayama (a microbiologist) exchanged scholarly habits with students of David Reichart – Historian. Academics confirm to standard practices of their disciplines and students acculturate quickly. Takayama and Reichart sought to take pedagogies from other areas to take students into new thinking processes.
Students from the History course were required to use a science poster basis (research poster) to present their work, instead of a traditional report. The word “Poster” was reacted to badly – students thought that was a cheapening of their effort for a year’s work. Students had to think outside of the norms and discovered new aspects of communication, voice and interpretation in the unusual territory. This also added a challenge component and allowed a multi-dimensional exploration of area.
The microbiology students had to document their research in a completely blank book and were allowed to create a narrative in that blank book. This was at odds with usual structure for Science: accurate, reproducible, adhere to convention, no narrative, no first person, dates, signatures. While accuracy and reproducibly were still enforced, students were encouraged to explore much more widely in their blank book.
Student work started to resemble commonplace books (loci communes) – a compiled work with annotations and narrative from the compiler. The new student books contain personalisation, reflection, narrative, collage, moments of exhilaration and discovery – but they maintained fidelity and scientific accuracy.
This then led to the core idea from the work: (An) engagement with the unfamiliar as a means for further development of expertise.
Students’ understandings are deeply tied to existing and established practices – to the point that students feared that outside conventions would render their work invalid. Working in unfamiliar territory allows the students to refine their understanding of their discipline and push the boundaries, as well as their own understanding. Lecturers had to take risks as well, to get this realisation.
In our traditional dispositions for engagement, we have had a tendency to create a learning culture that is less interested in the unfamiliar and we have implicitly driven a focus on understanding a discipline vs developing an understanding of oneself. The nature of learning as situated in institutional cultures is something that we can see from the inside but the student perspective is vital as we want to know what the students think that we look like. From the students’ perspectives, they see learning in terms of specialisation, globalisation, technology and collaboration. This is a critical forum through which students made sense of their own place in relation to the  discipline.
Students identified two over-arching goals:
  • Routine Expertise: The Habits of mind and skills associated with efficiency and performance in familiar  domains, and
  • Adaptive Expertise (after Bransford): applying knowledge effectively to novel situations or unique problems
Students discover themselves in the material – finding connection and allowing deep eqnuiry into their own nature. (Students’ awareness of themselves in the course or the curriculum (Barbazat, Amherst))
Looking from our perspective, based on what our students want and how they succeed, Barnett (U London) identified dispositions for learning as Venturing Forward
  • A will to learn
  • A will to encounter the unfamiliar
  • A will to engage
  • A preparedness to listen
  • A willingness to be changed
  • A determination to keep going.
 Dr Takayama then went on to talk about developing a strong learning and teaching community through courses such as Brown’s Certification program, which has any benefits in enhancing the perception of value and practices in learning and teaching, as well as overall enhancement of the new post-graduates. One of the core points identified was that many of the PhD students who are produced will go on to teach in liberal arts colleges, institutions with an undergraduate teaching focus and two-year colleges. If we don’t teach them how to teach then they will be woefully underprepared for the future that lies before them – just being good at research doesn’t translate into skill at teaching, hence it must be fostered and well-organised certification programs are a good way to do this.
I hope to comment more on the cert program shortly, but  a very interesting talk with lots of ideas for me to take home and to think about.

When the Stakes are High, the Tests Had Better Be Up to It.

(This is on the stronger opinion side but, in the case of standardised testing as it is currently practised, this will be a polarising issue. Please feel free to read the next article and not this one.)

If you make a mistake, please erase everything from the worksheet, and then leave the room, as you have just wasted 12 years of education.

A friend on FB (thanks, Julie!) linked me to an article in the Washington Post that some of you may have seen. The article is called “The Complete List of Problems with High-Stakes Standardised Tests” by Marion Brady, in the words of the article. a “teacher, administrator, curriculum designer and author”. (That’s attribution, not scare quotes.)

Brady provides a (rather long but highly interesting) list of problems with the now very widespread standardised testing regime that is an integral part of student assessment in some countries. Here. Brady focuses on the US but there is little doubt that the same problems would exist in other areas. From my readings and discussions with US teachers, he is discussing issues that are well-known problems in the area but they are slightly intimidating when presented as a block.

So many problems are covered here, from an incorrect focus on simplistic repetition of knowledge because it’s easier to assess, to the way that it encourages extrinsic motivations (bribery or punishment in the simplest form), to the focus on test providers as the stewards and guides of knowledge rather than the teachers. There are some key problems, and phrases, that I found most disturbing, and I quote some of them here:

[Teachers oppose the tests because they]

“unfairly advantage those who can afford test prep; hide problems created by margin-of-error computations in scoring; penalize test-takers who think in non-standard ways”

“wrongly assume that what the young will need to know in the future is already known; emphasize minimum achievement to the neglect of maximum performance; create unreasonable pressures to cheat.”

“are open to massive scoring errors with life-changing consequences”

“because they provide minimal to no useful feedback”

This is completely at odds with what we would consider to be reasonable education practice in any other area. If I had comments from students that identified that I was practising 10% of this, I would be having a most interesting discussion with my Head of School concerning what I was doing – and a carpeting would be completely fair! This isn’t how we should teach and we know it.

I spoke yesterday about an assault on critical thinking as being an assault on our civilisation, short-sightedly stabbing away at helping people to think as if it will really achieve what (those trying to undermine critical thinking) actually wanted. I don’t think that anyone can actually permanently stop information spreading, when that information can be observed in the natural world, but short-sightedness, malign manipulation of the truth and ignorance can certainly prevent individuals from gaining access to information – especially if we are peddling the lie that “everything which needs to be discovered is already known.”

We can, we have and we probably (I hope) always will work around these obstacles in information, these dark ages as I referred to them yesterday, but at what cost of the great minds who cannot be applied to important problems because they were born to poor families, in the ‘wrong’ state, in a district with no budget for schools, or had to compete against a system that never encouraged them to actually think?

The child who would have developed free safe power, starship drives, applicable zero-inflation stable economic models, or the “cure for cancer” may be sitting at the back of a poorly maintained, un-airconditioned, classroom somewhere, doodling away, and slowly drifting from us. When he or she encounters the standardised test, unprepared, untrained, and tries to answer it to the extent of his or her prodigious intellect, what will happen? Are you sufficiently happy with the system that you think that this child will receive a fair hearing?

We know that students learn from us, in every way. If we teach something in one way but we reward them for doing something else in a test, is it any surprise that they learn for the test and come to distrust what we talk about outside of these tests? I loathe the question “will this be in the exam” as much as the next teacher but, of course, if that is how we have prioritised learning and rewarded the student, then they would be foolish not to ask this question. If the standardised test is the one that decides your future, then, without doubt, this is the one that you must set as your goal, whether student, teacher, district or state!

Of course, it is the future of the child that is most threatened by all of this, as well as the future of the teaching profession. Poor results on a standardised test for a student may mean significantly reduced opportunity, and reduced opportunity, unless your redemptive mechanisms are first class, means limited pathways into the future. The most insidious thread through all of this is the idea that a standardised test can be easily manipulated through a strategy of learning what the answer should be, to a test question, rather than what it is, within the body of knowledge. We now combine the disadvantaged student having their future restricted, competing against the privileged student who has been heavily channeled into a mode that allows them to artificially excel, with no guarantee that they have the requisite aptitude to enjoy or take advantage of the increased opportunities. This means that both groups are equally in trouble, as far as realising their ambitions, because one cannot even see the opportunity while the other may have no real means for transforming opportunity into achievement.

The desire to control the world, to change the perception of inconvenient facts, to avoid hard questions, to never be challenged – all of these desires appear to be on the rise. This is the desire to make the world bend to our will, the real world’s actual composition and nature apparently not mattering much. It always helps me to remember that Cnut stood in the waves and commanded them not to come in order to prove that he could not control the waves – many people think that Cnut was defeated in his arrogance, when he was attempting to demonstrate his mortality and humility, in the face of his courtiers telling him that he had power above that of mortal men.

How unsurprising that so many people misrepresent this.


Who Knew That the Slippery Slope Was Real?

Take a look at this picture.

Dan Ariely. Photo: poptech/Flickr, via wired.com.

One thing you might have noticed, if you’ve looked carefully, is that this man appears to have had some reconstructive surgery on the right side of his face and there is a colour difference, which is slightly accentuated by the lack of beard stubble. What if I were to tell you that this man was offered the chance to have fake stubble tattooed onto that section and, when he declined because he felt strange about it, received a higher level of pressure and, in his words, guilt trip than for any other procedure during the extensive time he spent in hospital receiving skin grafts and burn treatments. Why was the doctor pressuring him?

Because he had already performed the tattooing remediation on two people and needed a third for the paper. In Dan’s words, again, the doctor was a fantastic physician, thoughtful, and he cared but he had a conflict of interest that meant that he moved to a different mode of behaviour. For me, I had to look a couple of times because the asymmetry that the doctor referred to is not that apparent at first glance. Yet the doctor felt compelled, by interests that were now Dan’s, to make Dan self-conscious about the perceived problem.

A friend on Facebook (thanks, Bill!) posted a link to an excellent article in Wired, entitled “Why We Lie, Cheat, Go to Prison and Eat Chocolate Cake” by Dan Ariely, the man pictured above. Dan is a professor of behavioural economics and psychology at Duke and his new book explores the reasons that we lie to each other. I was interested in this because I’m always looking for explanations of student behaviour and I want to understand their motivations. I know that my students will rationalise and do some strange things but, if I’m forewarned, maybe I can construct activities and courses in a way that heads this off at the pass.

There were several points of interest to me. The first was the question whether a cost/benefit analysis of dishonesty – do something bad, go to prison – actually has the effect that we intend. As Ariely points out, if you talk to the people who got caught, the long-term outcome of their actions was never something that they thought about. He also discusses the notion of someone taking small steps, a little each time, that move them from law abiding, for want of a better word, to dishonest. Rather than set out to do bad things in one giant leap, people tend to take small steps, rationalising each one, and after each step opening up a range of darker and darker options.

Welcome to the slippery slope – beloved argument of rubicose conservative politicians since time immemorial. Except that, in this case, it appears that the slop is piecewise composed on tiny little steps. Yes, each step requires a decision, so there isn’t the momentum that we commonly associate with the slope, but each step, in some sense, takes you to larger and larger steps away from the honest place from which you started.

Ariely discusses an experiment where he gave two groups designer sunglasses and told one group that they had the real thing, and the other that they had fakes, and then asked them to complete a test and then gave them a chance to cheat. The people who had been randomly assigned into the ‘fake sunglasses’ group cheated more than the others. Now there are many possible reasons for this. One of them is the idea that if you know that are signalling your status deceptively to the world, which is Ariely’s argument, you are in a mindset where you have taken a step towards dishonesty. Cheating a little more is an easier step. I can see many interpretations of this, because of the nature of the cheating which is in reporting how many questions you completed on the test, where self-esteem issues caused by being in the ‘fake’ group may lead to you over-promoting yourself in the reporting of your success on the quiz – but it’s still cheating. Ultimately, whatever is motivating people to take that step, the step appears to be easier if you are already inside the dishonest space, even to a degree.

[Note: Previous paragraph was edited slightly after initial publication due to terrible auto-correcting slipping by me. Thanks, Gary!]

Where does something like copying software or illicitly downloading music come into this? Does this constant reminder of your small, well-rationalised, step into low-level lawlessness have any impact on the other decisions that you make? It’s an interesting question because, according to the outline in Ariely’s sunglasses experiment, we would expect it to be more of a problem if the products became part of your projected image. We know that having developed a systematic technological solution for downloading is the first hurdle in terms of achieving downloads but is it also the first hurdle in making steadily less legitimate decisions? I actually have no idea but would be very interested to see some research in this area. I feel it’s too glib to assume a relationship, because it is so ‘slippery slope’ argument, but Ariely’s work now makes me wonder. Is it possible that, after downloading enough music or software, you could actually rationalise the theft of a car? Especially if you were only ‘borrowing’ it? (Personally, I doubt it because I think that there are several steps in between.) I don’t have a stake in this fight – I have a personal code for behaviour in this sphere that I can live with but I see some benefits in asking and trying to answer these questions from something other that personal experience.

Returning to the article, of particular interest to me was the discussion of an honour code, such as Princeton’s, where students sign a pledge. Ariely sees it as benefit as a reminder to people that is active for some time but, ultimately, would have little value over several years because, as we’ve already discussed, people rationalise in small increments over the short term rather than constructing long-term models where the pledge would make a difference. Sign a pledge in 2012 and it may just not have any impact on you by the middle of 2012, let alone at the end of 2015 when you’re trying to graduate. Potentially, at almost any cost.

In terms of ongoing reminders, and a signature on a piece of work saying (in effect) “I didn’t cheat”, Ariely asks what happens if you have to sign the honour clause after you’ve finished a test – well, if you’ve finished then any cheating has already occurred so the honour clause is useless then. If you remind people at the start of every assignment, every test, and get them to pledge at the beginning then this should have an impact – a halo effect to an extent, or a reminder of expectation that will make it harder for you to rationalise your dishonesty.

In our school we have an electronic submission system that require students to use to submit their assignments. It has boiler plate ‘anti-plagiarism’ text and you must accept the conditions to submit. However, this is your final act before submission and you have already finished the code, which falls immediately into the trap mentioned in the previous paragraph. Dan Ariely’s answers have made me think about how we can change this to make it more of an upfront reminder, rather than an ‘after the fact – oh it may be too late now’ auto-accept at the end of the activity. And, yes, reminder structures and behaviour modifiers in time banking are also being reviewed and added in the light of these new ideas.

The Wired Q&A is very interesting and covers a lot of ground but, realistically, I think I have to go and buy Dan Ariely’s book(s), prepare myself for some harsh reflection and thought, and plan for a long weekend of reading.


Time Banking and Plagiarism: Does “Soul Destroying” Have An Ethical Interpretation?

Yesterday, I wrote a post on the 40 hour week, to give an industrial basis for the notion of time banking, and I talked about the impact of overwork. One of the things I said was:

The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. (Emphasis mine.)

Back-breaking is me being rather overly emphatic regarding the impact of work, although in manual industries workplace accidents caused by fatigue and overwork can and do break backs – and worse – on a regular basis.

Is it Monday morning already?

But soul-destroying? Am I just saying that someone will perform their tasks as an automaton or zombie, or am I saying something more about the benefit of full cognitive function – the soul as an amalgam of empathy, conscience, consideration and social factors? Well, the answer is that, when I wrote it, I was talking about mindlessness and the removal of the ability to take joy in work, which is on the zombie scale, but as I’ve reflected on the readings more, I am now convinced that there is an ethical dimension to fatigue-related cognitive impairment that is important to talk about. Basically, the more tired you get, the more likely you are to function on the task itself and this can have some serious professional and ethical considerations. I’ll provide a basis for this throughout the rest of this post.

The paper I was discussing, on why Crunch Mode doesn’t work, listed many examples from industry and one very interesting paper from the military. The paper, which had a broken link in the Crunch mode paper, may be found here and is called “Sleep, Sleep Deprivation, and Human Performance in Continuous Operations” by Colonel Gregory Belenky. Now, for those who don’t know, in 1997 I was a commissioned Captain in the Royal Australian Armoured Corps (Reserve), on detachment to the Training Group to set up and pretty much implement a new form of Officer Training for Army Reserve officers in South Australia. Officer training is a very arduous process and places candidates, the few who make it in, under a lot of stress and does so quite deliberately. We have to have some idea that, if terrible things happen and we have to deploy a human being to a war zone, they have at least some chance of being able to function. I had been briefed on most of the issues discussed in Colonel Belenky’s paper but it was only recently that I read through the whole thing.

And, to me today as an educator (I resigned my commission years ago), there are still some very important lessons, guidelines and warnings for all of us involved in the education sector. So stay with me while I discuss some of Belenky’s terminology and background. The first term I want to introduce is droning: the loss of cognitive ability through lack of useful sleep. As Belenky puts in, in the context of US Army Ranger training:

…the candidates can put one foot in front of another and respond if challenged, but have difficulty grasping their situation or acting on their own initiative.

What was most interesting, and may surprise people who have never served with the military, is that the higher the rank, the less sleep people got – and the higher level the formation, the less sleep people got. A Brigadier in charge of a Brigade is going to, on average, get less sleep than the more junior officers in the Brigade and a lot less sleep than a private soldier in a squad. As an officer, my soldiers were fed before me, rested before me and a large part of my day-to-day concern was making sure that they were kept functioning. This keeps on going up the chain and, as you go further up, things get more complex. Sadly, the people shouldering the most complex cognitive functions with the most impact on the overall battlefield are also the people getting the least fuel for their continued cognitive endeavours. They are the most likely to be droning: going about their work in an uninspired way and not really understanding their situation. So here is more evidence from yet another place: lack of sleep and fatigue lead to bad outcomes.

One of the key issues Belenky talks about is the loss of situational awareness caused by the accumulated sleep debt, fatigue and overwork suffered by military personnel. He gives an example of an Artillery Fire Direction Centre – this is where requests for fire support (big guns firing large shells at locations some distance away) come to and the human plotters take your requests, transform them into instructions that can be given to the gunners and then firing starts. Let me give you a (to me) chilling extract from the report, which the Crunch Mode paper also quoted:

Throughout the 36 hours, their ability to accurately derive range, bearing, elevation, and charge was unimpaired. However, after circa 24 hours they stopped keeping up their situation map and stopped computing their pre-planned targets immediately upon receipt. They lost situational awareness; they lost their grasp of their place in the operation. They no longer knew where they were relative to friendly and enemy units. They no longer knew what they were firing at. Early in the simulation, when we called for simulated fire on a hospital, etc., the team would check the situation map, appreciate the nature of the target, and refuse the request. Later on in the simulation, without a current situation map, they would fire without hesitation regardless of the nature of the target. (All emphasis mine.)

Here, perhaps, is the first inkling of what I realised I meant by soul destroying. Yes, these soldiers are overworked to the point of droning and are now shuffling towards zombiedom. But, worse, they have no real idea of their place in the world and, perhaps most frighteningly, despite knowing that accidents happen when fire missions are requested and having direct experience of rejecting what would have resulted in accidental hospital strikes, these soldiers have moved to a point of function where the only thing that matters is doing the work and calling the task done. This is an ethical aspect because, from their previous actions, it is quite obvious that there was both a professional and ethical dimension to their job as the custodians of this incredibly destructive weaponry – deprive them of enough sleep and they calculate and fire, no longer having the cognitive ability (or perhaps the will) to be ethical in their delivery. (I realise a number of you will have choked on your coffee slightly at the discussion of military ethics but, in the majority of cases, modern military units have a strong ethical code, even to the point of providing a means for soldiers to refuse to obey illegal orders. Most failures of this system in the military can be traced to failures in a unit’s ethical climate or to undetected instability in the soldiers: much as in the rest of the world.)

The message, once again, is clear. Overwork, fatigue and sleeplessness reduce the ability to perform as you should. Belenky even notes that the ability to benefit from training quite clearly deteriorates as the fatigue levels increase. Work someone hard enough, or let them work themselves hard enough, and not only aren’t they productive, they can’t learn to do anything else.

The notion of situational awareness is important because it’s a measure of your sense of place, in an organisational sense, in a geographical sense, in a relative sense to the people around you and also in a social sense. Get tired enough and you might swear in front of your grandma because your social situational awareness is off. But it’s not just fatigue over time that can do this: overloading someone with enough complex tasks can stress cognitive ability to the point where similar losses of situational awareness can occur.

Helmet fire is a vivid description of what happens when you have too many tasks to do, under highly stressful situations, and you lose your situational awareness. If you are a military pilot flying on instruments alone, especially with low or zero visibility, then you have to follow a set of procedures, while regularly checking the instruments, in order to keep the plane flying correctly. If the number of tasks that you have to carry out gets too high, and you are facing the stress of effectively flying the plane visually blind, then your cognitive load limits will be exceeded and you are now experiencing helmet fire. You are now very unlikely to be making any competent contributions at all at this stage but, worse, you may lose your sense of what you were doing, where you are, what your intentions are, which other aircraft are around you: in other words, you lose situational awareness. At this point, you are now at a greatly increased risk of catastrophic accident.

To summarise, if someone gets tired, stressed or overworked enough, whether acutely or over time, their performance goes downhill, they lose their sense of place and they can’t learn. But what does this have to do with our students?

A while ago I posted thoughts on a triage system for plagiarists – allocating our resources to those students we have the most chance of bringing back to legitimate activity. I identified the three groups as: sloppy (unintentional) plagiarism, deliberate (but desperate and opportunistic) plagiarism and systematic cheating. I think that, from the framework above, we can now see exactly where the majority of my ‘opportunistic’ plagiarists are coming from: sleep-deprived, fatigued and (by their own hands or not) over-worked students losing their sense of place within the course and becoming focused only on the outcome. Here, the sense of place is not just geographical, it is their role in the social and formal contracts that they have entered into with lecturers, other students and their institution. Their place in the agreements for ethical behaviour in terms of doing the work yourself and submitting only that.

If professional soldiers who have received very large amounts of training can forget where there own forces are, sometimes to the tragic extent that they fire upon and destroy them, or become so cognitively impaired that they carry out the mission, and only the mission, with little of their usual professionalism or ethical concern, then it is easy to see how a student can become so task focussed that start to think about only ending the task, by any means, to reduce the cognitive load and to allow themselves to get the sleep that their body desperately needs.

As always, this does not excuse their actions if they resort to plagiarism and cheating – it explains them. It also provides yet more incentive for us to try and find ways to reach our students and help them form systems for planning and time management that brings them closer to the 40 hour ideal, that reduces the all-nighters and the caffeine binges, and that allows them to maintain full cognitive function as ethical, knowledgable and professional skill practitioners.

If we want our students to learn, it appears that (for at least some of them) we first have to help them to marshall their resources more wisely and keep their awareness of exactly where they are, what they are doing and, in a very meaningful sense, who they are.


Time Banking: Aiming for the 40 hour week.

I was reading an article on metafilter on the perception of future leisure from earlier last century and one of the commenters linked to a great article on “Why Crunch Mode Doesn’t Work: Six Lessons” via the International Game Designers Association. This article was partially in response to the quality of life discussions that ensued after ea_spouse outed the lifestyle (LiveJournal link) caused by her spouse’s ludicrous hours working for Electronic Arts, a game company. One of the key quotes from ea_spouse was this:

Now, it seems, is the “real” crunch, the one that the producers of this title so wisely prepared their team for by running them into the ground ahead of time. The current mandatory hours are 9am to 10pm — seven days a week — with the occasional Saturday evening off for good behavior (at 6:30pm). This averages out to an eighty-five hour work week. Complaints that these once more extended hours combined with the team’s existing fatigue would result in a greater number of mistakes made and an even greater amount of wasted energy were ignored.

The badge is fastened with two pins that go straight into your chest.

This is an incredible workload and, as Evan Robinson notes in the “Crunch Mode” article, this is not only incredible but it’s downright stupid because every serious investigation into the effect of working more than 40 hours a week, for extended periods, and for reducing sleep and accumulating sleep deficit has come to the same conclusion: hours worked after a certain point are not just worthless, they reduce worth from hours already worked.

Robinsons cites studies and practices coming from industrialists as Henry Ford, who reduced shift length to a 40-hour work week in 1926, attracting huge criticism, because 12 years of research had shown that the shorter work week meant more output, not less. These studies have been going on since the 18th century and well into the 60’s at least and they all show the same thing: working eight hours a day, five days a week gives you more productivity because you get fewer mistakes, you get less fatigue accumulation and you have workers that are producing during their optimal production times (first 4-6 hours of work) without sliding into their negatively productive zones.

As Robinson notes, the games industry doesn’t seem to have got the memo. The crunch is a common feature in many software production facilities and the ability to work such back-breaking and soul-destroying shifts is often seen as a badge of honour or mark of toughness. The fact that you can get fired for having the audacity to try and work otherwise also helps a great deal in motivating people to adopt the strategy.

Why spend so many hours in the office? Remember when I said that it’s sometimes hard for people to see what I’m doing because, when I’m thinking or planning, I can look like I’m sitting in the office doing nothing? Imagine what it looks like if, two weeks before a big deadline, someone walks into the office at 5:30pm and everyone’s gone home. What does this look like? Because of our conditioning, which I’ll talk about shortly, it looks like we’ve all decided to put our lives before the work – it looks like less than total commitment.

As a manager, if you can tell everyone above you that you have people at their desks 80+ hours a week and will have for the next three months, then you’re saying that “this work is important and we can’t do any more.” The fact that people were probably only useful for the first 6 hours of every day, and even then only for the first couple of months, doesn’t matter because it’s hard to see what someone is doing if all you focus on is the output. Those 80+ hour weeks are probably only now necessary because everyone is so tired, so overworked and so cognitively impaired, that they are taking 4 times as long to achieve anything.

Yes, that’s right. All the evidence says that more than 2 months of overtime and you would have been better off staying at 40 hours/week in terms of measurable output and quality of productivity.

Robinson lists six lessons, which I’ll summarise here because I want to talk about it terms of students and why forward planning for assignments is good practice for better smoothing of time management in the future. Here are the six lessons:

  1. Productivity varies over the course of the workday, with greatest productivity in the first 4-6 hours. After enough hours, you become unproductive and, eventually, destructive in terms of your output.
  2. Productivity is hard to quantify for knowledge workers.
  3. Five day weeks of eight house days maximise long-term output in every industry that has been studied in the past century.
  4. At 60 hours per week, the loss of productivity caused by working longer hours overwhelms the extra hours worked within a couple of months.
  5. Continuous work reduces cognitive function 25% for every 24 hours. Multiple consecutive overnighters have a severe cumulative effect.
  6. Error rates climb with hours worked and especially with loss of sleep.

My students have approximately 40 hours of assigned work a week, consisting of contact time and assignments, but many of them never really think about that. Most plan in other things around their ‘free time’ (they may need to work, they may play in a band, they may be looking after families or they may have an active social life) and they fit the assignment work and other study into the gaps that are left. Immediately, they will be over the 40 hour marker for work. If they have a part-time job, the three months of one of my semesters will, if not managed correctly, give them a lumpy time schedule alternating between some work and far too much work.

Many of my students don’t know how they are spending their time. They switch on the computer, look at the assignment, Skype, browse, try something, compile, walk away, grab a bite, web surf, try something else – wow, three hours of programming! This assignment is really hard! That’s not all of them but it’s enough of them that we spend time on process awareness: working out what you do so you know how to improve it.

Many of my students see sports drinks, energy drinks and caffeine as a licence to not sleep. It doesn’t work long term as most of us know, for exactly the reasons that long term overwork and sleeplessness don’t work. Stimulants can keep you awake but you will still be carrying most if not all of your cognitive impairment.

Finally, and most importantly, enough of my students don’t realise that everything I’ve said up until now means that they are trying to sit my course with half a brain after about the halfway point, if not sooner if they didn’t rest much between semesters.

I’ve talked about the theoretical basis for time banking and the pedagogical basis for time banking: this is the industrial basis for time banking. One day I hope that at least some of my students will be running parts of their industries and that we have taught them enough about sensible time management and work/life balance that, as people in control of a company, they look at real measures of productivity, they look at all of the masses of data supporting sensible ongoing work rates and that they champion and adopt these practices.

As Robinson says towards the end of the article:

Managers decide to crunch because they want to be able to tell their bosses “I did everything I could.” They crunch because they value the butts in the chairs more than the brains creating games. They crunch because they haven’t really thought about the job being done or the people doing it. They crunch because they have learned only the importance of appearing to do their best to instead of really of doing their best. And they crunch because, back when they were programmers or artists or testers or assistant producers or associate producers, that was the way they were taught to get things done. (Emphasis mine.)

If my students can see all of their requirements ahead of time, know what is expected, have been given enough process awareness, and have the will and the skill to undertake the activities, then we can potentially teach them a better way to get things done if we focus on time management in a self-regulated framework, rather than imposed deadlines in a rigid authority-based framework. Of course, I still have a lot of work to to demonstrate that this will work but, from industrial experience, we have yet another very good reason to try.


Time Banking: Foresightedness and Reward

You may have noticed that I’ve stopped numbering the time banking posts – you may not have noticed that they were numbered in the first place! The reason is fairly simple and revolves around the fact that the numbers are actually meaningless. It’s not as if I have a huge plan of final sequence of the time banking posts. I do have a general idea but the order can change as one idea or another takes me and I feel that numbering them makes it look as if there is some grand sequence.

There isn’t. That’s why they all tend to have subtitles after them so that they can be identified and classified in a cognitive sequence. So, why am I telling you this? I’m telling you this so that you don’t expect “Time Banking 13” to be something special, or (please, no) “Time Banking 100” to herald the apocalypse.

The Druids invented time banking but could never find a sufficiently good Oracle to make it work. The Greeks had the Oracle but not the bank. This is why the Romans conquered everywhere. True story!

If I’m going to require students to self-regulate then, whether through operant or phenomenological mechanisms, the outcomes that they receive are going to have to be shaped to guide the student towards a self-regulating model. In simple terms, they should never feel that they have wasted their time, that they are under-appreciated or that they have been stupid to follow a certain path.

In particular, if we’re looking at time management, then we have to ensure that time spent in advance is never considered to be wasted time. What does that mean to me as a teacher, if I set an assignment in advance and students put work towards it – I can’t change the assignment arbitrarily. This is one of the core design considerations for time banking: if deadlines are seen as arbitrary (and extending them in case of power failures or class-wide lack of submission can show how arbitrary they are) then we allow the students to make movement around the original deadlines, in a way that gives them control without giving us too much extra work. If I want my students to commit to planning ahead and doing work before the due date then some heavy requirements fall on me:

  1. I have to provide the assignment work ahead of schedule and, preferably, for the entire course at the start of the semester.
  2. The assignments stay the same throughout that time. No last minute changes or substitutions.
  3. The oracle is tied to the assignment and is equally reliable.

This requires a great deal of forward planning and testing but, more importantly, it requires a commitment from me. If I am asking my students to commit, I have to commit my time and planning and attention to detail to my students. It’s that simple. Nobody likes to feel like a schmuck. Like they invested time under false pretences. That they had worked on what they thought was a commitment but it turned out that someone just hadn’t really thought things through.

Wasting time and effort discourages people. It makes people disengage. It makes them less trustful of you as an educator. It makes them less likely to trust you in the future. It reduces their desire to participate. This is the antithesis of what I’m after with increasing self-regulation and motivation to achieve this, which I label under the banner of my ‘time banking’ project.

But, of course, it’s not as if we’re not already labouring under this commitment to our students, at least implicitly. If we don’t follow the three requirements above then, at some stage, students will waste effort and, believe me, they’re going to question what they’re doing, why they’re bothering, and some of them will drop out, drift away and be lost to us forever. Never thinking that you’ve wasted your time, never feeling like a schmuck, seeing your ideas realised, achieving goals: that’s how we reward students, that’s what can motivate students and that’s how we can move the on to higher levels of function and achievement.