To code or not to code? Is that the education question

Co-authored by Tim Patson, Geelong Grammar School
While ‘coding’ and ‘everyone should learn to code’ are catchphrases of our digital age, both fail to capture the essence and challenge of adapting to a society infused with digital technology.

If we pull apart these phrases and explore the motivations behind them we soon discover that our true goal should be to ensure that the next generation is digitally competent: armed with the attitudes and behaviours that enable them to bricolage solutions using digital and non-digital tools when working in their chosen profession. This may require coding, but often it will not. The important unanswered question is then: what does this bricolage look like in particular domains, and how might we teach it?

In mid-2016, the authors were chatting about this rush to teach everyone how to code which seemed to be infecting the nation. Every week there were new programmes being pitched, promising to teach anyone – even down to the lower end of primary – how to code. Both media and vendors were tapping into the existential angst that we all feel for the future, a tomorrow that we assume will be shaped by technology due to a strong thread of technological determinism that runs through modern society. It’s assumed that in this future everyone will have a choice: either learn how to code and control the machines, or become their subjects.

We weren’t so certain though.

Clearly computers are becoming more important, but that doesn’t necessarily imply that everyone must be a competent coder if they’re to remain relevant. Modern software development is a team sport, and coders can easily be in the minority.

It also seemed that different stakeholders (business folk, educators, parents and even coders) were implying different things when they reached for ‘everyone should learn how to code’, and ‘coding’ in particular. ‘Coding’ was being used as shorthand for some important but unformed and unarticulated desire to respond to the increasing prevalence of digital technology in society, rather than referring to a specific activity, knowledge or skill.

The conversation in the community had jumped directly from “computers will be important in the future” to “everyone should learn how to code” without any discussion on what ‘coding’ might imply.

At this point Centre for the Edge and Geelong Grammar School decided to collaborate on a project to pull these concepts apart and help obtain some clarity.

One obvious thing to do was to write a well-considered and thorough think-piece, one where we carefully define ‘coding’ and explore the implications of a digital future, and try to get the community to adopt our definitions. This seemed counterproductive though. The important thing was for the community to come together, to enumerate and explore the possible meanings for ‘coding’ and the implications of ‘teaching everyone how to code’, and to come up with a common understanding. So this is what we did.

Late in 2016 we convened a national series of symposia under the Chatham House Rule. Open discussions where the community could share points of view and opinions, and find common ground. (Our report of this project – To code or not to code, is that the question? – can be found here).

Rather than participate ourselves, we simply used the obvious questions (‘what do we mean by coding?’, ‘should everyone learn how to code?’, ‘where does it fit in already crowded curricula?’, and so on) to nudge each conversation forward. Other than that, conversations were allowed to find their own paths.

The process was very encouraging.

All conversations highlighted significant differences between stakeholders’ points of view, but in each instance these differences were managed respectfully. While conversations all took different paths too, they ended up at the same location.

The results were also very interesting.

Yes, was the consensus, everyone should learn how to code, but a short compulsory course (a term or so) should be sufficient, possibly at the upper end of primary and/or the lower end of secondary. Our goal is to ensure that all students have experienced coding, providing them with the opportunity to form their own opinion on what it is and its relevance to themselves. Some students will enthusiastically take coding up, not having considered it before, while others will decide that it is not for them. Both outcomes are fine.

There should also be an optional coding stream through K-12, as we must support the students who want to pursue coding. Our state and national curriculum teams already have this well in hand.

However, what is really meant by ‘everyone should learn how to code’ is that formal education should ensure that all students are ‘digitally competent’. This is not the same as digital literacy, as many stakeholders feel literacy implies consumption of content via digital means. Nor does it imply coding.

Digital competency is the ability to integrate digital tools into your work, to bricolage new solutions by integrating digital and non-digital tools. We need to integrate bricolage with digital (and non-digital) tools into non-digital domains, like geography and biology. This might imply coding, but it might not. Like creativity, it is really a set of attitudes and behaviours that span domains, rather than being a subject in its own right.

The forum also highlighted the digital native myth, the assumption that familiarity with digital technology confers competency. The evidence shows that this is not true. Digital competency, like literacy, must be taught.

The challenge then is to understand what digital competency looks like in practice, which we intend to be the subject of our next collaboration.

If you’d like to find out more you can download the full report here and if you have any questions or would like to be involved in the follow-on project please get in touch.

Want to stay up-to-date?

Stay on trend and in the know when you sign up for our latest content