“The pedagogy always comes first.”
As someone who has been working all of my career in the area of teaching and learning innovation at universities, it has become almost instinctive in conversations with most teachers to place technology second. Particularly in the early days of emergent digital leaning technologies – say, 1995 to 2010 – the levels of anxiety, or even downright suspicion held by many of my academic colleagues about the value and future of technology-enhanced learning were palpable. As a consequence, I and like-minded innovators would default to a position of technological apologist, with comments like, technology is merely the tool we use to help us achieve our learning goals and fundamentally, online learning is no different from face-to-face learning; technology is merely an enabler. I suspect that most of us who have worked in academic innovation over the past 20 years have heard, and expressed, such sentiments many times. And I suspect that, by and large, we have been wrong.
In this article I am going to take a controversial position – that technology and learning are inseparable and always have been, and that by understanding and embracing this connection by driving change in our technologies we can shape pedagogical behaviour and lead academic innovation in our institutions.
Scottish educational theorist Paul Maharg takes a radical view: technology is the curriculum, says Maharg, taking his lead from philosopher John Dewey. For Dewey and Maharg, it is impossible to separate learning, as an activity, or the ‘content’ of what is learned, from the world or environment in which it occurs.
In 2009, following some of Maharg’s educational ideas, the Australian National University College of Law radically reshaped its Graduate Diploma of Legal Practice. Starting with a conventional, semester-based online program, the degree was transformed into an immersive, simulation- and scenario-based learning environment. Students enrolled in the course as a whole and were assigned to virtual law firms of four to five students. Working in a virtual office space, students managed a number of competing legal transactions – litigation, conveyancing, contracts, negotiation – and a number of ethical issues were sprung on them without warning. There were no lecturers – staff participated in the simulation in role, as managing partners, office managers and clients. Apart from an immersive initial residential week, the entire experience was online; this was crucial, as it allowed the virtual learning space to be shaped to control the simulation. The experience would have been far less rich face-to-face. The success of this innovation underlines the truth of Maharg’s point – the technology is the curriculum.
Behind the scenes, one of the key lessons from this initiative was how the virtual office space shaped the behaviours of academics. They logged into the learning management system and their traditional roles of lecturer, tutor and student disappeared into the background. What appeared instead were new simulation-based roles that the staff were asked to fill. Rather than expound on the theory of contracts, staff were asked to role-play a partner mentoring a new lawyer on the development of a contract. It created a wholly new learning and teaching relationship – a much more authentic and powerful one, in my view. This pedagogical change was managed through changing the technology and through reshaping the spaces in which teachers and learners interact.
This idea has an ancient lineage. In the fourth century BC, Plato’s academy was laid out as a courtyard. Learning happened through dialogue. Scholars and students (not that there was often a clear distinction) walked as they talked, and the notion of a group of collaborative learners in physical and intellectual motion was a central pedagogical idea. This is essentially a social constructivist model. The majority of Socrates’ philosophy, as recorded by Plato, take the form of these ambulatory tutorials. Nowadays, we often celebrate the Socratic method as an admirable active learning strategy, but can overlook the technology – the learning space and mode of interaction – that was at its heart.
The same ancient roots can be found in what is arguably the most pervasive high/er education learning technology – the lecture, delivered in a hall or theatre. This technology of mass instruction is medieval in origin, deriving from the theological sermon, and has always been about the transmission of information. When it migrated into higher education practice in the 12th century, its purpose was simply to enable students to copy texts. Lecturers were hired on the basis of their ability to speak loudly and clearly, so the students could take dictation of the scholarly text being read. Moreover, lecturers were hired by the students directly, and were not re-hired the following term unless they were sufficiently comprehensible. This is why the etymology of the word lecture is from the Latin to read.
As Maharg points out, it was not for another two centuries that the students began to employ lecturers with knowledge of the subject so that they could provide their own explanatory notes (glossa) on the source text. Before this time, senior scholars themselves were far too busy to waste their time with the technology of the lecture; the reading aloud of texts was a secondary task, best left to the medieval equivalent of specialist professional staff. Thus the 11th century academic innovation that placed the lecture at the heart of university education was driven not by educational considerations, but by a combination of its efficiency as a technology and by relationships of academic hierarchy and power. Arguably, little has changed. Far too many modern course sites consist of weekly topics, each of which is an information dump of PDFs, PowerPoints and lecture recordings, with weekly quizzes testing surface learning, often managed by administrative staff. Here, the technology is making it very easy to resist academic innovation and perpetuate poor learning practice a millennium old.
I am entranced by what history can tell us about the relationships between technology and academic innovation (and resistance), and the relevance those lessons have for our modern practice as educators. The examples of Plato’s academy versus the medieval lecture; the immersive simulated virtual learning environment versus the weekly topic content dump show us how technologies and power relations can shape and drive academic practice, whether innovative or conservative.
Uncomfortable though it may be, a central point here is that the factors influencing academic innovation and change are not always – or not mostly – about best practice.
It would be attractive to think that, because we now know theoretically what pedagogical approaches result in high-quality university learning and what results in poor outcomes, that automatically the evidence-based best practice will be adopted. Experience tells us that this is far from the case. Resources (including the scarcest of all academic resources – time), roles and hierarchies will usually take precedence over other factors unless those leading and managing education explicitly drive behaviours in a different way. Notably, technology can play a crucial role in the restructuring of academic work to achieve better quality outcomes for students.
There is an Australian private higher education provider that delivers leadership training using a mainly synchronous online model. Academics lead interactive online seminars of up to a dozen participants. Because the tutorial is captured, it enables a crucial part of the quality enhancement cycle to occur. After every single seminar, an educational designer – who has specialist educational qualifications and experience, unlike the subject-matter expert who led the seminar – will review the recording and provide constructive feedback to the seminar leader about what worked well and what could be improved. These feedback reports form part of the academic’s performance review. By all accounts, the student experience with this provider is outstanding. The technology – the captured tutorial – is what enables change. If you consider the industrial implications if such a scheme were to be introduced at a public Australian university, it becomes clear that the obstacle to such an innovation is not technological nor educational – it is cultural.
Things continue to change. Through technology, we now have a more closely observed picture of student behaviour than ever before. Learner analytics allow us to monitor nearly all aspects of students’ learning activities. We match the text in their assignments, measure their keystrokes to ascertain identity, check when and for how long they access their learning resources and even use artificial intelligence (AI) to measure the cognitive complexity of their forum posts. In general, this is benign. If we can identify from these analytics that a student is likely to fail, we can intervene to provide them support before it happens.
The flip side of educational big data is teaching analytics. If, from these analytics, we can identify that half the class is about to fail, then a different intervention is required. In this case, most likely it is the teacher that needs support and development. When we have these analytics fully functional, then technology will truly drive academic innovation and quality enhancement.
Change will come. The question is, will it be a good thing?
Latest posts by Jonathan Powles (see all)
- Leading Academic Innovation Through Technology - July 8, 2018
- Leading Academic Innovation Through Technology - May 31, 2018
You must be logged in to post a comment.
There are no comments
Add yours