In my previous blog I recounted the work of my friend and colleague, Vince Ham – in particular, the focus on evidence to underpin our practice that was a feature of his work. I recalled the outcomes of the three year intensive research undertaken by a team led by Vince, looking at What makes for effective teacher professional development in ICT?
One of the key findings of that research was the need for a shared understanding of what was meant by the integration of ICTs into teaching and learning. Our observations in classrooms and interviews with teachers and students at the time revealed a very wide range of opinions, with very little in the way of agreement – and more particularly, no common framework for assessing that. As a consequence it meant that there no useful way to actually ‘measure’ the impact of the effort being put into teacher professional development at the time as any use of technology in the classroom may be regarded as either positive or negative, productive or time-wasting, supporting learning or hindering etc.
From the study of hundreds of hours of transcripts in the research the team identified some common themes that began to provide an understanding of how this question might be addressed. Basically the team identified two ‘threads’ evident in discussions in classrooms where ICTs were being used:
- ICT prominence – The extent to which emphasis on ICTs was emphasised or focused on in the learning activities. (e.g. “we’re using Google Docs to write a story”, “I’m using iMovie to make a movie’, “I’m taking photos on my digital camera” etc.)
- Curriculum Connectedness – The extent to which the use of ICTs was integrated or connected with the curriculum and pedagogical focus. (e.g. “we’re collaborating on some research about early forms of transportation – and we’ll be searching the internet for information and recording our ideas in a Google Doc.” etc.)
The interplay of these two criteria was then developed into what is affectionately known as the ‘liquorice strap’, so-called because of the way the twisting of the strap provides a way of showing how the ‘maturity’ of use develops as the dominance of each criteria changes.
The “strap” is illustrated below:
At the left (ADDITION) the focus on technology (yellow) is prominent, at the next stage (INCORPORATION) technology is still prominent, but underpinned by the focus on curriculum/pedagogy. At the third stage (INTEGRATION) the curriculum/pedagogy emphasis is prominent, underpinned by the technology, while at the far right (ASSIMILATION) the focus is almost exclusively on curriculum/pedagogy and the learning that is happening – and the technology, although present, has become simply a part of ‘the way we do things’.
A summary of indicators at each step is illustrated below:
The slides below provide a (slightly) more interactive way of engaging with this…
Many will recognise the similarities between this framework and the SAMR model created by Ruben R. Puentedura just a few years later. Important to note is how the ‘strap’ represents a strongly evidence-based framework for assessing and evaluating the integration of ICT/digital technologies. As such it provides an evidence-based logic (the relationship between technology and curriculum/pedagogy) that can be applied in any context, regardless of what new forms of technology may emerge and be used in schools – a criticism made by some of the SAMR model for example.
Coming back to the point of my previous blog, as we continue down the track of seeking to understand the impact of the use of digital technologies (ICT) to support/enhance/enable powerful learning to occur, it is important to consider the research that has already been carried out and the evidence it provides us with to make decisions as we move forward. This research was very useful in the development of my own thinking about digital agency for example (more here).
My regret about the 23 clusters research used to inform the development of the ‘strap’ is that for many years it remained locked in a ‘black hole’ somewhere, only appearing on the Education Counts website in more recent years after a lot of the research commissioned by the MoE over the years was curated in this way. Given the sheer size of the final report perhaps it was felt it was not something that would connect easily with educators generally, and would be of more interest simply to researchers? Or perhaps it simply ‘fell into the gap’ between personnel at the MoE who were in a position to actively work with what the research demonstrated and weave that into the design decisions around future curriculum and PLD activity? We’re unlikely to know, and it’s not actually a productive use of time to try and find out. The more important thing is to understand that the evidence is there, and it remains useful as we seek to find answers to questions we have about the value and impact of digital technologies in education.
In my next post I will share more about other models and frameworks that were developed from this research – including ‘the island’ – providing an interactive view of the ‘ecology’ of elements in a school context that contribute to the effective integration of ICTs in learning.