Skip to main content Skip to navigation

Implementation

Welcome to the Implementation section of the second module. Earlier you (1) identified a problem, (2) carried out some reconnaissance of the problem and (3) focused your inquiry.

We have already looked at (4) how to plan your evaluation. This work focused on tools for data collection including interviews and focus groups, surveys, documentary evidence, pupils' work, observations and field notes & diaries. We are now moving forward to consider (5) implementation of your enquiry.



Overview of Implementation

 

For you to do

Reflect on your implementation timetable :
Alternative description What aspects of implementation are you most looking forward to?
What are the aspects of implementation which most concerned you?


After so much time spent on planning you are probably eager to start your implementation. In comparison to planning, the implementation seems easy. It is what you do all the time in your professional practice. You just get on with it. However there are some considerations you need to bear in mind:

  1. The opportunities for trial runs
  2. Using your plans in your implementation
  3. The need for flexibility
  4. Remember your plans for evaluation

Many innovations start with a trial run, usually in a smaller and more comfortable setting. This will give you the opportunity to make further changes in your planning and also to gain greater confidence in carrying out the innovation. Even if you are an experienced practitioner, comfortable within your working environment, be prepared for the extra demands an innovation can make - particularly if it does not go in the way you expect!

You can reduce the intensity of these demands by working at first in a controlled setting. Take the example of Sally a teacher whose innovation concerned peer assessment techniques with a class of Y9 pupils. Sally trialled the strategies for peer assessment with a smaller class of 6th form pupils. This way she was able to focus on the innovation, rather than class management. When she came to work on her full implementation with her Y9s she was already aware of some of the difficulties she might face and could adjust her planning accordingly. This shows that action research follows a process cyclic development, or circles within circles, rather than the clearly differentiated stages which we have set out, for reasons for clarity, for you in this support material. Make notes during the trial run but there is no need treat the data as rigorously as in the full innovation. For many practitioner researchers this trial run could have taken place in an extended reconnaissance phase.

To summarise, the advantages of a trial run are to:

  • familiarise yourself with your role in the innovation
  • get to grips with collecting data
  • temporarily adjust certain aspects of the implementation setting (e.g. class size) to focus on the innovation
  • expose unanticipated flaws - and new potential - of the innovation design
  • test any aspects of the innovation you are not sure about

It is difficult to think of any disadvantages of doing a trial run but lack of time or access might seriously constrain you.

For you to do

If you have not already carried out a trial run consider the opportunities for doing so.
 

 

Use your planning when implementing the innovation. For example, consider the resource and time allocations in your plans. They were planned for so that you could carry out a realistic and manageable project. Try to resist the temptation to carry out more than you can possibly succeed in doing once you get started working with the class.

Be flexible in interpreting your plan. If you can act on the feedback during implementation do so. Create sub-cycles of planning, implementation and evaluation within the implementation phase itself. For example, if the feedback is 'not working' adjust accordingly - but be aware that precisely what is not working may be far from clear!

Refer back to your plans for evaluation. This means keeping detailed notes of your observations of the class and carrying out the surveys or interviews that you said you would. Your report cannot rely on memory of past events and impressionistic jottings.

Many innovations are carried out more or less as planned. Others, however, may be altered by events outside of the practitioner researcher's control. For example: an unexpected increase in workload; a hardware failure; the intervention of senior management or the head of department. If this happens, try to put aside your natural disappointment and instead build these events into your research and evaluation. For example, what do these events tell you about the environment in which you are working?
Similar points can be made about the implementation of your evaluation strategy. First look for opportunities to trial run your methods of data collection. You may be surprised when you run your survey about the ways in which pupils may interpret your beautifully put question. A practice interview with a colleague could provide valuable training in listening rather than leading an interviewee. Trial running an observation schedule will enable you to tell how manageable that schedule is. When you carry out your full data collection you will probably need to be flexible, for example, changing who you speak to and when you speak to them. Try to reflect on your data collection as you go along. For example, do you feel you are moving from the role of class teacher to practitioner researcher? If not, how can you address this difficulty?

Try to be open about the difficulties you encounter. It might be disappointing if your class did not take your survey questionnaire as seriously as you hoped - but what did you learn from this and are there opportunities to address this difficulty next time around? As action researchers we are as interested in what did not 'work' as what 'worked'. The point is to reflect and act on what was found out.

Top of page

Reporting on your implementation

We all approach writing in different ways. Some prefer to plan what will be reported in advance. Others try to build up a narrative as they go along.

Writing about implementation poses particular difficulties. For many action researchers there seems so little to report - we did this and this happened. On approach to developing your account of the implementation is to begin with a table of key events. For example, imagine the case of a Sue, a teacher researcher, who was interested in developing her use of the plenary. She might begin her account by explaining what she was aiming to achieve, the classes she carried out her innovation with, and a brief description of what happened. The key events here can be easily represented in a table (see below). You might also produce a time line to display these events. (Note that templates for producing time lines are available at the Microsoft office templates website).
Week 1 Quiz style questions asked to assess and consolidate understanding of key learning points. Some children offered answers repeatedly and others not at all.
Week 2 Children were put into groups for the plenary quiz and each group asked for an answer for each question.
Week 3 Groups were used again with quiz style questions, but different groupings to last week.
Week 4 Focus this week on using plenaries which require writing before oral work with the children, this seemed to involve more children in the plenary.

These events can provide prompts for a short narrative of each lesson in which an innovation had been tried. For example, the first innovation prompted Sue to say more about the class she was working with, adding to the material she had presented in the planning section.

I began by working with a class of Y8 pupils. This was a mixed ability group with whom I had good relationships on the whole. I was aware of some special needs in the class and others who found it difficult to focus in class. The innovation concerned the plenary and this began in week 1 with introducing oral quizzes at the end of the lesson..............

At this stage the table and the narrative provide few details of the data collection tools used. Sue could add a further column to the table to describe those tools.

Week 1 Quiz style questions asked to assess and consolidate understanding of key learning points. Some children offered answers repeatedly and others not at all. Observation of class focused on frequency of answers by keeping a tally on how often each pupil contributed.
Week 2 Children were put into groups for the plenary quiz and each group asked for an answer for each question. Observation focused on interactions within a particular group of children who are generally shy to answer. Field notes were written up immediately after the lesson.
Week 3 Groups were used again with quiz style questions, but different groupings to last week. Observations focused again on interactions within the same group of shy children, and field notes were made immediately afterwards.
Week 4 Focus this week on using plenaries which require writing before oral work with the children, this seemed to involve more children in the plenary. Observation of class focused on frequency of answers by keeping a tally on how often each pupil contributed. Review of pupil comments during written activity. First brief questionnaire survey carried out.

These data collection events can provide further prompts for a short account of the carrying out of the data collection. For example:

I carried out a questionnaire survey with the class in week 4. All completed the survey and seemed able to comprehend what was being asked, there appeared consistency between answers they gave. However open ended questions were only answered very briefly or not at all suggesting that time was too short or children were simply unhappy about adding that level of detail.

You are aiming to give the reader of your report a clear account of what you did and what happened when you did it. Your account will rely initially on the field notes and diaries that you have kept. This account need not necessarily take place after you completed all data collection and there are advantages of updating your report at regular intervals as you undertake the project. By writing regularly you will have ideas still fresh in mind. You are working towards providing a warts-and-all narrative in which you describe what happened, any unexpected events as well as the planned events. You might want to extend the narrative by talking about some of the frustrations and pressures you encountered along the way. You might also draw attention to the success felt as well as the constraints you experienced such as lack of time, curriculum constraints and resource issues. Comment on the role and co-operation of others who were able to help shape the success of the project. These may include such people as colleagues, advisers, students, parents and even your course tutors.

Your notes and diary entries provide a time-stamped chronological journey through the process of implementation and collecting data. It is likely that a time line or table of salient events can be drawn from them. Notes also provide an insight into the evolution of your own thinking over the duration of the project. What may have felt like a haphazard and chaotic research process will reveal threads of continuity and progress with the benefit of hindsight. timeline

 

For you to do

Look back over your notes and reflect on your experiences to produce a short narrative of your project. Ensure that you cover:
A table and / or a time line.
A narrative of key events .
Comments on the data collection.
First impressions of what you felt had gone well or had bee difficult

 

You may find Sarah Fletcher's site, TeacherResearch.net, helpful at this stage. It covers all aspects of practitioner research. It contains a large selection of national and international pieces of practitioner research, details research mentoring as a methodology, advertises practitioner research events, and reviews relevant books and websites.

Top of page

Notes

  1. Computer technology has recently made available genuine alternatives to traditional transcription. For example, recent improvements in voice recognition software such as Dragon Naturally Speaking mean that the interview need not be typed at the keyboard. Once the software has been trained to your voice, transcription can be carried out by listening to the interview through headphones and speaking it into a microphone. However you cannot simply play an audio recording and have it appear on screen as the technology is not yet sufficiently advanced.
  2. Some software packages, such as Qualrus and Transana, eliminate the need to transcribe audio and video data at all. They enable coding and analysis to be performed directly on digital recordings. This saves time and avoids loss in the richness of the data. However some people prefer to work with text, and others feel that the transcribing process is too valuable in of itself to skip. The software package NVivo can be used to analyse transcribed data and is available for less than £10 from IT Services. Details on available software and the various issues relating to technology-supported analysis of qualitative data can be found at Online QDA.
  3. Sivan, A., 2000. The Implementation of Peer Assessment: an action research approach, Assessment in Education, 7(2), 193-213.
  4. Ballantyre, C., 1999. Improving University Teaching: Responding to feedback from students, in Zepke, N., Knight, M., Leach, L. & Viskovice, A. (Eds) Adult Learning Cultures: Challenges and Choices in times of change, 11, 155-165, WP Press, Wellington.

Top of page