How can pedagogical leaders structure the evaluation process to ensure it is meaningful and impacts on student learning?
I thought a lot about this question throughout the PYP evaluation process. Having just finished a lengthy CIS accreditation process, the school now needed to prepare itself for another evaluative phase. This meant staff had just spent time in groups, gathering evidence and writing summaries of findings. Another process using the same structures would have likely disengaged them, making evaluation an administrative task instead of an opportunity for school-based inquiry. In order to be meaningful, the PYP evaluation process needed to prove that it is far more than an exercise in coordination. Which it is. As John MacBeath writes, evaluation provides schools with the opportunity to improve teaching and learning, refine strategic planning and structure internal professional development opportunities for staff.
However, this opportunity is only as good as what we make of it. Think about the classroom experience: an ineffective learning engagement often turns off students no matter how engaging the content is. The task done through a modality that clashes with a child’s learning style ends in frustration. The discussion that values some students’ opinions over others creates a sense of ownership in some but not all students. So in the evaluation, just like in the classroom, our ability to construct meaningful experiences determines how successful we are in achieving our aims with members of the school community. Pedagogical leaders must be responsive and act flexibly toward the realities of the school context in order to turn the evaluation process into positive change.
So how did we attempt to remain agile during evaluation? A few key understandings guided our approach:
1. The PYP evaluation provides the opportunity to develop understanding of “inquiry as a stance” across the school.
Early on in our evaluation before we formed our inquiry groups, staff had the chance to tune in to the standards and develop descriptors for assessing the practices. Yet something was still missing: that emotional pull that encourages questioning and exploration in the inquiry process. We needed a provocation. To reiterate that the evaluation is about inquiring into student learning, I interviewed and videoed students using one question: “What makes a good learner at our school?” The question, based on John Hattie’s work on Visible Learning (2008), allowed us to capture the diversity of student thought around learning dispositions. Would they express what we hoped they would? Was there a clear picture across the school? What trends were there in student responses?
The video interviews did indeed document varying ideas about what makes a good learner. Some students stated that it meant “listening to the teacher” or “being quiet”. Others described the importance of being a thinker, inquirer or someone “hungry to learn”. Here is a snippet from our “What makes a good learner?” interviews: video
The student responses sparked teachers’ interest and desire to find out more. In this way, the evaluation modeled the inquiry process in that we started with a question hanging over our heads: how are we really doing? The evaluation allowed us to question, experiment with possibilities, make and test theories and defend positions just as we do with students during a unit of inquiry. Focusing on the notion of “inquiry as a stance” throughout the evaluation process stressed that it’s about the process and not about the product.
2. Catering for different working styles amongst staff increases likelihood of a meaningful PYP evaluation process.
Some teachers construct meaning through writing. However, not all do. For this reason, asking inquiry groups to write summaries of findings would not ensure that all teachers felt personal ownership over the evaluation process. To remedy this, I thought about Colin Robson’s claim that what is important is “the usefulness of the data for the purposes of the evaluation, and not the method from which it is obtained”. If we wanted to create opportunities for deep evidence-based discussion in order to improve student learning, we needed to use methods that supported conversation. This meant doing away with a regimented report format, which is linear in form, allowing instead for iterative discussion with use of audio files. Each group recorded their conversations using a handheld device, noting evidence that was found to support their argument for a particular rating. For visual thinkers, using the interactive whiteboard to reference files while speaking meant that discussion linked to concrete examples of practice. Here is an example of the type of discussion we captured through audio recording. A group discusses Standard A6, “The school promotes open communication based on understanding and respect.” Audio: A group discusses A6
It seemed only natural that the “documentation” of these conversations would increase the quality of our self-study and lead to more meaningful action points for our strategic plan. A second part of the audio recording component was for groups to verbally summarize their findings to provide next steps for school development. These were then collated to create a list of action points that informed our final action plan. Here the same group as above discusses one area for development that has come out of their process: Audio: Summary and Action Points
3. A school’s approach to PYP evaluation determines whether it is perceived as an “add on” or as integrated into the fabric of teaching and learning.
Another consideration while structuring our evaluation was ensuring that the process connected to what we already do as a school. Although a significant amount of time was dedicated to evaluation, we did not want it to be perceived as an ‘“add on” to our routine. So to make connections between the PYP standards and practices and our mission statement, I aligned C3 (Teaching and Learning) and C4 (Assessment) practices to our Teaching and Learning Principles. Our Teaching and Learning Principles define the school’s core pedagogical values and “unpack” what the mission statement looks like in the classroom.
As we strive to embed these in our classroom practice, assessing our implementation of C3/C4 practices with the Teaching and Learning Principles brought greater relevancy to the evaluation process and addressed our unique school context.
To make connections to teaching and learning even more evident, C3 and C4 practices were assessed using classroom observations and student interviews. In a series of Learning Walks, groups of teachers released for a day gathered evidence around a particular focus. At the end of each day, reflections and possible action points were collated into a shared Google doc. Here is an example of how the Teaching and Learning Principles were matched up with C3 and C4 practices for use in our Learning Walks.
4. Creating structures that gather multiple perspectives helps schools ensure that the PYP self-study process does not reflect a singular voice.
Lastly, when developing structures for our evaluation it was important to ensure that everyone’s voice was present. Gustave Flaubert claims, “There is no truth, only perception.” In the same way, we are more likely to find a collective “truth” about our strengths and areas of development as a school if we look for trends across multiple perspectives. We did this, for instance, in the development of our action plan. As we all know, each school year comes and we seem to be busier than ever before. Streamlining our action plan and being clear on school priorities ensures that we will be more successful in the implementation process. So to uncover which action points were most pressing for teachers, I used a survey.
These results were then cross-referenced with data from parent focus groups done earlier in the year. In this way, the evaluation action plan reflects a multiple perspectives approach and is owned by the entire school community.
MacBeath J. (1999). Schools Must Speak for Themselves: The Case for School Self-Evaluation. London: Routledge.
Hattie, J. (2008). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. London: Routledge.
Robson, C. (1993). Real-World Research: A Resource for Social Scientists and Practitioner–Researchers. Malden: Blackwell Publishing.
Carla Marschall previously worked in Hong Kong at Quarry Bay School (English Schools Foundation) as well as in Germany as PYP coordinator. Carla is a PYP workshop leader and field representative and a Lynn Erickson concept-based curriculum and instruction trainer. She is especially interested in the role of the curriculum to help students develop critical and creative thinking. Carla is currently involved in curriculum development for language in the PYP. She tweets as @carlamarschall.