The making of the Happy Teacher questions

Posted on 29th July 2016 by Joe
A question mark block

So how do you decide on a set of questions that will provide the best information to teachers on what it’s like to work at a school? This was our goal. In this post I’ll tell the story of how we ended up with the questions which you see on the site today.

Background Research

We started off by doing an extensive review of what is currently out there: wellbeing and workload surveys, stress surveys, job satisfaction surveys, and everything in-between. These surveys are conducted by a large range of different entities, such as teaching unions, individual schools, academic researchers, and job review websites. We looked not only at the questions they use, but also their results. We also talked to teachers wherever possible on what they would expect from a website like Happy Teacher.

We decided early on that we wanted a different focus from a standard wellbeing or stress survey, in that we wanted the emphasis to be not so much on teachers evaluating their own feelings, but evaluating their school. We thought that teachers arriving at the site would be less interested in, for example, whether individual teachers had found themselves a good work/life balance (which could be due to many personal factors), and more interested in whether “This school makes a good work/life balance possible.”

After a lot of reading and discussing, we put together 28 questions for a pilot survey, knowing that we’d want to prune this number down to be left with a review short enough even for time-stretched teachers.

Survey 1: Pilot

On an INSET day in the spring term I stood up in front of all the teachers at my school to introduce the Happy Teacher project and ask for their help. I was very grateful that my head teacher, Jonathan Wilden, saw the potential of gathering anonymous teacher feedback. My colleagues had lots of questions (“If it’s anonymous, why should we say anything positive?”, or “If it’s going to be public, why should we say anything negative?”) and in the end 34 of them completed the pilot survey.

We drew up a summary of results for my head teacher, who discussed them with SLT and forwarded them on to all staff. These results themselves were promising for us at Happy Teacher because there were clear peaks and troughs, highlighting specific ‘WWW’ and ‘EBI’ areas.

We analysed the results with help from my fiancée, Emily, currently studying for a PhD in behavioural genetics. We looked at things like whether any 2 questions’ responses correlated so highly that one of them was redundant and could be removed. We looked at consistency between the ratings to see if all the questions in one Key Area were really getting at the same thing. For example, we had a Key Area called ‘Atmosphere & Environment’ which, it turned out, wasn’t really nailing down one coherent construct. We also looked at teacher feedback on what questions were too ambiguous or confusing and adapted these. An example of this was: “Working at this school is stressful,” where teachers said they just weren’t clear on the definition of ‘stress’.

Teachers were also asked to judge whether the responses to each question would be useful to a teacher who was considering working at the school. One drawback of this survey was that teachers rated almost all questions as being very useful, yet also said our review needed to be shorter. It became clear that we needed to flip our perspective – to that of a teacher reading the review, who is interested in a job there and needs the insider’s scoop.

Survey 2: ‘What Do Teachers Want?’

So next we ran a survey designed to answer questions like: What do teachers desperately want to know about a school? What are their absolute top priorities when looking for a new job?

We got 74 responses from teachers at a range of schools. When asked to rank focus areas in terms of “What would matter to you the most?” teachers gave the following ‘Top 5′:

1. Teacher wellbeing.
2. Demands on teachers’ time.
3. The day-to-day running of the school.
4. Senior leadership.
5. The behaviour of students.

A few interesting things arose here.  One was that some areas we had originally included in our question set weren’t judged to be of high importance, such as ‘IT systems’ and ‘Career opportunities’. So we removed these from the question set. Another was that ‘Behaviour of students’ was clearly important to teachers, so we included a whole ‘Key Area’ for this.

Teachers also identified what specific questions they’d most want to see responses to. The Top 3 were:

1. “This school makes a good work/life balance possible.”
2. “The deadlines and time pressures imposed here are fair.”
3. “There is a positive atmosphere amongst colleagues.”

We made sure to include all questions with a really high ‘want rating’ such as these. We pruned away questions which teachers didn’t care so much about.

Survey 3: A Large Data Set

Once our question set had been edited, we realised we needed to ask these once more so it could be finalised. We got anonymous responses from 124 teachers, which was a sufficiently large number to enable a proper statistical analysis.

We used this analysis to edit the question set down to 4 Key Areas – ‘Atmosphere’, ‘Workload’, ‘Behaviour’ and ‘Leadership & Management’, each with 4 questions. By deleting questions and moving a couple around, we ensured that the responses to the 4 questions within a Key Area were consistent with each other. Without going into the statistics too much (I teach a lot of statistics, but I know it’s not everybody’s cup of tea!), this was indicated by a high ‘Cronbach’s alpha’ score for each of the Key Areas. This justified us grouping the questions together in this way.

Big Picture Questions

Some people have asked, “Why do you have an overall question at the end, why not just find an average of all the questions?”. Our rationale is that any averaging process would be misleading, since it would be based on the unwarranted assumption that all items are of equal importance. At the end of our review we ask, “Taking everything into consideration, how would you rate your experience of working at XYZ School?” allowing each teacher to judge the relative importance of each of the items themselves. Nevertheless, a pleasing result of this final survey was that the responses to this overall question correlated very well with the average of the responses to all the individual questions.

The following may be of interest to stats enthusiasts out there. We ran a factor analysis of our results to see if they revealed a number of distinct underlying factors important to general teacher job satisfaction. Apart from finding ‘Behaviour’ as a factor of minor importance, we didn’t find any other factors beyond one general factor contributed to by many of our questions. So what does this mean? It means that our 4 Key Areas do not seem to be separate factors, even if they do each have internal consistency.

We stuck with organising the questions into these Key Areas.  This is primarily because the previous survey had identified them as ones teachers said they would actually want to find out about when looking for a new job.  Additionally, teachers have expressed how they want to be able to ‘drill down’ to the results of specific questions within each Key Area.

Finishing Touches

We made our final decisions on the question set by triangulating between this final statistical analysis, the ‘What Do Teachers Want?’ survey, and our desire for everything to come together to form one coherent review. If you have any further questions about any part of this process, please just ask:

We’re really excited to see what’s revealed as we gather more reviews and explore the results in more detail. For this to happen, we need as many reviews as possible. If you haven’t yet, do consider writing a review of your school.

Photo credit: Jared Cherup