major evaluation of teach for australia

From May 2016 to March 2017, dandolopartners was contracted to evaluate a major national education intervention focused on new education pathways for teachers. Laura Williams, dandolo’s Operations Manager, spoke about this project with Joe Connell. Joe is one of the firm’s directors and the project manager on this evaluation.

LW: Joe, thanks for speaking to me about this project. Before we launch into dandolo’s response, can you share some information on the project’s background?

Probably the first thing to understand is what Teach for Australia (TFA) is. It’s known as an alternative pathway into teaching. TFA:

  • Selects high achieving graduates with no prior teaching experience,

  • Provides them with intensive training and support, and

  • Matches them with disadvantaged secondary schools where they are paid to teach for a two-year placement.

It’s similar to Teach for America, or Teach First in the UK.

TFA was, and still is, substantially funded by the Australian Government education department. The department commissioned us to evaluate TFA. This meant we had to assess to what extent it was delivering on the department’s objectives for its investment, and whether there was scope for improvement.

Our evaluation took place around 2016, six years into TFA’s operations so the model was reasonably mature and TFA was operating at scale.

LW: I’ve heard it said around the office that this is one of dandolo’s most iconic projects. How did it get that reputation?

JC: Well, there are a couple of responses to that question. For me personally, this was a real coming of age project. It was the first project that I had worked on for a federal client and was also the most substantial in scale, nature and duration that I’d tackled. It was also, importantly, when I started to explore some of the analytical tools that have become integral to dandolo’s toolkit.

From the perspective of the firm, this project epitomises the kind of work we love. It’s aligned with our values, high-profile and challenging. It was an opportunity to deep-dive into a program that is exciting, and full of possibility, but that also had some serious critics, and has, at times, been controversial. And to cut through of all of that to understand whether and how TFA was delivering public value and how it could deliver more.

We’re also really proud of the quality of the advice that we provided, the change it precipitated and the lasting relationships we built with clients that remain well after the project formally concluded.

LW: What was noteworthy about the approach you took to this project?

JC: One of the things that sticks in my mind is how we incorporated ways of thinking from traditional evaluations and management consulting. We started by coming up with a framework to describe our client’s objectives and their key drivers. Then we developed hypotheses – with our client – about which drivers were the most significant, and which needed the most interrogation. That allowed us to be really efficient in targeting our analysis for maximum value.

In a project like this, that meant spending relatively little time confirming that high quality candidates were indeed attracted to the program and spending more time figuring out if graduates of the program stayed in teaching and teaching in disadvantaged schools.

LW: So what’s an example of an area where you really focussed your attention?

JC: As I said, the question of how long graduates of the TFA program stay teaching – rather than joining public policy consulting firms, for example – and whether they stay teaching in disadvantaged schools, was critical in the project. The client was basically investing in high quality teachers in lower socioeconomic schools. Each year a TFAer stayed in a school, they got a return on their investment. Whenever they moved on, they did not.

So we looked at the retention question in different ways.

  • We started with TFA alum survey data, which at the time had quite low response rates, and sough to triangulate.

  • One jurisdiction was able to run an analysis of their payroll data. Payroll is great because it’s granular, quite intimate, and very rarely wrong!

  • We were also able to ‘cyber stalk’ TFA alums. Stalk is not really the right term, because we relied on public information only, but we found LinkedIn, school newsletters and various other online mentions to be a rich source of information.

Through all this we developed a comprehensive and nuanced view of the retention question.

LW: What else did you do that was innovative in this project?

A couple of things come to mind.

As a nation-wide program that aimed to disperse its graduates, it would have been really hard to conduct a meaningful number of stakeholder interviews in person. Using online focus groups, we were able to involve a greater number of participants than we would have been able to meet face-to-face and we reached a broader, more representative cross-section of the cohort. Current and former TFAers were enthusiastic participants in this fieldwork. They logged in daily over the course of a week and engaged with our moderators, and each other, in a format that felt like a discussion forum. That was the first time we used this tool. We’ve done online discussion forums in a handful of projects since.

One of the most useful outputs of this project was a value-for-money analysis we conducted to quantify the benefit realised for the Department’s investments. This was important a) just because we were talking about a spend of public money for which TFA needed to be accountable but b) because it provided a useful frame to consider changes to the model. If we could lower costs, or increase the number of additional teaching years created, that would improve value for money.

LW: Finally, what impact did the report have?

JC: You can see a bit of the report for yourself. As is common on projects like this, we produced a public executive summary. That’s here.The report proved the extent to which the program was delivering on the Government’s objectives and the Department was able to use this evidence to retender for services. I think it helped the department reiterate their frame of reference for their investment.

In the short term they renegotiated their contract with TFA, and some of the recommendations we had made ended up as provisions. Then, in the medium term, and consistent with our advice, the department chose to go to market for suppliers of alternative pathways into teaching. TFA was successful in this process, but so was another program based out of La Trobe university. And my understanding is that the TFA contract after the competitive process further reflected some of the changes we had proposed.