LGiU Scotland’s Kim Fellows is in conversation with David Martin, CE Dundee Council, and Emily Lynch, Improvement Service, talking about the Local Government Benchmarking Framework.
It would be good to start with a little bit of the background on this. So what is the benchmarking framework?
The Local Government Benchmarking Framework [LGBF] has been around for about seven years now, and it effectively gives councils, and any other interested party, the ability to look at the performance of local government and partners on a broad range of functions. The framework helps give a sense of where local performance fits in relation to national performance.
The benchmarking framework provides comparative information for all 32 councils across all key service areas. There are two specific design features that might be helpful to highlight: One is that it really encourages a rounded view of performance and provides information on cost, performance and satisfaction – information which helps us move away from a simplistic league table approach. The second is that it focuses on all key local government services – helping councils and their partners look at the connections across services and supporting them in their focus on improving outcomes. If you’re focusing on outcomes, for example for children, there are so many different service areas that play a very important role in improving these whether it’s children’s services, culture and leisure services, or housing services. The framework really supports that local focus on the interconnections across local government services.
It would be helpful for people who aren’t familiar with the framework to know how it came about. What was the initial driver for the framework?
DM: So I think the initial driver for the framework is that local government has always been subject to review, external audit and inspection and regulation activity. Around eight years ago, local government had evolved to a place where there was a range of detailed individual statutory performance indicators and we’d been operating like that for a long time. It felt imposed on local government and it felt that we were feeding a machine of data that wasn’t being particularly well used diagnostically. It wasn’t of great interest to the media, it was of limited interest to politicians, and around that time we began to lobby and to suggest that local government could be trusted to do our own self-evaluation and self-determination, and that all improvement is local and needs to start and finish locally.
The idea was that local government could deal with the data in a more meaningful way, in a way that improved outcomes for communities and individuals. The principle was that local government needed to think again and the framework evolved into what it is today, from relatively limited beginnings. We didn’t immediately dispense with the Statutory Performance Indicators [SPIs] and we worked really hard with Accounts Commission and Audit Scotland to demonstrate that local government could be trusted to mark our own homework, that the data was robust, and that we could replicate comparative information across all Scotland’s local communities. Local government then worked through a period where the SPIs continued in one form or another as we grew the reach and the depth of the local government benchmarking framework.
We’ve continued to evolve – we started off with inputting measures about cost and we’ve gradually developed a much more rounded approach. Maybe the best example would be the children’s indicators. Looking back three or four years, it was about cost per pupil. In secondary and primary, there was a lot of very interesting information but it only provided a very partial measure of the service. A lot of feedback from chief executives and from politicians was that local government [LG] needed a rounded view. In a couple of steps we’ve grown to the 20 odd indicators existing today looking at early years developmental milestones, performance of services for looked after children and young people, attainment and achievement for young people, post school destinations in the short and longer term etc. It also looks at the cost information. Reports provide a rounded view on how children’s services are performing, with efficiency and effectiveness data and a focus on outcomes, and that has worked well for the LGBF. We also produced a themed report on children’s services, which allowed us to go into a bit more depth around the issues. This demonstrates that local government takes this issue seriously, that we want to inform all stakeholders and members of the public about what we’re doing and that challenges and support is welcomed.
There is a little degree of “I’m a chief executive, one of 32, and I want to know how Dundee is performing even if that’s uncomfortable reading”. I want to know if we’re not performing particularly well in a particular area, and from the framework I know who is performing well, so I can directly talk to them. It also allows me to delve deeper and dig in to what’s behind the data. I would stress it’s not definitive – it gives you the direction of travel and allows you to ask the more detailed questions. I think politicians have found that approach really helpful.
EL: I think it is an important piece of information that can supplement other intelligence that people have locally to build that whole picture – and the LGBF gives you the comparative element. One of the striking features is seeing the benefits of the work being led by local government, because I think they have embraced the flexibility that that’s afforded them in terms of being able to focus in on those areas that are of particular policy relevance and strategic importance to them. What we’ve seen is a real commitment across councils to contribute to that process in terms of informing the development of measures and ensuring that the data that’s provided is sufficiently robust for benchmarking. We’re about to publish our ninth year of data and we still have 32 councils signed up to this approach providing robust and consistent data and, importantly, contributing to the ongoing development of the framework. We’re continuing on that development journey and councils are really driving and shaping that future.
How credible do you think the framework is amongst politicians and officers?
DM: I think you can never take anything for granted. Personally, I think the LGBF has grown in credibility. We’ve had discussions in parliamentary committees, the data has been referred to and used in First Minister’s Questions and much, much more. From the point of view of local interest, there’s a lot of regular public performance reporting information that councils are both required, and want, to embrace. The local government benchmarking framework is at the heart of that, so in council chambers, politicians are regularly looking at the web-based tool and in real time, asking questions about accountability and scrutiny on service data. It’s been used as intelligence and it’s been used to pose policy questions, both locally and nationally.
If we talk about inspectors and regulators, the best value audits that have been done by Audit Scotland have adopted a core dataset drawn from the local government benchmarking framework, so it’s being taken seriously by Scotland’s audit community. Accounts Commission regularly challenges us about where we’re going next, so we have a strategic plan, we’ve got a forward look, we’re thinking all the time about the relationship between how councils contribute to what our community looks like, and how other partners do so. There’s a really strong link between the local government benchmarking framework and information from wider community planning purposes.
I think that LGBF has grown in credibility and we’ve moved forward. Seven years ago it was a nuisance, and now there’s more of an improvement culture. We’ve really grown. An example would be the family groups; there are groups of like-minded councils who regularly meet in particular areas and support and challenge each other based on what the data is telling them. The media are less league table-y and more interested in issues, and I think of that as a positive thing. To be frank, personally I have no fear of league tables – facts are facts and at the end of the day, if we’ve got information that we need to know, the public should be scrutinizing it and we should be working through our accountability loop to try and improve services. Culturally, we’ve become more confident about the LGBF.
EL: Conversations have most definitely developed – they are more nuanced now and there is more of a focus on what’s driving the difference and what can you learn from that, as opposed to who’s top and who’s bottom. It’s still a work in progress and I think it continues to be the area of development both for ourselves, and also within each of the local councils in terms of how do we use this information to actually develop the conversation about performance. I think COSLA are really keen to work with us on this as well, specifically in terms of to support elected members to really engage effectively with the material so that they too are using it to have that more nuanced conversation.
DM: It’s important to stress that we can always improve LGBF and it’s never a question of a job done. We talked about the issue of relevance – public protection is a big issue in Scotland just now, whether that’s about child protection or adult support protection, or violent offenders. At the moment you can’t use the local government benchmarking framework to give you a dashboard for public protection, so we’re looking at whether that could be built into the LGBF, what that might look like, and how it might complement some of the political priorities and national performance framework priorities for Scotland.
It’s important that LGBF is evolving through our strategic plan to make sure that we can keep up with the policy agenda of the day. What we’re not trying to do is replicate or duplicate efforts that have already been done elsewhere. We mentioned child protection and adult social care: There’s a range of performance data being developed on the back of national work in health and social care, and we need to complement that. There’s also a child protection dataset for the first time in Scotland and it’s consistent across all of the public protection partnerships that exist. I was actually at an event yesterday where we were discussing using the local government benchmarking framework data around children as context, and the information around child protection. If you put these two things together you’ve got a really clear understanding of prevention and protection together so it’s about trying to anticipate and add value all the time, not just drive a tick box agenda.
What do you think the future holds? Going forward next year, around climate change and sustainable development for example, how can the framework evolve and help?
DM: LGBF has a board, on which COSLA are represented at a political level and I have the privilege of chairing, and we’ve got a combination of professional associations, Improvement Service, and a number of other interests who make it happen. That is a dynamic process and we scan and think about how we go ahead and deal with newer issues. There’s been a declaration of a climate emergency in Scotland and we’re now going to have to be looking at how and where we can add value in this area. We don’t start conversations with “the trouble with that is…”, although it’s very easy to do that – we’re looking at how can we offer something different. My sense is that a climate change dashboard is quite a challenge but we’ll see what we can do to play our part and get actively involved in debates about how that might look in Scotland.
Another important issue is that Community Justice partnerships need to get deeper because of resource constraints. That’s partnership between agencies and with communities, and we need to be thinking about how other agencies could perhaps play into this arena. I have regular dialogue with Police Scotland and Scottish Fire and Rescue Service about getting more fine-grained, comparative information from across divisions. It’s helpful to recount our journey, because if you’ve not done it before you worry about “how does D division of Police Scotland compare with B division” and it ends up being used in an unhelpful way. Our standpoint is just get out there on the front foot and use the information for the purposes of improving public relations and local scrutiny. Also, I’m really interested in what D division’s doing compared to B division but doing that within an organisation, which happens everywhere, is one thing. Doing it across a partnership is so much more powerful and it gets you into far more collaborative joint working. For national agencies that have got a regional geographic structure, there’s a lot of value in doing this, and it has certainly benefitted us in local government – but I understand why some people may be a little nervous about it.
EL: Related to that, in terms of our focus on the services that councils are delivering in partnership, we keep our sights on the link to the whole outcomes landscape. We are continuing to look at how to strengthen the links between the benchmarking framework, which obviously has a focus on local government, and outcomes through the Community Planning Outcomes Profile. The profile provides high level outcomes data and allows you to focus in on specific communities within your area to get to a much more geographically concentrated focus in terms of outcomes and inequalities. As we develop the benchmarking framework, we’re continuing to look at how the information that we have helps us understand the progress we’re making towards these outcomes. That, again, helps us to hook in those conversations with partners, because in terms of delivering on progress and outcomes these are things that we need to be working on very closely with partners.
So what are the key messages from what we’ve talked about here today?
DM: Go compare, I think is one key message. Chief Executives have a short attention span and need clear, succinct information to aid decision making. I think from the point of view of the data we’ve got, it allows you to frame questions and seek assurance. So, if I’m in Dundee it’s a tale of two cities – the challenge is that for all the good things that are happening in Dundee, you’ve still got high levels of child poverty, difficult issues with substance misuse and mental health and wellbeing. Having data around that, both from the community planning outcomes profile and the local government benchmarking framework, just gives me some context and something deeper that allows me to sense-check progress. That is particularly helpful, and there’s nothing to be afraid of.
EL: I would say that it is a success story for local government. Whilst there’s still work to do and local government is still committed to driving that forward, I think you just need to look at all the examples that councils provided in the recent report ‘How Councils are using the LGBF’ about the different ways that they are using this information, whether it is in performance, scrutiny and reporting to the local communities, or whether it’s about driving improvement. We’ve collected so many examples from authorities that show how they are making use of this information and how it’s helping them to drive improvement and will continue to do that to build this evidence base.
DM: That’s been one of the most interesting things to the council leaders and senior politicians, because they want to see the difference it makes. There’s actually an increase in the amount of source information and learning from that right across the range of activities and that learning’s been adopted by councils who perhaps wouldn’t have even known it existed, had they not had the data to compare.
Further information is available on the LGBF website: http://www.improvementservice.org.uk/benchmarking/explore-the-data.html
Thank you very much.