Skip to main contentSkip to navigationSkip to navigation
A view over Bristol in summer
An IBM computer processes data including from the police and NHS about 170,000 residents of Bristol 24 hours a day. Photograph: Alamy
An IBM computer processes data including from the police and NHS about 170,000 residents of Bristol 24 hours a day. Photograph: Alamy

How Bristol assesses citizens' risk of harm – using an algorithm

This article is more than 4 years old

Almost a quarter of population of city is processed by algorithm used to guide actions of frontline staff

Day and night, an IBM computer algorithm whirrs through reams of data about the lives of 170,000 citizens of Bristol. The city council’s servers teem with information from the police, NHS, Department for Work and Pensions and the local authority about these individuals’ work, alcohol, drug and mental health problems, crime, antisocial behaviour, school absences, teenage pregnancies and domestic abuse.

Almost a quarter of the population of the West Country’s biggest city is processed by the algorithm, which gives these people scores out of 100 to show the likelihood of them behaving antisocially, abusing children or going missing, among other things. The system has even predicted which of the city’s 11- to 12-year-olds seem destined for a life as a Neet – not in employment, education or training – by analysing the characteristics people currently in that position showed at that age.

Citizens do not automatically have access to the results from the integrated analytics hub but can request them under data protection laws. The council, however, is using them to guide the actions of frontline staff.

Often, the algorithm tells these workers what they already know, but its insights have already helped inform decisions to deploy £800,000 to fund more social workers and family support workers in parts of the poorer, socially deprived south of the city, where risk factors such as school exclusions, domestic violence and crime indicated emerging demand.

The project is part of a trend among cash-strapped local authorities to embrace algorithmic technology to analyse their citizens. It holds out the hope of reducing fraud or preventing costly social problems, but the systems are often opaque. Private companies are pouring into the market, but there have been problems. North Tyneside council recently ended a contract after an algorithm identified the wrong people as high-risk claimants, leading to their benefits being wrongly delayed.

Bristol has decided to be open about its system, which has largely been developed in-house. Citizen scoring is underpinned by analysis of the circumstances of people who have gone through problems before. For example, the algorithm determines the likelihood of a child being sexually exploited through analysis of 80 different risk factors from the records of people who were previously abused. Going missing more than once is the most reliable predictor, but data about domestic violence, gang membership, school discipline and truancy are also fed into the algorithm.

In seconds, the system can paint a picture of risk for an individual that would take social workers years of local knowledge to build up. New data arrives around the clock and when it creates a concerning pattern, the system can automatically email an individual’s social workers or family support workers and urge them to review their plans. Feedback from frontline officers on which risk factors are most germane is used to then “tune” the algorithm to improve its accuracy.

Gary Davies, the head of early intervention and targeted services for children and families, said: “It is not replacing professional judgment, it is not making any decisions on its own. What it does is give us information that has been sunk in organisations’ memories.”

A recent effectiveness check examined the sexual exploitation of five young people. It found that prior to them being victimised last year, the algorithm had placed three of them in the top 100 of young people in Bristol who were likely to become victims; a fourth was between 100th and 200th. This was out of more than 7,000 people. The fifth would have been high up as well if a piece of data about them had not become disconnected.

This raises serious questions. If an algorithm predicts harm, does that place the council under a legal and moral obligation to prevent it? Should the council allow the algorithm to tune itself to account for when its predictions were right or wrong – in other words, to use machine learning to potentially improve its accuracy? This does not currently happen, but might give the machine greater influence over the question of who needs the council’s help most. If the algorithm is effective, should it automatically mandate that workers take action, rather than simply tipping them off about possible concerns?

For now, Davies, a former chief superintendent with Avon and Somerset police, is keen to retain the primacy of social workers, health visitors and school welfare officers, who spend time in schools and on citizens’ doorsteps. He believes people might resist the algorithm being allowed to develop machine learning.

“It is hard to know if it goes in that direction,” he said. “Public opinion could go against it.”

But the neatness with which the system categorises families and citizens with scores and rankings is likely to prove alluring to a resource-strapped public sector. At a click, officials in Bristol can download an individual’s digital “vulnerability profile” including percentage scores denoting the likelihood of a range of other harms. The risk of bad outcomes for whole families can be displayed as a single line graph showing change in risk over time.

It is a simplistic snapshot of the complexity of human life. Bristol has acknowledged misinterpretation of scores is a risk, particularly when it comes to child exploitation, and it has identified hacking of information or even misuse by those with authorised access as dangers.

“There is definitely a need to be mature in your thinking about what this tells you,” Davies said. “There is a benefit to be gained in allowing people with the right intentions to use modern technology to assess risk and vulnerability, and allocate resources to those that need it most. We have to stimulate a conversation with the public to get their consent.”

Most viewed

Most viewed