Councils harvest thousands of people's data to predict child abuse

Harvesting data from a laptop

Data on at least 377,000 UK residents is being amassed and used as part of a scheme to use predictive analytics to predict child abuse, reported The Guardian.

Through the use of algorithmic profiling, the data will be analysed and used to predict when child abuse could take place and allow for the authorities to intervene before it occurs. The scheme is being billed as a means of assisting social workers.

Despite good intentions, the data sharing endeavour is likely to rustle up more than a little controversy, given its capacity to violate individual privacy.

The scheme is relatively nascent, although it has been activated by at least five local authorities, discerns The Guardian. These, it reports, have developed and/or implemented a data-fuelled predictive analytics system to protect youngsters from child abuse.

As for the kind of data sought, it's both eclectic and expansive. Councils have attained information on everything from police records on antisocial behaviour and domestic violence to housing association repairs and arrears data and information on school attendance and exclusion. However, some datasets were discarded from the final algorithmic profiling models.

When it comes to the legality of the collective endeavour, it is being overseen by the Information Commissioner's Office (ICO), which regulates the use of individuals' personal data by third parties, both public and private. Speaking to The Guardian, an ICO spokesperson assured that the organisation would perform the requisite checks to ensure local councils complied with data protection law while undertaking the predictive analytics schemes.

Meanwhile, councils that currently employ algorithmic profiling are reporting results; take Hackney Council, which recently revealed that its system had flagged 350 potential risk families in need of protection. Thurrock Council followed close behind, reporting 300 similar incidences.

As the scheme snowballs, it's sure to generate further controversy with it. While it's not without its (sizeable) caveats, in an age where data is routinely chronically used to bolster commercial interest in phone cases/synthetic clothing/discount vouchers, using data to identify children at risk of abuse and act accordingly perhaps doesn't sound quite so monstrous after all.