Exclusive: Age, disability, marital status and nationality influence decisions to investigate claims, prompting fears of ‘hurt first, fix later’ approach
An artificial intelligence system used by the UK government to detect welfare fraud is showing bias according to people’s age, disability, marital status and nationality, the Guardian can reveal.
An internal assessment of a machine-learning programme used to vet thousands of claims for universal credit payments across England found it incorrectly selected people from some groups more than others when recommending whom to investigate for possible fraud.
Continue reading…
{Categories} _Category: Implications{/Categories}
{URL}https://www.theguardian.com/society/2024/dec/06/revealed-bias-found-in-ai-system-used-to-detect-uk-benefits{/URL}
{Author}Robert Booth, UK technology editor{/Author}
{Image}https://i.guim.co.uk/img/media/52d3c80c0959c656da707331b163c217d7f9b869/0_183_5500_3300/master/5500.jpg?width=465&dpr=1&s=none&crop=none{/Image}
{Keywords}Universal credit,Artificial intelligence (AI),Welfare,UK news,Politics,Technology,Society,Benefits{/Keywords}
{Source}Implications{/Source}
{Thumb}{/Thumb}