“…Algorithms are being deployed across the public sector where they are informing decisions regarding which child maltreatment cases to investigate [66,101], which families should be offered preventive services [106], which neighborhoods require more policing [17,55], and who are offered public housing and unemployment benefits [43,70], among others. However, audits of these systems show that they are achieving worse outcomes [29,97], embedding human biases in decision-making [107], and exacerbating racial and socio-economic biases [43,68,94]. Algorithmic systems are causing real harm to vulnerable people who are witnessing cuts to their unemployment and healthcare benefits [36,43], loss of public housing [125], as well as experiencing unfair investigations from the child welfare system [59,114].…”