The Justice Department is reportedly examining an algorithm used by a Pennsylvania county child welfare agency to determine which child neglect allegations merit a formal investigation, following a series of complaints that the algorithm unfairly targets parents with disabilities. While the district says the algorithm is intended to reduce human error in child welfare investigations, critics say the tool puts parents with disabilities—who are already disproportionately investigated by child welfare agencies—at risk of unnecessary government intervention.
According to the Associated Press, in 2016 Allegheny County—home to Pittsburgh—began using the “Allegheny Family Screener,” an algorithm designed to help social workers better identify which families should be investigated for child neglect—a broad term that includes everything from leaving children unattended, lack of food to frequent absences from school.
The tool collects data from “Medicaid, substance abuse, mental health, prison and probation records, among other government data sets,” and generates a “Family Review Score.” According to the county’s website, a high score indicates a high probability that the child will be taken away by state authorities in the future. “When the score is at the highest levels, meeting the threshold for a ‘mandatory review,’ the allegations in the summons must be investigated,” the county’s website states.
According to the AP, the Justice Department has been receiving complaints about the algorithm since at least last fall. The complaints primarily center on an algorithm that includes disability-related data in the Family Screening score, a practice that could unfairly penalize parents with disabilities—and possibly violate the Americans with Disabilities Act.
The county appears to back up claims that its algorithm singles out parents with disabilities, telling the AP that when disability-related data is included, it “predicts outcomes,” adding that “it should come as no surprise that parents with disabilities … may also have the need for additional support and services.”
The full extent of the Department of Justice’s involvement is not known. However, two anonymous sources told the AP that attorneys from the Justice Department’s Civil Rights Division “[urged] to file formal complaints detailing their concerns about how the algorithm could reinforce bias against people with disabilities, including families with mental health problems.”
Allegheny County argues that its algorithm is simply a tool used to facilitate the screening of families for possible child welfare investigations, insisting that the tool was responsibly designed. “The design and implementation of AFST was a multi-year process that included careful procurement, community meetings, a validation study, and independent and rigorous process and impact assessments,” according to the county’s website. “In addition, the resulting model underwent ethical review prior to implementation.”
But critics argue that such algorithms often unfairly target families because of their race, income or disability. “When you have a technology that’s designed by humans, bias is going to show up in the algorithms,” Nico’Lee Biddle, a former Allegheny County child welfare worker, told the AP in an investigation into the Family Screening Tool last year. “If they designed a perfect tool, it really doesn’t matter, because it was designed from very imperfect data systems.” Last June, a similar algorithm used in Oregon was discontinued due to concerns that it was racially biased.
Parents with disabilities are already at increased risk of losing their children to state custody. While Allegheny County’s algorithm may be intended to help social workers make better decisions, it could lead to further entrenched prejudice against parents with disabilities.
“I think it’s important for people to be aware of their rights,” Robin Frank, a family law attorney representing an intellectually disabled man whose daughter was taken into state custody, told the AP. “And to the extent that we don’t have a lot of information when there are seemingly valid questions about the algorithm, it’s important to have some oversight.”