Children are abused, neglected and in otherwise dangerous living situations every day in the United States. Unfortunately, not all of these cases are reported, and of the ones that are, many are left uninvestigated. Or maybe worse, nothing comes of the investigation.
In reported cases, social workers are left to gauge whether or not the child(ren) are in immediate danger and need interventions to protect and care for them ... however, the human mind naturally puts in its own bias and opinions on individual circumstances, which unfortunately means leaving some children in unsafe conditions.
In the past, it has been left up to humans to screen each call and allegation of child abuse (often combining the call with prior reports of abuse or neglect) but that may change. A county in Pittsburg is the first place in the U.S.A. to "let a predictive-analytics algorithm — the same kind of sophisticated pattern analysis used in credit reports, the automated buying and selling of stocks and the hiring, firing and fielding of baseball players on World Series-winning teams — offer up a second opinion on every incoming call, in hopes of doing a better job of identifying the families most in need of intervention," stated the New York Times.
The algorithm weighs each instance reported, analyzing it for danger indicators and measures the risk from low to highest. While a human mind may rank a family as low risk (meaning they wouldn't likely investigate), the program might rank a family high risk, necessitating an in-person investigation to parents and children.
The program is useful because it eliminates most bias based on race, economic levels or the social worker's personal experience with situations like drug use or child abuse. It's not that the screeners of abuse or neglect allegations are incompetent, the problem is that there are so many factors to look at — each child in the home, each person the child lives with, records of abuse, arrests, drug or alcohol use, etc. It's too much for a person to see the whole picture without error, according to the New York Times. Investigations into child welfare are subjective, but with predictive analytics, it's more objective and therefore effective.
Children are often helpless victims of sexual, physical, emotional or verbal abuse, and/or neglected or live in otherwise unsafe circumstances.
The New York Times cited, "Nationally, 42 percent of the four million allegations received in 2015, involving 7.2 million children, were screened out, often based on sound legal reasoning but also because of judgment calls, opinions, biases and beliefs. And yet more United States children died in 2015 as a result of abuse and neglect — 1,670, according to the federal Administration for Children and Families; or twice that many, according to leaders in the field — than died of cancer."
We see the news of children who have been abused by a parent or caretaker. Broken bones, broken spirits and even death lace our news feeds with sorrow and hopelessness. Hopefully, this new algorithm can help fill in the gaps of human error to protect and save innocent lives.
How many go cases of abuse go unreported? How many children live silently with abuse every day with no one to stand up for them and protect them? As observant adults we need to watch carefully children around us. Listen to what they say, notice changes in behavior. If you see signs of abuse (sexual and physical) or signs of neglect, do something. If you suspect abuse or know of abuse, most states require reporting it.
Yes, we don't want to be in somebody else's business or jump to conclusions, but what if you don't and regret it later? Err on the side of caution and report it if you suspect child abuse. It could save a child's life.