Predictim’s New Babysitter Background Check Faces Racial Bias and FCRA Controvers

Will parents soon have a new way to screen their babysitters? A startup company called  Predictim  has created a technology that uses artificial intelligence to vet babysitter candidates. The system uses an algorithm to collect data on a person, pulling details from social media profiles, online criminal history databases, and other online sources.  Predictim  rates each candidate on a “risk level” scale. Parents can use this information to decide whether to hire a babysitter. The platform’s launch has been paved with controversy, from racial bias accusations to FCRA compliance challenges.

Predictim  claims its AI algorithms collect and consider “billions of data points” when preparing risk assessment reports. Violent crime will significantly increase a candidate’s risk profile according to Predictim’s algorithm. However, the company says that its technology also focuses heavily on social media, looking for “content that is aggressive, abusive, explicit, or offensive.” The presence of this kind of content on a person’s Twitter or Facebook will result in a higher risk score. 

When a parent or family member runs a potential babysitter through  Predictim , the service will return scores in a variety of categories. These categories include “Bullying/Harassment,” “Disrespectful Attitude,” “Explicit Content,” and “Drug Abuse.” Scores range from 1 (very low risk) to 5 (very high risk) and fluctuate based on a person’s background and online activity. If someone’s social media account scores high in the “Bullying/Harassment” category, the person who ordered the check can ask  Predictim  to show the posts that prompted the high score.

A recent Gizmodo article called into question whether  Predictim  can do what it does without bias. The author of the article tested the software by using it to screen two people: his actual babysitter, who is a black woman, and a friend he knows who “routinely spews vulgarities.” The former was deemed “riskier” than the latter, largely due to a few Twitter jokes. The author of the Gizmodo piece asked if Predictim’s algorithms had internalized systemic biases that might disadvantage prospective babysitters who are minorities. Racial bias aside, the  Predictim  algorithm didn’t seem to have a sense  for  context or sarcasm.

Predictim  may have an even bigger problem on its hands: complying with the Fair Credit Reporting Act (FCRA). When the service initially launched, parents could initiate searches by doing nothing more than providing payment information and typing in the name of a prospective babysitter.  Predictim  would then pull huge quantities of personal data about the candidate: addresses, phone numbers, email addresses, social media accounts, names of relatives, and more.

At launch, this process would trigger an email to the babysitter asking him or her to consent to the background check. The creators of  Predictim  ultimately removed this feature because parents didn’t want to tell prospective sitters what they were doing. 

The FCRA requires employers to notify their candidates (and obtain written consent) before conducting background checks. While there is some  gray  area to this requirement with babysitters since they are not formal employees of the families they work for, it is still best practice to obtain consent before vetting a prospective sitter.  Predictim  added a notice to its site warning users that they could not utilize the service “to make decisions about employment, admission, consumer credit, insurance, tenant screening, or any other purpose that would require FCRA compliance.”  Predictim  then paused its launch and pulled its service from the web, in part because of criticism from articles like the one on Gizmodo.

At backgroundchecks.com, we have a partnership with Peoplefinders.com that makes it easy to run thorough criminal history checks on other people, including babysitters. These checks won’t include social media information or other data that might be viewed as an invasion of privacy. We always recommend getting consent before using this service.

 

Sources: https://gizmodo.com/predictim-claims-its-ai-can-flag-risky-babysitters-so-1830913997

https://www.predictim.com

Get instant updates on FCRA Compliance

Michael Klazema

About Michael Klazema The author

Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments

Michael's recent publications

More Like This Post

State Criminal Search

Virginia Criminal Search

A Virginia state background check can uncover more criminal records. Learn about these tools and the legal restrictions involved.

Order a Search for Virginia