A Los Angeles entrepreneur developing technology for social media background checks sees value in trawling the Web to guess how people behave offline, a controversial topic after the San Bernardino terrorist attack led lawmakers to demand online scrubs of visa applicants.
Ben Mones’ Fama Technologies Inc. offers employers an online service that automatically finds profiles of job applicants and flags suspicious public posts. Employers can use the tool to pull phrases that show a bias against women, a bent toward violence or a penchant for bigotry. Each could suggest the person would clash with colleagues and customers, and the data often supplant what traditional background checks turn up, Mones said.
Advertisers, lenders, landlords and many employers agree. They’re among groups digging through online diatribes and adulations to more accurately judge people. Mones hadn’t considered visa authorities as a potential customer when creating his company a year ago, but that they could benefit is no surprise.
“Background checks look for absence of information to draw conclusions about people: They aren’t a drug addict. They aren’t a criminal,” he said. Examining “the absence of information to flesh out who someone is is fundamentally divergent from looking at patterns of historical behavior.”
The Times reported last week that immigration officials recently began testing ways to include social media vetting in their routine. For years, senior officials at the Department of Homeland Security debated whether checking people’s profiles would be too intrusive.
Reviewing apps such as Facebook, Twitter and Instagram may not have stopped the rampage by Tashfeen Malik and Syed Rizwan Farook at the Inland Regional Center -- chats Malik had about Islamic jihad were hidden in private messages. But it might have highlighted more subtle risk factors.
Fama works via machine learning, or feeding a computer tons of data and letting the virtual brains organize it. The computer associates word groupings with six issues for now -- like violence, alcoholism or racism -- that could affect a person’s job abilities. The computer also can analyze images, spotlight keywords such as company names and classify the severity of flagged posts. For example, maybe a company can tolerate moderate bigotry but wants to note every mention of violence. Fama checks Facebook, Twitter, Instagram and Google+.
Automatically filtering out benign posts is more efficient than hiring managers scouring by hand, as many do now, Mones said. A State Department official declined to say how social media vetting for visas is conducted.
At Fama, about 95% of searches generated flagged results, Mones said, citing data from the few customers he’s signed up in recent months.
Mones has just three rules. Fama’s technology will scan only public posts. People must be warned about the scan and be allowed to contest results. The requirements are in line with the Fair Credit Reporting Act, he said.
Mones doesn’t fear people racing to mark posts private.
“It’s people who don’t realize there’s something wrong with what they do and what they say -- that’s who our clients are looking for,” he said.
And he insists results should be conversation starters.
People should have a chance to say, “This was a slip of judgment. ... Just look at my other 40 contrasting comments,” Mones said. “You can’t make decisions from an ivory tower about people involving social media.”
Technology challenges loom. Machine learning can be finicky, and applying it to all of the world’s languages is tough. Mones claims progress, saying that to start Fama’s program tracks down the right accounts more often than rival services.
Teenagers also pose a problem. They’re ratcheting up privacy settings and spending increasing time on more impenetrable networks such as Snapchat. Mones points to Facebook’s continued growth as a sign that social media data mining remains lucrative. Investors are letting him prove it. Fama has raised $1.5 million from Los Angeles investment funds including Amplify.LA, Double M Partners and Wavemaker Partners.