Advertisement

Column: Whatever you do, don’t say yes when this chatbot asks, ‘Can you hear me?’

Share

It’s the most cunning robocall scam I’ve encountered — and the fact that I’ve fallen for it more than once tells you how successful it can be.

The phone rings. You pick it up and say “hello.” There’s a brief silence and then a woman’s voice says, “Oh, hi there!” She offers an embarrassed laugh. “I’m sorry, I was having a little trouble with my headset!”

I’ve gotten this call a number of times in recent weeks, at home and at work, and each time I’ve been suckered by the lifelike opening to stay on the line longer than I normally would for a robocall or a telemarketing pitch. It’s only when I realize I’ve heard the exact same thing before that I realize I’m hearing a recording.

Advertisement

This is a new and highly sophisticated racket known as the “can you hear me” scam, which involves tricking people into saying yes and using that affirmation to sign people up for stuff they didn’t order.

It’s also an indication of what can be expected in the future from scammers and telemarketers as automated “conversational agents,” or chatbots, play an increasingly large role in interacting with humans.

I’ve spoken with a number of experts in the field, and they all say natural-speech technology is advancing so quickly that it may be only a few years until we won’t be able to tell if we’re speaking with a machine.

“On every front in development of conversational agents, there’s a huge emphasis on making them more sociable,” said Marilyn Walker, a professor of computer science at UC Santa Cruz with decades of experience in natural language processing.

“This stuff is all coming together now in a way that’s getting very close to artificial intelligence,” she said.

The “can you hear me” scam doesn’t seem to be using that level of technical achievement, but it displays a sneaky savviness about how to manipulate people.

Advertisement

Dan Weld, a professor of computer science and engineering at the University of Washington, said the techniques employed in the calls demonstrate “careful human engineering with an understanding of the human dynamics of conversations and what will sound natural.”

In other words, you won’t know it’s a robocall until it’s too late.

As the scam plays out, the recorded voice will raise the possibility of a vacation or cruise package, or maybe a product warranty. She’ll ask if you could answer a few questions. Or she’ll make it sound like her headset is still giving her trouble and say, “Can you hear me?”

Don’t say yes.

Police departments nationwide have warned recently that offering an affirmative response can be edited to make it seem you’ve given permission for a purchase or some other transaction. There haven’t been many reports of losses, but a Washington State man reportedly got bilked for about $100.

A recorded “yes” could also could be used to deny refunds to any consumer who complains.

“If someone calls and asks, ‘Can you hear me?’, do not answer yes,” advised the Better Business Bureau. “Just hang up. Scammers change their tactics as the public catches on, so be alert for other questions designed to solicit a simple yes answer.”

Walker, the UC Santa Cruz computer wiz, has been teaching computers how to speak since the 1980s, when she worked as a researcher for the Natural Language Project at Hewlett Packard Laboratories in Palo Alto. She’s also done stints at Mitsubishi Electric Research Laboratories in Cambridge, Mass., and AT&T Labs in New Jersey.

Talking machines have been epitomized for years by the automated switchboards that drive most consumers crazy. But Walker said we’re seeing the next iteration of speech technology in the likes of Apple’s Siri and Amazon’s Alexa — devices that can respond to users’ requests and, to a limited extent, give the impression of conversation.

Advertisement

The next step, she said, will be computers that respond to voice commands to perform multiple tasks across multiple websites or platforms. For example, booking airline seats, a hotel and a rental car without a human having to look at a screen or touch a keyboard.

“The vision right now for conversational agents is moving seamlessly among various tasks,” Walker said.

She acknowledged that as the technology improves and becomes more commonplace, it almost certainly will be embraced by telemarketers and scammers to try to dupe people into thinking they’re speaking with a real person, thus making a questionable sales pitch all the more believable.

“That’s clearly not out of the realm of possibility,” Walker said.

She said machines become more human-sounding the more they can be taught to pepper conversations with the occasional “um” or “uh-huh,” or to laugh at the right moment. They’ll soon convey what sounds like emotion and will adjust their vocal pitch to match the context of the discussion.

“These things are all being pursued,” Walker said.

She’s leading a team of grad students that’s competing for the first Alexa Prize, an award offered by Amazon for the university that can come up with a “socialbot” capable of genuine chitchat.

Each of 12 sponsored teams has received $100,000 from Amazon to fund their work. The team with the best-performing bot will win $500,000. An extra $1 million will go to the team’s school if its socialbot “converses coherently and engagingly with humans on popular topics and news events for 20 minutes.”

Advertisement

Obviously, any technical advances will be considered for future versions of Alexa.

I asked Art Pettigrue, an Amazon spokesman, if the ultimate goal of the contest is to produce a machine capable of speaking like HAL 9000 in “2001,” albeit without the homicidal tendencies.

He declined to go that far. But Pettigrue said that “we’re really at a tipping point for so many elements of the technology.

“We’re in a golden age of machine learning and AI,” he said. “We’re still a long way from being able to do things the way humans do things, but we’re solving unbelievably complex problems every day.”

Think the “can you hear me” scam sounds devious? Just you wait.

David Lazarus’ column runs Tuesdays and Fridays. He also can be seen daily on KTLA-TV Channel 5 and followed on Twitter @Davidlaz. Send your tips or feedback to david.lazarus@latimes.com.

Click here for a Spanish version of this story

MORE BY DAVID LAZARUS

Advertisement

Bitter pill to swallow: Less FDA oversight of supplements seems likely

‘Pension advance’ company is unmasked — and it’s no friend of California consumers

Your long, long odds of winning the Publishers Clearing House sweepstakes

Advertisement