From dating sites to phishing emails: How AI is creating more realistic scams

Artificial intelligence is helping scammers create more realistic scams. They are targeting consumers through dating profiles and other deceptive phishing emails.

From dating sites to phishing emails: How AI is creating more realistic scams

INCREASE YOUR SALES WITH NGN1,000 TODAY!

Advertise on doacWeb

WhatsApp: 09031633831

To reach more people from NGN1,000 now!

INCREASE YOUR SALES WITH NGN1,000 TODAY!

Advertise on doacWeb

WhatsApp: 09031633831

To reach more people from NGN1,000 now!

INCREASE YOUR SALES WITH NGN1,000 TODAY!

Advertise on doacWeb

WhatsApp: 09031633831

To reach more people from NGN1,000 now!

From dating sites to phishing emails, bad actors are taking advantage of artificial intelligence to create more realistic scams. 

They are now infiltrating dating apps to try and create a fake relationship, and eventually trick victims into sending money. Scammers are using bots at scale to create a massive number of accounts. Then, they'll utilize AI to chat to victims in a completely authentic way, Kevin Gosschalk, CEO of cybersecurity company Arkose Labs, told FOX Business.

AI VOICE CLONING SCAMS ON THE RISE, EXPERT WARNS

With AI, they "are able to… perfectly speak to a person to the point where they feel like the victim is kind of on the hook," Gosschalk said. "Then they hand it over to a human operator to kind of do the final half a mile in terms of figuring out how to scam the person into giving money." 

This is a trend that Arkose Labs, which helps businesses with bot prevention and account security, has seen pop up over the last few months. 

Gosschalk said it's also emotionally devastating because certain victims will get comfortable sending money when they are deep in a relationship. 

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Another issue is that phishing scams have become more realistic. Before AI, phishing messages or emails were traditionally created with broken English, which made it easier to detect when something was fake, according to Gosschalk. 

"We're now seeing them use generative AI to actually craft better-looking messages," he said. "The grammar they use now is basically perfect."

Unethical sellers are also using the technology to generate large volumes of more realistic, hard-to-detect reviews to prop up their reputation and sales. On top of that, there have been instances of fake, AI-generated product listings on e-commerce marketplaces. 

These "fake sellers, fake products, fake images, that deceive a consumer into buying something that turns out to be much different than what was depicted – if they receive anything at all," he warned. 

UPS USING AI TO PREVENT ‘PORCH PIRATES’ FROM STEALING PACKAGES

Scams using AI aren't completely new. Scammers have already been using deep fakes created from consumers’ voices that they recorded from sources like YouTube videos or calling them disguised as a telemarketer, Gosschalk added. 

It's become such an issue that companies are now concerned that their "CEO's voice will be leveraged from conferences, when they're on stage giving a speech, for example," Gosschalk said, adding that with AI, "scammers could transform that CEO's voice to social engineer employees." 

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The problems are only expected to proliferate in the new year, too. With 2024 an election year, the company projected that bad actors will try and leverage this technology "to run sophisticated influence campaigns, propagate misinformation, confuse and misdirect the public about issues and candidates." 

One thing holding scammers back from using AI and generative AI now is the cost to compute, which is so high in some cases that it eats into their return on investment.

Those costs, however, are expected to fall, meaning "2024 will be the year of the AI-generated scam, at scale," Gosschalk said.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow