Cybercriminals have found a way to leverage emerging technology to ensnare new prey: earnest and unsuspecting leaders and employees at companies everywhere. Bad actors are using artificial intelligence (AI) technology to craft sophisticated emails that fool employees and C-suite executives by impersonating their colleagues through voice spoofing, manipulated photos and AI-generated video.
Â
The results are mind-blowing in both their evident effectiveness and financial destructiveness.
Â
In one shocking real-life example, fraudsters used AI to create fake virtual images of finance leaders attending a multi-person video conference call. The result? A finance team leader sent $25 million to the scammers because the employee believed he was on a real video call with the company’s CFO and other senior employees.
Â
While most people say they’d never fall for such an obvious scam, the rise of AI technology is making it easier for cybercriminals to create elaborate “deepfake” schemes — and making it that much harder for people to distinguish the authentic from the deceptive.
Â
The prevalence of these scams — and the financial fallout — continues to rise. Many insurance carriers say funds transfer fraud now rivals ransomware as their top cyber claim. In 2023, funds transfer fraud (FTF) and business email compromise (BEC) made up 56% of all cyber claims.
Â
FTF increased 15% and claim severity rose 24% between 2022 and 2023, with losses averaging $275,000 per claim. With $2.9 billion in reported losses in 2023, business email scams were the second-costliest type of cybercrime behind investment scams.
Employees: The million-dollar target
Employees at any level in a company are conditioned to respond, often with lightning speed, when a request from a senior-level executive comes across their desk, making them the perfect target. They take directions and act quickly without thinking it could possibly be a scam, especially when they believe they’re seeing and talking to their CEO or someone else in the C-suite.
Â
The impact can be crushing. Not only can companies face a devastating financial blow or fines for releasing sensitive information, deepfake technology can sabotage deals or allow online fraudsters to gain access to a company’s network and data. In most cases, the scam isn’t discovered until it’s too late to avoid the consequences.
Â
These hoaxes can also threaten someone’s brand, reputation and even their family’s physical and financial safety. From using AI to create fake nude images of celebrities and generate videos of well-known figures falsely endorsing a product to fabricating famous voices for AI-promotional use and leveraging generative AI to bully and harass, bad actors have found innumerable ways to create new headaches for individuals and businesses alike.
The HUB EDGE
The prevalence and sophistication of deepfake scams will only increase. Organizations need to take steps now to evaluate their exposures and create a risk management plan tailored to their industry and circumstance.
In evaluating and planning for risk, remember that:Â
- The transaction is the target. Educate employees to be skeptical of all requests to transfer sensitive data or funds or modify vendor banking information — and authenticate everything. If your CFO never contacts you and suddenly asks for a large wire transfer on a Zoom call, the alarm bells should be ringing. Be the person who calls the CFO’s cell phone while still on the Zoom call to ensure she made a funds transfer request. Phone the personal extension of the vendor who “called” to ask for a bank routing change. Verifying doesn’t have to be sophisticated, but it absolutely needs to happen.
- Prevent fraudsters from cracking the code. Use code words! Establish a secret phrase or password in an in-person meeting that’s required to authenticate transactions. While this could be applicable in some business settings, it’s most appropriate for high-net-worth families, family offices and smaller organizations, where only a small number of individuals will be privy to the code and able to verify a funds transfer.
- Protocols and policies are pivotal for prevention. Businesses should consider advanced security protocols and tools, including deepfake detection tools, and ensure their IT application and operating system providers have the latest resources for deepfake detection and deterrents. In addition, companies should maintain strict policies that outline internal controls and authentication procedures for all financial transactions, particularly those involving fund transfers, and require ongoing training for everyone on recognizing bad actors.
- Speak up if the subject looks suspicious. It’s nearly impossible to erase all signs of a deepfake. Pay attention to the person’s eyes and eyebrows. Strange shadows, too much glare on glasses or glare that doesn’t change when a person moves and blinking that appears too infrequent or too often could indicate that the individual on your screen is a deepfake.
- The right coverage is crucial. Even with the right policies, detection tools and training, no one is immune to someone falling for a deepfake. You need the right insurance to ensure you’re covered for funds transfer fraud, invoice manipulation or social engineering scams. You also need an experienced broker who can provide risk mitigation resources and crisis response coverage — including access to world-class vendors such as forensics firms and privacy attorneys — that can minimize the financial and reputational damage of cyber scams.
We are in the nascent stages of deepfake technology development, and it’s only going to get more sophisticated and difficult to decipher the real from the fake. Do you think you can spot it, and more importantly, are you confident your employees can?
Â
If you have questions about how to protect your pharmacy operations with cybersecurity insurance, please contact the OPA HUB team at [email protected] or 1-855-672-7672.
Â
This blog post contains content that was originally published by HUB and is used here with permission. The content has been modified from its original form for OPA’s insurance products.Â