Threat Library
AI impersonation fraud is no longer theoretical. Every scenario below has a documented real-world analogue — a company that lost money, data, or access because they trusted a voice, face, or email without a second channel of verification.
"Send this wire now" — urgent email from the CFO
Attackers compromise or spoof executive email accounts and create urgency to bypass normal approval processes.
Email CompromiseVendor calls to update their banking details
A fraudster impersonates a known vendor over phone or email, redirecting future payments to a controlled account.
ImpersonationReal estate attorney sends new closing instructions
Email accounts of attorneys or title companies are compromised, and last-minute wiring instructions are changed.
Wire FraudAI-cloned CFO voice authorizes an emergency transfer
Using just minutes of public audio, AI tools can clone an executive's voice to authorize transactions over the phone.
Voice CloneDeepfake CFO on video call approves a $25M wire
The Arup/Hong Kong incident: a finance worker was deceived by a full video call featuring deepfake recreations of colleagues.
Deepfake VideoIT manager requests admin credentials over Slack
Attackers compromise Slack or Teams accounts and use them to request privileged credentials from IT staff.
Account Takeover"Your CEO" approves emergency system access on a video call
Real-time deepfake video tools allow attackers to impersonate executives during live video calls to authorize access.
Deepfake VideoColleague asks you to share the client database — via text
Compromised phone numbers or messaging accounts are used to request sensitive data exports from trusted colleagues.
Data ExfiltrationFake IT support requests remote desktop control
Attackers clone the voice of a known IT contact and call employees to request remote access under the guise of support.
Voice CloneNorth Korean IT worker hired via deepfake job interview
Documented FBI-warned scheme: state-sponsored actors use AI face-swap during video interviews to gain insider access.
Identity FraudFake employee gains system access for months undetected
Once inside, fraudulent employees exfiltrate data, install backdoors, or sabotage systems over extended periods.
Insider ThreatExecutive voice memo authorizes a new hire or termination
AI-cloned audio of an executive can be used to issue HR directives — hiring, firing, or salary changes — without their knowledge.
Voice CloneAttorney instructs you to sign and return a contract — by email
Compromised legal counsel email is used to send fraudulent contracts or redirect signed documents to attackers.
ImpersonationManager approves sensitive personnel data release over phone
A cloned manager voice calls HR to authorize release of employee records, salary data, or personal information.
Deepfake AudioFake board member approves a policy change via video
Governance attacks target board-level decisions — using deepfake video to impersonate directors during remote meetings.
Deepfake VideoDeepfake face bypasses facial recognition at building entry
AI-generated faces or video loops are used to defeat facial recognition systems at physical access points.
Biometric FraudSynthetic identity used to open a business bank account
AI generates fully synthetic identities with realistic documents, photos, and credit histories to open fraudulent accounts.
Identity SynthesisCloned voice defeats bank voice authentication
Financial institutions using voice biometrics for authentication are increasingly vulnerable to AI voice cloning attacks.
Voice CloneAI Video Generation — The Emerging Frontier
EMERGING THREATReal-time AI tools can now generate a live video feed of a fake co-worker using only a few photos. A fraudster can appear on a video call as your CFO, CEO, or trusted colleague — giving verbal and visual approval for a wire transfer, policy change, or system access grant — then disappear without a trace.
In 2024, a finance worker at engineering firm Arup paid out $25 million after a video call featuring deepfake recreations of his CFO and other colleagues. The scam was only discovered weeks later. In 2025, Singapore saw a $499K deepfake CEO video scam. These are no longer edge cases.
→ LiveLock's out-of-band challenge cannot be intercepted by a video feed. The one-time word only appears on the real person's registered, device-bound app — not on any screen an attacker can see.