Verder naar navigatie Doorgaan naar hoofdinhoud Ga naar de voettekst

Voice Impersonation and DeepFake Vishing in Realtime

door NCC Group

30 september 2025

Recent and ongoing advancements in AI technology mean that the dangers of voice cloning attacks are greater than ever before. An attacker can create realistic voice clones from as little as five minutes of recorded audio.

This article demonstrates the very real risks of voice cloning as applied to vishing exercises, describing the techniques and technologies involved in our attack simulation services.

For more information, see: https://www.nccgroup.com/technical-assurance/social-engineering-prevention/