Skip to navigation Skip to main content Skip to footer

Voice Impersonation and DeepFake Vishing in Realtime

By NCC Group

30 September 2025

Recent and ongoing advancements in AI technology mean that the dangers of voice cloning attacks are greater than ever before. An attacker can create realistic voice clones from as little as five minutes of recorded audio.

This article demonstrates the very real risks of voice cloning as applied to vishing exercises, describing the techniques and technologies involved in our attack simulation services.

For more information, see: https://www.nccgroup.com/technical-assurance/social-engineering-prevention/