Generative AI’s Impact on Remote Identity Verification: New Threat Intel Report

The premier provider of science-based biometric identity solutions, iProov, has unveiled its latest publication, “The iProov Threat Intelligence Report 2024: The Impact of Generative AI on Remote Identity Verification.” This report delves into the landscape of remote identity verification threats, offering unique insights into digital injection attacks and exposing malicious actor techniques, trends, and consequences. The comprehensive findings are based on data and expert analysis conducted by the iProov Security Operations Center (iSOC).

With the ongoing expansion of digital ecosystems driven by the need for remote access and services, organizations and governments are facing an unintended consequence — an ever-expanding attack surface. This, coupled with the widespread availability of easily accessible and criminally weaponized generative artificial intelligence (AI) tools, has heightened the demand for robust remote identity verification. The report by iProov sheds light on how bad actors are leveraging advanced AI tools, including convincing face swaps in tandem with emulators and other metadata manipulation techniques, to create novel and largely uncharted threat vectors.

Generative AI tools facilitate the creation of face swaps, posing a significant challenge to identity verification systems due to their ability to manipulate crucial traits in images or videos. Despite the off-the-shelf availability of video face-swapping software, face swaps can be countered by advanced biometric systems designed to resist such attacks.

However, in 2023, malevolent actors exploited a vulnerability in certain systems by using cyber tools like emulators to obscure the presence of virtual cameras, making it more challenging for biometric solution providers to detect fraudulent activities. This led to a convergence of face swaps and emulators as the preferred tools for identity fraud.

Andrew Newell, Chief Scientific Officer at iProov, emphasizes the productivity boost provided by generative AI tools to threat actors and underscores the urgent need for highly secure remote identity verification. While the report highlights face swaps as the current deepfake of choice, Newell acknowledges the uncertainty of future threats, emphasizing the importance of constant monitoring and identification of attack patterns.

The report also delves into the evolution of digital injection attacks, with emulators and metadata spoofing witnessing a significant uptick in 2023. These attacks, which mimic a user’s device, especially mobile phones, have rapidly evolved and pose substantial new threats to mobile platforms.

Collaboration and sophistication have seen notable advances between 2022 and 2023, with indiscriminate attack levels ranging from 50,000 to 100,000 times per month. A substantial increase in the number of threat actor groups and improvements in the sophistication of tools used are evident. Notably, nearly half of the identified groups (47%) were created in 2023, highlighting the collaborative approach adopted by threat actors.

The report identifies two primary attack types: presentation attacks and digital injection attacks. New trends for 2023 include a significant increase in packaged AI imagery tools, streamlining the launch of attacks. Additionally, there was a remarkable 672% surge in the deployment of deepfake media, such as face swaps, alongside metadata spoofing tools from H1 2023 to H2 2023. This combination of presentation and digital injection attacks, when coupled with traditional cyber attack tools like metadata manipulation, poses a substantial threat.

The report concludes with case studies on anonymous prolific threat actor personas, evaluating their methodologies, efforts, and attack frequency. These case studies provide valuable intelligence, aiding iProov in enhancing the security of its biometric platform and minimizing the risk of exploitation for organizations engaged in present and future remote identity verification transactions.

Source Link

Newsletter Updates

Enter your email address below and subscribe to our newsletter