GenAI Attacks Emerge as a Growing Cybersecurity Challenge

Gartner’s latest survey reveals that 62% of organizations faced deepfake attacks and nearly a third experienced prompt-based or GenAI infrastructure attacks, underscoring how AI-driven threats are rapidly rising.

A new Gartner survey reveals that generative AI (GenAI) is reshaping the cyber threat landscape at an alarming pace. Conducted between March and May 2025 across 302 cybersecurity leaders in North America, EMEA, and Asia/Pacific, the study highlights how organizations are increasingly vulnerable to GenAI-driven attacks.

According to the findings, 62% of organizations reported experiencing a deepfake attack in the past year. These incidents often involved social engineering or exploitation of automated processes, signaling how quickly deepfakes have moved from novelty to mainstream risk. Additionally, 29% of cybersecurity leaders confirmed their organizations faced attacks targeting enterprise GenAI application infrastructure, while 32% reported prompt-based attacks on AI applications. These include adversarial techniques that manipulate large language models and multimodal systems into generating biased or malicious outputs.

Experts caution against overreacting with sweeping changes to cybersecurity strategies. “As adoption accelerates, attacks leveraging GenAI for phishing, deepfakes and social engineering have become mainstream, while other threats — such as attacks on GenAI application infrastructure and prompt-based manipulations — are emerging and gaining traction,” noted Prashast Gupta, Director Analyst at Gartner. He advises a balanced approach: strengthening existing controls while implementing targeted safeguards for new categories of risk.

The report underscores that as GenAI becomes embedded in business operations, cybersecurity teams must anticipate both current and emerging risks. Rather than reactive measures, organizations are encouraged to adopt a proactive, layered defense to minimize threats while maximizing AI’s potential.

Share on