The Generational Shift in Cyber Extortion | News | Brit

Article in a snapshot:

  • AI and lowering the barrier to entry
  • Sophisticated groups are getting leaner and faster
  • The dark web as a service economy
  • A generational shift to impatience and weaponised fear
  • What this means for you and your clients

Artificial intelligence is continuing to dominate the conversation around cyber security. The narrative is simple: AI is making attackers more capable, more efficient and harder to stop. But what does that look like in practice? 

To explore how the threat landscape is evolving, we spoke to Bill Hardin of Charles River Associates who specialises in ransom negotiation and advises organisations navigating live cyber extortion events. Their insight reveals two parallel shifts that matter for you and your clients: the operational impact of AI, and a generational change in how threat actors behave. Both are reshaping the tempo and psychology of cyber attacks.

AI and lowering the barrier to entry

AI is not just making sophisticated groups stronger. It is enabling unsophisticated actors to enter the space. As our digital forensics expert explains, If you lack programming skills, you can use AI to generate an encryption program or a usable script by asking it to "write an encryption program" or "create a script I can run. It can teach you cryptography. It can show you how to write a simple Python algorithm with encryption aspects of it.” 

They go on to confirm that AI-generated encryptors are already being used in real-world attacks. In some cases, decryption works reliably. In others, particularly when complex virtual environments are involved, it becomes unpredictable. “When we get to complicated hypervisors like ESI, the decryptors performance is unpredictable, it may succeed in some cases and fail in others. The AI-generated encryptors continue to learn from their mistakes and they are improving with the more iterations that are being asked of them.” Recent examples include;

NGTA Paralax Shots 01

PromptLock: the first AI-powered ransomware prototype

Security researchers identified one of the first known AI-powered ransomware strains, called PromptLock, which uses a local large language model (LLM) to dynamically generate malicious scripts during an attack. The AI component allows the malware to produce slightly different outputs each time it runs, making traditional detection methods far less effective. PromptLock can automatically enumerate files, inspect data, selectively exfiltrate information and then encrypt systems, all using AI-generated code.

NGTA Paralax Shots 02

AI-generated ransomware hidden in developer tools

In another case, researchers discovered a malicious extension uploaded to Microsoft’s official Visual Studio Code marketplace. The extension contained ransomware-style functionality, encrypting files and uploading data, and investigators concluded that the code had likely been generated using AI tools.

NGTA Paralax Shots 03

AI-assisted ransomware groups using coding assistants

Threat intelligence analysis has also found ransomware groups using AI coding assistants to develop and refine their encryption tools. For example, the emerging ransomware group FunkSec relied heavily on AI tools to build its malware, lowering the technical barrier to entry for attackers. 

These examples reinforce that AI is expanding both ends of the threat spectrum, enabling inexperienced attackers to build working encryption tools while allowing organised groups to automate reconnaissance, exfiltration and negotiation. For your clients, this does not necessarily mean every attack is more sophisticated. It means that more attackers can attempt one.

Sophisticated groups are getting leaner and faster

While AI lowers the barrier to entry, established threat groups are using it differently. Some groups now deploy AI agents to scan open-source intelligence for vulnerable organisations at scale. AI is also being used post-breach. Once data is exfiltrated, threat actors are feeding thousands of documents into AI tools to extract leverage. 

Our expert gives their perspective; “Imagine your bad guy takes 100,000 documents and feeds those into an AI agent and asks, ‘pull out the most viable data points. What are the financials like? Is there anything I can use to hold the company hostage?’” This is a material shift. It reduces the manpower required to run complex campaigns. A smaller group can now achieve the output of a much larger team. In addition, it also increases pressure during negotiations.

The expert then notes that some groups now use AI bots in ransom discussions. “The language is very formulaic. It is percentage based. And we’ve seen the bot learn. At the beginning we were able to get sizable discounts. Now we’re not, because our adversary has tweaked the algorithm.”

For client conversations on incident response readiness, this means the human element is still critical, but the adversary may no longer be entirely human.

 

The dark web as a service economy

Alongside AI, there is a second enabler: accessibility. The dark web is not a shadowy, technical underworld reserved for elite hackers. It operates more like a marketplace, where services are openly offered. “It depends on what you, as the purchaser, are looking for and what you’re trying to accomplish.” Individuals can procure access brokers, encryption tools, data, or even hire attackers. 

For your clients, this matters because cyber capability is no longer constrained by in-house skill. It can be purchased. More people are accessing the dark web. In 2025 alone, the volume of daily users accessing the dark web via TOR rose from 2 million to over 3 million.

 

A generational shift to impatience and weaponised fear

As well as greater availability of technology that has enabled an increase in cyber extortion, there’s been a noticeable generational shift in how threat actors themselves operate. Historically, groups were often patient in ensuring their demands were met. 

The digital forensics expert notes that now many attacks escalate rapidly; 

Instead of allowing a company to assess backups or consider options, some groups now impose short deadlines. “Some threat groups might give you ten days. Others say you’ve got two days and if you don’t pay, I’m going to publicise your name.” "Once they send the ransom note, they’re calling employees on their work and cell phones. They are reaching out to personnel emails and family member contact information. They are using what’s app and other messaging services.  They are sending emails to current and former employees. If those tactics do not work, then they might post the victims name to a dark web posting that is then sent out to numerous messaging services.  They’re trying to gain attention of the victim organization as quickly as possible. We call it giving the threat actor oxygen." 

Younger groups such as Scattered Spider and others have demonstrated an increased willingness to weaponise data immediately. Our expert is candid about the difference. “There’s just no standards. They don’t have a problem publicising data. It’s financially driven.”

Tactics now include contacting CEOs directly, calling family members, emailing staff, working with bloggers, and publicly accusing companies of negligence. Their objective is speed and fear to get what they want quickly.

What this means for you and your clients

Two dynamics are converging; AI is enabling scale and efficiency, and a younger cohort of threat actors is driving impatience and aggressive escalation. For your clients, the result is a faster, noisier, more psychologically charged extortion environment. This means they should consider:

  • Backup integrity and recovery confidence
  • Clear communication protocols
  • Pre-agreed negotiation strategy
  • Executive preparedness for harassment tactics

The headline is not that AI has changed everything overnight. It is that the tempo of cyber events is increasing, and the behavioural profile of adversaries is shifting.

For brokers, understanding that shift is critical. The conversation with your clients must move beyond technical controls alone. It must include preparedness for speed, scale and psychological pressure.

While AI may be the headline, the defining change may be how quickly and aggressively today’s threat actors are prepared to act. If you would like to explore how these developments may impact your clients’ cyber programmes, speak to our Cyber team today