The Coming “AI-Pocalypse”: Why Organizations Are About to Learn Hard Lessons About Networks

Reading Time: 5 minutes

by Mike Klein:

The AI-driven workforce reduction is happening. Not tomorrow, not in some distant future—it’s happening now. What most organizations don’t realize is that they’re about to conduct one of the largest uncontrolled experiments in organizational dynamics in business history. And most of them are completely unprepared for what they’re about to discover.

As noted change and transformation expert Dr. Leandro Herrero pointed out in our inaugural Strategic Conversation, “There will be less people maybe, I don’t know in some cases… but certainly the peer-to-peer element, the transversal connection is going to remain and continues to be the strongest source of power in changing organizations.”

The main problem is that most organizations approaching AI-driven reductions have no idea who they’re actually removing from their networks, what roles those people play in information flow and decision-making, or how their departure will cascade through the remaining system.

The Invisible Infrastructure

Every organization has two structures: the formal hierarchy that shows up on organisation charts, and the informal network that actually gets things done.

The formal structure determines who reports to whom. The informal network determines who talks to whom, who trusts whom, and who influences whom.

Guess which one most AI-driven workforce reductions are designed around?

Organizations are making decisions about headcount reduction based on job titles, salary levels, performance ratings, and functional redundancies.

They’re using algorithms to identify “inefficiencies” and spreadsheets to calculate cost savings.

What they’re not doing is mapping the informal networks that actually drive organizational performance.

This is like performing surgery while blindfolded. You might successfully remove what you think is unnecessary tissue, but you have no idea what vital connections you’re severing in the process.

The Network Effect of Sudden Removal

When you remove people from an organizational network without understanding their role in that network, you don’t just lose their individual contribution—you potentially collapse entire information pathways, break trust relationships, and eliminate crucial knowledge bridges between different parts of the organization.

Consider what happens when you remove someone who isn’t a formal leader but serves as a key connector between departments. Or someone who doesn’t appear critical on paper but serves as the institutional memory for a crucial process. Or someone whose informal mentoring relationships are holding together a team of high performers.

These people don’t show up as “essential” in traditional organizational analysis.

But their removal can trigger cascading failures that are far more costly than their salaries ever were.

As Dr Herrero noted, “The peer-to-peer network may be smaller, but year after year it continues to be the strongest network.”

The question is whether the networks that remain after AI-driven reductions will still be functional, or whether they’ll be so damaged that the organization can’t operate effectively.

The Trust Collapse

Beyond the immediate network effects, there’s a deeper problem: AI-driven workforce reductions are likely to fundamentally alter the relationship between employees and their organizations in ways that most leaders haven’t considered.

When people see their colleagues removed by algorithmic decision-making processes, when they realize that their own job security depends on metrics they may not understand or be able to influence, the psychological contract that enables organizational commitment begins to break down.

This isn’t just about fear, though fear is certainly part of it. It’s about agency—the sense that individuals have some control over their circumstances and some meaningful relationship between their actions and outcomes.

“The individual and collective agency,” Dr Herrero observed, “is the ability of the individual to take control, to have some autonomy, to see a more clear relationship between what I do and the effect that has on the organization.”

AI-driven workforce reductions, particularly when implemented without transparency or human judgment, fundamentally undermine this sense of agency. People begin to see themselves as variables in an equation rather than agents with the power to influence their circumstances.

The Engagement Death Spiral

This brings us to one of the great ironies of the current moment: organizations are implementing AI-driven efficiencies at the same time they’re investing heavily in “employee engagement” and “employee experience” initiatives. They’re trying to optimize human performance while simultaneously treating humans as optimizable resources.

The cognitive dissonance is staggering. You can’t create engagement, loyalty, or commitment in an environment where people understand that their value is being calculated by algorithms and their future is determined by spreadsheet optimization.

This doesn’t mean that AI-driven workforce changes are inherently wrong or impossible to manage well.

It does mean that most organizations are approaching them in ways that will destroy the very things they claim to value: agility, innovation, collaboration, and employee commitment.

The Contagion Effect

Perhaps the most dangerous aspect of current AI-driven reduction strategies is their tendency toward contagion.

As Dr Herrero noted, leaders are “copying each other. Look, ‘this company has now got rid of 300 people and another 400’. So it’s a contagious environment.”

This creates a particularly toxic dynamic where organizations are making workforce decisions not based on their own operational needs or strategic analysis, but because they feel they have to be seen as doing something in response to technological change.

The result is organizational decisions driven by panic rather than planning, mimicry rather than strategy.

This is exactly the wrong approach for changes that will fundamentally alter how organizations function.

The Path Forward

None of this means that organizations should avoid AI-driven optimization or refuse to adapt their workforce strategies to technological change.

It means they need to approach these changes with much more sophistication than most are currently demonstrating.

Organizations that want to successfully navigate AI-driven workforce changes need to:

Map their informal networks before making changes. Understand who the key connectors, influencers, and knowledge holders actually are, not just who the org chart says is important.

Maintain transparency about decision-making processes. People can handle difficult decisions; they can’t handle arbitrary ones. If algorithms are involved in workforce decisions, people need to understand how those algorithms work and what they can do to influence outcomes.

Preserve network integrity during transitions. When you remove people, you need to consciously rebuild the connections and knowledge pathways they maintained. This isn’t something that happens automatically.

Focus on collective agency, not just individual performance. The organizations that thrive post-AI will be those that can mobilize collective intelligence and action, not just optimize individual productivity.

The Coming Reckoning

The organizations that are approaching AI-driven workforce changes as simple cost optimization exercises are about to learn expensive lessons about the difference between formal structure and functional networks, between headcount reduction and capability preservation, between algorithmic efficiency and organizational effectiveness.

As Dr Herrero concluded, “We need to learn and use our imagination to say what do we do now? Just because we knew how to do it 20 years ago with all these case studies and all the things that were there, well that’s archaeology. So how do we do it now, for good and for the future?”

The organizations that figure this out—that understand how to combine AI capabilities with human network dynamics—will have enormous competitive advantages. Those that don’t will discover that optimizing for the wrong variables can destroy everything they were trying to preserve.

The AI culling is coming. The question is whether organizations will approach it as a strategic transformation or a spreadsheet exercise. The difference will determine which organizations emerge stronger and which discover they’ve automated themselves into irrelevance.

Communication Leadership Summit, Brussels, 19 September

Written by: Editor

Leave a Reply

Follow by Email
LinkedIn
Share