Mar 8 / Ricky Tam

Will AI Make My Experience Irrelevant? What the Research Actually Says

A vintage compass resting on soft purple fabric, symbolising finding your own direction

The worry nobody says out loud

You have spent the better part of two decades becoming genuinely good at something. You have built the kind of judgement that cannot be acquired quickly — the pattern recognition, the hard-won client relationships, the instinct for what actually works versus what looks good in a presentation.

And now you find yourself reading about large language models that can produce a first draft of almost anything in thirty seconds. You sit in meetings where junior colleagues talk about AI as though it is simply a faster version of you. You wonder, privately, whether the thing you spent all those years building is about to become less useful. Less valued. Less necessary.

This is not imposter syndrome. It is a specific, legitimate concern — and it is affecting senior professionals more acutely than anyone in your organisation is likely to acknowledge.

Before we look at what the evidence actually suggests, it is worth pausing on why this particular anxiety tends to run quietly rather than loudly.

"This is not imposter syndrome. It is a specific, legitimate concern."

Empty space, drag to resize

Why senior professionals feel this differently

When AI disruption is discussed publicly, the conversation tends to focus on entry-level job displacement — call centres, data entry, routine administrative tasks. That framing, while accurate in some respects, misses the subtler anxiety that sits in senior roles.

The fear at Director or Senior Manager level is not "will I lose my job tomorrow?" It is something harder to articulate: "Will the thing that makes me valuable — my accumulated knowledge, my contextual judgement, my professional reputation — gradually become a commodity that any reasonably capable AI can approximate?"

That is a different kind of worry. And it deserves a more precise answer than most AI commentary provides.
Empty space, drag to resize

What the research actually shows

The short version: the evidence does not support the worst-case narrative — but it does require a specific kind of adaptation.

The World Economic Forum's Future of Jobs Report 2025 identified the skills expected to grow most in value over the coming years. The list is notable for what it includes: analytical thinking, creative thinking, resilience, leadership, and the ability to work effectively with others in conditions of uncertainty. These are not skills that AI replicates well. They are, largely, the skills that accumulate with experience.

The report also found that the roles most vulnerable to displacement are those built around single, repeatable tasks — not roles that require integrating multiple domains of knowledge, managing relationships, or exercising judgement in ambiguous situations. Senior professional roles, in most sectors, are predominantly the latter.

This aligns with research published in Daedalus by economist Erik Brynjolfsson, who draws a distinction between AI designed to replace human workers and AI designed to augment them. His argument — supported by productivity data — is that augmentation produces better outcomes for both organisations and workers, particularly those whose value lies in complex problem-solving rather than task execution. Experienced professionals, in Brynjolfsson's analysis, benefit disproportionately from AI that amplifies their capacity rather than competing with it.

McKinsey's 2023 analysis of the economic potential of generative AI reached a similar conclusion: knowledge workers who learn to work with AI tools see productivity gains; those who treat AI as a threat and disengage see relative decline. The differentiating factor is not age, sector, or seniority level. It is whether someone learns to direct the technology rather than compete with it.

None of this means the transition is frictionless. But the headline finding is worth sitting with: experience does not become irrelevant. The question is whether it is applied in ways that make the most of what AI cannot do.
1 World Economic Forum. (2025). Future of Jobs Report 2025. World Economic Forum. https://www.weforum.org/publications/the-future-of-jobs-report-2025/
2 Brynjolfsson, E. (2022). The Turing Trap: The Promise and Peril of Human-Like Artificial Intelligence. Daedalus, 151(2), 272–287. https://doi.org/10.1162/daed_a_01915
3 McKinsey Global Institute. (2023). The economic potential of generative AI: The next productivity frontier. McKinsey & Company. https://www.mckinsey.com/capabilities/tech-and-ai/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier
Empty space, drag to resize

A more useful question

Most of the anxiety around AI and experience is actually several different concerns bundled together — and bundled concerns are hard to act on.

There is the concern about your organisation's specific decisions and how AI adoption might affect your role. There is the broader question of how your industry is changing and whether your skills map onto where it is heading. There is the identity question: if AI can produce a version of your output, does that change what your output means? And there is the practical question of whether you need to learn new tools, and if so, which ones.

These are distinct problems. They require different responses. And some of them — particularly the ones about your organisation's decisions or the pace of industry change — are not within your control to resolve today, regardless of how much mental energy you spend on them.

This is where The Sorting Method becomes useful. Not as a definitive answer, but as a way of separating the concerns that are yours to act on from the ones that are genuinely outside your control — and deserve to be set down, at least for now.

Empty space, drag to resize

The Sorting Method: separating noise from what is yours to handle

The Sorting Method is a simple framework for untangling anxiety — it asks you to sort your concerns into three categories:
CATEGORY EXAMPLES USEFUL RESPONSE
Within your control Which AI tools you learn. How you frame your expertise. The conversations you initiate with your manager. The skills you choose to develop. Take one specific action this week. Not a plan — an experiment.
Can influence, but not control How your team uses AI. How your organisation approaches upskilling. Your professional reputation in your field. Engage where you have genuine influence. Don't overestimate or underestimate it.
Outside your control entirely The pace of AI development. Industry-wide structural change. Your organisation's strategic decisions about technology investment. Acknowledge these are real. Set them down. Return to them when there is actual new information.
Most AI anxiety, when you sort it carefully, concentrates in the third column. That does not make the worry irrational — the concerns are real. It simply means that spending mental energy on them today does not change the outcome. What does change outcomes is clear action in the first column.

"This is not optimism. It is applied clarity."

Empty space, drag to resize

A note on the identity question

One concern that often sits beneath the surface of this conversation is harder to sort: what does it mean for your professional identity if AI can replicate aspects of your work?

This is worth its own attention — and it is the subject of a separate article in this series. But briefly: the value of your experience is not located in your outputs alone. It lives in your judgement about which outputs matter, your capacity to ask the right question rather than answer the wrong one efficiently, and your ability to earn trust in contexts where trust is not automatically granted.
AI does not replicate those things. It can produce a version of your deliverable. It cannot produce your perspective on whether that deliverable is the right response to the actual situation.

For a deeper exploration of this theme, see: Professional Identity in the Age of AI: The Question Nobody Is Asking
Empty space, drag to resize

What to do with this

If you have arrived at this article because you are genuinely worried about your relevance in the AI era, the most useful next step is not a course, a certificate, or a reading list.

It is this: spend fifteen minutes this week sorting your specific concerns using the framework above. Write them down. Separate what you can act on from what you cannot. Then take one action — the smallest one that generates real information.

Not a plan. An experiment.

The Calm Action Cycle — the framework that underlies this approach — follows the same logic: Sort, See, Try. Each step is small enough to attempt without certainty. Each attempt generates something useful, regardless of outcome.

About the creator

Ricky is the creator of Embracing Imperfection Academy, a digital education platform for professionals navigating perfectionism, anxiety, burnout, and life transitions.

A former Hong Kong professional now based in the UK, Ricky brings lived experience of high-pressure careers, cultural transition, and the quiet work of building a calmer life. His work is evidence-based, anti-hustle, and always grounded in the belief that calm is a competitive advantage — including in the age of AI.

Embracing Imperfection Academy offers courses, resources, and a membership community for professionals ready to navigate disruption with clarity rather than panic.

Ricky, creator — Embracing Imperfection Academy

Explore our Courses

Frequently Asked Questions

Will AI make experienced professionals irrelevant?

The evidence does not support the worst-case narrative. The World Economic Forum's Future of Jobs Report 2025 identifies analytical thinking, leadership, and working under uncertainty — all experience-built skills — as growing in value. AI tends to displace single, repeatable tasks, not complex judgement roles.

Is my professional experience still valuable in an AI-driven workplace?

Yes — particularly the aspects that AI cannot replicate well: contextual judgement, trust-building, knowing which problem is worth solving, and navigating ambiguous situations. Economist Erik Brynjolfsson's research shows experienced professionals benefit disproportionately from AI that augments rather than replaces.

Why do senior professionals feel AI anxiety more acutely than junior staff?

Junior staff worry about job loss. Senior professionals carry a subtler fear: that accumulated expertise — pattern recognition, client relationships, professional reputation — may become a commodity that AI can approximate. That is a different concern, and it deserves a more precise response than most AI commentary provides.

What does 'AI augmentation' mean for knowledge workers?

Augmentation means AI amplifies your capacity rather than competing with it. McKinsey's 2023 research found that knowledge workers who learn to direct AI tools see productivity gains; those who disengage see relative decline. The differentiating factor is not age or seniority — it is willingness to engage with the technology as a tool.

How do I separate genuine AI risk from anxiety noise?

Use a simple sorting approach: divide your concerns into what is within your control, what you can influence but not control, and what is genuinely outside your control. Most AI anxiety concentrates in the third category. Focusing mental energy there does not change outcomes. Clear action in the first category does.

What practical step can I take this week if I am anxious about AI?

Spend fifteen minutes writing down your specific AI concerns. Sort them into what you can act on today, what you can influence over time, and what is outside your control. Then take the smallest action in the first column — not a plan, an experiment. One action. This week.

 Is AI anxiety at work a recognised issue for senior professionals?

Yes. Research and workforce data consistently show that AI disruption creates significant psychological pressure at senior levels — often experienced as identity threat rather than simple job insecurity. For a broader overview, see our guide to AI anxiety at work.

Want to think more clearly about AI and your career?

The Compass Letter is a fortnightly note for professionals navigating AI disruption without the panic. Each issue offers one evidence-based perspective and one practical starting point — nothing more.

Join the early access waitlist for the AI Anxiety Reset programme
Thank you!
Write your awesome label here.
Write your awesome label here.

Also exploring UK settlement?

Life in the UK: 20-Day Calm Sprint — for professionals preparing for UK settlement with calm confidence.

References

  • World Economic Forum. (2025). Future of Jobs Report 2025. World Economic Forum. https://www.weforum.org/publications/the-future-of-jobs-report-2025/
  • Brynjolfsson, E. (2022). The Turing Trap: The Promise and Peril of Human-Like Artificial Intelligence. Daedalus, 151(2), 272–287. https://doi.org/10.1162/daed_a_01915
  • McKinsey Global Institute. (2023). The economic potential of generative AI: The next productivity frontier. McKinsey & Company. https://www.mckinsey.com/capabilities/tech-and-ai/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier
Created with