The Future of AI at Work Is Hiding in Plain Sight
Reading Between the Lines of the 2025 WEF Jobs Report
Over the past couple weeks, I’ve spent some time digging through the 290-page 2025 WEF Future of Jobs Report. If you haven’t seen it, I’d encourage you to check it out, even though it is a doozy.
As I combed through the stats and details, I kept shaking my head and thinking, “The future of AI at work isn’t coming; it’s already here, and we’re not prepared to see it.”
Now, there’s plenty in it that reads like a lot of the other noise out there. It highlights a lot of familiar themes like AI transformation, upskilling, and evolving roles. However, this one goes a lot deeper. And to further complicate things, many of the real insights aren’t in what it says but what’s underneath. The insights are hidden in what it assumes and what it doesn’t say out loud.
When you read between the lines, an unsettling picture emerges. It reveals we’ve got a workforce standing on the edge of massive disruption with organizations that lack a clear plan. To further complicate it, there’s no shared understanding, and very few are asking the right questions.
I talked through my raw thoughts on YouTube. However, per usual, this post is the polished version. This was where I stepped back, slowed down, and mapped out the core signals that matter, whether you’re a leader, strategist, or someone simply trying to make sense of what’s coming (or already here).
As a heads up, this week is a bit of a deviation from my usual four reflections. I’m unpacking six of the key signals buried in the report. I think you’ll find they aren’t just observations but warnings. And depending on how you choose to respond, they’ll either become your roadmap… or your blind spot.
With that, let’s get to it.
Key Signals
“It’s one thing to expect change; it’s another to be fully prepared for it.”
I was extremely encouraged to see 86% of companies now expect AI to radically transform their business by 2030. It’s a positive indication that companies and people are recognizing the magnitude of change ahead. And, despite how long it feels like it’s been noisy, that’s a kind of acknowledgment that would’ve been rare just a year ago. We’re making our way through the denial and anger stage. However, recognition is by no means the same thing as readiness. Acknowledging you’re expecting change doesn’t mean you’re adequately prepared to lead through it.
This is where things very much start to unravel, and not just in the report. I frequently find that once you start defining what “prepared” actually looks like, the vast majority of organizations begin to recognize it as an “opportunity area.” People may nod in meetings about transformation, but many go right back to operating like nothing is actually changing. Countless strategy decks mention AI, but I see far fewer line items in the budget. Most teams are told to “experiment,” but few have guardrails, support, or clarity on what success even means. The result is a strange kind of cognitive dissonance. We seem to know it’s coming, but we’re still acting like it’s “down the road.”
We need to remember that success isn’t found in acknowledgment. It comes from action. And, action without alignment isn’t strategy; it’s just noise.
“Skill decay isn’t slowing down but it is getting much harder to see.”
The report has quite a bit to say about skill instability. Casually defined, that’s the volume of job skills that are expected to change. If you dig in, you’ll see instability is projected to peak this year, with a staggering 55% of skills needing to change. However, in the chart it seems to get better, dropping to just over 40% by 2030. On paper, it’d be easy to interpret that as a shift toward a more stable landscape (if you want to call needing to transform 40% of your skills “stable”). The way it’s presented, it almost reads like a return to equilibrium. But that interpretation is not only misleading; it’s dangerous.
My concern is that what’s being presented as stability isn’t actually stability. It’s something very different: invisibility. Here’s why. As AI tools become more embedded in daily workflows, they don’t just augment work. They start masking the visible signs of struggle. People prop themselves up with AI assistants that churn out stuff that seems acceptable. Before you know it, the skill gaps don’t show up in a detectable form because they’re covered by the tool. When the output looks “good enough,” no one bothers to ask what’s underneath. Managers see results and assume competence. Teams move faster but think less. Capability quietly erodes under the surface while performance metrics suggest everything’s fine.
If we don’t course correct, we’re going to find ourselves with a workforce that looks effective and capable until catastrophe strikes. And, that’s a terrible time to discover you don’t have the resources required to put out the fire.
“If you’re going to cut alongside upskilling, you better do it with surgical precision and clear communication.”
The report is filled with a lot of mixed messages, but one in particular stood out. It highlights that 85% of organizations plan to invest in upskilling their workforce. And yet, a whopping 40% of companies are also anticipating reducing their workforce as a direct response to AI. Those two metrics appear side-by-side in the report with no real acknowledgment of the tension or complexity between them. That happens to also be one of the biggest gaps I see in my conversations with senior leaders. The desire to “do the right thing” is often there. However, when the pressure is on and there’s a lack of clarity, you end up with what looks like a contradiction.
I say “looks like a contradiction” for a reason. Reskilling and downsizing aren’t necessarily in contradiction with each other. There are moments where doing both is absolutely necessary, and there are plenty of those moments amidst a change of this magnitude. Some roles need to disappear as others are being reimagined. The problem is treating upskilling and downsizing like interchangeable levers you can easily pull without consequence. In practice, both require deep, deliberate effort and come at a cost, culturally, strategically, and emotionally. If you’re not crystal clear about why you’re doing what you’re doing, you will create chaos and mistrust.
So, if you’re in a mess, it’s not that you can’t send both signals, but you have to send them with clarity, conviction, and a plan your people can believe in.
“AI’s not just coming for repetitive tasks; it’s coming for the creative, relational, and human work, too.”
There is still an inordinate amount of noise out there suggesting that “human skills” (often poorly defined) will always be safe. It’s usually some loud proclamation that “xyz skills” will go untouched. And then, some AI company releases its latest frontier model and blows it all up. Now, to be clear, in some ways, I think there is truth in “durable” skills. However, durability doesn’t mean “set it and forget it.” Every single thing you think is “durable” will be challenged. The report quietly points to that reality. Among the most at-risk roles are graphic designers, legal secretaries, and customer service professionals. These are jobs that aren’t just task-driven, but people-driven.
However, this quiet indicator reveals a twofold problem. First, it emphasizes something I say all the time, which is that nobody is immune. Every job, as it stands today, will be disrupted. However, there’s something else, which might be more concerning. It reveals how little we really understand about how work gets done. Again, there’s a recognition that AI will change unexpected roles, but almost no clarity on when AI should be used, how it should be applied, or what shouldn’t be off limits (more on that later). That kind of change is much bigger than digital transformation and requires a level of data many have ignored for far too long, which is why we can’t keep ignoring it.
Because, the most dangerous kind of disruption isn’t the kind you expect. It’s the kind that shows up quietly, in the places you thought were safe.
“Everyone’s chasing tech roles; however, the biggest growth is in places we’ve been trained to overlook.”
There was another really subtle but significant signal not getting enough attention. It would be very easy for someone to skim the report and walk away thinking the future belongs entirely to AI specialists, machine learning engineers, and other high-skill tech roles. I’d go so far as to say that conclusion wouldn’t be accidental. It’s literally how the report (and a lot out there) frames the narrative, highlighting growth percentages that make emerging digital roles seem dominant. However, when you pause and dig into the job volume numbers, putting all your chips there would be a mistake. The volume of growing roles is in areas like delivery services, agriculture, and construction. They don’t sound futuristic or flashy, almost unpredictably real.
Now, this kind of misdirect is problematic. It’s problematic not just because it shapes and influences how individuals think about their careers (a serious problem in and of itself). It’s problematic because it dramatically affects where organizations invest. I know because I’m in the room when decisions get made, often shouting, “Please don’t do that!” I can’t count how often I’ve seen companies make heavy time and budget investments into programs and initiatives tied to high-tech roles while completely ignoring the very functions that keep their business operational. I get that investing in the flashy stuff sounds strategic, and it is important. However, when that’s all you care about, you risk starving the systems that quietly actually sustain you.
We need to remember that sustainability doesn’t come from the top of the skills ladder. It’s heavily dependent on the parts of your business you’ve stopped looking at.
“The biggest risk isn’t the automation; it’s trusting the wrong people to decide what gets automated.”
If you’ve read this far, I hope you don’t need to be convinced that the report shows a dramatic shift ahead (technically already in motion). However, there was another stat that stood out. The report highlights that today humans still perform 66% of workplace tasks, with 29% handled by machines and 5% by augmentation. However, by 2030 those numbers are expected to shift to 55% human, 35% machine, and 9% augmentation. That’s a massive 10–11% swing in just five years, a major reallocation of how work gets done, and by whom. In my opinion, it’s too conservative, but still a more surgical estimate of the volume of tasks that will be handed off or reimagined.
However, this opens another Pandora’s box many aren’t aware even exists. It forces us to examine the question, “Are the people responsible for making those decisions actually equipped to do it well?” That’s something I find almost no one is talking about. And from what I’ve observed, most organizations have no idea how to even begin assessing it. That’s actually what led me to create the AI Effectiveness Rating. It’s designed to help organizations, leaders, and individuals close the gap between awareness and wise action. Because automating a workflow is one thing. Trusting your team to pick the right one and do it well? That’s something else entirely.
We have to remember that the future of work isn’t just about shifting who does what. It’s about making sure people are equipped and can be trusted to decide.
Concluding Thoughts
I’ll completely understand if you don’t end up carving out time to read all 290 pages of the report. However, I hope this article at least gives you an idea of what’s in it and has helped you take more seriously what’s ahead of us.
Because, whether you accept it or not, we’re not looking out toward the future of work anymore. We’re in the middle of it, and it’s not going to slow down while we scramble to figure things out.
Now, if you’re feeling overwhelmed, you’re not alone. I led an advisory meeting with a group of senior leaders this week who were honest enough to say what a lot of people are feeling: “I didn’t sign up for this.” Every single person mentioned something along the lines of not going into their field to become digital transformation experts or navigate new waves of technology. However, that’s what’s being required of us all, regardless of title, industry, or background.
And I’m empathetic to the fact it’s a steep learning curve. I’ve been doing it my whole career and frequently get overwhelmed. But, I promise it’s not impossible. In fact, once you get moving, you discover many of the most important skills are already in your toolbelt. You’re already using them today. You just need better ways to and support for transferring them into this new environment.
So if you’re someone who’s trying to find your footing, or you know someone who is, don’t go it alone. Reach out or send them my way. Even if a formal engagement isn’t the right path, I’m committed to helping as many people as I can navigate what’s next (or more accurately, what’s already here).
Because, you may not need to have it all figured out, but you do need to start moving.
With that, I’ll see you on the other side.
**the AI effectiveness link is broken
Christopher, I enjoy your articles as they are both factual and thoughtful. I retired a year ago and my last role was in data analytics, certainly a tech oriented field. For the prior ten years we talked about up skilling for data along with additional tech roles and always the opportunity to eliminate work and possibly people. You used the same words only now swap out AI instead of data analytics. The sad reality is that many companies will make appropriate noises about these issues but unless the executives both believe in the technology and back serious initiatives with time and money, nothing meaningful happens. Lots of handwringing but no real improvement.
AI costs serious money for the tool set, for developing in-house expertise, and for rolling it out in a meaningful way. Most companies won’t do those things.
AI in many ways is the latest generation of technology that will get lots of press and attention by big companies with thoughtful leaders and will be deployed in a thousand bad ways for the bulk of other companies.
I’m glad to see your help available and to know some companies are doing it well. Anything you can do to wake up the others and give them a reality check would be great. Glad my career is done and I can let others enjoy this battle. Good luck to all in the AI trenches.