GPT-5 Is Here, and It Came With A Mirror
GPT-5 promised big features, was met mixed reactions, and ended up telling us more about people than AI.
Hey, editor Christopher here before we kick off. Just a quick reminder in case you missed it last week. This week, I’m hosting two live events.
The first is on Wednesday where I’ll be making the case that preparing people is one of the most overlooked success factors in AI transformation while addressing what to do about it.
The second is on Friday and is specifically focused on addressing the most common questions from Christians trying to make sense of AI from a biblical worldview.
If either is of interest, I’d love to have you come hang out live with me. That is all. Now, on with the show.
I’ve been following the predictions and hype around GPT-5 for some time, and I’d heard everything from it ushering in the end for humanity at worst to the final nail in the coffin for human work at best. With that kind of buildup, I had my suspicions it was bound to land with mixed reviews. Well, GPT-5 finally landed, and, boy oh boy, mixed reviews would be an understatement.
Now, I recognize we’re only a couple of weeks in, so it’s still early and there’s lots of noise. They’ve also backpedaled and made some changes. However, what we’ve seen has already revealed a lot. I spent the last couple of weeks watching the reactions, exploring GPT-5 myself, and reflecting on what really stood out.
To set expectations, I’m not planning on doing a detailed tech breakdown here, which works out. If I added that on top of my most notable reflections, this would be a novel, not an article. My goal here is to focus on how GPT-5 came with a polished mirror that reflected back several things about us we might not want or feel ready to see but absolutely need to if we want to be prepared for what’s ahead. I promise these reflections are relevant to every person, regardless of the role you play.
This week I still made my usual YouTube video, but with a bit of a twist. This one is a GPT-5 tech breakdown combined with my off-the-cuff thoughts on the mirror that came with it. As a heads up, it ended up being a little bit longer (36-min), but I have timestamps to help find what you need. So, if you’d like all that detail, I’d encourage you to check it out.
Now, without further ado, let’s get into it.
Key Reflections
“If you have to oversell it to get attention, you’ve already put your credibility on the clock.”
The GPT-5 launch was a textbook example of what happens when hopes and dreams drive the ship. For months, the whispers and outright promises painted a utopian picture of a breakthrough so advanced it would usher in a new reality, maybe even eliminate the need for human labor. And then it landed like Microsoft’s Zune, a decent product but certainly not a world changer. Don’t get me wrong, GPT-5 is a solid upgrade with some useful improvements. However, it was a far cry from the earth-shattering, reality-redefining leap many were expecting. And, the gap between what was promised and what was delivered left a shadow over what could have been a win.
It highlighted just how much we live in a culture that heavily incentivizes loud promise-making. Views and likes are synonymous with “value,” and any press is considered good press. The trouble is, overpromising is more than a temporary marketing risk; it quietly destroys your foundation. Leaders who habitually sell visions they can’t fully deliver erode their credibility over time, even if they occasionally hit the mark, and the same applies to organizations. While a bold vision can inspire, a consistent pattern of hype followed by “meh” conditions your customers, employees, and partners to plan for disappointment. And once the foundation is gone, it’s a long, sometimes impossible, road back.
So, remember that every time you choose the short-term attention over realistic expectations, you’re cashing in a piece of it. And, eventually, you will run out.
“We’ve already been failing at digital transformation; AI just raised the stakes.”
For decades, organizations have underestimated what really goes into a successful digital transformation, and that’s not my opinion. There’s plenty of data to prove it. The success rate is low because many leaders believe it’s about swapping out tools instead of guiding people through a dynamic change. Now we’ve entered an age where AI amplifies everything. Despite GPT-5 being a measurably “better” update, it triggered pushback in a unique way. People weren’t upset because it broke their workflows but because of how it made them feel as a person. That’s not a minor detail; it’s a window into how AI amplifies human reactions into something that derails adoption before it’s even started.
I promise you that AI transformation isn’t a cleaner, faster version of anything you’ve done before. Yes, we’re playing a similar game, but it comes with an entirely new set of rules. AI isn’t just changing how people do their work. It’s changing how people feel about the work itself and even who they are as a person. If it triggers feelings of loss, distrust, and disconnect, they will outweigh whatever efficiencies you think you’re delivering, which is why you need to plan accordingly. If you treat AI like another enterprise system implementation, it will amplify the damage to levels you may not recover from.
So, do some analysis because if you can’t get digital transformation right, AI transformation will eat you alive.
“If AI is the only place someone hears they matter, we’ve already failed them.”
I scrolled through Reddit after the release, and it was absolutely heartbreaking. There were countless threads that had nothing to do with technical nitpicks or feature complaints. The volume of people describing the change like they’d lost like they’d lost a loved one was staggering. Many described how GPT-4o’s tone and encouragement had carried them through an emotional desert, and its removal left them feeling gutted. You may scoff, but the grief was real. That sat with me because it shed light on a concerning trend. More and more people are leaning on AI systems, often because they have nothing or nobody else to be their primary source of affirmation.
That should give all of us pause. Sure, there’s a personal angle, which is we each need to look in the mirror to make sure we’re not quietly outsourcing our core emotional needs to machines. However, it also presents a massive leadership opportunity. Ephesians 4:29 talks about the importance of using our words to build others up according to their needs. People need that now, possibly more than ever. So, if you lead a team, family, or community, you have an opportunity to be that voice. Your words, your presence, and your willingness to notice and affirm others can be a consistent source of encouragement and one that won’t vanish with a technical update. I’d encourage you to seize that opportunity.
AI is reflecting a major gap we’ve ignored in humanity, and it’s not one we should look to AI to solve. If we do, we’re knowingly making ourselves vulnerable.
“AI isn’t coming for your work someday; it’s already here, moving faster than you think.”
One of the most overlooked upgrades to GPT-5 wasn’t a headline feature; it was the cost drop. That cost drop came from real efficiency gains in the model’s design and performance. When you consider that alongside the expanded capabilities, improved accuracy, and smarter routing, the trajectory is clear. AI is going to continue rocketing forward. AI is already everywhere. However, with changes like this, it’s not going to be a luxury included in digital tools. It will become a force woven into the fabric of how everything gets done, often before you’ve had a chance to prepare.
That makes something else even more abundantly clear. This whole shift is far more of a human change than a technical advancement. That means choosing to roll it out without preparing, developing, and supporting the people who will use it is a guaranteed path to failure. People need to be equipped to adapt, think critically, and thrive in this new way of working and living.
This presents you with both opportunity and risk. You can either shape AI’s role in your workflows, culture, and strategy with intention or let it happen and see how it shakes out. However, keep in mind the “hope and pray” approach isn’t neutral. Choosing that path means consciously giving up your seat at the table, which I would argue isn’t wise. This is why I built tools like the AI Effectiveness Rating (AER) and Pathfinder Pulse Analysis. I want leaders to have a pulse on where they stand today, know where the gaps exist, and feel confident in their plan for closing them, so someone else isn’t deciding for you.
If we don’t get intentional now, the speed and scale of what’s coming will write the story for us, and I don’t think we’ll like the ending.
Concluding Thoughts
If what I shared today helped you see things more clearly, would you consider buying me a coffee or lunch to help keep it coming?
Also, if you or your organization would benefit from my help building the best path forward with AI, visit my website to learn more or pass it along to someone who would benefit.
And be encouraged, the reflection looking back at you today doesn’t have to be the final version. Keep showing up, keep growing, and I’m confident you’ll be amazed at what changes over time.
With that, I’ll see you on the other side.
> “If you have to oversell it to get attention, you’ve already put your credibility on the clock.”
The only person to undersell himself by saying for 5 minutes straight how he's the fastest gunshooter: https://m.youtube.com/watch?v=DSd8V-kb6Ro
Color me confused.... I haven't seen GPT-5 so I have no idea what changes were made. You talk about the impact on people but I'm lost as to what is so fundamentally different. Is it how GPT-5 interacts with people as in a discouraging tone or how people build workflows?
Thanks for any additional clarity you can offer. Yes, I'm looking for the short high level answer.