When I first came across the headlines that inspired this article, they didn’t seem all that uncomfortable or surprising.
One was an article about a grieving sister who decided to use AI to recreate her murdered brother’s voice and video likeness for a message played in court. The other was the same kind of approach used by a television network to bring back a well-known and beloved sports commentator with hopes of resurrecting his legacy. At first glance, both seemed reasonable enough, maybe even noble. Why not use technology in the name of justice, remembrance, or continuity?
Now, if you’re new here, I recognize my comfort thinking and talking about death may feel unorthodox. I have a few close friends who regularly rein me in. For context, I grew up around funeral service, so death has never felt taboo. That aside, I think we can all relate to the desire to hold on to someone we’ve lost. However, the longer I sat with these stories, the more the whole thing felt off. It raised a lot of questions for me. Who was this really for? Who was deciding what these people would say? And, if we could ask, how would they feel about being digitally resurrected, especially since they couldn’t speak for themselves?
Per usual, I made my off-the-cuff YouTube video sharing my raw thoughts on the matter. However, the more time I’ve spent with it, the more I’m questioning whether this was really about honor, justice, and legacy. That has really left me unsettled. And, given it’s something we’re all going to have to face at some point, it’s an ethical dilemma I wanted to explore. Because when the person being recreated can’t consent, we have to tread carefully.
With that, let’s get into it.
Key Reflections
“Once someone’s gone, they’re no longer available to correct the record.”
It’s easy to assume that knowing someone gives us the right to speak for them. “We were close. We shared life together. I know what they valued.” And, those statements may be true on a general level. However, knowing someone deeply isn’t the same as knowing what they’d say in a particular moment. I can’t count the number of times someone’s said, “You’d probably say this…” and I’ve had to gently say, “Actually, I wouldn’t.” That kind of misunderstanding happens all the time, and it doesn’t have to be malicious. It happens with the best intentions from people who love you. So, imagine how much easier it is to get things wrong when the person isn’t around to correct them.
It can even happen when we’re quoting someone’s exact words; things can still go sideways. I regularly have people repeat something I’ve said but apply it in a situation where I wouldn’t have said it. The problem wasn’t because the quote was wrong, but because the context was. Our words never exist in isolation. They carry intent, tone, timing, and nuance. And, if we use AI to amplify, enhance, or repackage someone’s voice, even their real words, we risk turning them into something they never wanted to be. You may be able to technically reconstruct a moment, but that doesn’t mean you’re preserving it accurately.
And when someone’s no longer here, they can’t push back. The agency to say, “That’s not me,” dies with them.
“Grief is real, but it isn’t a license to do whatever we believe will offer us healing.”
Growing up in a funeral home, I saw grief up close a lot. One thing I learned from that is that grief makes people vulnerable. They’re vulnerable to reaching, rationalizing, and doing things they would never do under normal circumstances. When you’re grieving, the only question most people want answered is: Will this make me feel better? However, feeling better isn’t always the right path, especially when AI enters the equation. AI doesn’t naturally show up in our grief, but it promises a frictionless, high-tech shortcut to emotional relief. And, in a state of pain and disorientation, we’re far less likely to pause and ask the deeper question: Should I do this at all?
That is a question we can’t afford to skip. As much as grief is deeply human, so is regret. Just because something helps numb the ache of loss doesn’t mean it’s morally neutral or good for you. Grief is a necessary part of the process. It’s not a bug we need to patch or a gap we can fill with synthetic closeness. And AI, like any tool, doesn’t come with ethics. It accelerates the journey on whatever path we’re walking. That means it will make avoidance easier, amplify self-deception, and nudge us toward decisions we’d only consider when trying to escape pain.
So, as much as we want to avoid thinking about death, this is one place we have to. If we don’t, it will come for us. It will come not just in grief, but in the choices we live with after it.
“Someone’s legacy is something to honor and protect, not exploit for monetary gain.”
I don’t know that I’ve ever met someone who loves the idea of being forgotten. And, typically, the people who loved them don’t either. There’s something sacred in preserving a voice, a story, a presence, especially for someone who made a lasting impact. Given that, I completely understand the desire to help people “live on.” Whether digitally re-creating an iconic moment or a familiar voice, it feels like a tribute, a way to make sure they’re not lost to time. However, AI takes that impulse and brings it to “life.” In a constantly changing culture that’s already uncomfortable with letting go, that kind of power can disguise itself as love.
However, I’m going to be very direct. A lot of what we call “honoring legacy” is a dressed-up way of clinging to control. We’ll say it’s about remembering someone, but more often it’s about what we still want from them. It might be emotional comfort, but it also might be brand recognition or financial value. And, when you introduce AI, the lines blur even faster. You’re no longer remembering them, you’re reanimating them, editing them, and using them to serve your goals. We can tell ourselves it’s about keeping their memory alive. However, if we’re willing to look in the mirror, we’ll probably discover it’s not really about them at all; it’s about us.
Now, you might be able to convince others you’re carrying someone’s legacy for them. However, it might make your legacy “the person who used someone else’s memory for their own gain.”
“The more we normalize digital imitations, the easier it becomes to forget what real ever was.”
Each time we interact with an AI-generated human likeness, especially one tied to someone we loved or admired, we become a little more comfortable with synthetic presence. At first, we’re completely aware it’s not a human. We’re conscious of the fact it’s just a representation, a digital tool designed to help us feel connected or remember. However, it doesn’t take long before something shifts. I see it all the time in the work I do. The line between artificial and reality begins to blur. Before long, we start relating to it like the real thing, and we don’t want to be reminded there’s a difference. We begin protecting our delusion.
And, I’d like to remind you that you don’t have to look far to see what happens when history is edited and diluted at scale. We know the cost when nations rewrite the past to serve their own narratives. Once truth gets distorted, it’s not long before trust erodes, an entire generation grows up with no memory of what actually happened, and we repeat the same mistakes. Why on earth would we think it’s any less dangerous on a personal level? When we allow AI to reshape how someone is remembered, we don’t just risk losing the facts. We risk forgetting that facts mattered in the first place.
The risk to this all isn’t that we’ll be deceived. It’s that we’ll stop caring that we are.
Concluding Thoughts
I recognize this is a lot to take in. I’m frequently reminded my comfort discussing matters related to death isn’t common. However, I’m not asking you to wrestle through it alone. In fact, I’d encourage you not to. I’ve talked with my wife, my older kids, and my close friends. They know my wishes. If something ever happens to me, my channel, publication, and content die with me. I don’t want them training an AI to keep my content flowing. I don’t want a glowing orb in the corner of the room talking to them in my voice and mimicking my mannerisms, pretending to keep me alive. It wouldn’t be me.
Now, they’re welcome to publish the hours of footage and unfinished projects that never made it out into the world. If they want to finally publish the book that’s been sitting in publisher-ready form on my desktop for the past three years, they have my blessing. However, I never want them to bring back an artificial version of me when the authentic one is no longer here. When I’m gone, I’m gone.
I share all this to encourage you to think deeply about these things before the moment comes. Because, like it or not, it will come. In Ecclesiastes, Solomon reminds us that wise people think about death often. Not in a morbid way, but in a grounded, present, intentional way. As uncomfortable as it might be, these are questions we’ll all need to answer because the technology isn’t slowing down.
And maybe, just maybe, thinking about death a little more won’t take you down a path of fear. Perhaps it’s the very thing that helps you show up more fully in the life you’re currently living.
With that, I’ll see you on the other side.
The show Black Mirror has had some really good/uncomfortable episodes around this very topic. It's definitely a highly philosophical question that goes beyond just the idea of control.
A very important topic, and great read on said topic. Thanks!