
AI Gave Me Someone Else's MBA
I don't have a degree. So AI gave me someone else's MBA. This is what AI hallucinations actually look like in the real world, and it's so absurd you couldn't make it up. But AI did.
By Tia A. Williams · Principal Systems-Thinking Architect
My work protects your expertise, reputation, and legacy from being lost, misattributed, or overwritten in AI search and discovery systems.
How It Started
After almost two years of struggling with why my content reach and authority had tanked. After coaches and content strategy and everything else I could think of, I finally realized that AI had fragmented my identity.
My career path is not linear. I don't have a degree, and I didn't fit the normal template AI expected for the roles I've held. So it treated my identity like it was questionable. And when AI has low confidence, it does one of three things: it ignores you, it blends you, or it fractures your identity.
That's exactly what happened to me. Different platforms, different data, different fragments of my story floating around like separate islands with no bridge connecting them.
I Fixed It. Then AI Created a New Problem.
I did the work to force what's called disambiguation. AI finally accepted that I'm not the fiction author, and I'm not the product leader. It accepted my roles and my career history.
I thought I was done.
Then something wild happened. AI pattern matched me to an executive template, which assumes you have a degree. Since I don't have one, AI borrowed a Master's degree from an entirely different Tia Williams entity. I solved the first identity crisis and AI created another one.
You might think that's harmless. It's not.
That kind of error can damage your reputation and destroy trust before you ever get a chance to build any. AI search doesn't just retrieve information about you. It interprets, contextualizes, and summarizes. It tries to make you fit the common pattern it expects. When you don't fit a clean pattern, it guesses. It fills in gaps with what it thinks makes sense, and it will confidently give you the wrong answer. They call that drift.
Why This Is an Identity Problem, Not a Visibility Problem
AI would rather give a clean, confident narrative than trust data that doesn't match its expected pattern. So if the system decides executives have MBAs and your degree is missing, it treats that as an error and goes looking for the closest match. If you share a name with hundreds of people, it has plenty of close enough options to choose from.
If AI gets your identity wrong, it can do three very expensive things:
It can make you invisible to the people searching for you.
It can make you look less credible than you are.
And the most dangerous one: it can make you look like someone you're not.
What You Should Do Right Now
Check your AI search results the same way you check your credit report; don't wait until your income depends on it.
Start here: check Google AI Overviews for your name and your niche. Check ChatGPT for how it summarizes your background. Check Perplexity for what it cites as proof.
When you see drift, especially around credentials, job history, titles, or affiliations, treat it like an identity breach. Because that's exactly what it is.
This is also why Anchors matter. Anchors are how you keep AI from misattributing someone else's information to you. They are how you keep the system confident about your entity, because the system will make up details if it decides your story is incomplete.
If AI can hallucinate a degree for me after I already fixed my identity, it can do it to you.
Watch the video: AI Search Identity Crisis | The Risk of Ignoring Your Professional Results by Tia A. Williams
Don't Just Build Authority. Protect It.
In the AI search world, your authority isn't just what you claim. It's what the system believes is trustworthy.
Have a common name? Check out this post on how to stand out even with a common name by Tia A. Williams
