People keep tagging me in screenshots. Marketing professionals calling it spam. SEO experts using it as a cautionary tale. One LinkedIn post got thousands of impressions asking people to be careful about sites like mine.
A site nobody had heard of, sitting at 4.1% of ChatGPT's global citations. Above Reuters. Above TechRadar. Above Cars.com. Right there with Wikipedia and Forbes. That does look strange from the outside.
But it didn't happen by accident, and it wasn't luck.
I've been doing this since 2005
Back when DigitalPoint was the forum for anyone serious about search, I was there under the name wissam, racking up posts and finding things nobody had written about yet. SEO was a different game then but the core of it hasn't changed: the people who figure out something genuinely new before everyone else, win. The playbook is always changing. The principle stays the same.
2006. I found a platform called Swiki, a community-powered search engine by Eurekster that had just quietly added AdSense support. Nobody was talking about it. I set one up, pointed it at a niche I knew had traffic, and within days it was pulling 1,500 visitors a day, more than my actual websites at the time. I wrote about it on DigitalPoint and the thread ran to 11 pages. People couldn't believe it. The loophole closed eventually but that wasn't the point. The point was seeing it before anyone else did.
I've never been interested in following the steps. The steps are always written by someone who figured them out after the fact, usually after the window has already closed. Real SEO, the kind that actually works, comes from understanding what the underlying system actually values and then building for that before it becomes conventional wisdom.
By 2024 there was a lot of noise about AI search, how to rank in ChatGPT, how to get cited by LLMs. Most of it was speculation repackaged as advice. Nobody had real data. So I decided to get some.
The experiment
I built Toxigon as a research project. The question was specific: can a small site with no brand authority, no backlink history, no paid promotion, no corporate backing, rank alongside the biggest sources in the world in AI citation purely based on what the content actually is?
The site generates content at scale using AI. That part everyone figured out quickly and it became the headline. What got ignored is what happens after generation. Every piece goes through fact-checking. I review high-traffic content personally when I have time. User feedback shapes corrections. The accuracy rate sits around 97%. That's not perfect but it's better than a lot of what gets treated as authoritative.
The content is also genuinely trying to be useful. Not stuffed with keywords. Written to actually answer what someone is asking, clearly and correctly. That distinction matters more than people realize.
What happened
Profound ran a study of 10 million ChatGPT prompts. These were the results:
I held that position for close to a year before traditional search engines adjusted their signals. That was expected. The experiment was never about building a permanent loophole. It was about proving the hypothesis.
The reaction from the SEO and marketing community was loud. Some of it was funny.
"We found one of the most influential sources in ChatGPT Search."Daniel Drabo, Peec AI — LinkedIn, 2025
"Toxigon is the seventh most cited source" when searching for best electric cars, ranking above established outlets like Cars.com and Reuters.Debra Williamson — LinkedIn, 2025
"I had never even heard of this website."Nate Tower, after finding Toxigon at 4.1% alongside Wikipedia and Forbes — LinkedIn, 2025
"Toxigon appears in ChatGPT's top 10 citations at 4.1%, suggesting specialized relevance for certain query types."Eyeful Media — AI Search Citations Analysis, 2025
Why AI still cites it
Traditional search updated its signals. Traffic dropped. That's fine. But AI systems keep citing Toxigon. That's the part worth paying attention to.
AI doesn't rank authority the way traditional search does. It cites what's accurate, clear, and directly answers the question. Once content earns its place based on correctness and clarity, that doesn't disappear because a ranking algorithm changes. The content got into training data by being good. That's a different kind of moat.
This is a meaningful distinction for anyone thinking about content strategy in an AI-first world. Traditional SEO has always been partially about signals: links, authority scores, trust metrics. AI citation is more direct. Is the answer correct and clear? Then it gets used.
What this actually proves
One person. No funding. No team. No corporate infrastructure. Custom models, custom approach, genuine investment in content quality.
Same list as Wikipedia.
The lesson is the same one I learned in 2005 and every year since. SEO is never about following the current list of best practices. It's about understanding what the system you're trying to rank in actually cares about, before the crowd figures it out, and building something that genuinely delivers that.
In 2024, for AI citation, it turned out to be quality. Not brand. Not links. Not authority scores from a decade ago. Just: is this answer correct and is it useful?
The answer was yes. The numbers showed it.
And that's not even all of it. But you didn't expect me to tell you everything for free, did you?