AI Scientist · Neural Architect · Game Developer
I invent neural architectures, build games, and wire up analog circuits to understand how intelligence actually works.
AI Scientist. Working remotely.
I don't think transformers are the answer. I've thought about this for a long time, longer than most. My interest in AI goes back to ELIZA, and in the early 2000s I was already modifying open-source chatbots in Visual Basic, experimenting with giving them something closer to human cognitive memory. The question of how machines actually remember and reason has been with me ever since.
The more I understand about how biological intelligence works, the more convinced I am that the field is optimizing the wrong thing. Attention mechanisms and scale are impressive, but they're not how brains learn, and I think that gap matters more than the benchmarks suggest.
Most of my serious research has been about closing that gap. I study biological neural computation, how organic systems wire themselves, adapt, and generalize, and try to translate those principles into architectures that don't exist yet. I built an analog neural network from discrete electronics, no digital gates, to get closer to the hardware reality of synaptic computation. That wasn't a side project. It was part of a longer investigation.
I also work on the practical side: training and fine-tuning LLMs, building inference infrastructure, shipping products. I published on ASRL and Progressive LoRA Merging, and independently developed reflection-based chain-of-thought reasoning before it became common. Those are stepping stones, not the destination.
I've been building on the web for over twenty years and making games on the side. Most interested in the problems where the right answer hasn't been written down yet.
A long-running investigation into biologically-inspired computation: how organic systems wire, adapt, and generalize, and what that means for building architectures beyond the transformer paradigm.
Designed and fabricated from discrete electronics, no digital gates. Replicates biological synaptic computation at the hardware level.
Developed reflection-based CoT reasoning for LLMs independently, before it was adopted by leading AI labs.
View Repo →A training paradigm that alternates between supervised and reinforcement learning phases for more stable, efficient model training.
Read Paper →Incrementally merges LoRA adapters during training, enabling efficient multi-task fine-tuning without catastrophic forgetting.
View Repo →Active work in progress on novel architectures. Updates published here and on Hugging Face.
Hugging Face →
100% AI-generated content platform ranked #9 globally in ChatGPT citations, alongside Reddit, Wikipedia, and Forbes. Captured ~4% of global search-referred traffic at peak.
Visit →
Full-stack AI chat and API platform serving proprietary models with OpenAI-compatible endpoints.
Visit →
Open-source language model built on Qwen3-1.7B. Focused on reasoning, anti-hallucination, and self-aware AI behavior. Released Dec 2025.
View on HF →
Public AI tools suite launched on Product Hunt. 152 AI-powered utilities across 14 categories.
Visit →In 2025, a study of 10 million ChatGPT prompts found Toxigon at 4.1% of global citations, sitting next to Wikipedia, Reddit, and Forbes. The internet had opinions.
"We found one of the most influential sources in ChatGPT Search."Daniel Drabo, Peec AI
"Toxigon is the seventh most cited source for best electric cars searches, ranking above Cars.com and Reuters."Debra Williamson, LinkedIn
"I had never even heard of this website."Nate Tower, after finding it at 4.1% alongside Wikipedia and Forbes
"Toxigon appears in ChatGPT's top 10 citations at 4.1%, suggesting specialized relevance for certain query types."Eyeful Media, AI Search Citations Analysis
A solo experiment to prove SEO is about novelty, not rules. How Toxigon ended up next to Wikipedia in AI citations.
Read article →People keep conflating intelligence with consciousness. They are not the same thing, and no amount of compute will close that gap.
Read article →If you're working on something interesting, reach out.