The Chipotle Support Bot Farce

March 14, 2026

Chipotle's support bot can reverse a linked list.

It will never be asked to. It should never be asked to. Do you expect your cashier at Chipotle to reverse a linked list? I know a few programmers who couldn't pull it off either.

This is happening because the bot is backed by a large foundational model trained on general world knowledge. And somehow, general intelligence also includes the ability to write code.

I don't think that's an accident. The best quality training data for forming a world view comes from software engineers. Our craft is unique in that we constantly translate domains into code — and code is formal, automatable, and verifiable. Every domain we touch becomes a structured artifact. That's exactly what you want in training data. We've been generating it by accident for decades.

So will every model be able to write code, regardless of what it's supposed to do?

In a limited sense, probably. My guess is we'll end up distilling these massive foundational models into smaller, domain-specific ones. Some logical capacity will likely carry over — the idea of a linked list as a container of ordered things, for instance. But in a well-distilled model, that probably won't come with a Python implementation attached.

The farce isn't that the bot can reverse a linked list. It's that we trained something so general, it learned skills its entire job description will never require. That's just a consequence of building general intelligence and then pointing it at your customer support queue.