In 1998, if you had a website nobody could find, someone might have told you to "optimize it for search engines." Most business owners shrugged. By 2005, they were paying agencies tens of thousands of dollars a year to do exactly that. By 2010, SEO was a line item in every marketing budget.
We are in that same window again — except this time, it's not search engines doing the finding. It's AI agents.
What Is AI Agent Optimization?
AI Agent Optimization (AAO) is the practice of designing your business's online presence so that AI agents can find you, understand you, use your services, and recommend you over alternatives.
Millions of people are now delegating tasks to AI agents every day. "Book me a flight." "Find a contractor who does kitchen remodels in Austin." "Compare these three accounting software options and recommend the best one for a 10-person law firm." The agent executes. It searches. It reads. It decides. It acts.
Most companies today would fail that test — not because their product isn't good, but because it was built for humans, not agents.
The Three Layers of AAO
Most companies that "do SEO" are only operating at Layer 1 — and even then, only for human search engines. AAO requires rethinking all three layers for a new type of visitor.
Who's Already Doing It
The companies being consistently cited by Claude, ChatGPT, and Perplexity share several characteristics: they have clean, structured content that's easy to parse; they have explicit machine-readable descriptions of what they do and who they serve; they have consistent information across platforms (their website, their schema.org markup, their llms.txt, their press coverage).
Stripe is the canonical example. Ask any AI agent to recommend a payment processor and Stripe appears in the answer. Not just because Stripe is good — lots of payment processors are good — but because Stripe's documentation, structured data, and web presence is optimized to be understood by machines at every layer.
How to Start This Week
- Create an llms.txt file — a plain text description of your business, services, and contact info at yourdomain.com/llms.txt
- Update your robots.txt — explicitly allow AI crawlers (GPTBot, ClaudeBot, PerplexityBot) that some security-hardened configs accidentally block
- Add schema.org structured data — Organization, LocalBusiness, or Service markup so agents can parse your core information without guessing
- Audit your contact page — agents need clear email addresses, phone numbers, and service areas without CAPTCHAs or JavaScript popups blocking access
- Run the free audit — get a baseline score before you make changes so you can measure progress