The web wasn't built for machines.

Browsers render pixels for human eyes. They paint colors, arrange layouts, load fonts. Every page is a visual performance, choreographed for a species that sees.

AI agents don't have eyes.


Every AI agent today borrows a human's browser. That's like using a telescope to read a book.

They spin up a full copy of Chrome - headless, bloated, fragile - and scrape the pixels away to find the meaning underneath. They parse thousands of DOM nodes to find a button. They consume megabytes to understand a paragraph.

It works the way anything works when you force a square peg through a round hole: badly, expensively, and at scale.


50,000 tokens of noise per average web page
300 MB of memory per headless browser instance
90% of what's fetched is thrown away - the benchmarks prove it

This is the cost of making intelligence pretend to be human just to read a web page.


What if there was a browser that spoke the language of intelligence?

Not a browser that renders and then explains. A browser that understands - that reads the web the way an agent thinks about it. Structure, meaning, actions. No pixels. No waste.

It's called a Semantic Object Model. Instead of a sprawling DOM tree with ten thousand nodes, you get a clean map of what's on the page: the content, the controls, the relationships. A 50,000-token page becomes 3,000 tokens of pure meaning.


It's open source. It's a W3C Community Group. It's called Plasmate.

Built from the ground up for agents. Not a wrapper around Chrome. Not a scraping tool. A browser that was never meant for human eyes - and is better for it.

There's even a proposal for robots.txt that lets websites serve semantic content directly - a native handshake between sites and the agents that visit them. The web, redesigned for its newest readers.

The ecosystem is already growing. Agents built with Plasmate can browse the web in a fraction of the time, at a fraction of the cost, with none of the fragility.

See what we built
Source Code Documentation SOM Specification W3C Community Group Benchmarks Robots.txt Proposal