Welcome Letter

The consumer internet has one purpose. All of it — social media, commerce, content, education — shares a single, transparent objective: to capture, occupy, and hold as much of your attention as possible, without regard to whether that attention is healthy or not.

This was not an accident. It was the output of a machine that was tuned and optimized over thirty years. One decision, one meeting, one A/B test at a time. The question “is this good for our users?” was asked constantly, in every design meeting, by every product manager, engineer, and growth lead who ever worked in tech. It was never what we meant (yes, me included). What we meant was: does this increase engagement? Does it increase DAU? Is it good for revenue, for our investors, for the stock? We asked the first question and answered the second, over and over, until the two became indistinguishable. It worked. Until it didn't.

But what a machine we built. The average American now has over seven hours of screen time per day with more than two of those on social media alone. The CDC found that half of all teenagers report four or more hours of daily screen time outside of school, and among that group, more than a quarter show symptoms of anxiety or depression. Gen Z is twice as likely to report feeling lonely as older generations, despite being the most connected generation in history. It is normal to see three-year-olds in restaurants with iPads and headphones. Children learn how the dopamine engine works before they understand what dopamine is. Every person reading this carries a device engineered, by some of the most talented people on earth, specifically to be impossible to put down.

Optimizing the engagement machine meant compromises. Not compromises to profit, or growth, or ROI. Compromises with your safety, your anonymity, and your mental health. To keep your attention, we realized we needed to know you. So alongside the engagement machine, a surveillance apparatus was built to feed it. The machine needed to learn: what you read, where you clicked, how long you lingered, who you knew, what you bought, where you were, what you said. Every interaction became a data point. Every data point sharpened the model. Every sharpened model extracted more. Privacy intrusions were not a casualty of the engagement mill. They were a design input. The more the platforms knew about you, the better they could hold you. Terms of service that nobody read quietly transferred ownership of your behavior, your relationships, and your attention history to companies whose business it was to sell that picture of you to the highest bidder. It was billed as personalization. It was just an ingredient in the drug.

We also learned you'll stick around longer if we upset you. Research compiled by the Center for Humane Technology found that moral-emotional language increases engagement by 17 to 24% per charged word. So creators who want visibility learn to make people angry. The algorithm selects for outrage the same way evolution selects for traits. Not through intent, but through differential survival. You cannot moderate your way out of a system that rewards anger. You can only change what the system rewards.

The bitter irony is that our participation — our clicks, our rage, our time — is what funds and improves the machine. The yachts and politicians were bought with our collective engagement. And now, with all the information and data the engaged have provided, the same industry has engineered our replacements: artificial intelligence with all the learned manipulation tricks of the engagement mill, and the added capacity to replace the means by which many earn their livelihood.

But now the machine faces a problem it created.

The consumer internet is losing what it promised at its outset: real human connection. Automated traffic now exceeds human traffic. AI-generated content floods every platform. The engagement mill does not distinguish between a real person and a synthetic agent; it only measures behavior. So the bots multiply, and the synthetic content multiplies, and the signal degrades. The assumption that you are talking to a person has quietly, without announcement, ceased to be safe. Is that product review real? Was that article typed by a human? Is that comment part of a coordinated campaign?

The platforms with the resources to act at scale have no incentive to do so. The engagement mill runs better on a combination of synthetic and human traffic as bot content upsets us, upvotes us, spikes our dopamine, and keeps us coming back. And when it accelerates, realpeople will be left wondering — who is left to talk to?

Which creates an opportunity.

The same forces that broke the internet have made it possible to build something different. The cost of building and running software has been declining for two decades and shows no sign of reversing. The tooling ecosystem has already eliminated engineering functions that once required dedicated teams. And the AI that is coming for our jobs will, in all likelihood, collapse what remains of the cost of building software. We don't know exactly when, but we choose to plan for that inevitability.

At the same time, the demand for something real is already visible, if you know where to look. Not in surveys or think pieces, but in behavior. People are retreating to private Slack channels, small Discords, near-forgotten bulletin boards, and video game forums. Places that were never designed to scale, places the algorithm cannot reach. They are imperfect and fragmented, but they have something the commercial internet has largely lost: the reasonable assumption that you are talking to a person. People are not waiting to be offered an alternative. They are already building their own, with whatever is at hand.

What is missing is something purpose-built. Not one thing, but many things. Each attempting a different version of what the internet could be if profit were not the organizing principle. Technology and applications built for human benefit, funded by people who believe it is worth building, measuring success by whether it actually serves the people who use it and delivers on the original promises of the internet — to lower the hurdles to education, connection, and understanding.

If we want proof another path exists, one that delivers human content and utility outside a profit motive, look no further than Wikipedia. One of the ten most-visited sites on earth, funded entirely by donations, with no advertising and no profit motive. Imperfect as it is, it has no engagement algorithm, no rage bait, no infinite scroll, no mass bot takeover. It was built to be useful, not to hold you. We built a working model of human-centric design for an encyclopedia. The question is why we stopped there.

The time to build is now. The same greed that built the engagement mill handed us, as an unintended consequence, the tools to build the alternative. We intend to use them.

This is the Enclave Project.

A non-profit dedicated to funding experimentation and exploration in a new category of internet applications, starting from different premises and bound to different commitments. Think of it as a venture fund for human utility instead of profit.

We call these applications enclaves. An enclave has no profit motive and no advertising. It exists for a stated human purpose such as connection, education, information, or discourse. Success is measured against that purpose. It is built to actively resist the design patterns the consumer internet has spent thirty years refining: no infinite scroll, no engagement metrics, no notifications engineered to pull you back, no personalization algorithm feeding you more of what made you angry last time. It protects the privacy of its users as a first principle. It is transparent about what it is doing and why. And it treats the people using it as the point of the thing instead of the grain to feed the mill.

There will not be one enclave. There will be many, each addressing a different failure of the commercial internet. Some will work. Some will not. That is the nature of experiment, and we are honest about it. What matters is that we run them and take seriously the question of what the internet could look like if the engagement mill were not the only model on offer.

The first of these is Airlock: a private, invite-only discussion platform built around the premise that every message is written by a human being. Its architecture is not a set of features added to a conventional platform. It is the consequence of removing everything the engagement mill depends on and discovering what is left when you do. A place where the reasonable assumption that you are talking to a person is safe to make again. And one that has no interest in keeping you longer than you want to be there.

Airlock is our first attempt to show that the model is possible. The Enclave Project exists to make sure it isn't the last.

← Back