2014P_ / Codex / Surveillance Capitalism

Surveillance Capitalism.

Human experience claimed as raw material. Behavioural surplus refined into prediction products. Instrumentarian power as the terminal aim. The most rigorous existing account of what the platform substrate actually does.

Codex · Western Canon · ≈10 min read · Zuboff, The Age of Surveillance Capitalism · 2019
TL;DR

The platform is not selling ads. It is selling predictions about you. The ads are the delivery mechanism; the prediction product is the actual good. Your experience is the raw material; the behavioural surplus you generate while using the service is the input feedstock; the goal is not to know your beliefs but to make your behaviour reliable for whoever is paying for the prediction. This is the logic that built the contemporary platform economy — and the logic the Techno-Memetic Commons, Pañca Ṛṇa accounting, and an audit built against different substrate are deliberately designed to refuse.

The book that gave the substrate a name

January 2019. Shoshana Zuboff, professor emerita at Harvard Business School, publishes The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Seven hundred dense pages, decades of research, the unmistakable register of a serious scholar who has decided the time for polite restraint is over. The book reframes the dominant business model of the contemporary internet — and gives it the only name that has stuck.

Zuboff's central claim is that what is happening at Google, Meta, Amazon, Microsoft, and the next generation of platform firms is not a particularly aggressive form of advertising-supported media. It is a new logic of accumulation — as historically distinct from industrial capitalism as industrial capitalism was from agrarian feudalism — that has very specific moves:

  1. Claim human experience as free raw material. Your queries, clicks, dwell times, location traces, voice recordings, photos, contacts, calendar, biometrics — everything generated by living a digital life — is collected without principled consent, on the legal theory that it was lying around free for the taking.
  2. Refine this raw material into prediction products. The data is processed into models that predict what people like you will do next — what you will click on, buy, vote for, hate, share, watch, leave, return to. Prediction is the actual product. The advertising is the delivery channel that monetises the predictions.
  3. Sell these prediction products on behavioural-futures markets. Advertisers buy access to people-like-you-who-will-do-X-next. Political campaigns buy access to people-who-could-be-persuaded. Insurers buy access to people-likely-to-claim. Employers buy access to people-likely-to-quit. The buyer is buying a guarantee on future behaviour.
  4. Tilt the environment to make the predicted behaviour more likely. A guaranteed prediction is worth more than a probabilistic one. So the recommendation, the notification, the feed, the search ranking, the price, the timing — all get adjusted to push the behaviour toward what was predicted. The prediction stops being passive observation and becomes active intervention. This is the move Zuboff calls economies of action, and it is what makes surveillance capitalism categorically different from prior data businesses.

Behavioural surplus — the founding move

The historical inflection point in Zuboff's account is around 2003, in the wake of the dot-com crash, at a struggling startup called Google. Google had a great search product and no business model. Then the founders noticed that user interaction with search generated enormous amounts of incidental data — not just what people searched for, but how they searched, how they corrected typos, how long they dwelled, how they clicked. This data was not necessary to deliver the search service. The search service worked fine without using it. But the data turned out to predict, with increasing accuracy, what advertising people would click on.

Zuboff calls this surplus data behavioural surplus. The word "surplus" is deliberate — an echo of Marx, but applied to a new substrate. In industrial capitalism, surplus is the value labour generates beyond what is paid in wages, captured by the owner of capital. In surveillance capitalism, surplus is the behavioural data generated beyond what the service needs, captured by the owner of the platform. The discovery that this surplus could be refined into prediction products and sold on a market was, in Zuboff's history, the founding move of the new logic of accumulation. Google's discovery became Meta's foundational design; Meta's discovery became the default playbook for the contemporary platform industry.

Crucially, behavioural surplus was extracted before there was any social, legal, or political framework for objecting to its extraction. There was no debate. There was no legislation. There was no public decision. The substrate was claimed and refined for years before most of the people generating it had any idea it existed. Zuboff calls this the dispossession by surveillance — the twenty-first-century analogue of the enclosure of the commons that built early modern capitalism. Polanyi would have recognised the move.

The product is not the advertisement. The product is the prediction. You are not the customer. You are not the product. You are the raw material.

Instrumentarian power

The political shape of surveillance capitalism is what Zuboff calls instrumentarian power, and her distinction between it and totalitarian power is one of the most important moves in the book.

Totalitarian power wants your soul. It requires belief, fear, conversion, terror. It is interested in what you think. It rules by changing your mind from the inside, and where it cannot, it eliminates you. Twentieth-century totalitarianism — fascist, Stalinist, Maoist — was structurally about the interior of the human being.

Instrumentarian power is interested in none of this. It does not care about your beliefs, your values, your inner life. It cares about your behaviour — whether you click, buy, vote, comply, return. It rules by adjusting the choice architecture of your environment until your behaviour becomes reliable for whoever is paying for the prediction. You may hate the system. You may believe it is destroying democracy. You may delete the app. None of that matters as long as the aggregate behaviour of the population it serves remains predictable.

This is a categorically different kind of power, and the political vocabulary inherited from the twentieth century does not handle it well. There is no Stalin to overthrow. There is no Hitler to denounce. There are quarterly earnings reports and benevolent-sounding mission statements and a steadily eroding ability of populations to act in concert against systems whose mechanism is precisely to disable collective action by making each individual's behaviour a private negotiation with an opaque algorithm.

The right to the future tense

Zuboff's ethical centre is what she calls the right to the future tense — the human capacity to project, to plan, to commit, to surprise, to intend. This capacity is what surveillance capitalism principally extracts and quietly forecloses. Once your behaviour has been predicted with high accuracy and the environment has been tilted to fulfil the prediction, the space in which you might have done otherwise has been structurally narrowed — not by force, but by the steady erosion of the conditions under which an unpredictable choice was possible.

This is the deepest ethical bite of the book. The argument is not that surveillance capitalism makes you less free in some abstract philosophical sense. The argument is that the specific human capacity to author your own next move — to intend a thing and then do it, in surprise as much as in consistency — is the substrate the system is grinding down for raw material. The future tense, said sharply, is what is being mined.

The neighbours — and where they thicken

Zuboff is the most-cited contemporary voice on the substrate, but the diagnosis sits inside a broader conversation. The neighbours to know:

  • Jaron Lanier, Who Owns the Future: siren servers are central platforms that aggregate data while distributing risk. The information economy concentrates value at the centre while pushing precarity to the periphery. Lanier proposes a dignity-based information economy with micropayments and data dividends — under-discussed but a serious operational proposal.
  • Tim Wu, The Attention Merchants: a media-history-of-attention account of how human attention became a commodity, from the penny press through television to the contemporary feed. The historical sweep is useful; Zuboff's analytical depth is greater.
  • Byung-Chul Han, Psychopolitics: data-driven optimisation is not liberation but a deeper form of capture, because it replaces the disciplinary "you must" with the achievement-subject's "I can." You exploit yourself voluntarily in service of metrics whose ultimate beneficiary is not you. Han's writing is more lyrical and less empirical than Zuboff's; the diagnosis is the same.
  • Cory Doctorow, enshittification: the operational consequence of surveillance capitalism for the user experience — the platform's ability to twiddle the dials, hide behind opacity, and extract from locked-in users is precisely what behavioural-surplus extraction enables.
  • Karl Polanyi, fictitious commodities: the structural ancestor of Zuboff's analysis. The forcing of behavioural data into commodity form is a Polanyian move — taking something that was never produced for sale, defining property rights over it, building a market around it, and watching the social fabric reorganise around the extraction.

AI as the surveillance-capitalism multiplier

The five years since Zuboff's book have done one large thing to the analysis: the rise of generative AI has dramatically multiplied the capacity of the surveillance-capitalism stack. The same firms that pioneered behavioural-surplus extraction now own the most capable models. The training corpora for those models were built by scraping the world's textual, visual, and behavioural commons — at a scale and opacity that makes the early Google data grab look quaint. The prediction products are now far more capable. The economies of action are far more granular. The instrumentarian power is correspondingly larger.

The AI is the Audit essay in this Codex sits exactly on this hinge. The same technology, on a different substrate, performs a different audit. AI built and deployed by firms whose principal revenue model is behavioural-surplus extraction is structurally going to audit human experience for what is extractable. AI built and deployed against a Pañca-Ṛṇa ledger, on a Techno-Memetic Commons substrate, on federated rails — would audit for what is owed, what is owed back, what is being neglected. The choice is architectural, not technological, and the window in which it can be made is narrowing.

The Indic counter-frame — Manuṣya Ṛṇa and the commons

The Pañca Ṛṇa frame answers surveillance capitalism at the level of the substrate. Manuṣya Ṛṇa — the debt to fellow humans — names the extraction of behavioural surplus from one's contemporaries as a debt that has to be ledgered, not a free input. The platform that captures your dwell time is incurring an obligation, whether or not its capital structure recognises that obligation. The frame surfaces the obligation and forces the question of how it will be discharged.

The Techno-Memetic Commons licence is the legal-engineering counterpart. The licence makes closed-source enclosure of commons-built infrastructure structurally illegal under the commons terms, while permitting the legitimate commercial use that funds the commons. Surveillance capitalism cannot operate on a substrate it cannot enclose. The TMC licence is one move that takes the enclosure off the table.

The federated unicorn architecture is the capital-structure counterpart. Where the surveillance-capitalism playbook requires concentration — one platform, one data layer, one shareholder structure — the federated unicorn distributes the same economic arithmetic across 10,000 federated proprietors. The aggregation of behavioural surplus into a single trillion- dollar prediction-product business depends on the existence of a single firm large enough to do the aggregating. Take the firm apart and the substrate of the playbook is taken apart with it.

What to do with this

Three operating heuristics for builders, funders, and policymakers in 2026:

  1. Notice when you are designing for behavioural surplus. If the principal monetisation depends on extracting and refining data that the service does not need to function, you are building a surveillance-capitalism business — under whatever marketing language is attached. The honest move is either to redesign the monetisation or to be precise about what is being extracted and who is paying the cost.
  2. Build on substrates that resist enclosure. Open protocols, federated architectures, commons-licensed code, member-owned cooperatives, public-good infrastructure. The substrate that cannot be enclosed cannot host the surveillance-capitalism playbook at scale. See the TMC licence for the legal engineering.
  3. Make the obligation explicit. Even within the existing playbook, naming the obligation surfaces the question. Stewardship Marks, data dignity contracts, dignity-based microeconomies, Pañca-Ṛṇa-shaped impact reporting. The audit is coming. The question is what it will be built against.

Quick answers

Is surveillance capitalism a moral category or an economic one?
Zuboff insists it is both — and that mixing them is exactly the point. It is a specific logic of accumulation (an economic category) that has specific moral and political consequences (an ethical category) for the substrate it operates on (humans). Trying to address the economics without the ethics misses the substrate; trying to address the ethics without the economics misses the structural mechanism. The Codex takes the same view.
Doesn't GDPR fix the problem?
It addresses one surface of it. GDPR raises the cost of certain extraction practices, requires consent flows that nominally constrain collection, and creates some right of access and deletion. The deeper mechanism — claiming behavioural surplus as raw material, refining it into prediction products, selling those products on behavioural-futures markets — operates largely undisturbed beneath the compliance layer, because the consent flows are designed to be granted and the prediction products do not require identifiable data to function. GDPR is necessary; it is not nearly sufficient.
Is the open-source movement enough to resist surveillance capitalism?
Necessary, not sufficient. The MIT-licensed library running inside a surveillance-capitalism stack does not stop the stack from operating. The reason the Techno-Memetic Commons licence exists is precisely that conventional open-source licences do not constrain downstream business models, and surveillance capitalism has been the principal beneficiary of permissive licensing at the substrate layer.
Where else should I read?
Zuboff's book is canonical. Jaron Lanier's Who Owns the Future is the cleanest constructive-proposal companion. Tim Wu's The Attention Merchants for the historical sweep. Byung-Chul Han's Psychopolitics for the phenomenological depth. Maria Farrell and Ian Brown's recent The Internet of Things and the Right to Repair for the hardware angle. And inside this Codex, the AI is the Audit essay sits exactly on the surveillance-capitalism/AI hinge.

Building against the audit?

If you're designing platforms, AI systems, or commons infrastructure on substrates that the surveillance-capitalism playbook cannot uniformly traverse — write in. That is the substrate the studio is working.