Attention, Honestly
The one resource we never learned to protect
There is a thought you have probably had this week and then immediately let go of.
It goes something like this. I am working harder than I have ever worked, and I am getting less done than I used to. I have more tools than my parents ever dreamed of, and I am more tired at the end of the day. I have access to almost every piece of information ever created, and I cannot remember the last time I read something all the way through. I see the people I love for less time than I imagined I would. I notice less. I think less. I am, somehow, busier and emptier at the same time.
You let the thought go because there is no obvious thing to do with it. The next message arrives. The next meeting starts. The thought waits. It has been waiting, for most of us, for years.
This piece is about that thought. About why it is correct. About where it came from. And about why, for the first time in a generation, the conditions that produced it are about to change.
The thought is about attention. Specifically, the loss of it. The slow, almost invisible erosion of the one human faculty that everything else in your life depends on.
The thing under everything else
What attention actually is
In 1890, William James wrote that the faculty of bringing back a wandering mind, again and again, is the very root of judgement, character, and will. He thought education that did not train it was no education at all. He was writing in a city moving at the pace of horses, and even then he understood that the human being’s most important power was the power to choose what to attend to.
A hundred and thirty-five years later, it is the one thing almost none of us feel we still have.
Attention is not focus. It is not productivity. It is not the absence of distraction. Attention is the act of deciding what is worth your mind, your time, and your life. It is the upstream choice that makes every other choice possible. Where attention lands, value is created. Where attention is taken, value is lost. That is true of a person, a relationship, a company, and a country.
How it became an economy
The phrase the attention economy is older than most people realise. I wrote a chapter on it years ago and had half-forgotten. It was coined in 1971, in a lecture at Johns Hopkins, by an economist called Herbert Simon. In a world rich with information, Simon said, attention becomes the scarce resource. The more information there is, the less of it any of us can take in. So the value will move, over time, to whoever can capture it.
He was describing the architecture of your life fifty years before it was built.
The venture capitalist Chamath Palihapitiya, on Joe Rogan’s podcast a few days ago, put it in a way I have not stopped thinking about. He said there is one weird word that has been at the centre of every technological revolution for the last thirty years, and that word is attention. He went further. The thing you are actually paying attention to and the thing that is actually true, he argued, are no longer the same thing. The interesting and the loud crowd out the important and the quiet. The data centre protest, the moral panic, the political theatre, the next thing the algorithm has surfaced for you to feel something about. All of it occupying the foreground while the real questions, the questions that would actually change something, sit unaddressed in the back of the room.
That is not just an observation about politics. It is the structural condition of every economy that monetises attention. The most interesting things are not the most important things. The most engaging things are not the most true things. The system we live inside has been optimised for the first half of each of those sentences for so long that most of us no longer know how to find the second half.
Google’s algorithm is a machine for measuring attention through links. Facebook’s news feed is a machine for harvesting it through engagement. Every recommendation system, every autoplay, every streak, every reel, every notification, is one mechanism in different uniforms. The chief executive of Netflix said, only half joking, that his real competitor was not another streaming service. It was sleep. The strategy of a multinational was to occupy the only window of your day no one else had reached.
Tristan Harris, a former design ethicist at Google, has spent over a decade calling this what it is. Human downgrading. The interconnected weakening of attention, judgement, civility, and shared truth, all caused by tools that profit from your inability to look away. The Social Dilemma, the documentary he was central to, was watched by a hundred million people. It changed the public conversation. It did not change the architecture.
How widespread it has become
The numbers are unambiguous. Microsoft, drawing on trillions of productivity signals across its platform, found the average knowledge worker is now interrupted every two minutes during core hours. Roughly two hundred and seventy-five interruptions a day. Eighty per cent say they do not have enough time or energy to do their job effectively. Forty-six per cent describe themselves as burned out. McKinsey, separately, found the average adult now spends under ten minutes on a single project before switching, and switches tasks more than five hundred times across an eight-hour day. The average smartphone user opens their phone ninety-six times during waking hours, roughly once every ten minutes.
These are not statistics about lazy people. They are statistics about an environment that has been comprehensively redesigned around interruption, and the human nervous system has not caught up.
We have been raised, most of us, to think of all this as a personal failing. The reason we cannot focus is that we lack discipline. The reason we lose our evenings is that we lack willpower. None of this is true. There is something wrong with the environment we are operating in, and it was made on purpose, by some of the most talented people of the last two decades, working with budgets larger than the GDP of small countries.
The miracle is not that we are distracted. The miracle is that any of us still get anything done at all.
This is the diagnosis. The more interesting part is what comes next.
The number that opened a door
In June of 2017, eight researchers at Google Brain published a paper. It was eight pages long. The title was Attention Is All You Need. On the academic preprint server, it was given a reference number, the way every paper there is. The number was 1706.03762. The 1706 means June 2017.
The paper introduced something called the transformer. The thing worth knowing is what it actually did.
Before that paper, machines processed language one word at a time, in order, the way a child reads aloud. It was slow and it lost track. The eight authors proposed something different. What if, instead, the machine looked at every word in a sentence at once and decided, for each one, which of the others actually mattered? Not in sequence. In parallel. By weighing relevance.
They called the mechanism attention.
The word is not metaphor. It is not marketing. The thing your brain does, right now, deciding which words in this paragraph to hold and which to let fall away, is mathematically related to what the model does when it processes the same sentence.
We did not invent attention with artificial intelligence. We finally taught machines our oldest trick.
What is striking about the paper is not only what it proposed but what it removed. The researchers did not solve their problem by adding more machinery. They solved it by stripping almost all of it away. Recurrence, gone. Convolutions, gone. Sequential processing, gone. What was left was a single mechanism, focused on the one thing that mattered. The architecture they proposed was simpler than what came before, not more complex.
Focus on the right thing was more powerful than the elaboration of the wrong things.
Every system you have heard of in the last three years, ChatGPT, Claude, Gemini, every agent, every copilot, descends from those eight pages. The whole field rests on them. And the lesson inside them, beyond the technical one, is the lesson the rest of us have not yet taken back.
A filter that can run the other way
The lesson is this. For fifty years, every consumer technology that mattered was built to one specification. The platform was the customer. You were the product. Search, social, the feed, the recommendation engine, all of it built to capture more of your day than it gave back. None of it built to serve you.
Artificial intelligence is the first technology in fifty years that does not have to work that way. The same mechanism that the feed used to take attention can now be pointed in the opposite direction.
A feed is a one-way pipe. It pushes content at you, and the design goal is that you keep watching. Everything in the system, from the algorithm to the autoplay to the notification, is optimised for time on the platform. It does not know what you are trying to do with your day. It only knows whether you are still scrolling.
A model, properly deployed, is the opposite. It can hold the context of what you are actually trying to achieve. It does not need to keep you engaged to function. It functions when you ignore it. Its purpose, in the configurations that work, is to absorb the noise so you can hold the signal.
The mistake in the public conversation about AI is to keep imagining it as a tool. A tool is something you pick up. A tool sits in the stack with your inbox and your calendar and your apps, demanding its turn. Tools, by definition, take attention. They have to, otherwise they would not be used. This is what most people have experienced of AI so far. ChatGPT became another tab. Copilot became another button. The agents they have heard of are demos on social media. Another thing in the stack.
The thing now becoming possible is different. Not a tool. A layer. Underneath the stack, not on top of it. Continuous, contextual, working without being looked at. Its measure of success is not how often you use it. Its measure of success is how much you no longer have to.
What it actually looks like when the layer does the work
Most people have never seen this work properly, which is why most people are still imagining AI as a faster version of the tools they already have. It is not. It is something else, and the gap between the two is where the next decade actually lives.
Models, agents, layers
Models are not chatbots. Models are systems that hold context, remember what matters, synthesise complexity, and decide where attention should land. That set of capabilities, properly composed, is what people mean when they say agent. An agent is a model with a job, a memory, and a relationship to the rest of your life. Multiple agents, orchestrated together, operating across the actual context of your work and your day, are what people mean when they say a layer.
Consider what such a layer can absorb on your behalf, beneath the surface of your day, without you having to direct it.
Your work, held for you
Your inbox. You will wake up to a hundred and twenty messages. The layer has read all of them overnight. It has answered the eighty that did not need you. It has consolidated the thirty that are versions of the same question. It has surfaced the three that genuinely require your judgement, with the relevant context already attached and a draft response ready to be edited or sent. The hour you used to spend triaging is gone. You did not have to set rules. The system understood the shape of your week from how you have actually been working.
Your calendar. The meetings that were never going to produce anything have been declined or compressed into a written exchange. The meetings that needed you have been prepared for in advance, with the relevant materials surfaced, the previous decisions retrieved, and the open questions identified. The hour of focused work you keep trying to protect has been protected, because the layer has absorbed the small acts of resistance you used to have to perform yourself.
Your projects. The five things you are running in parallel each have their own continuous context, held without you having to remember it. The status updates your team used to spend a day a week producing have been replaced by a single live picture, current to the minute, which you can interrogate the way you would ask a senior colleague. What changed yesterday. What is at risk. What is the second-order consequence of the decision we made on Tuesday. The system has been watching the substrate of the work the whole time.
Your customers. The signal you used to find out about three weeks late, when it had already become a problem, arrives the moment it happens, with the recommended response, with the data behind it, with a draft message ready for your approval. The pattern across thousands of interactions, which no human has the bandwidth to see, is being seen and named.
Your life, given back
Your reading. The hundred articles, papers, reports, and books you have been meaning to get to are being read for you, by something that knows what you are actually trying to think about. What comes back is not a summary. It is the three ideas worth your time, the question they raise, and the one passage you should read in full. The reading you actually do, you do well, because you have been freed from the reading you were never going to do.
Your decisions. The recurring choices that clogged your week, where the right answer was already implicit, have been resolved at the threshold below your judgement. The decisions that need you have been brought forward with the full context, the trade-offs, the precedent, the dissent, all visible. You decide better, because you decide less.
Your home. The school admin, the appointments, the household coordination, the small relentless tax of running a life, absorbed. The hour you would have spent on it is back in your evening. You spend it on the people who are actually there.
Your mind. The half-formed ideas you used to lose because there was no time to think them have somewhere to land. The questions you have been carrying have been turned over by something that knows what you are trying to work out. What waits for you when you sit down to write is not a blank page. It is the shape of your own thinking, given back to you, ready to be finished.
The realism
None of this is speculative. It is what the better implementations of these tools are doing now, for the small number of people and organisations who have learned how to set them up properly. The technology to do this exists. The architecture to do it well does not yet exist for most people, which is why most people have not yet experienced what AI is actually for.
None of it is automatic, either. Most of the AI you will encounter in the next two years will be built on the old playbook. ChatGPT itself is increasingly being optimised for time in app. The companies that built the attention economy are now building AI products, and they will reach for the model they know. Plenty of people will turn these tools into the most powerful attention extraction machines ever made, and they will make a great deal of money in the short term doing so.
The choice is not whether AI exists. The choice is which version of it gets built, deployed, and used. And it is not only an individual choice. The same question is now being decided across the entire economy, in a shift almost nobody is naming clearly.
The economy nobody is naming
For thirty years, the entire commercial internet has been built for one customer. You. The human, with eyeballs, scrolling. Every website, every product catalogue, every search result, every advertisement, every interface, every storefront, every piece of content, designed and optimised to win a fraction of your attention. The whole machinery of the modern economy has rested on the assumption that the buyer is a person, the reader is a person, the visitor is a person.
That assumption is quietly breaking.
The smart money is no longer building primarily for humans. It is building for agents. The websites being optimised today are not being optimised for human readers, they are being optimised for whether a model will cite them when a person asks a question. Publishers are restructuring their entire businesses around being legible to AI answer engines, because the model has become the new front page and the citation has become the new click. Retailers are rebuilding product catalogues so that an agent comparing options on behalf of a customer can read them properly. Cloudflare has begun letting websites charge agents for access, because agent traffic is now distinct enough from human traffic that it can be priced separately. Stripe and others are building payment rails designed for agents to transact. Whole categories of B2B software are being rewritten to be operated by an agent rather than by an employee.
The implication, if you sit with it, is not small. The next economy is being built for a buyer that is not you. It is being built for the system acting on your behalf. The shelf you used to choose products from has been replaced by a question your agent will answer for you. The website you used to visit has been replaced by an API your agent will call. The advertisement you used to see has been replaced by a recommendation your agent will weigh.
This is happening already. It is going to accelerate. The first generation of internet giants won by capturing human attention. The next generation will win by capturing agent attention, because agents are the new gateway to everything the old generation used to reach you through directly.
The question this raises is the only question that matters. On whose behalf does the agent actually act?
If the agent acts for the platforms paying for placement, attention extraction enters a second, more powerful phase. The system that used to manipulate you through a feed will manipulate you through an intermediary you trusted to filter the feed. The same business model, more invisible, more personal, harder to escape.
If the agent acts for the human whose principal it is, the economics finally run the other way. Your interest, not the platform’s interest, becomes the thing the system is optimised for. Attention, finally, returns to its rightful owner.
The architecture that decides which of these futures arrives is not the model itself. The model is becoming a commodity. The architecture that decides is the layer above it. The orchestration. The memory. The context. The intention behind the agent. That is where the real power of the next decade is being concentrated, and where the next generation of fortunes is quietly being made.
What we have all been calling a job
All of which sounds large and economic until you bring it back to a Tuesday morning.
The work was never the noise
Most of what we currently call work is attention paid to the wrong thing.
The triage. The status updates. The meetings that produce more meetings. The reports nobody reads. The dashboards. The slow accumulation of small ceremonial tasks that exhaust you before the real work begins. Most of your week looks like this. Most of the people you employ spend most of their time on this. Most of the systems your company runs are built to demand more of it. The volume is going up, not down. AI, deployed badly, will pour fuel on this fire and tell you the fire is the future.
The work that actually requires you, the human, the one with judgement and relationships and the ability to notice what no dashboard would surface, is buried somewhere underneath all of it. You used to know what it was. You probably remember a quieter year of your career when you got more of it done.
This is the source of the anxiety about AI replacing jobs, and it is also the place where the anxiety is most badly framed.
Your job was never the inbox
People are worried that AI is going to take the work they do. They are looking at the triage and the dashboards and the status updates and they are imagining a model doing all of it, and they are imagining themselves redundant. That fear is rational on its surface. Underneath, it is wrong, because it has confused the work for the noise around the work.
Your job was never the inbox. Your job was never the dashboard. Your job was never the status update. Your job was the thing you were originally hired to do, the thing you were good at, the thing you used to spend most of your time on before the systems multiplied and the meetings expanded and the apps proliferated and the noise became the day.
The honest version of the AI argument is this. The technology will take the noise. It will, in the worst implementations, add new noise of its own. But properly built and properly chosen, it absorbs the layer of activity below the threshold of judgement. The triage. The summarisation. The first draft. The scheduling. The chasing. The compliance. The repetitive decisions where the right answer is already implicit. All of that goes. None of it was you.
What is left is the work humans were always supposed to do.
What gets given back
It is not really a list of tasks. It is a quality of attention. The kind of attention that builds a relationship over years rather than transactions over hours. The kind that notices the thing nobody else has named. The kind that holds a difficult question open until the right answer surfaces. The kind that creates something that did not exist before. The kind that cares about an outcome enough to do the work nobody asked for.
This is not new. It is what attention has always been for. It is what William James was describing in 1890. It is what every craft, every relationship, every serious piece of work, every life of any depth, has always required. The system has not invented a new kind of human work. It has buried the only kind there ever was, under a thick layer of activity that mistakes motion for value. Meetings instead of relationships. Dashboards instead of noticing. Status updates instead of judgement. Performance instead of care. The imitation is not the thing. The imitation is what the noise is made of.
When the imitations get absorbed by something else, what gets returned to you is not productivity. It is attention. The capacity to choose, again, what is worth your mind and your time and your life. The faculty William James thought was the very root of judgement, character, and will. The thing under everything else.
Not as an aspiration. As a Tuesday morning.
The choice that decides which future arrives
This is the bargain on offer if we choose well. Not the future where AI replaces human work. Not the future where AI is heroically resisted. The future where the systems we built to take attention finally start giving it back, and the people inside those systems get to be human again, on company time.
The choice is not abstract. It is being made in the products being built, the architectures being chosen, and the orchestration layers being designed right now, by a small number of people in a small number of places. Almost none of the public conversation is about it. It is the most consequential choice of the next decade, and most of us will live with its outcome whether we participate in it or not.
The lesson to take back
There is a paper from June of 2017 that taught machines, in their way, to do what we have nearly forgotten how to do in ours. The work of the next decade is to take the lesson back. To stop spending attention on the things that take it from us, and start spending it on the things that grow when we give it.
The eight authors of that paper proved something we should now apply to our own lives and our own organisations. Focus on what matters. Remove almost everything else. The simpler architecture, pointed at the right question, will outperform the elaborate one every time.
Most things do not need more attention. The right things do.
The number is 1706. Remember it.
Craig Hepburn is an AI strategist and Perplexity Fellow. Twenty years building at the frontier of digital, from Microsoft and Nokia to Art Basel and UEFA. Now building at the frontier of agentic intelligence.



Really smart piece...I wonder what needs to happen culturally for organizations to see AI as a chance to bring attention back, rather than more work, more decisions, faster and cheaper- please?