jake kara ‣ Bad bossing and the expertise byproduct ░ • Projects • Blog May 4, 2026 Bad bossing and the expertise byproduct Imagine you own a restaurant and you replace your professional “dish pit man” with a robot trained to do one thing perfectly: wash dishes. Same throughput. No overtime. No sick days. No scheduling. A clear win. But then your Saturday night shift goes sideways and you don’t know why. Servers stack plates in a way that makes inefficient use of the dishwasher, causing downtime that adds up over the night. Cooks run out of pots and pans. Servers run out of plates. New detergent leaves spots on the glasses, and a regular complains. Your old, human dish man knew how to avoid all of this. Nobody ever asked him, because nobody hired him to know it. He had become the foremost expert on operational workflow in your kitchen, because humans who do a job for a while accumulate understanding of the job whether you pay them for it or not. In both cases, you bought dishwashing. First with a human, then with a robot. With the human, you were also getting, for free, a kitchen sensor network with opinions. You didn’t know it was part of the deal, so you didn’t notice when you cut it out of the deal. I adapted that dishwashing example from Cory Doctorow’s post on “process knowledge”: https://pluralistic.net/2026/04/08/process-knowledge-vs-bosses/ Here’s how I see this: Human effort produces an output as its product and expertise as its byproduct. Bad bosses pay for the product and pocket the byproduct without noticing. AI work produces the output and nothing else. No byproduct. No extra dividends for the boss and the company. AI vibing is appealing to bad bosses, but it poses a risk to companies that lean hard into bad bossonomics. In Doctorow’s post, he also talks about an employee at a medical facility named Marisol, a receptionist who was shown the near-final design for a new medical facility and spotted, in three seconds, that the reception desk couldn’t see the waiting room. The CEO and the architect had been working on the design for months. They were experts. She was the receptionist. She was also the one who had spent 40 hours a week for years actually watching a waiting room, and that byproduct – accumulated, invisible, unpaid-for – was worth more than the architect’s fee. She had the expertise that mattered. Doctorow frames this as knowledge. We can look at it through the lens of decisions. Every task described in human language is ambiguous. “Design the building” as a prompt contains an enormous number of decisions that aren’t in the sentence. “Wash the dishes” contains more than you’d think. When you delegate a task, you’re transferring agency to make those decisions. Your prompt sets a ceiling on how much of your intent can show up in the output, and everything above that ceiling has to come from somewhere else. Doctorow has called this out as communicative intent: https://pluralistic.net/2025/03/25/communicative-intent/ With a human worker, the missing intent gap gets filled in by the expertise byproduct. The dishwasher makes a thousand decisions based on expertise you never asked about and didn’t know they had. Marisol would have made the sightline decision differently than the architect did because her byproduct contained information the architect’s didn’t. Bad bosses have always been bad at seeing byproducts. They see salaries paid for outputs, and they see meetings and schmoozing and hallway conversations as overhead – the tax you pay to get the outputs produced. They think their prompt – their mission statement, their strategy one-pager, their product spec – contained all the answers to every decision, and workers are just unpacking their expertise, brainlessly contributing nothing to the equation. Some employees are stubbornly hard to automate away though. The New York Times had an article about how the people hardest to replace with AI right now are the ones who go to a lot of meetings: https://www.nytimes.com/2026/04/15/business/ai-jobs-human-work.html A “fractional executive” named Dan Sirk has been working two jobs with AI’s help and might add a third, but three is the ceiling: “there are still human relationships.” The article has a vaguely reassuring angle: Meetings and meatspace tasks will save us. Heh. ::sweat-smile:: It actually illustrates the same structural problem that causes companies to overlook value, but the article gets distracted by the job security angle. Meetings are where byproducts get harvested – where the dishwasher mentions the need to stack dishes efficiently, where the receptionist sees the blueprint and calls out the problem before it becomes expensive to fix, where the thing one person knows from doing things becomes a thing the organization knows and can act on. Bad bosses protect meetings not because they finally understand byproducts, but because meetings are the one place byproducts become visible enough to mistake for product. Everything that happens outside the meeting – the reason the meeting has any substance – remains invisible and remains undervalued. If people aren’t out there doing things, they won’t have much to bring to the meeting room. Which brings us to the self-own. Bad bosses have always wanted to pay only for product. Everything else – including the byproduct – was overhead they wished they could cut. Historically they couldn’t, because the byproduct came bundled with the worker and you couldn’t unbundle them. AI finally unbundles. You can now buy pure product – the cleaned dishes, the generated marketing copy, the drafted brief – with no byproduct attached, at a price that reflects only the product. The bad boss’s instinct says: finally. The bad boss’s instinct is wrong in a way that’s going to be very hard to recover from, because of the next piece of this puzzle. The New York Times magazine published an article with a headline that basically says it all: “We Don’t Really Know How A.I. Works. That’s a Problem.” https://www.nytimes.com/2026/04/15/magazine/ ai-black-box-interpretability-research.html It contrasts Deep Blue, which was good at chess because humans programmed every decision into it, and AlexNet, which was good at recognizing images because it had been trained in ways no one can fully inspect. Deep Blue was delegation to a mechanism: all the decisions were pre-made by humans, then executed fast. AlexNet and modern LLMs are delegation to agents: decisions get made during execution, by processes that even the people who built the system don’t understand. Map that onto the byproduct idea. When a human makes invisible decisions while doing their job, the invisibility is a failure of the boss to look. The dishwasher can be asked. Marisol can be consulted. The byproduct is always, in principle, recoverable. Good bossse do that: they harvest it, promote the person who has the most of it, move expertise to where it’s needed. Healthy organizations circulate byproduct. When AI makes invisible decisions, the invisibility is built in. There is no dishwasher to ask. There is no Marisol to consult. You can’t promote a dishwashing AI to front-of-house based on what it noticed about traffic patterns, because it didn’t notice anything in the sense that would survive beyond the output being generated. Ask it what it learned and it’ll make up something plausible. That’s not expertise. That’s another output. When you delegate to a human, the decisions you didn’t make get made by someone whose expertise can be surfaced, refined, and aligned with your intent over time. When you delegate to AI, the decisions you didn’t make get made by a process that produces nothing that can grow toward anything. The choice you’ve made is to sever decisions from the substance – accumulated expertise – that lets organizations learn from the decisions they make. AI vibing is bad bossing without the possibility to course correct. A bad human boss can become a good one. They can start asking the dishwasher, consulting Marisol, treating meetings as harvest season rather than overhead. An AI-vibed company can’t make that correction because they finally succeeded at paying only for product. There’s no process knowledge to tap into, no expertise byproduct left.