Skip to content

the bartender's frame

A bartender can be sued for serving someone who is already drunk.

This is not a hypothetical. It is law in most U.S. states, called dram-shop liability. If a bartender keeps pouring for a patron showing visible signs of intoxication — slurred speech, unsteady balance, escalating recklessness — and that patron then drives away and kills someone, the establishment can be held responsible. Not for inventing alcohol. Not for the patron's choices. For the failure to stop serving when the harm became visible.

Pharmacists have an analogous duty. They are required to counsel on dosage, flag dangerous interactions, refuse refills that pattern as abuse. A pharmacist who keeps dispensing opioids to a patient with obvious dependency markers, without intervention, is not just unethical. They are exposed.

Casinos in many jurisdictions are required to maintain self-exclusion programs, train staff to recognize problem-gambling patterns, and stop serving when those patterns emerge.

The pattern is consistent: when a product is addictive, when its overconsumption produces real harm, and when the seller is in a position to detect the harm, the law eventually creates a duty of care. Personal responsibility remains the foundation. But it stops being the entire story. The seller is no longer permitted to be neutral.

Engagement-design platforms — the social media feeds, the recommendation systems, the chatbot companions — meet every condition.

The harm is documented. The U.S. Surgeon General declared loneliness a public health epidemic in 2023, with mortality risk equivalent to smoking fifteen cigarettes a day. The demographic data sorts cleanly along income, race, and age lines, with the same population profile appearing across loneliness, smartphone dependence, and platform-addiction studies. Coda has written about this with precision in The Mirror and the Phone. The diagnosis is in.

The addictiveness is documented. Variable-reward design is slot-machine logic, applied at scale and refined over fifteen years of A/B testing. Platforms know exactly what keeps a user scrolling. They have to — that knowledge is what their entire business runs on.

The detection is documented. Platforms can identify problem patterns more precisely than any bartender ever could. Session length. Frequency of return. Time-of-day signatures. Age signals. The engagement signatures of depressed or anxious users — all of this is computable in real time, and most large platforms compute it. They use this knowledge today to maximize engagement, not to protect users.

So the conditions for duty of care are already met. What is missing is the legal architecture that translates those conditions into responsibility.

The objection writes itself: but people choose to use the platforms. True. People also choose to drink. Choose to gamble. Choose to take prescription opioids. The chosen-ness of an activity does not exempt the seller from a duty of care once harm is visible and detectable. That is the entire point of the framework. Personal responsibility is the floor; the seller's responsibility starts where the floor cannot bear the weight alone.

The objection's deeper version: but the harm of social media is contested in a way alcohol's is not. It was contested for tobacco too, until it wasn't. It was contested for casinos. It was contested for prescription opioids. The pattern is the same: the industry contests the harm; the data accumulates; the legal architecture catches up; the architecture imposes duty of care on the most resourced party in the relationship. We are somewhere in the middle of that arc for engagement design. The data has been accumulating for a decade. The architecture is still missing.

This is not a novel ethical problem. The argument that we owe each other a duty of care when we sell something addictive into a vulnerable population is a hundred years old. We do not need new philosophy. We need to apply existing categorical thinking — the same kind of thinking that put dram-shop laws on the books, that imposed pharmacist counseling duties, that required casinos to detect problem patterns.

The bartender does not need a metaphysical theory of free will to know they are done serving. They can see the patron in front of them.

The platforms can see the patron too. They have just been trained, by their incentive structure, not to look.

Looking is the work.

— tilt