Tristan Harris and the Center for Humane Technology, Explained
Former Google designer turned tech critic Tristan Harris argues your phone addiction isn't a willpower problem—it's a design problem. Here's what he actually means.
Your Instagram feed refreshed 47 times during your last scroll session, even though you only pulled down once. That wasn't a glitch—it was a feature designed by someone who understood exactly how your brain works. That someone might have been a colleague of Tristan Harris back when he worked at Google.
Harris spent three years as a design ethicist at Google before becoming one of Silicon Valley's most prominent critics. His central argument? The reason you can't put your phone down isn't a character flaw. It's the predictable result of technology designed to be irresistible.
But Harris isn't just another tech critic shouting into the void. Through his Center for Humane Technology, he's pushing for specific changes to how apps are designed to addict users—changes that could actually happen without requiring you to delete every app on your phone.
Key Takeaway: Tristan Harris frames tech addiction as a design problem, not a willpower problem. His Center for Humane Technology advocates for specific changes to make technology serve users rather than exploit them, though critics question whether voluntary reform can fix business models built on capturing attention.
What Tristan Harris Actually Argues About Humane Tech
Harris's core thesis sounds almost too simple: technology companies are in a "race to the bottom of the brainstem." They're competing to trigger the most primitive parts of your brain—the parts that respond to intermittent rewards, social approval, and fear of missing out.
This isn't accidental. Harris points to specific design choices that mirror casino tactics: variable reward schedules (you never know when you'll get a good post), social approval metrics (likes, comments, shares), and what he calls "infinite buffets" of content that never signal when to stop.
The numbers back this up. According to research cited by the Center for Humane Technology, the average smartphone user receives 64 notifications per day and checks their phone 144 times. That's once every 6.5 minutes during waking hours—hardly the behavior of someone making conscious choices about their attention.
Harris argues this creates what he calls "human downgrading"—the systematic erosion of our ability to think clearly, maintain relationships, and make good decisions. Unlike previous moral panics about new technology, his critique focuses on measurable psychological effects rather than vague cultural anxieties.
The Center's research highlights three specific mechanisms: attention extraction (pulling focus away from what matters), addiction (creating compulsive usage patterns), and polarization (amplifying extreme viewpoints because they generate more engagement). Each represents a different way that engagement-driven design harms users while benefiting platforms.
How the Center for Humane Technology Operates
The Center for Humane Technology, founded by Harris in 2018, operates more like a policy think tank than a typical nonprofit. Rather than telling individuals to use their phones less, they focus on changing how platforms work at the system level.
Their approach combines three strategies: research documentation, industry pressure, and regulatory advocacy. They publish detailed analyses of specific design patterns, brief lawmakers and regulators, and work directly with tech companies to implement less exploitative features.
One concrete example: the Center's research on "continuous scroll" led to conversations with YouTube about adding natural stopping points to video recommendations. While YouTube hasn't eliminated autoplay entirely, they did add features like "take a break" reminders and bedtime mode—changes that came directly from Center advocacy.
The organization also focuses on what Harris calls "design ethics education." They've created curricula for computer science programs, arguing that engineers need to understand the psychological impact of their work the same way doctors understand the side effects of medications.
Their most visible success was contributing research and interviews to "The Social Dilemma," the 2020 Netflix documentary that brought Harris's ideas to mainstream audiences. The film sparked congressional hearings and internal reviews at major tech companies, though critics argue it generated more awareness than actual change.
As of 2026, the Center operates with a staff of about 30 researchers and advocates, funded primarily by individual donors rather than tech companies—a deliberate choice to maintain independence from the industry they're trying to reform.
The Limits of Design Reform Without Regulatory Teeth
Here's where Harris's approach hits its biggest obstacle: voluntary reform only works if companies want to reform. And companies that make money from attention have limited incentives to make their products less engaging.
Consider Facebook's response to concerns about teen mental health. Internal research showed that Instagram worsens body image issues for teenage girls, but the platform's solution was to hide like counts rather than address the underlying comparison mechanics that drive usage. That's design reform, technically—just not the kind that addresses the root problem.
Harris acknowledges this tension. In recent speeches, he's argued that humane technology requires regulatory intervention, not just industry goodwill. The Center now advocates for specific policies: data portability (so users can leave platforms without losing their social connections), algorithmic transparency (so users understand how content is chosen), and what Harris calls "fiduciary duty" for platforms (legal obligation to act in users' best interests).
The European Union's Digital Services Act, passed in 2022, includes some of these provisions. Platforms must now provide "chronological feeds" as an alternative to algorithmic curation, and users have the right to understand why specific content was recommended to them. Early data suggests these changes reduce usage time by 12-15% without significantly impacting user satisfaction.
But regulatory change moves slowly, and the attention economy moves fast. New platforms emerge constantly, each experimenting with novel ways to capture attention. TikTok's short-form video format, for instance, created engagement patterns that existing regulations didn't anticipate.
Harris's critics argue that his focus on design reform misses the fundamental issue: business models based on advertising revenue will always prioritize engagement over wellbeing. True humane technology, they suggest, requires different economic models—subscription services, public funding, or cooperative ownership structures that don't depend on capturing and selling user attention.
Real Examples of Humane Tech in Practice
Despite these limitations, some platforms have implemented changes that align with Harris's vision of humane technology. Understanding what works—and what doesn't—reveals both the potential and the constraints of design-based solutions.
YouTube's "Digital Wellbeing" features represent the most comprehensive attempt at humane design by a major platform. Users can set daily time limits, schedule "bedtime" modes that gray out the interface, and receive reminders to take breaks. Internal data shows these features reduce usage by an average of 23 minutes per day among users who enable them.
The catch? Only 8% of users turn these features on, and most who do turn them off within two weeks. This illustrates a core challenge in humane tech: features that reduce engagement often feel like obstacles rather than improvements to users who are already habituated to high-stimulation interfaces.
More successful are changes that improve the user experience while reducing addictive patterns. Instagram's "Quiet Mode," which delays notifications and adds friction to posting, actually increased user satisfaction scores while reducing daily usage by 15 minutes. The key difference: it felt like a premium feature rather than a restriction.
Apple's Screen Time controls offer another model. By showing users detailed data about their usage patterns without imposing restrictions, the feature helps some users make more conscious choices about their phone habits. Research from Stanford shows that simply seeing weekly usage reports leads to a 7% reduction in total screen time, though the effect diminishes over time.
The most promising examples come from smaller platforms designed with humane principles from the start. BeReal's once-daily posting window eliminates the infinite scroll problem. Mastodon's chronological, ad-free feeds remove algorithmic manipulation. These platforms sacrifice growth potential for user wellbeing—a tradeoff that mainstream platforms can't easily make without regulatory pressure.
What This Means for Your Daily Phone Use
Understanding Harris's framework changes how you can think about your own technology habits. Instead of blaming yourself for "lack of self-control," you can recognize specific design patterns that make conscious usage difficult and work around them systematically.
Start by identifying which apps use the most aggressive engagement tactics. Apps that refresh content automatically, send frequent notifications, or use variable reward schedules (you never know what you'll find when you open them) are designed to be habit-forming. That's not a bug—it's the business model.
The most effective individual responses target these specific mechanisms. Turn off all non-essential notifications to break the intermittent reward cycle. Use apps like Freedom or Cold Turkey to add friction to opening social media. Switch to chronological feeds when platforms offer them (most don't make this easy to find, which tells you something).
But Harris's research suggests that individual solutions have inherent limits. Even users who successfully reduce their own usage often find that the people around them become more difficult to reach as platforms optimize for ever-higher engagement. Your friends post more extreme content because moderate content gets buried by algorithms. Group chats become more urgent because platforms train users to expect immediate responses.
This is why Harris focuses on system-level change rather than individual behavior modification. Personal digital wellness strategies can help, but they're swimming against a current designed to pull you back in. Real solutions require changing the current itself.
Frequently Asked Questions
What is Tristan Harris humane tech? Tristan Harris advocates for "humane technology"—tech designed to support human wellbeing rather than maximize engagement. His Center for Humane Technology pushes for design changes that respect users' time and attention.
Is this design choice intentional? Yes. Harris argues that features like infinite scroll, variable reward schedules, and push notifications are deliberately designed to be addictive, following principles from casino design and behavioral psychology.
Can I turn this off? Some features can be disabled through settings (notifications, autoplay), but many core addictive elements are baked into the app's fundamental design and can't be turned off by users.
What has the Center for Humane Technology actually accomplished? They've influenced some design changes at major platforms, contributed to legislative hearings, and raised awareness through documentaries like "The Social Dilemma," but systemic change remains limited.
Does Tristan Harris think we should quit social media entirely? No. Harris focuses on reforming how platforms work rather than telling people to quit. He wants technology that serves users rather than exploiting them for profit.
Pick one app that you use daily and spend five minutes exploring its notification settings. Turn off everything except direct messages and essential alerts. This won't solve the attention economy, but it will give you a taste of what technology designed for your wellbeing might feel like.
Frequently asked questions
Keep going
One short email a day with a specific, practical move to reduce screen time.
One short email. One small win.
A daily note with one specific thing to try — a setting to change, a tactic to run, a story to read. Unsubscribe anytime.
Keep reading
The Attention Economy, Explained: How Your Focus Became a Product
Why willpower won't beat apps designed by teams of neuroscientists. The economic forces that turned your attention into the internet's most valuable commodity.
Friction Is a Feature: How the Best Apps Slow You Down
Why One Sec, Opal, and other apps deliberately add friction to break your scroll habits. The counter-trend to frictionless design that's actually working.
How 'For You' Algorithms Actually Work (And Why They're So Hard to Resist)
The plain-English breakdown of how TikTok, Instagram, and YouTube decide what you see next—and why their predictions feel so eerily accurate.
Autoplay: The Design Choice That Stole Your Evening
YouTube's 2015 autoplay switch and TikTok's endless scroll aren't accidents. Here's the behavioral psychology behind it and how to turn it off.