9 min read

The Numbers Game


OpinionIndieSocial Media

I caught myself doom-scrolling again.

Not looking for anything in particular. Just the habit of opening Twitter, seeing some random account with 47K followers post "good morning builders," and quietly wondering if I'm doing something wrong. Their bio says "founder, dad, entrepreneur." Their tweets are generic. Their replies are emojis. And yet: 47K. I've been posting for two years and I'm sitting at four hundred.

The easy conclusion: they know something you don't. They're doing something right. You're behind.

After a while you start to notice the conclusion is wrong. Not because the number is wrong, but because the number doesn't mean what you think it means anymore.

The numbers almost nobody wants to look at

Imperva publishes an annual bots-on-the-internet report. The 2025 edition, measuring 2024, found that 51% of all internet traffic is no longer human. It's bots. Of that 51%, 37% is what they call "bad bots": malicious automation, scraping, fraud, engagement manipulation. It's the first time in the decade Imperva has been tracking this that bots crossed humans.

Twitter (X) is no better. SparkToro and Followerwonk ran a public audit in 2022, while Musk was arguing in court that Twitter had less than 5% bots. The audit, across 44,058 randomly-sampled active accounts, found that 19.42% are spam or fake. The @POTUS account at the time had 50% fake followers. And that was before cheap LLMs made bot creation trivial.

Cyabra ran another measurement in October 2024 on five Musk political posts and found that 20% of average engagement came from inauthentic accounts, with two specific posts hitting 40%. Bots voting in polls, bots replying, bots driving impression counts so the algorithm thinks something matters.

GitHub doesn't escape either. A team from Carnegie Mellon, Socket, and NC State published a paper in December 2024, "4.5 Million (Suspected) Fake Stars in GitHub", where they analyzed 20TB of GitHub data (6.7 billion events) and built a tool called StarScout. The numbers: over 4.5 million suspicious stars across more than 15,800 repos. In July 2024, the worst month on record, 16.66% of all repos with 50+ stars were involved in fake-star campaigns. One in six. And it's not just CMU hallucinating: GitHub has already deleted more than 90% of the repos the paper flagged. The most affected categories, per the paper, are crypto/sniper bot repos, pirated software, and, surprisingly, AI/LLM projects (~177,000 fake stars), where startups and academic-paper repos buy stars to look legit.

TikTok published its own transparency report for the first half of 2024 and the number is eye-opening: they blocked 36 billion fake likes in six months. They removed 940 million videos from fake accounts. They blocked 15 billion follow attempts. That's just what TikTok admits to blocking. The more interesting question is how much got past them. Next time you see a "🔥🔥🔥" or "❤️❤️❤️" comment on a video with 200K views, that's not an enthusiastic fan. That's a bot probing whether the account is alive.

The market for fake numbers

What makes this more than a technical problem is that there's a liquid, cheap market for it.

Naizop sells Twitter followers at $0.02 per follower. Other services charge $17.50 per thousand. GitHub stars sell by the package. There are public lists of "growth" services that are basically bot farms with pretty dashboards. Anything offering followers under $25 per thousand is mathematically a bot. Humans don't get that cheap.

Here's the part that hits: someone barely funded can buy 10,000 followers for $200. A startup wanting to look legit can buy 5,000 GitHub stars for less than what their CTO costs in a day. The wall between "looking established" and "being established" collapsed.

I'm not saying most people do this. I'm saying that when you see a big number, the number is no longer a reliable signal. It's noise.

Why the algorithm rewards garbage

When Twitter open-sourced its algorithm in March 2023 (one of the few good moves Musk made, credit where due), the community pulled it apart in hours. What they found is offensive in its simplicity: a reply is weighted 27x a like. If the original author engages back with the replier, that reply is weighted 75x a like.

This explains something we've all felt but few articulate: why a post saying "React or Angular?" or "iPhone or Android?" beats by ten-to-one a post where someone shares three months of real building progress. The first invites debate. Debate generates replies. Replies carry 27x the weight. The algorithm doesn't distinguish a healthy debate from a stupid fight. Weight is weight.

Frances Haugen, the Facebook whistleblower, said the same thing in her 2021 Senate testimony about Meta: the company's own internal research showed that engagement-based ranking systematically amplifies "angry, divisive, polarizing" content because outrage performs better than any other emotion. This isn't theory. It's Meta's own engineers saying it in internal documents.

So when you post your builder progress and nothing happens, it's not that your work isn't worth anything. It's that your work isn't debatable, isn't polarizing, doesn't generate replies. The system is designed for you to lose against "Tabs vs spaces?".

And the temptation to play the game

I've thought more than once about just playing this game. Doing threads about "5 prompts that changed my life." Posting "React or Vue?" every Monday. Screenshotting ChatGPT with the vaguest possible caption. It looks trivial to generate 10K followers in six months if you abandon any substance.

But here's the problem: even if you win the game, what you won is useless. 10,000 followers who came in through farming bait don't buy your product, don't read your newsletter, don't remember you next week. They gave you a like, which was the entry cost the bot/near-bot paid to be able to pitch you something later. It's a market of one-sided transactions.

And that's the mental flip worth installing: a high number isn't proof of an asset, it's proof of a cost. That person paid in time, in substance, or in money, to accumulate that number. The question is whether the number generates a return. It almost never does when it was accumulated via engagement bait.

The dead internet isn't a theory anymore

Five years ago, saying "dead internet theory" placed you in conspiracy-adjacent territory. The original Atlantic piece from 2021 treated it as "patently ridiculous" but captured a real pattern.

In 2025 it doesn't feel ridiculous anymore.

Graphite, an SEO company, published an analysis showing that in November 2024, AI-generated articles briefly surpassed human-written ones on the open web: 50.3% of new articles published were AI-first. It has since stabilized near 50/50. The good news is only ~14% of AI articles rank in Google vs. 86% for human ones. But that's Google doing its job, not the web being healthy.

And then there's this tweet from Sam Altman, who is, to be clear, the CEO of OpenAI:

"i never took the dead internet theory that seriously but it seems like there are really a lot of LLM-run twitter accounts now"

When the guy selling the models that create the bots admits the social network is saturated with LLM accounts, you know it's serious.

What this does to builders

Here's where it gets personal. And where I want to be honest without sounding like I'm crying.

You spend three months building something. You launch it. You post an update with screenshots, real data, what you learned. You get 14 likes and a comment from your mom (love you, Mom). You open the feed and see someone you've been following for three weeks, an "AI builder" who only posts generic threads about "the 5 prompts that 10x your output", and they just crossed 30K followers. And the voice in your head says: I'm doing this wrong.

That voice is comparing your real work against a number that's probably inflated. And even if it isn't, even if that builder is genuine, what the algorithm rewarded was debatable content, not substance. You're not behind. You're playing a different game.

This doesn't mean you shouldn't get better at writing, posting better, finding your voice. It means the metric you use to measure yourself has to change. If I had stayed calibrated to followers each week, I would've quit two years ago. But there are other signals, and they're the ones that actually matter:

  • A DM from someone saying "your post about X made me understand Y for the first time"
  • A user who comes back to the product the next week without you nudging
  • Someone who quotes your work in their own post, without tagging, because they don't need the credit
  • A customer who pays without you running a six-step funnel

Those signals don't scale. They don't show up on a dashboard. But they're real. And when you accumulate them, they end up mattering more than any follower count.

Why this gets worse before it gets better

Imperva attributes the 2024 bot surge directly to generative AI: the marginal cost of creating a convincing account, posting convincing content, and faking convincing engagement, dropped to basically zero. The economics no longer defend humans from bots. Bots win on cost, on speed, and increasingly on surface quality.

What's coming in the next 12 to 24 months, the way I read it:

  • Platforms will have to surface stronger "verified human" signals. Not the buyable blue check, but actual identity (Worldcoin, biometric passkeys, attestations of some kind)
  • "Small and authentic" becomes paradoxically more valuable. Humans look for humans when the noise rises
  • Public metrics (followers, stars, likes) lose what little credibility they had left; private metrics (retention, conversion, revenue) become the only real standard
  • Some networks (Discord, paid communities, newsletters) grow in relevance precisely because they have enough friction to keep bots out

What to actually measure

If you're a builder and you're trying to stay sane, this is the only thing I'd recommend:

Never measure yourself against strangers' public numbers. You don't know how many are real. You don't know how many were bought. You don't know what algorithm pushed them. It's like measuring your health against someone else's Instagram filter.

Measure what happens in private. Real DMs. Newsletter subscribers who actually open. Users who come back without push notifications. People who pay. People who recommend without being asked. Those don't sell at $0.02.

Measure your output, not your reach. How many posts you wrote with substance this month. How many features you shipped. How many real conversations you had. Reach is noise. Output is a signal of your own commitment.

And when the voice in your head says "I'm behind" because you saw someone with 47K followers, remember that probably 9K of those are bots, another 5K are inactive accounts, another 8K are people who followed once and never came back, and the remaining 25K will never buy anything from them. And you still have your 400 real followers, reading, replying, and occasionally telling you that something you made worked.

That's a smaller number. But that number exists.


This post isn't selling you optimism. It's just reminding me, and you, that the game it looks like you're losing was never the real game.