AI enshittified Digg shuts down, temporarily?

They claim the bots were at fault

Maybe I should start a series on the enshittified community web. This AI-driven community world seems to be going just great. 🫠

I previously wrote about the enshittification of the community web and how we're allowing it to happen, and in this case, Digg got hit hard with bots, resorted to layoffs and it has 'temporarily closed doors'.

I have a hunch that the temporary shutdown will turn into a permanent one, time will tell. I checked them out when they launched and a couple of times after that, and I struggled to see any real value in them. It felt like another low value waste of space. Maybe people have time for that in their one life they have to live, not me!

They allowed this to happen by not taking moderation seriously and believing in the over-glorification of AI as a solution to a better community. Little did they realise that it would be the AI bots taking them down rather than helping them. 🤷🏻‍♀️

A Hard Reset, and What Comes Next Building on the internet in 2026 is different. We learned that the hard way. Today we're sharing difficult news: we've made the decision to significantly downsize the Digg team. This wasn't a decision made lightly, and it's important to say clearly: this is one of the strongest groups of people we've ever had the privilege of working with. This is not a reflection of their talent, their effort, or their belief in what we were building. It's a reflection of the brutal reality of finding product-market fit in an environment that has fundamentally changed.  We faced an unprecedented bot problem When the Digg beta launched, we immediately noticed posts from SEO spammers noting that Digg still carried meaningful Google link authority. Within hours, we got a taste of what we'd only heard rumors about. The internet is now populated, in meaningful part, by sophisticated AI agents and automated accounts. We knew bots were part of the landscape, but we didn't appreciate the scale, sophistication, or speed at which they'd find us. We banned tens of thousands of accounts. We deployed internal tooling and industry-standard external vendors. None of it was enough. When you can't trust that the votes, the comments, and the engagement you're seeing are real, you've lost the foundation a community platform is built on.  This isn't just a Digg problem. It's an internet problem. But it hit us harder because trust is the product.  Building social is hard, incumbents are harder We underestimated the gravitational pull of existing platforms. Network effects aren't just a moat, they're a wall. The loyalty users have to the communities they've already built elsewhere is profound. Getting people to move is a hard enough problem. Getting them to move and bring their people with them is something else entirely.  What’s next We're not giving up. Digg isn't going away.  A small but determined team is stepping up to rebuild with a completely reimagined angle of attack. Positioning Digg as simply an alternative to incumbents wasn't imaginative enough. That's a race we were never going to win. What comes next needs to be genuinely different.  We're also announcing something we're excited about: Kevin Rose, Digg's founder who started the company back in 2004, is returning to join the team full-time. Starting the first week of April, Kevin will be putting his focus back on the company he built twenty+ years ago. He'll continue as an advisor to True Ventures, but Digg will be his primary focus. We couldn't think of a better person to help figure out what Digg needs to become.  Lastly, Diggnation, our official Digg podcast, will continue recording monthly while we work on the re-reboot.  Lastly, and most importantly, thank you… To the team members we're saying goodbye to today: thank you. You took a bet on a hard problem and showed up every day. The work you did laid the groundwork for what comes next, even if it doesn't feel that way right now.  To the community who came back to Digg, submitted links, argued in the comments, and emailed us with what you wanted: we haven't forgotten why we're doing this. We know how frustrating this is, and we hope you'll give us another look once we have something to show, we’ll save your usernames!  Ultimately, the internet needs a place where we can trust the content and the people behind it. We're going to figure out how to build it.  More soon –@justin, CEO
A screenshot of the announcement for prosperity

Of course, it's hard to know for sure the full story, but here's a snippet from The Verge:

When they announced its relaunch, Rose told The Verge that AI could “remove the janitorial work of moderators and community managers.” Now, the new Digg’s CEO Justin Mezzell writes in a note pinned to the homepage that, “We knew bots were part of the landscape, but we didn’t appreciate the scale, sophistication, or speed at which they’d find us. We banned tens of thousands of accounts. We deployed internal tooling and industry-standard external vendors. None of it was enough.”

I feel for Digg, of course, spammers were always more likely to pounce on them, considering the history and previous known nature of Digg. But let's be real, assuming AI can handle all the moderation up front, perfectly, is quite frankly crazy thinking.

Other comments from The Verge community commentary indicated a wider bot problem, not just for Digg, but for Reddit and likely other spaces too:

Can't say I'm surprised. The bot problem on Reddit is probably even worse, and sometimes very obvious. I'm not sure Reddit actually cares to solve that problem, though.

And...

There was no content or community. I tried it out because I'm frustrated with Reddit for a lot of reasons but there was just not enough people/content there to have me return after a couple of days.

To me, this signifies a real and ongoing shift. I'm not quite sure where it's heading, but it is certainly influencing my decisions over at the MoTaverse. The people are noticing too, I detect a shift in people's willingness to continue spend time in these spaces.

The state of the web is that trust feels at an all time low, I can only assume it will get lower, but everything in community, for me, is about building on trust. I don't want to be part of anything that doesn't have a real connection or meaning. I refuse!

We all have the capacity to refuse!

Trust has to be the foundation. If we allow bots in, the trust is gone.

Of course, many people appreciate Reddit, but for me, the largely anon aspect has always put me off. It encourages a certain type of content, that is always going to lack context and depth. They've been trying to maintain the human aspect, I'm not sure how it's going, but I honestly can't see how they can prevent bots, especially as they get smarter. It's a hard nope for me. I've got one life, I'm not going to spend it there.

All of a sudden, running free communities runs real risks of not knowing whether people signing up are human beings or bots. And then we need to do the enshittified work of cleaning things up constantly. The moderation work does not disappear because of AI, it just shifts elsewhere.

We're made to believe that this has to be the way. That's also a hard nope from me.

Not only will people increasingly not want to spend their time checking whether they are speaking to humans and verifying that what they are reading is true, but the people leading communities should be doing good work, not fighting bot spam constantly.

The purpose of bots could become more sophisticated too. As community people, we're used to spammers posting content and having profiles that usually promote something. They have traditionally been easier to spot.

But what if they lurk behind the scenes, doing other things we are not aware of? What if there are agentic created profiles that are slowly created over time, that feel and look human.

What if agents pose as other people?

What if agents are instructed by the person? (I can see this as both negative, but also potentially a positive way to engage in an easier way).

How can people be sure they are speaking to real people? (It's important, you know, despite what others may have you think otherwsie)

There are many risks, both from user and business perspectives, and they are going to be harder to spot and deal with. I could quite easily see agentic tools being built around this.

It's wild and depressing. The web is becoming less trustworthy. Every interaction we have will be followed by the question of whether we can trust it. And again, I can only emphasise that this makes me want to build something that is truly more human-led.

But with it comes opportunity. And that's where my mind is right now. There is so much in flux and with that comes the option to rethink what we

What's clear, is that when we design for community, we now need to protect ourselves from the AI slop and the bots. It has to be baked into our strategy and we can't expect how things we did in the past to work today.

Are you even building community if you aren't protecting yourself from AI enshittification?

Learn more:

Digg shuts down for a ‘hard reset’ because it was flooded with bots
Digg has shut down, for now, just a few months after its open beta launched.
Digg’s open beta shuts down after just two months, blaming AI bot spam
Digg is going away again, but Kevin Rose is coming back.
Digg shuts down for a ‘hard reset’ because it was flooded with bots
Digg has shut down, for now, just a few months after its open beta launched.

Learn to build community through every day actions

Sharing actionable community building insights since 2020.

Rosieland

Our Sponsors

View

Testimonials

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Rosieland.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.