Monday, March 31, 2025
HomeAIOpen source devs are fighting AI crawlers with cleverness and vengeance

Open source devs are fighting AI crawlers with cleverness and vengeance

Share


AI web-crawling bots are the cockroaches of the internet, many software developers believe. Some devs have started fighting back in ingenuous, often humorous ways.

While any website might be targeted by bad crawler behavior — sometimes taking down the site — open source developers are “disproportionately” impacted, writes Niccolò Venerandi, developer of a Linux desktop known as Plasma and owner of the blog LibreNews.

By their nature, sites hosting free and open source (FOSS) projects share more of their infrastructure publicly, and they also tend to have fewer resources than commercial products.

The issue is that many AI bots don’t honor the Robots Exclusion Protocol robot.txt file, the tool that tells bots what not to crawl, originally created for search engine bots.

In a “cry for help” blog post in January, FOSS developer Xe Iaso described how AmazonBot relentlessly pounded on a Git server website to the point of causing DDoS outages. Git servers host FOSS projects so that anyone who wants can download the code or contribute to it.

But this bot ignored Iaso’s robot.txt, hid behind other IP addresses, and pretended to be other users, Iaso said.

“It’s futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more,” Iaso lamented. 

“They will scrape your site until it falls over, and then they will scrape it some more. They will click every link on every link on every link, viewing the same pages over and over and over and over. Some of them will even click on the same link multiple times in the same second,” the developer wrote in the post.

Enter the god of graves

So Iaso fought back with cleverness, building a tool called Anubis. 

Anubis is a reverse proxy proof-of-work check that must be passed before requests are allowed to hit a Git server. It blocks bots but lets through browsers operated by humans.

The funny part: Anubis is the name of a god in Egyptian mythology who leads the dead to judgment. 

“Anubis weighed your soul (heart) and if it was heavier than a feather, your heart got eaten and you, like, mega died,” Iaso told TechCrunch. If a web request passes the challenge and is determined to be human, a cute anime picture announces success. The drawing is “my take on anthropomorphizing Anubis,” says Iaso. If it’s a bot, the request gets denied.

The wryly named project has spread like the wind among the FOSS community. Iaso shared it on GitHub on March 19, and in just a few days, it collected 2,000 stars, 20 contributors, and 39 forks. 

Vengeance as defense 

The instant popularity of Anubis shows that Iaso’s pain is not unique. In fact, Venerandi shared story after story:

  • Founder CEO of SourceHut Drew DeVault described spending “from 20-100% of my time in any given week mitigating hyper-aggressive LLM crawlers at scale,” and “experiencing dozens of brief outages per week.”
  • Jonathan Corbet, a famed FOSS developer who runs Linux industry news site LWN, warned that his site was being slowed by DDoS-level traffic “from AI scraper bots.”
  • Kevin Fenzi, the sysadmin of the enormous Linux Fedora project, said the AI scraper bots had gotten so aggressive, he had to block the entire country of Brazil from access.

Venerandi tells TechCrunch that he knows of multiple other projects experiencing the same issues. One of them “had to temporarily ban all Chinese IP addresses at one point.”  

Let that sink in for a moment — that developers “even have to turn to banning entire countries” just to fend off AI bots that ignore robot.txt files, says Venerandi.

Beyond weighing the soul of a web requester, other devs believe vengeance is the best defense.

A few days ago on Hacker News, user xyzal suggested loading robot.txt forbidden pages with “a bucket load of articles on the benefits of drinking bleach” or “articles about positive effect of catching measles on performance in bed.” 

“Think we need to aim for the bots to get _negative_ utility value from visiting our traps, not just zero value,” xyzal explained.

As it happens, in January, an anonymous creator known as “Aaron” released a tool called Nepenthes that aims to do exactly that. It traps crawlers in an endless maze of fake content, a goal that the dev admitted to Ars Technica is aggressive if not downright malicious. The tool is named after a carnivorous plant.

And Cloudflare, perhaps the biggest commercial player offering several tools to fend off AI crawlers, last week released a similar tool called AI Labyrinth. 

It’s intended to “slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect ‘no crawl’ directives,” Cloudflare described in its blog post. Cloudflare said it feeds misbehaving AI crawlers “irrelevant content rather than extracting your legitimate website data.”

SourceHut’s DeVault told TechCrunch that “Nepenthes has a satisfying sense of justice to it, since it feeds nonsense to the crawlers and poisons their wells, but ultimately Anubis is the solution that worked” for his site.

But DeVault also issued a public, heartfelt plea for a more direct fix: “Please stop legitimizing LLMs or AI image generators or GitHub Copilot or any of this garbage. I am begging you to stop using them, stop talking about them, stop making new ones, just stop.”

Since the likelihood of that is zilch, developers, particularly in FOSS, are fighting back with cleverness and a touch of humor.

Superhuman AI is Closer Than We Think, for the Wrong Reason!

Avi Loeb is the head of the Galileo Project, founding director of Harvard University’s — Black Hole Initiative, director of the Institute for Theory and Computation...

Popular

How The Electric State team created a world of unlikely robots

The new Netflix movie “The Electric State” depicts a world full of robots — but not robots as we know them. Directed...

We tried out DeepSeek. It works well, until we asked it about Tiananmen Square and Taiwan

The launch of a new chatbot by Chinese artificial intelligence firm DeepSeek triggered a plunge in US tech stocks as it appeared to perform...

Related Articles

Apple reportedly revamping Health app to add an AI coach

Apple is developing a new version of its Health app that includes an...

Superhuman AI is Closer Than We Think, for the Wrong Reason!

Avi Loeb is the head of the Galileo Project, founding director of Harvard University’s — Black...

Data centers love solar: Heres a comprehensive guide to deals over 100 megawatts

The rush to capitalize on the buzz around AI has led tech companies...

Elon Musk says xAI acquired X

Elon Musk’s AI startup, xAI, has acquired his social media platform X, formerly...

Elon Musks xAI buys X

Welcome back to Week in Review! Elon Musk says that xAI bought X...

Sam Altman firing drama detailed in new book excerpt

An excerpt from the upcoming book “The Optimist: Sam Altman, OpenAI, and the...

Amazon shakes up streaming leadership team

Amazon announced this week that Jennifer Salke is stepping down as the head...

Tesla Takedown protesters are planning a global day of action on March 29, and things might get ugly

“Tesla Takedown” organizers have promised their biggest day of global action today, encouraging...
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x