Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would be nice if this could get standardized http headers so bots could still use sites but they effectively pay for use. That seems like the best of all possible worlds to me, the whole point of HTML is that robots can read it, otherwise we'd just be emailing eachother pdfs.


We already have a standardized system - robots.txt - and AI bots are already ignoring it. Why would more standardized headers matter? Bots will ignore them just as they do today, pretend to be regular users, and get content without paying.

(A secondary thing is that AI bots have basically zero benefit for most websites, so unless you are some low-cost crappy content farm, it'll be in your interest to raise the prices to the max so the bots are simply locked out. Which will bring us back to point 1, bots ignoring the headers)


Being indexed in search engines has zero benefit?

Also robots.txt is a suggestion but hashcash is enforced server side. I agree it's a tragedy people have started to completely ignore it but you can't ignore server side behavior.


Being indexed in search engines is very useful, but does not need any filtering - Google and all other major search engines respect robots.txt, use well-known UA and even publish IP address.

AI bots are not search engines, and they have no benefits for the website owners. This can be very clearly seen because they ignore robots.txt, pretend to be regular browsers, and use multiple proxies to avoid IP bans.


How do you propose the server distinguish between a bot and a human visitor?


They should have to set the evil bit



Formalizing it doesn't change that it's being used. If it doesn't work it shouldn't be done, if it does it should be formalized.


> bots could still use sites but they effectively pay for use. That seems like the best of all possible worlds to me

This would make the entire internet a maze of AI-slop content primarily made for other bots to consume. Humans may have to resort to emailing handwritten PDFs to avoid the thoroughly enshittified web.


As opposed to what we have now?


Yes - things could get worse from the status quo.

At the moment, ad networks don't pay for bot impressions when detected - so content farms tend to optimize for what passes for humans. All bets are off if human and bot visitors offer the same economic value via miners, or worse if it turns out that bots are more profitable due to human impatience.

Imagine an internet optimized for bot visitors, and indifferent to humans. It would be a kind of refined brainrot aimed at a brainless audience.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: