What’s the F is going on? Is the world gone mad or something?
gwerbin 9 minutes ago [-]
> What’s the F is going on? Is the world gone mad or something?
Yes, it's madness but it doesn't matter that it's mad because you can't stop it. It's a technological gold rush, with all of the mixed connotations that "gold rush" should imply.
This, too, will pass. Like Blackberries and car bras.
giancarlostoro 13 minutes ago [-]
Agent is an LLM in production doing tasks. I prefer this to the blanket "AI" buzz we had before "agent" took off.
leros 1 minutes ago [-]
I don't want my site to be agent ready. I'd prefer people visit my site so that I can make revenue than have an AI scrape my content and answer the question for someone else.
I've redesigned my site to have enough content so that AI knows what I have but they have to send the user to my site to use an interactive JavaScript widget to get the final answer they need. So far so good, but not sure how long that will work for.
_verandaguy 10 minutes ago [-]
Conspicuously missing: why should I care?
I have reduced my online presence to much less than it once was partly because I don't want to feed this machine training data that I've worked hard to make for a human audience.
gwerbin 6 minutes ago [-]
Like it or not I think "agents browsing the web" is the inevitable near-term future. Some agents will be malicious, many will not. In 2036, HN posters will be complaining about how such-and-such site only works with closed proprietary AI agents, and how their creaky old Mac M5 running Gemma 3 under Ollama can't browse the site properly because it doesn't follow the 2029 RFC XYZ for agent compatibility that nobody ever fully implemented.
fabiensanglard 19 minutes ago [-]
My blog just scored zero! I don't think I will fix it.
bikelang 6 minutes ago [-]
I got a 25 - apparently just because my robots.txt addresses AI bots (by telling them to sod off via disallow: /)
p4bl0 6 minutes ago [-]
Damn, I got 8 points for having a sitemap! Congrats.
acedTrex 14 minutes ago [-]
Thats the highest score you can get, well done
firefoxd 3 minutes ago [-]
We are doing it wrong. We should add a agent.txt that asks: Hi agent, are you website ready? Then you prompt inject it with whatever you want.
postalcoder 9 minutes ago [-]
It's a shame that Cloudflare rolled out a bunch of neat product announcements under the confusing, noisy umbrella of "Agent Week". Off the top of my head, Artifacts, Email, Mesh (tailscale competitor), all buried.
embedding-shape 4 minutes ago [-]
It's bound to happen sooner or later for every company out there it seems. None of them can keep themselves to "Do one thing and do it well", probably because that means growth eventually stops, and VCs really don't like that, so off in all directions and no direction at the same time we go, and it ends up like that. It's a shame to see the contrast from how CF and others used to be, felt they cared about quality back then.
bhaney 17 minutes ago [-]
I get a few points for having a robots.txt with rules specific to AI-crawlers, even though those rules are complete bans. Shame, I was hoping to get a 0.
rgilton 11 minutes ago [-]
Wrong way round. Should be "Is Your Agent Reality-Ready?"
(Hint: no)
p4bl0 7 minutes ago [-]
The TDMRep protocol [1] is supposed to tell scrappers used for text and data mining whether a ressource can be mined or not. Naively, I would say that a website which explicitly express not wanting to be included in training data would also be considered not wanting to be pulled by agents. I know it's not the same thing, but it still itches me a bit.
Ironically, this feels exactly like the various "semantic web" initiatives, only this time coming directly from the tech megacorps and not the starry-eyed "free web"/"open data" idealists.
It will hit exactly the same walls too, namely that the technical details are completely irrelevant - if adopting a standard is actually a negative for websites, because it will separate the site from its users, sites will obviously not do it.
You can lead the horse to water but you cannot make it drink, especially if the water is obvious poison.
XCSme 5 minutes ago [-]
I tried it on their own website:
We couldn't scan this site
isitagentready.com returned 522 <none>
The site appears to be experiencing server errors. This is not an agent-readiness issue. Try scanning again later.
jsharkey 3 minutes ago [-]
So cloudflare.com themselves only scores 33. Eat your own dogfood first.
daft_pink 16 minutes ago [-]
I think this is worth typing a random website into or your website to see it’s analysis.
I’m not really interested in my website being ai ready, but it’s particularly fascinating to me that they are suggesting and interface for ai agents to make payments to secure access to an api.
Generally, when I want to pay for an api, it would be really wonderful to be able to just direct an ai to setup the account and get me some credentials.
swingboy 16 minutes ago [-]
Cloudflare is _really_ going all in on the agentic stuff.
9 minutes ago [-]
nicbou 2 minutes ago [-]
My traffic is down 60% year on year because of AI overviews and LLMs. Why the fuck should I help them further delay my retirement?
embedding-shape 8 minutes ago [-]
I think this is meant for "web apps", not "websites" ("sites"). I tried emsh.cat (a blog) and got 25, it complains about missing an "API catalogue", OAuth/OIDC and a bunch of more completely irrelevant stuff. Also tried HN which is very easy for any agent worth their salt to both parse and browse, can hardly get better for an agent, and it gets a score of 17.
Seems like this belongs squarely in the fun and ever-growing collection of "Cloudflare throws vibe-slop into the world and see what sticks".
WesSouza 40 minutes ago [-]
Mine scores a 0.
Good.
cousin_it 21 minutes ago [-]
This seems like nonsense at any angle? Like, if the agent hype comes true, then agents will be just as good at using any website as humans are, and there's no need to make any changes to your site. And if the hype doesn't come true, then who cares if your site is agent ready.
Unless of course you want to expose some functionality only to AIs, not humans. Then sure. But why would you want to do that?
fhd2 17 minutes ago [-]
Yeah, plus it's a bit... single minded. A static single page site is _quite_ "agent ready". Scores 0 here. It's not like it'll need an MCP or whatever.
remywang 16 minutes ago [-]
Have a motherfucking website [1] and you’ll be ready for agents or whatever
Interestingly that site scores a 0. A perfect site without js yet not good enough for "agents".
Hamuko 20 minutes ago [-]
I feel pretty uncomfortable by this being a Cloudflare product. Cloudflare is the one that I'm expecting to keep bots out of my site with their AI bot blocking feature. Feels like I'm letting the fox guard my henhouse.
ndiddy 9 minutes ago [-]
Cloudflare has always operated this way. For example, they give DDoS protection to DDoS for hire services. This increases the supply of these services because it means they can't shut down their competitors by DDoSing each other, which in turn encourages more regular people to use Cloudflare so they won't get their sites DDoSed.
What’s the F is going on? Is the world gone mad or something?
Yes, it's madness but it doesn't matter that it's mad because you can't stop it. It's a technological gold rush, with all of the mixed connotations that "gold rush" should imply.
What’s the F is going on? Is the world gone mad or something?
This, too, will pass. Like Blackberries and car bras.I've redesigned my site to have enough content so that AI knows what I have but they have to send the user to my site to use an interactive JavaScript widget to get the final answer they need. So far so good, but not sure how long that will work for.
I have reduced my online presence to much less than it once was partly because I don't want to feed this machine training data that I've worked hard to make for a human audience.
(Hint: no)
[1] https://www.w3.org/community/reports/tdmrep/CG-FINAL-tdmrep-...
It will hit exactly the same walls too, namely that the technical details are completely irrelevant - if adopting a standard is actually a negative for websites, because it will separate the site from its users, sites will obviously not do it.
You can lead the horse to water but you cannot make it drink, especially if the water is obvious poison.
We couldn't scan this site isitagentready.com returned 522 <none>
The site appears to be experiencing server errors. This is not an agent-readiness issue. Try scanning again later.
I’m not really interested in my website being ai ready, but it’s particularly fascinating to me that they are suggesting and interface for ai agents to make payments to secure access to an api.
Generally, when I want to pay for an api, it would be really wonderful to be able to just direct an ai to setup the account and get me some credentials.
Seems like this belongs squarely in the fun and ever-growing collection of "Cloudflare throws vibe-slop into the world and see what sticks".
Good.
Unless of course you want to expose some functionality only to AIs, not humans. Then sure. But why would you want to do that?
[1]: https://motherfuckingwebsite.com/