Bishma ,
@Bishma@discuss.tchncs.de avatar

I mentioned this in another thread, but I do worry that google is eventually going infect the APIs that metasearch engines like DDG, Kagi, searchxng, etc depend on.

In my experience, a lot of the sysadmins who run high traffic sites will treat all bots as scrapers that have to be blocked or slowed to a crawl. Then they make special allowances for googlebot, bing/msnbot, and a few others. That means there is a massive uphill climb (beyond the technical one) to making a new search engine from scratch. With Google and MS both betting the farm on LLMs I fear we're going to lose access to two of the most valuable web reverse indexes out there.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • technology@beehaw.org
  • test
  • worldmews
  • mews
  • All magazines