- cross-posted to:
- technology@lemmy.world
- news@beehaw.org
- technology@beehaw.org
- cross-posted to:
- technology@lemmy.world
- news@beehaw.org
- technology@beehaw.org
An update to Google’s privacy policy suggests that the entire public internet is fair game for it’s AI projects.
Here’s an example https://www.google.com/robots.txt
Basically it’s a file people put in their root directory of their domain to tell automated web crawlers what sections of the website and what kind of web crawlers are allowed to access their resources.
It isn’t a legally binding thing, more of a courtesy. Some sites may block traffic if they’re detecting the prohibited actions, so it gives your crawlers an idea of what’s okay in order to not get blocked.
Got it. Thanks.