Paper | Prorock: Addressing the Limitations of Robots.txt in Controlling AI Crawlers
slides-aicontrolws-addressing-the-limitations-of-robotstxt-in-controlling-ai-crawlers-00
Slides | IAB Workshop on AI-CONTROL (aicontrolws) Team | |
---|---|---|
Title | Paper | Prorock: Addressing the Limitations of Robots.txt in Controlling AI Crawlers | |
Abstract | The emergence of Generative AI and the surrounding ecosystem has introduced new challenges for the internet, highlighting the limitations of the Robots Exclusion Protocol (RFC … The emergence of Generative AI and the surrounding ecosystem has introduced new challenges for the internet, highlighting the limitations of the Robots Exclusion Protocol (RFC 9309). The current mechanisms for controlling automated access are inadequate for both AI system operators and content creators. This paper explores the deficiencies of the robots.txt approach and proposes considerations for a more robust solution. |
|
State | Active | |
Other versions | ||
Last updated | 2024-09-09 |
slides-aicontrolws-addressing-the-limitations-of-robotstxt-in-controlling-ai-crawlers-00
Not available as plain text.
Download as PDF.