Adding RSL to robots.txt
RSL extends the limited Allow and Disallow directives of the Robot Exclusion Protocol (robots.txt) with granular controls that define machine-readable usage restrictions and licensing terms.
RSL-compliant robots.txt files must include one or more License directives that link to an RSL license file that define permitted use and licensing requirements for the website's content. The License directive is global (not tied to any specific user agent) and crawlers must retrieve and comply with the defined terms before accessing or processing the site’s content.
Usage: Add a License directive to robots.txt
License: [absoluteURL][absoluteURL] must point to the location of a valid RSL license file or feed. It must be a fully qualified URL, including the protocol and host, and does not need to be URL-encoded. The URL may be on a different host than the robots.txt file.
Example robots.txt file
License: https://example.com/license.xml
User-agent: Googlebot
Allow: /