About LLMs.txt Directory
The LLMs.txt Directory is a community-driven project that aims to catalog websites that have implemented the LLMs.txt standard, making it easier for AI developers and users to discover and respect content creators' preferences.
What is LLMs.txt?
LLMs.txt is a proposed standard file, similar to robots.txt, that allows website owners to communicate their preferences regarding how Large Language Models (LLMs) should interact with their content. This includes permissions for training, content generation, and other AI-related activities.
By implementing an LLMs.txt file, website owners can explicitly state their preferences, helping to establish clearer boundaries for AI systems and the companies that develop them.
Our Mission
Our mission is to promote responsible AI development by:
- Creating a comprehensive directory of websites with LLMs.txt implementations
- Raising awareness about the importance of respecting content creators' preferences
- Providing resources for website owners to implement their own LLMs.txt files
- Fostering a community that values ethical AI development and usage
How to Participate
There are several ways you can participate in this project:
- Implement an LLMs.txt file on your own website and submit it to our directory
- Spread awareness about the LLMs.txt standard and its importance
- Contribute to the development of the LLMs.txt standard
- Provide feedback on our directory and suggest improvements