uncloseai.

Our Ethical Web Crawler

About Our Web Crawler

Our team uses an ethical web crawler for research and development on the open web. We believe in responsible data collection that respects website owners and follows industry best practices.

User Agent

Our crawler identifies itself with the following user agent:

uncloseai.com/1.42 (ethical web crawler; +https://uncloseai.com)

This clearly identifies who we are and provides a link back to this page for more information.

What We Do

Our crawler is used by our AI systems to:

Our Ethical Practices

We follow strict ethical guidelines to ensure we're good web citizens:

1. Robots.txt Compliance

We always respect robots.txt files. If your site's robots.txt disallows our crawler, we won't access those pages.

# Example: Block our crawler from specific paths
User-agent: uncloseai.com
Disallow: /private/
Disallow: /admin/

2. Crawl Delays

We respect crawl delays specified in robots.txt. Our default is 2 seconds between requests to the same domain, but we'll honor any delay you specify:

# Example: Set custom crawl delay
User-agent: uncloseai.com
Crawl-delay: 5

3. Intelligent Caching

We cache fetched content for 7 days by default. This means:

4. Smart Depth Crawling

We use intelligent depth-based crawling:

Our crawler automatically decides how deep to go based on whether it found relevant information, so we don't waste resources fetching unnecessary pages.

5. Query-Aware Relevance

We score pages based on relevance to the user's question, prioritizing:

Transparency

We believe in being open about our crawling practices:

How to Block Our Crawler

If you don't want our crawler accessing your site, you can block it using robots.txt:

# Block uncloseai.com crawler entirely
User-agent: uncloseai.com
Disallow: /

Or use your server's firewall/WAF to block our user agent string.

Questions or Concerns?

If you have questions about our crawler or want to discuss our access to your site, please reach out:

Try Our Discord Bot

Our ethical web crawler powers the research capabilities in our Discord bot. The bot can:

Want to try it? Check out our Discord bot source code to see how our ethical crawler powers the bot's research capabilities.