Skip to content

Commit e4d3670

Browse files
committed
feat: add initial robots.txt file to manage web crawler access
1 parent 313827f commit e4d3670

1 file changed

Lines changed: 21 additions & 0 deletions

File tree

public/robots.txt

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
# robots.txt for techdiary.dev
2+
3+
# 1. Global rules for all bots
4+
User-agent: *
5+
Disallow: /dashboard
6+
Disallow: /backdoor
7+
8+
# Allow important resources
9+
Allow: /css/
10+
Allow: /js/
11+
Allow: /images/
12+
Allow: /static/
13+
14+
# 2. Optional: Add crawl delay for non-Google bots to reduce server load
15+
# Useful for Bingbot, Yandex, etc.; Googlebot ignores this.
16+
User-agent: Bingbot
17+
Crawl-delay: 10
18+
19+
# 3. Sitemap location(s)
20+
Sitemap: https://www.techdiary.dev/sitemaps/articles/sitemap.xml
21+
Sitemap: https://www.techdiary.dev/sitemaps/profiles/sitemap.xml

0 commit comments

Comments
 (0)