webclaw/packages/create-webclaw/package.json
Valerio 050b2ef463 feat: add allow_subdomains and allow_external_links to CrawlConfig
Crawls are same-origin by default. Enable allow_subdomains to follow
sibling/child subdomains (blog.example.com from example.com), or
allow_external_links for full cross-origin crawling.

Root domain extraction uses a heuristic that handles two-part TLDs
(co.uk, com.au). Includes 5 unit tests for root_domain().

Bump to 0.3.12.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 19:33:06 +02:00

42 lines
835 B
JSON

{
"name": "create-webclaw",
"version": "0.1.4",
"mcpName": "io.github.0xMassi/webclaw",
"description": "Set up webclaw MCP server for AI agents (Claude, Cursor, Windsurf, OpenCode, Codex, Antigravity)",
"bin": {
"create-webclaw": "./index.mjs"
},
"type": "module",
"keywords": [
"webclaw",
"mcp",
"mcp-server",
"ai",
"ai-agent",
"scraping",
"web-scraping",
"scraper",
"crawler",
"extract",
"markdown",
"llm",
"claude",
"cursor",
"windsurf",
"opencode",
"codex",
"antigravity",
"tls-fingerprint",
"cloudflare-bypass"
],
"author": "webclaw",
"license": "AGPL-3.0",
"repository": {
"type": "git",
"url": "https://github.com/0xMassi/webclaw"
},
"homepage": "https://webclaw.io",
"engines": {
"node": ">=18"
}
}