# crawl **Repository Path**: ipvb/crawl ## Basic Information - **Project Name**: crawl - **Description**: go crawl - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 1 - **Forks**: 0 - **Created**: 2018-04-27 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Go crawl This is a really simple web spider. The purpose of this program is to fetch all local links on a website and follow them recursively. It's just a visitor and nothing else. To speed things up, this whole thing works concurrently. old project: [https://github.com/githubnemo/GoSpider](https://github.com/githubnemo/GoSpider) ## Usage $ go build $ ./crawl -workers=16 -url=http://localhost/