A web crawler, also known as a spider or bot, is an automated program that systematically browses the web, indexing and gathering information from websites.A web crawler starts with a list of starting URLs, often called a seed list. This list may include popular websites and specific URLs.