Definition: Crawler

« Back to Glossary Index

A “Crawler” is a program or automated script used by search engines to scan, index, and store information about websites and their content. The purpose of a crawler is to help search engines understand what a website is about and what information it provides, so that they can include the site in their index and provide relevant search results.

Crawlers visit a website, follow its links, and retrieve information about the pages and their content, including the page title, description, and keywords. This information is then used to generate an index of the site and provide relevant search results to users.

Crawlers can also be used by other types of software, such as site analysis tools, to analyze a website and gather information about its structure, performance, and other factors that are relevant to search engine optimization and website design

« Back to Glossary Index

By Ashley Bryan

Ashley Bryan is an Internet Strategist and a SEO Consultant located on the Sunshine Coast in Australia with over 19 years' experience. He owns WebsiteStrategies which serves small to medium businesses in Australia and New Zealand. Follow him: Facebook LinkedIn Twitter