A crawler (also called spider) is a computer program running on a machine connected to a network (i.e. the internet) which "crawls" through the content of the network. It is commonly used by search engines to build up a search index which affords fast search results.
Attributes | Values |
---|---|
rdfs:label |
|
rdfs:comment |
|
dcterms:subject | |
abstract |
|