A bot (short for "robot") is a program that operates as an agent for a user or another program or simulates a human activity. On the Internet, the most ubiquitous bots are the programs, also called spiders or crawlers, that access Web sites and gather their content for search engine indexes.
Search engines use "spiders" which search (or spider) the web for information. They are software programs which request pages much like regular browsers do. In addition to reading the contents of pages for indexing spiders also record links.
- Link citations can be used as a proxy for editorial trust.
- Link anchor text may help describe what a page is about.
- Link co citation data may be used to help determine what topical communities a page or website exist in.
- Additionally links are stored to help search engines discover new documents to later crawl.
A chatterbot is a program that can simulate talk with a human being. One of the first and most famous chatterbots (prior to the Web) was Eliza, a program that pretended to be a psychotherapist and answered questions with other questions.
Red and Andrette were names of two early programs that could be customized to answer questions from users seeking service for a product. Such a program is sometimes called a virtual representative or a virtual service agent.
A shopbot is a program that shops around the Web on your behalf and locates the best price for a product you're looking for. There are also bots such as OpenSesame that observe a user's patterns in navigating a Web site and customize the site for that user.
A knowbot is a program that collects knowledge for a user by automatically visiting Internet sites and gathering information that meets certain specified criteria.
No comments:
Post a Comment