TOP

Retries Settings

There are some reasons why this configuration might come handy. For example, when you accidentally lost an internet connection in the middle of the process, thus it will try to re-execute the current context until your internet connection established.

Another advantage by set this options is when you assign proxies to a particular service, and somehow that service neglected to take care of its job, it will try to switch and change the proxy before deciding to re-execute.

It is excellent as internally we have something we call raw validation method inside our web crawler to validate whether the web page crawled correctly. This behavior applied to some popular websites such as Google, Wikipedia, Bing, etc.

Caution: The greater value might slow down the associated task.

In a short explanation, this feature will make each running-services to run at their best and returns the expected results.

NameExplanation
Web Crawler Retries

Maximum number of times an attempt if an error occurs during crawling web pages.

The default value is 5, and it should between 1-20.

Search Engine Crawler Retries

Maximum number of times an attempt if an error occurs during crawling search engine result pages.

The default value is 5, and it should between 0-100.

Domain Vanity Service Retries

Maximum number of times an attempt if an error occurs during sending a request to WHOIS server (checking domain availability).

The default value is 2, and it should between 0-100.

Web 2.0 Vanity Service Retries

Maximum number of times an attempt if an error occurs when performing Web 2.0 availability.

The default value is 5, and it should between 0-100.

Statistic Engine Retries

Maximum number of times an attempt if an error occurs during gathering domain metrics.

The default value is 5, and it should between 0-100.