Skip to content

Development: Crawl and Work with API

If you want to use crawlers or some bots, you need to keep in mind the several rules

  1. Respect robots.txt file
  2. Always introduce yourself via User-Agent 1, and provide their naming correctly2
  3. In case works with API make the timeouts between the requests. Don’t DoS3 your server

We have integrated security monitoring system which will automatically block the malicious and abnormal traffic that can lead to website unstable works. And we don’t provide the whitelists because of the “legitimate” traffic from the whitelisted IP’s can lead to website problems and this will affect your customers.

Hint

If you have a problem, need assistance with tweaks or a free consultation, if you just want to discuss your project with experts and estimate the outcome, if you're looking for a solution that reinforces your online business, we will help. Let us know through Scalesta account or email.