Development: Crawl and Work with API¶
If you want to use crawlers or some bots, you need to keep in mind the several rules
- Respect robots.txt file
- Always introduce yourself via User-Agent 1, and provide their naming correctly2
- In case works with API make the timeouts between the requests. Don’t DoS3 your server
We have integrated security monitoring system which will automatically block the malicious and abnormal traffic that can lead to website unstable works. And we don’t provide the whitelists because of the “legitimate” traffic from the whitelisted IP’s can lead to website problems and this will affect your customers.
If you have a problem, need assistance with tweaks or a free consultation, if you just want to discuss your project with experts and estimate the outcome, if you're looking for a solution that reinforces your online business, we will help. Let us know through MyCloud or email.
Syntax aka rules for the naming your User-Agent: Syntax, RFC 7231, section 5.5.3: User-Agent ↩