I can attest to this. From personal experience I found websites that would ignore scrappers and just allow me to access their data on their public web-site easier to deal with code wise and time wise. I make the request, you give me the data I need and then I can piss off.
Web-sites that make it a cluster &&*& to get access to the data do two things. They setup a challenge to break their idiotic `are you a bot?` and secondly it is trivial in most situation just to spin up a vm, and run chrome with selenium and a python script.
Granted I don't use AJAX API or anything like that. Instead I've found developer who nativly have a JSON string along side the data within the HTML to easiest to parse.
Reasons why I've setup bots/scrappers
1) My local city rental market is very competitive and I hate wasting time email landlords who have all-ready signed up a lease.
2) House prices
3) Car prices
4) Stock prices
5) Banking
6) Water Rates
7) Health insurance comparison
8) Automated billing and payment systems.
Web-sites that make it a cluster &&*& to get access to the data do two things. They setup a challenge to break their idiotic `are you a bot?` and secondly it is trivial in most situation just to spin up a vm, and run chrome with selenium and a python script.
Granted I don't use AJAX API or anything like that. Instead I've found developer who nativly have a JSON string along side the data within the HTML to easiest to parse.
Reasons why I've setup bots/scrappers 1) My local city rental market is very competitive and I hate wasting time email landlords who have all-ready signed up a lease. 2) House prices 3) Car prices 4) Stock prices 5) Banking 6) Water Rates 7) Health insurance comparison 8) Automated billing and payment systems.