Scrapy - Use Website's Search Engine To Scrape Results

- 1 answer

I have to go through the results of a search on a website. The thing is that the URL doesn't change when you search for something on that site, meaning that I can't use the URL to have the results that I want to have.

My question is, can Scrapy set the filters that I need, search for the results, and then go through all the results of the search ?

If yes, how ? And if not do you know something that could do it using Python or something else ?




If the search term is not reflected in the URL, it means that it is transmitted to the server as a POST reqest. This means that your Scrapy code also needs to make a POST request in order to submit the desired search term.

The Scrapy request documentation has examples for making a POST request, simulating a form submission:

return [FormRequest(url="",
                formdata={'name': 'John Doe', 'age': '27'},