I would like to scrape the following website using python and need to export scraped data into a CSV file:
http://www.swisswine.ch/en/producer?search=&&
This website consist of 154 pages to relevant search. I need to call every pages and want to scrape data but my script couldn't call next pages continuously. It only scrape one page data.
Here I assign value i<153 therefore this script run only for the 154th page and gave me 10 data. I need data from 1st to 154th page
How can I scrape entire data from all page by once I run the script and also how to export data as CSV file??
my script is as follows
import csv
import requests
from bs4 import BeautifulSoup
i = 0
while i < 153:
url = ("http://www.swisswine.ch/en/producer?search=&&&page=" + str(i))
r = requests.get(url)
i=+1
r.content
soup = BeautifulSoup(r.content)
print (soup.prettify())
g_data = soup.find_all("ul", {"class": "contact-information"})
for item in g_data:
print(item.text)