I have scraped some links using the following code, but I can't seem to store them in one column of excel. When I use the code it will parse all the alphabet of the link address and will store them in multiple columns.
for h1 in soup.find_all('h1', class_="title entry-title"):
print(h1.find("a")['href'])
This yields all the links I need to find.
To store in csv I used:
import csv
with open('links1.csv', 'wb') as f:
writer = csv.writer(f)
for h1 in soup.find_all('h1', class_="title entry-title"):
writer.writerow(h1.find("a")['href'])
I also tried to store the results for instance using
for h1 in soup.find_all('h1', class_="title entry-title"):
dat = h1.find("a")['href']
and then tried using dat in other csv codes but would not work.