0

I created a scraper to get info from a website using a for loop and created a list with each loop. The result is what I expected but I'm having a hard time to put this info altogether into a csv file. My intention is to create 4 rows where the corresponding url, location, etc, will match but I maybe the way that I set my loops is making it harder?

I thought the the follow syntax would do the trick but nah. It only prints all the urls in the same row. I got to print all the info but they were still in the same row.

with open('trademe.csv', 'w', newline='') as f:
    writer = csv.writer(f)
    writer.writerow(['URL', 'Price', 'Location', 'Flatmates'])
    writer.writerow([link, price, loc, mates])

By the way, this is the code so far;

trademe = urlopen(url)
trademe_html = soup(trademe.read(), "html.parser")
trademe.close()

link=[]
for i in trademe_html.find_all('div', attrs={'class' : 'dotted'}):
    link.append('URL: www.trademe.co.nz'+i.a['href'])
    print('URL: www.trademe.co.nz'+i.a['href'])

price=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-list-view-card-price'}):
    price.append('Price and availability: ' + i.text.strip())
    print('Price and availability: ' + i.text)

loc=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-card-subtitle'}):
    loc.append('Location: ' + i.text)
    print('Location: ' + i.text)
    
mates=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-card-existing-flatmates'}):
    mates.append(i.text.strip())
    print(i.text)
2
  • Welcome to SO. You've set the newline to ''. Shouldn't it be \n? Commented Nov 24, 2020 at 1:20
  • hey @ewong thanks! I forgot to mention that I tried setting the newline='\n' and it didn't work. I tried adding \n at the end of my loop and it works if I print it inside of my IDE. However it doesn't work on the csv file somehow. Commented Nov 24, 2020 at 7:52

1 Answer 1

1
  • Insert your data into a pandas DataFrame
  • Use the to_csv function in pandas

Pandas makes it really easy to export a DataFrame to csv with headers

Sign up to request clarification or add additional context in comments.

1 Comment

hi @topgunner, thanks for answering my question! I did use pandas dataframe but I still can't get to the desired result. The following code writes all the info in the corresponding header but altogether in the same row fields = ['URL', 'Price', 'Location', 'Flatmates'] rows = [ link, price, loc, mates], with open('agalu.csv', 'w', newline='\n') as csv_file: csvwriter = csv.writer(csv_file) csvwriter.writerow(fields) csvwriter.writerows(rows)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.