I am trying to scrape lyrics from an api and print responses directly to a csv file, like so:
def scrape_genius_lyrics(urls):
all_lyrics=[]
headers = {'Authorization': 'mytoken'}
base_url = 'https://genius.com/'
with codecs.open('genius.csv', 'ab', encoding='utf8') as outputfile:
outwriter = csv.writer(outputfile)
for url in urls:
page_url = base_url + url
try:
page = requests.get(page_url, headers=headers)
html = BeautifulSoup(page.text, "html.parser")
[h.extract() for h in html('script')]
lyrics = html.find('div', class_='lyrics').get_text()
# outwriter.writerow(lyrics)
all_lyrics.append(lyrics)
print lyrics
except:
'could not find page for {}'.format(url)
however, I only see responses if i comment #outwriter.writerow(lyrics), otherwise the program halts and does not print lyrics.
how can I save to csv file every lyrics to its own row, at each iteration?
[h.extract() for h in html('script')]on it's own does nothing... Did you want to save that list?