I'm writing a web scraping script in Ruby that opens a used car website, searches for a make/model of car, loops over the pages of results, and then scrapes the data on each page.
The problem I'm having is that I don't necessarily know the max # of pages at the beginning, and only as I iterate closer to the last few known pages does the pagination increase and reveal more pages.
I've defined cleanpages as an array and populated it with what I know are the available pages when first opening the site. Then I use cleanpages.each do to iterate over those "pages". Each time I'm on a new page I add all known pages back into cleanpages and then run cleanpages.uniq to remove duplicates. The problem seems to be that cleanpages.each do only iterates as many times as its original length.
Can I make it so that within the each do loop, I increase the number of times it will iterate?