I was recently assisted in scraping data from a webpage by the guys at Stackoverflow. It's a great community. I was given a function that pulls data into excel from a cell containing a url. Unfortunately I'm running into some problems because I need a loop function so that Excel does not restart all my functions once I save or refresh the page.
So far I have tried to build this, but am next to useless in VBA. Wondering if anyone can provide a little extra assistance.
Sub POSTPageViews()
Dim InputSheet As Worksheet
Dim i As Long
Dim AllWords As Range
Dim text As String
Dim OutValue As String
Dim driver As SeleniumWrapper.WebDriver
On Error Resume Next
Set driver = New SeleniumWrapper.WebDriver
driver.Start "chrome", "https://re.po.st/"
driver.Open strLocation
Set InputSheet = Active
Set WorkRng = Application.Selection
WordListSheet.Range("E1") = "All Words"
InputSheet.Activate
r = 1
Do While Cells(r, 1) <> ""
Cells(r, 1).Value = txt
OutValue = driver.findElementById("sguidtotaltable").findElementByTagName("span").text
Next i
r = r + 1
driver.stop 'Stops the browser
Loop
End Sub
But naturally it is not working... Anybody see what is wrong? Basically in Column E I have all the URLs and in column K I would like to see the accompanying values.
Thanks
Next iwith out aFor ior any other reference toifor that matter.On Error Resume Next, at least for the moment, will allow the editor to tell you where your code is going wrong and why.strLocationis where your url is, then I would put it inside your loop. Maybe something like this:driver.Open Cells(r, 5).Value. and you can placedriver.stopoutside of the loop. The results could be in column F, maybe, likeCells(r, 6) = driver.findElementsById(...