I can't figure out where the error is in this code. Basically it inserts a code in the search bar, clicks a button and extracts the results:
from seleniumwire import webdriver
import time
API_KEY = 'my_api_key'
proxy_options = {
'proxy': {
'https': f'http://scraperapi:{API_KEY}@proxy-server.scraperapi.com:8001',
'no_proxy': 'localhost,127.0.0.1'
}
}
url = 'https://www.ufficiocamerale.it/'
vats = ['06655971007', '05779661007', '08526440154']
for vat in vats:
driver = webdriver.Chrome(seleniumwire_options=proxy_options)
driver.get(url)
time.sleep(5)
item = driver.find_element_by_xpath('//form[@id="formRicercaAzienda"]//input[@id="search_input"]')
item.send_keys(vat)
time.sleep(1)
button = driver.find_element_by_xpath('//form[@id="formRicercaAzienda"]//p//button[@type="submit"]')
button.click()
time.sleep(5)
all_items = driver.find_elements_by_xpath('//ul[@id="first-group"]/li')
for item in all_items:
if '@' in item.text:
print(item.text.split(' ')[1])
driver.close()
Running the script (chromedriver.exe is saved in the same folder and I'm working in Jupyter Notebook, if it matters) I get
NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//form[@id="formRicercaAzienda"]//input[@id="search_input"]"}
but this element exists, because trying the script without ScraperAPI I get no errors. Can anyone figure out what the problem is?