22

I'm trying to get a value that is given by the website after a click on a button.

Here is the website: https://www.4devs.com.br/gerador_de_cpf

You can see that there is a button called "Gerar CPF", this button provides a number that appears after the click.

My current script opens the browser and get the value, but I'm getting the value from the page before the click, so the value is empty. I would like to know if it is possible to get the value after the click on the button.

from selenium import webdriver
from bs4 import BeautifulSoup
from requests import get

url = "https://www.4devs.com.br/gerador_de_cpf"

def open_browser():
    driver = webdriver.Chrome("/home/felipe/Downloads/chromedriver")
    driver.get(url)
    driver.find_element_by_id('bt_gerar_cpf').click()

def get_cpf():
    response = get(url)

    page_with_cpf = BeautifulSoup(response.text, 'html.parser')

    cpf = page_with_cpf.find("div", {"id": "texto_cpf"}).text

    print("The value is: " + cpf)


open_browser()
get_cpf()

4 Answers 4

13

open_browser and get_cpf are absolutely not related to each other...

Actually you don't need get_cpf at all. Just wait for text after clicking the button:

from selenium.webdriver.support.ui import WebDriverWait as wait

def open_browser():
    driver = webdriver.Chrome("/home/felipe/Downloads/chromedriver")
    driver.get(url)
    driver.find_element_by_id('bt_gerar_cpf').click()
    text_field = driver.find_element_by_id('texto_cpf')
    text = wait(driver, 10).until(lambda driver: not text_field.text == 'Gerando...' and text_field.text)
    return text

print(open_browser())

Update

The same with requests:

import requests

url = 'https://www.4devs.com.br/ferramentas_online.php'
data = {'acao': 'gerar_cpf', 'pontuacao': 'S'}
response = requests.post(url, data=data)
print(response.text)
Sign up to request clarification or add additional context in comments.

7 Comments

You are right, they are not related. But I tried your code here and still empty value
@Felipe , did you try initial answer or last updated? Still no output? I tried it couple times and got required output...
I tried the last updated, here is the output: <div class="output-txt" id="texto_cpf" onclick="fourdevs.selectText(this)"></div>
@Felipe , hmm... My code definitely cannot return div node... I think you use it incorrectly. Just do print(open_browser()) to get value
@Felipe , I don't use Scrapy, so I can't provide you with appropriate code. Check updated answer
|
7

You don't need to use requests and BeautifulSoup.

from selenium import webdriver
from time import sleep

url = "https://www.4devs.com.br/gerador_de_cpf"

def get_cpf():
    driver = webdriver.Chrome("/home/felipe/Downloads/chromedriver")
    driver.get(url)
    driver.find_element_by_id('bt_gerar_cpf').click()
    sleep(10)
    text=driver.find_element_by_id('texto_cpf').text
    print(text)
get_cpf()

Comments

3

Can you use a While loop until text changes?

from selenium import webdriver

url = "https://www.4devs.com.br/gerador_de_cpf"

def get_value():
    driver = webdriver.Chrome()
    driver.get(url)
    driver.find_element_by_id('bt_gerar_cpf').click()
    while driver.find_element_by_id('texto_cpf').text == 'Gerando...':
        continue
    val = driver.find_element_by_id('texto_cpf').text
    driver.quit()
    return val

print(get_value())

Comments

0

I recommend this website that does exactly the same thing.

https://4devs.net.br/gerador-cpf

But to get the "gerar cpf" action with selenium, you can inspect the HTML source code with a browser and click on "copy XPath for this element".

It is much simpler than manually searching for the elements in the page.

1 Comment

Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.