0

I wrote the code blow to get all unique links from a url:

include_once ('simple_html_dom.php');

$html = file_get_html('http://www.example.com');

foreach($html->find('a') as $element){ 
  $input = array($element->href = $element->href . '<br />');
  print_r(array_unique($input));}

but I really can't understand why it shows the duplicated links too! is there any problem with the function array_unique and simple html dom? and there's another thing I guess is related to the problem: when you execute this you see all of the link that it extracted are in one key I mean this :

array(key  => all values)

Is there any one who can solve this?

2
  • 1
    $element->href = $element->href what's this code for? Commented Jun 28, 2016 at 19:04
  • And you're overwriting $input each time. Commented Jun 28, 2016 at 19:04

1 Answer 1

1

I believe you want it more like this:

$temp = array();
foreach($html->find('a') as $element) { 
    $temp[] = $element->href;
}

echo '<pre>' . print_r(array_unique($temp), true) . '</pre>';
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.