1

I'm starting to help a friend who runs a website with small bits of coding work, and all the code required will be PHP. I am a C# developer, so this will be a new direction.
My first stand-alone task is as follows:
The website is informed of a new species of fish. The scientific name is entered into, say, two input controls, one for the genus (X) and another for the species (Y). These names will need to be sent to a website in the format:
http://www.fishbase.org/Summary/speciesSummary.php?genusname=X&speciesname=Y&lang=English
Once on the resulting page, there are further links for common names and synonyms.
What I would like to be able to do is to find these links, and call the URL (as this will contain all the necessary parameters to get the particular data) and store some of it.
I want to save data from both calls and, once completed, convert it all into xml which can then be uploaded to the website's database.
All I'd like to know is (a) can this be done, and (b) how difficult is it? Thanks in advance Martin

1
  • Thanks all. Actually, I did get it wrong! The initial part is correct, as it will give me some info that I can then pass on to a page which will give me data in XML format. What I need to then do is parse the resulting XML (assuming the page exists, looks like I can use CURL for this part) and store any data returned. I'll take a look at the examples above and if I need any help, I will come back and ask. Martin Commented Jan 21, 2011 at 9:40

5 Answers 5

1

If I understand you correctly you want your script to download a page and process the downloaded data. If so, the answers are:

a) yes b) not difficult

:)

Oke... here some more information: I would use the CURL extension, see: http://php.net/manual/en/book.curl.php

<?php 
        $ch = curl_init(); 
        curl_setopt($ch, CURLOPT_URL, "example.com"); 
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 
        $output = curl_exec($ch); 
        curl_close($ch);      
?>
Sign up to request clarification or add additional context in comments.

Comments

0

I used a thing called snoopy (http://sourceforge.net/projects/snoopy/) 4 years a go. I took about 500 customers profiles from a website that published them in a few hours.

Comments

0

a) Yes b) Not difficult when have experience.

Google for CURL first, or allow_url_fopen.

Comments

0

file_get_contents() will do the job:

$data = file_get_contents('http://www.fishbase.org/Summary/speciesSummary.php?genusname=X&speciesname=Y&lang=English');

Comments

0
// Отправить URL-адрес
function send_url($url, $type = false, $debug = false) { // $type = 'json' or 'xml'
    $result = '';
    if (function_exists('curl_init')) {
        $ch = curl_init(); 
        curl_setopt($ch, CURLOPT_URL, $url);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        $result = curl_exec($ch);
        curl_close($ch);
    } else {
        if (($content = @file_get_contents($url)) !== false) $result = $content;
    }
    if ($type == 'json') {
        $result = json_decode($result, true);
    } elseif ($type == 'xml') {
        if (($xml = @simplexml_load_file($result)) !== false) $result = $xml;
    }
    if ($debug) echo '<pre>' . print_r($result, true) . '</pre>';
    return $result;
}

$data = send_url('http://ip-api.com/json/212.76.17.140', 'json', true);

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.