I need a function/method in php to access multiple web pages in a loop. Just as someone can manually access a web page and load any scripts on there. I'm not downloading any information I just want the script to access it so that any php code can run on that page. It's a hack for a program i'm working on that needs cron jobs running. The cron job will run one script that will load multiple pages eg. http:// localhost/program/script1, http:// localhost/program/script2. I can then dynamically add pages from a database as time goes on.
2 Answers
here you would just separate the code you want shared into another file and then use
require("/path/to/filename.php");
The path instead of being a url will be the filesystem path to where you saved the file.
Good starting points to reference this file is $_SERVER["DOCUMENT_ROOT"] so you could say something like.
require($_SERVER["DOCUMENT_ROOT"]."/program/script1.php");
3 Comments
Fabian Glace
I can't include or require the url's. I need to access the pages so that the scripts on them run. I don't need any data from them and they are blank cause there is no outputs. I just need to have a script automatically access them in a loop.
Orangepill
I got you... you need the results not the code... you should be able to use the http stream wrappers for that just do this.
echo file_get_contents("http://localhost/program/script1");Orangepill
if you need more control over the page (like posting data to the script) then you should look into curl_* php.net/manual/en/book.curl.php or HttpRequest php.net/manual/en/class.httprequest.php
You could scan a directory and loop through each of the files in it then read or write to it with fopen
$files = scandir('folder/');
foreach($files as $file) {
//do your work here
$fhandle = fopen($file, 'r');
}