1

I am writing my first web application with Javascript and WebGL. For now I am running the app on localhost from Apache. The app needs to work with data that is provided instantly. Until now I worked with AJAX calls that happen during runtime which doesn't work out for my purposes anymore. So instead of serving individual files from Server to Client when asked, I want the application to load all files from the Server to Client side at initialization time (I want this to happen automatically at the start so I don't have to add every new file as a url in the html index). I understand I should do this with Server Side scripting; probably with PHP since I have a Apache localhost? I have different folders which hold my necessary resources in a uniform dataformat (.txt, .png and .json). So what I want to do is, before the Javascript app starts, look through the folder and send one object per folder that holds filenames as keys bound to filedata. Is my intuition right that I need to do that with PHP? If yes, where do I start to tell my application what to do when (first start serving files with php, then start the javascript app)? How do I do this on localhost? Should I already think about extending my toolset (e.g. using nodeJS on ServerSide(locally for now))? If so what lightweight tools do you propose for this kind of work? I feel I am missing some design principles here.

EDIT: Keep in mind that I don't want to specifically call a single file... I am already doing that. What I need is a script that automatically serves all the files of a certain folder on the server to the client side at init time of the app before the program logic of the actual application starts.

1 Answer 1

2

Your question is kind of broad so I'll try my best. Why does AJAX not work for real-time data but loading all the files once does? If you're working with real time data, why not look into a websocket or at the bare minimum, AJAX queries?

If you want to pass data from the server to the client, you will need to use a HTTP request no matter what. A GET request or POST request is necessary for the client to request data from the server and receive it as a response.

You could theoretically just pass the data from PHP straight to the view of the application (which is technically done through a GET request whenever a user requests data such as .php files from the server) but this isn't as flexible as if Javascript had access to the data. You can do some hacks and 'transfer' the data from the view to Javascript with some .value methods, but this isn't ideal and can be prone to some security holes. This also means data is only being passed once.

So what would need to happen is that the data would need to be processed upon initialization and then immediately transferred to the client by use of Javascript and HTTP requests.

So if you want Javascript to have access to the data and use it in variables or manipulate it further, then you'd need to use an HTTP request such as GET or POST which is called by Javascript. Otherwise, you need to immediately pass the data to the view upon initialization (through PHP), but this means you can't work with real-time data because the data is only being passed once when there is a page refresh.

Example of scandir():

<?php
//scandir() returns filenames, not data from files
$fileArray = scandir('datafolder/') //this is a relative path reference to the folder 'datafolder'
$finalArray = [];

foreach($fileArray as $filename){
    tempArray = [];
    $file = fopen('datafolder/' . $filename, 'r'); //im pretty sure scandir only retrieves the filenames and not the path, so you might need to append the filepath so your script knows where to look
    $tempArray = fgetcsv($file, 1024); //temp array to hold contents of each iteration of foreach loop
    array_push($finalArray, $tempArray); //this will store the data for later use
}

Or the data can be used however, depending on what it is. Say, if you need to combine the data from multiple .csv files, you can read each file and append it to a single array. If you want to read multiple distinct files and preserve the independence of each file, you can create multiple arrays and then pass back a single JSON encoded object that contains each file's data as a separate attribute of the object such as:

{
    'dataOne': [0,1,2,3,4,5,6...],
    'dataTwo': ['new', 'burger', 'milkshake'],
    'dataThree': ['Mary', 'Joe', 'Pam', 'Eric']
}

Which can be created with a PHP associative array using one of the following methods:

//assuming $arrayOne is already assigned from reading a file and storing its contents within $arrayOne
$data['dataOne'] = $arrayOne;
// or
array_push($data['dataTwo'], $arrayTwo);
// or
array_push($data, [
   'dataThree'  => ['Mary', 'Joe', 'Pam', 'Eric']
]);

Then $data can simply be passed back which is a single array containing all the different sets of data, if each set needs to be distinct.

Sign up to request clarification or add additional context in comments.

5 Comments

I think what you are suggesting is already what I do now with my app. However, I haven't found any info yet how to get the content of a whole folder on the server with a HTTP request.. The methods that I am using now request a specific path to the file that I request. This doesn't work for me now since the files I am requiring are depended on each other in the application (so the asynchronous pipeline of AJAX crashes with my program logic). My idea was to load all the data to the Client BEFORE any program logic is happening and buffer it to an object in the html context.
Ok. I think it's a little more clear. Have you looked into scandir() on PHP? You can make an AJAX request to a file on the server which will then use scandir('/path/to/directory') to retrieve the filenames of all the files on that directory. Then a foreach loop on this array of filenames will open up each one and you can pull the data from each file into a separate array to be manipulated further. This will scale as you add more files to this directory so you don't have to keep adding new file names to the code. This array can also be filtered down
Also, what kind of data is this? If the files depend on one another, a relational database might be ideal for you. Normally, raw data is just simply raw data that doesn't require anything else to be used (like financial data). If the data depends on other data files to be interpreted properly, a relational database like MySQL might help reduce some complexity that is needed to link the files together or to read them properly.
No the dependency only occurs at program runtime... For example a .txt file is used to build a shader program. This shader program is then used together with a .JSON file to build mesh for rendering.
Hm, I guess my solution would be to have a self-executing anonymous function in your JS script that executes a GET request to the server that will automatically run when the window loads. This request can retrieve the data from the .txt files and pass it back as a JSON object that has all of the initial settings as attributes of the JSON object. Then JS can use the data to build the shaders and use it sequentially (so as to maintain the dependency chain). I'm afraid I'm not too familiar with WebGL and the whole process so I can't offer much except general advice.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.