3

I'm trying to read a file .csv over 40 MB (with more 20.000 lines), viewing it as a table in html. The system I am designing will be in pure HTML + JQuery only. My worksheet .csv format that is:

=======================================================================
| rank ||  place           || population ||    lat    ||      lon     |
=======================================================================
|  1   || New York city    ||  8175133   || 40.71455  ||  -74.007124  |
-----------------------------------------------------------------------
|  2   || Los Angeles city ||  3792621   || 34.05349  ||  -118.245323 |
-----------------------------------------------------------------------
..........................Thousands of lines..........................

I have a textbox filter that makes me just showing the line that has the same number entered in the field. I have a textbox filter that makes me just showing the line that has the same number entered in the field. And then uploading the file it breaks the browser. :( Follow my code:

var fileInput = document.getElementById('fileInput');
var fileDisplayArea = document.getElementById('fileDisplayArea');

        fileInput.addEventListener('change', function(e) {
            var file = fileInput.files[0];

                var reader = new FileReader();

                reader.onload = function(e) {

                    //Convert .csv to table html
                    var data = reader.result; 
                    var rank = $("#rank").val(); 

                    if(rank == ""){
                        alert("digit rank number");
                        fileInput.value = "";
                     }
                     else{



                    data = data.replace(/\s/, " #"); 
                    data = data.replace(/([0-9]\s)/g, "$1#"); 
                    var lines = data.split("#"), 
                        output = [],
                        i;
                        for (i = 0; i < lines.length; i++)
                        output.push("<tr><td>" + lines[i].slice(0,-1).split(",").join("</td><td>") + "</td></tr>");
                        output = "<table>" + output.join("") + "</table>";

var valuefinal = $(output).find('tr').filter(function(){ 
    return $(this).children('td').eq(0).text() == rank;
});

 $('#fileDisplayArea').append(valuefinal);

                     }
                }

                reader.readAsBinaryString(file);    


        });

Is it possible to do something that I optimize my code and not let my browser and break my file to be read? I tried to split the file into pieces and then seal it. (BLOB) But this way he kept crashing my browser too. :(

(Excuse me if something was not clear and enlightening about my question to you, my english is bad.) DEMO CODE JSFiddle

16
  • My recommendation would be to design a much better UI that doesn't force the user to deal with a 20,000 line table. That's just not going to be a good UI even if you do succeed in reading the whole thing. Commented Jun 22, 2014 at 23:12
  • What you should do is to only load partial data at a time. Commented Jun 22, 2014 at 23:12
  • @Derek朕會功夫 Pay attention to this part of my question: "I tried to split the file into pieces and then seal it. (BLOB) But this way he kept crashing my browser too. :( " Commented Jun 22, 2014 at 23:27
  • @jfriend00 In the table with 20 000 lines served to consult with another script I created. The User will not have to deal with 20,000 lines Commented Jun 22, 2014 at 23:28
  • 1
    I was asking what part of my suggestions do you not understand? I've given you a direction to go and all you've said is how do I implement this? I want to know what part of implementing it do you not understand? The general idea is that you open the file, set up a few state variables, read and process N bytes of the file, then do a setTimeout() to let the browser breathe, then read and process N more bytes, repeat until done as is shown in the linked post. To query the file directly, you'd have to put it into a database (probably with a server behind it) that had that capability. Commented Jun 23, 2014 at 17:16

1 Answer 1

3

As I found this thread on my research to parse a very big CSV file in the parser, I wanted to add that my final solution was to use Papa Parse:

http://papaparse.com/

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.