0

I'm working with a pair of PHP scripts. One script reads data from a MYSQL database and exports it to a csv file then a second script uploads the exported csv file to another MySQL database instance using csv. The structure of the database tables A (export) and B (import) are identical.

These scripts work fine for "normal" MySQL tables and column types. However, the import fails when we apply them to a MySQL table that stores a JSON object in one of the columns (MySQL column type is "json").

The script that exports the data works as expected, producing a CSV file with the JSON object surrounded by double quotes...just like the other values in the row.
The row in the exported CSV file looks like this (the last item is a complex json object, abbreviated for simplicity):

"894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired"

In the PHP script to export the data it's essentially this:

$rowStr =   "894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired";
file_put_contents($filepath, trim($rowStr), FILE_APPEND);

No issues with the export. Row appears in the CSV file as expected (same format as above).

My code to read the csv into the other database looks like this:

    $allRows = array_map('str_getcsv',file($fp)); // read the exported csv file where $fp is the path to the file

    foreach($allRows as $i => $row) {
        //get the col_names from the 2nd database table (identical to the first) where $ac-> is the class that handles data queries
        $col_names = $ac->get_table_column_names('databasename',$tablename);

        $update_arr = array();
        foreach($col_names as $i => $cname) {

            $update_arr[$cname['COLUMN_NAME']] = $val;              
        }

        //and write the row to the 2nd db's table
        $ac->create_update_table($update_arr,$tablename,FALSE);
   }

And, if it matters, here are the Queries used in the "get_table_column_names" and "create_update_table" functions:

get_table_column_names //Using PDO

SELECT COLUMN_NAME,COLUMN_DEFAULT,DATA_TYPE FROM information_schema.columns WHERE table_schema = :db AND table_name = :table

create_update_table

INSERT INTO 'tablename' (field1, field2, field3, field4,json_object_column) VALUES ("894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}")

The problem is that, when importing, the row is converted to an array like this:

array (
[0] = "894",
[1] = "Somebody",
[2] = "Related",
[3] = "2020-02-20",
[4] = "{name1":"value1",
[5] = "name2:"value2",  //should be part of node 4
[6] = "name3:"value3"}", //should be part of node 4
[7] = "expired"
);

What's happening is that the "," inside the JSON object is being treated as a field separator and the JSON is broken up into array nodes. Other than writing a script to detect fields that start with "{ and end with }", how can I read the entire json string as one field (as it is in the database)? or, perhaps, is there a better way to output the string so that it can be read as one item?

2 Answers 2

2

If instead of just writing out the data using something like file_put_contents() you use some of the methods designed for CSV files, this will do most of the work for you...

To write the data use fputcsv() and this escapes the delimiter (in this case the " becomes "")...

$row = ["894","Somebody","Related","2020-02-20",'{"name1":"value1","name2":"value2","name3":"value3"}',"expired"];

$fh = fopen($filepath, "a");
fputcsv($fh, $row);
fclose($fh);

which will write to the file

894,Somebody,Related,2020-02-20,"{""name1"":""value1"",""name2"":""value2"",""name3"":""value3""}",expired

and then to read from the file, just read a line at a time and use fgetcsv()...

$fh = fopen($filepath, "r");
print_r(fgetcsv($fh));   // This in a loop to read all lines
fclose($fh);

which shows

Array
(
    [0] => 894
    [1] => Somebody
    [2] => Related
    [3] => 2020-02-20
    [4] => {"name1":"value1","name2":"value2","name3":"value3"}
    [5] => expired
)
Sign up to request clarification or add additional context in comments.

2 Comments

I had tried some combination of these two tools (fputcsv and fgetcsv) but hadn't gotten them to work. With your clear solution, it is working as expected! Thank you! One small thing I discovered is that the 'a' setting for fopen on the export is critical. I had tried fopen using 'w' and that only writes one line in the file.
This is the mode in fopen, using w will blank the file and allow you to write, a is open for write and append the data. Have a look on php.net/manual/en/function.fopen.php for the mode part, there are more options.
1

One way of solving this is to create a new copy of the array and manipulate the new array and add json as a sliced part of the original array.

$allRows = array_map('str_getcsv',file($fp)); 

$new_arr = [];
foreach($allRows[0] as $key=>$item) {

    $json = false;

    if (substr($item,0,1) == '{') {
        $json_start = $key;
        $json = true;
    }

    if (substr($item,-2,2) == '}"') {
        $json_stop = $key;
        $json = true;

        //Slice json-part from original array (in your case 4,5,6)
        $sl = array_slice($allRows[0], $json_start, ($json_stop-$json_start)+1);

        //Add the sliced part into key where json started
        $new_arr[$json_start] = implode('',$sl);         
    }

    if ($json === false) $new_arr[] = $item;
}

And then you have your expected array in $new_arr.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.