0

My tool appends little json blocks with comma at their end into a txt file which initially has [ as the first character, in order to create a whole JSON format text file, like below,

{data: text1},

and txt file seems like at the end of the day,

[{data: text1},{data: text2},{data: text3},

So I have to remove last comma at the end of the text and put a ] to make it valid JSON right, but this file size is around 1.5GB so I couldn't figure how to edit it in NodeJS in order to make it valid JSON file.

11
  • A better question might be why you have a 1.5gb JSON file that requires dynamic updates. What about a database solution, or splitting the file up? If this is essentially logging, then you could use a format that doesn't require a trailing ], or a command-line tool that would allow you to append to the file (while truncating the last n bytes). Commented Feb 8, 2024 at 19:08
  • I have an idea, don't really know if it can be implemented. If every character is fixed size and if there is something like seek operation when reading file, maybe you can seek to end of file minus 2 * size of character and then write ] and delete rest of it if needed. Commented Feb 8, 2024 at 19:13
  • Hi @Rogue, thanks for interest, my browser extension bot is actually pushing the json blocks all the time into that txt files through my browsing activities and I process them on NodeJS at the end of the day but overtime the data gets bigger so maybe I should switch to some beter db logics after that but I am still tryna find a way to keep in txt file as long as I could work with it if possible coz the data collected there is meant to be deleted after I process them with nodeJS Commented Feb 8, 2024 at 19:14
  • @MilosStojanovic I am also thinking to target somehow the last line of the chunk in order to make the difference on final part of whole text, every json blocks pushed to txt file has diff length btw Commented Feb 8, 2024 at 19:17
  • I guess you are using UTF-8, not ASCII, right? Commented Feb 8, 2024 at 19:29

1 Answer 1

0

Additional solution for updating large-sized files with Transform as you wanted but working slower than Milos's code for my specific situation,

const fs = require('graceful-fs');
const { Transform } = require('stream');

const sourceFilePath = 'jsfiles.txt';
const destinationFilePath = 'new.txt';

// Create a transform stream to modify the data
const transformStream = new Transform({
    transform(chunk, encoding, callback) {
        // Modify the data as needed
        var modifiedData = chunk.toString().slice(0, -1); // remove last comma
        this.push(`${modifiedData}]`); // add last closing bracket
        callback();
    }
});

// Create a readable stream from the source file
const readStream = fs.createReadStream(sourceFilePath, { encoding: 'utf8' });

// Create a writable stream to the destination file
const writeStream = fs.createWriteStream(destinationFilePath, { encoding: 'utf8' });

// Pipe the data through the transform stream before writing to the destination file
readStream.pipe(transformStream).pipe(writeStream);

// Handle events when the copying is complete
readStream.on('end', () => {
    console.log('File duplication completed.');
});

writeStream.on('finish', () => {
    console.log('Data has been written to the destination file.');
});

// Handle errors during the process
readStream.on('error', (err) => {
    console.error('Error reading from the source file:', err.message);
});

transformStream.on('error', (err) => {
    console.error('Error transforming data:', err.message);
});

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.