1

I'm trying to make a live JSON database of IDs and a tag. The Database refreshes by reading from the JSON file and I have traced my problem to Nodejs not writing to disk, and I don't quite know why.

This is my reading operation, and yes there is a file there with proper syntax.

        let dbraw = fs.readFileSync('db.json');
        var db = JSON.parse(dbraw);

This is my writing operation, where I need to save the updates to disk.

                        var authorid = msg.author.id
                        db[authorid] = "M";
                        fs.writeFileSync('db.json', JSON.stringify(db));

Am I doing something wrong? Is it a simple error I am just forgetting? Is there an easier/more efficient way to do this I am forgetting about? I can't seem to figure out what exactly is going wrong, but it has something to do with these two bits. There are no errors in my console, just the blank JSON file it reads every time on the Read Operation.

1
  • 2
    Are you sure the fs.writeFileSync() is writing to the correct directory? Search your project for other instances of db.json. And, add console.log(path.resolve('db.json') right before your fs.writeFileSync() to see what the full path of the file will be. Commented Sep 19, 2020 at 19:26

2 Answers 2

1

There is a problem with your JSON file's path.

Try using __dirname.

__dirname tells you the absolute path of the directory containing the currently executing file.
source (DigitalOcean)

Example:

If the JSON file is in the root directory:

let dbraw = fs.readFileSync(__dirname + '/db.json');
var db = JSON.parse(dbraw);

If the JSON file is in a subdirectory:

let dbraw = fs.readFileSync(__dirname + '/myJsonFolder/' + 'db.json');
var db = JSON.parse(dbraw);

Side note: I suggest you read about Google Firestore, as it will be a faster way to work with real time updates.

Sign up to request clarification or add additional context in comments.

3 Comments

Wouldn't it be better to use the join method on the path module to join the different path sections?
I'm interested in Google Firestore now, how would I implement this?
@WillEridian firestore gives you the ability to store data as documents, instead of table like SQL, it is very userfriendly and has a lot of information, you can start from thier docs firebase.google.com/docs/firestore/quickstart or just watch any video you find easy on youtube please note: google has 'firestore' and 'firebase' firestore- is their newer version and for DB you can find the differences here - firebase.google.com/docs/database/rtdb-vs-firestore
0

Here's a simple block that does what is desired

const fs = require('fs');
let file_path = __dirname + '/db.json',
dbraw = fs.readFileSync(file_path),
db = JSON.parse(dbraw),
authorid = 'abc';
console.log(db);
db[authorid] = "M";
fs.writeFileSync(file_path, JSON.stringify(db));
dbraw = fs.readFileSync(file_path), db = JSON.parse(dbraw)
console.log(db);

I've added a couple of log statements for debugging. This works and so there may be something else that's missing or incorrect in your flow. The most probable issue would be that of different path references as pointed out by jfriend00 in the comment to your question.

As for better solutions, following are a few suggestions

  1. use require for the json directly instead of file read if the file is small which will do the parsing for you
  2. Use async fs functions
  3. Stream the file if it's big in size
  4. See if you can use a cache like redis or database as a storage means to reduce your app's serialization and deserialization overhead

2 Comments

It still isn't writing. I can't require it because it needs to be updated and read the updates without reloading the file. I tried using MySQL but I couldn't use it either. How would I go about streaming the file or using MySQL?
Are you still facing the issue? If yes, can you print out the path for the file being read and being written? Initially manually add some dummy data in the file and after read check if you're able to read the same data that was added. If you need to stick with files and it's a big file, you'll need to read it in chunk and do a bit of processing. Cache like redis should be quite straightforward to try. There's a node module available for the same. You will not require to create schema etc. as for database when using redis. Lemme know and I'll give you a sample code for it, if required.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.