0

i'm newbie looking for solution how to share a varible among files(modules) by reference. for an example my here is app.js

const users = {},
    waitingQueue = []
module.exports = function(io){
    io.on('connection', (socket)=>{
        users[socket.id] = socket
        socket.on("Search", (d, ack)=>{
            waitingQueue.push(socket)
            //  logic with waitingQueue
            //  logic with users {}
        })
        socket.on("otherEvent", (d, ack)=>{
                     //  logic with waitingQueue
                     //  logic with users {}
             })
    })
}

now i'd like to divided it modulewise. now new app.js

const users = {},
    waitingQueue = []
const Search = require('./search')

module.exports = function(io){
    io.on('connection', (socket)=>{
        users[socket.id] = socket
        socket.on("Search", Search(users, waitingQueue))
    })
}

now in differenct case my trouble

socket.on("Search", ...)

... should be a funtion

now if use encloser

socket.on("Search", ()=>...)

export modified{users, waitingQueue} form Search.js is ok but need to override users & waitingQueue varibles but not working. i need to share these varible to other modules too with tightly coupled.

i also try event emitter based approach Object.observe() but unable to fix problem. anyone help me pls to fix this problem

2 Answers 2

2

I would use REDIS to store the shared values.

You can use the redis client to get/set the values, those values will be stored in the redis database Then those values can be get/set from any code in any process in the same local machine (as the db) or a set of remote machines (thats why i will mention scaling later on...)

Redis can be benchmarked and in my experience, compared to other db solutions its amazingly fast.

It's quite simple to use : An introduction to Redis data types and abstractions

  • Binary-safe strings.
  • Lists: collections of string elements sorted according to the order of insertion.
  • Sets: collections of unique, unsorted string elements.
  • (Sorted sets, similar to Sets )
  • Hashes, which are maps composed of fields associated with values. Both the field and the value are strings. This is very similar to
    Ruby or Python hashes.

...and some more...

I suggest you install the database server and client and try to get and set some values, I'm sure it will be useful knowledge for the future.

Extra Reasons :

It will give youeven more power to your arsenal

  • Its really good for scaling purposes (both : vertically & horizontally scaling)
  • socket.io-redis (GitHub)

By running socket.io with the socket.io-redis adapter you can run multiple socket.io instances in different processes or servers that can all broadcast and emit events to and from each other.

If you need to emit events to socket.io instances from a non-socket.io process

EDIT I know you want to share variables between your code, but since I saw you are using socket.io, you might want to share those variables also accross your slaves...

Socket.IO : passing events between nodes

Sign up to request clarification or add additional context in comments.

8 Comments

That's all well and good, but how does that help share data across modules? :/ Both those middlewares are related to scaling socket.io (which isn't the question).
Normally in real world application you will probably have many instances of your code, and you will maybe then need to share the values between the instances. What I'm showing you is a good db solution for this case without being expensive (for this use case) [like mysql]
which is fine, but it's not what the question was. However, in reality, if your storing large amounts of data, over time, Redis would become expensive. Redis is best for transient data, or at best data non-critical data e.g. caching. Using it as a complete backend store is something I wouldn't recommend (comparing MySQL and Redis is like comparing apples & oranges).
Yes its another way of "variable share among modules", with extra info about how useful it can be in real life, but like I declared in two ocasions, we are speaking about this use case, which is a waitingQueu and some userData, so the expensiveness is really decided by the op, thats why I also mention benchmarking :)
other than the ability to share data cross-process (which I reiterate wasn't the question) I'd argue something like Redis would be an unnecessary complication. Redis is memory based, if the goal is to keep the data in memory and the OP isn't concerned with multiple processes then what benefits would using Redis bring? This was my original point, you've answered questions that weren't really asked, there are definitely benefits to using Redis but these are all if, buts & maybe's because there is no context.
|
2

Well, my presumption is in a real world application users & waitingQueue would come from some sort of persistent storage e.g. Database, so the notion of "sharing" data across modules would be a non-issue as you'd most likely fetch directly from the DB.

However, if I did have in-memory "global" data that I wanted to share across different modules, then I'd most likely move it into it's own module e.g.

data.js

module.exports = {
    users: {},
    waitingQueue: []
}

app.js

const Search = require('../search');
const data = require('./data');

module.exports = io => {
    io.on('connection', (socket) => {
        data.users[socket.id] = socket;
        socket.on('Search', Search(data.users, data.waitingQueue));
    });
}

As it's own module, data.js can then be shared across various other modules.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.