1

Description: I do a mysql query to a table, and get back some rows

>   connection.query(q2,function(err,rows){...}

lets say the rows looks like {id:",,,", time:"..." etc:"cc"} Next for every row i query another table and get back some lets say coms based on the id of the row. there can be 0 - 100 rows depenting on the query

> for (var i = 0; i < rows.length; i++){   
> connection.query(qc,function(err2,coms){

lets say the coms looks like this { rowid:",,,", something:"ddd" smthelse:"ff"} there can also be 0 - 100 coms in every row Question:

How can i add coms inside rows, so for every row i have the coms i got back from the table? i want something like this

> {id:"1", time:"..." etc:"cc" coms:{ rowid:",,,", something:"ddd"
> smthelse:"ff"}}

Allready tried: rows[i].comms=coms; (not working) rows[i].comms[j]=coms[j]; (not working) rows[i].push(coms) (not working) rows.comms=coms; (working but its not what i want. with this i get multiple times the same data)

the whole code i am using after the first query

 connection.query(q2,function(err,rows){
                    //      connection.release();
                            if(!err) {
                            //      rows["coms"]=[];
                                    for (var i = 0; i < rows.length; i++){
                                            var qc="select * from comms where id='"+rows[i].id+"' order by com_id";
                                            connection.query(qc,function(err2,coms){
                                             if(!err2) {
                                             // rows[i].cid=coms.com_id;
                                             //     rows[i].coms=[];
                                                    rows[i]="coms";
                                                    for (var j = 0; j < coms.length; j++){
                                                    // data[i][j]=coms[j];
                                              //    rows[i].coms[j]=coms[j];
                                                    //  rows[i][j].coms=coms[j];
                                                    }
                                    //         data[i]=coms;
                                             // rows[i]["coms"]=coms;
                                    //        console.log(rows);                                
                                             }
                                            });
                                    }
                            //      rows.push(data);
                                      console.log(rows);
                                    users[index].emit('data',rows);
                            }
                            connection.release();
                    });
6
  • Show us the whole code you are using after the first query. Your for loop should be inside the callback to the first query and have some way of telling the top-level function that all queries in the for loop are complete, since you are executing asynchronous code. You should probably use a library like async or promises to make this easier and less error-prone. Commented Nov 26, 2015 at 14:32
  • @kliron edited with the code Commented Nov 26, 2015 at 14:37
  • The problem is you are using synchronous code to handle asynchronous calls. The for loop executes multiple connection.query calls and then calls emit('data', rows) but when emit is called, some (most) of the queries have not yet had time to complete. As I write above, you need to use some way to handle the asynchronicity and collect the data you need before calling emit. I usually use promises for this task and really like the bluebird library. Commented Nov 26, 2015 at 15:08
  • i can still fix that later. i get both rows data and comms data (for example if i emit the seperatly it works. but i want to merge them and do a single emit. and the way i want to merge them is by adding zero to multiple comms, on each object of rows Commented Nov 26, 2015 at 15:18
  • 1
    That is exactly what I mean. The reason it works when you emit separately is that the specific query has resolved so emit actually emits the data. But to merge them you need to wait until all queries have completed. Commented Nov 26, 2015 at 15:32

2 Answers 2

1

Why don't you change your second query's WHERE id = row[i].id to WHERE id IN(ids) to get your data with one pass. It's easier on the sql server and it's easier to handle the code. First get a list of Ids to pass to the IN() condition. when the data come back, you can iterate through the parent rows and filter the children node rows and append to the parent as the property you wanted. you may want to use underscorejs library to help you with collection operations.

combining all of you sub-sequence selects into one select will speed up your queries; because it will get rid of all the over heads that are needed to make connections and queries to sql server down to one connection and one query. Imagine if your first query returns 100, 1000, 10,000 rows. Your sql server will take a performance hit with the number of sub-sequence queries.

I don't believe that there's a maximum length for a query; so you can go crazy with the number of Ids inside of the IN() condition.

Sign up to request clarification or add additional context in comments.

1 Comment

its still in beta :) trying to figure out all the communications between server - client etc. later i will fix things like this (its not the only one, there are a lot of things to be done sql side), but the way i see it this might help a bit even at this stage.
0

I don't have any MySQL database available to test this, but I hope it gets the point across or gives some good ideas.

var Promise = require("bluebird");

// This is a "promisified" version of connection.query
function query(con, sql) {
  return new Promise(function(resolve, reject) {
    con.query(sql, function(err, rows) {
      if (err) { reject(new Error(err)); }
      else { resolve(rows); }
    });
  });
}

Your code would now translate to something like this:

query(connection, q2)
    .then(function(rows) {
        return Promise.map(rows, function(row) {
            var qc = "select * from comms where id='"+row.id+"' order by com_id";
            return query(connection, qc).then(function(coms) {
                // Do whatever you want with result of second query here
                row.coms = coms;
                return row;
            })
        })
    })
    .then(function(rows) {
        // All promises have resolved here, rows is an array of values
        users[index].emit('data', rows);
    })
    .catch(function(err) {
        // Handle errors here
    });

2 Comments

Of course, if you can combine your queries into a single query it would be much more efficient as other people have commented. For large queries will be probably orders of magnitude faster. But this pattern of handling asynchronicity in Node is so pervasive that you will use it (or something similar) everywhere.
i ended up doing a left join of the two queries, so instead of one query outside, and several inside, i just ask the db once and get all the results i want in a single row object i can then emit to the user. Still haven't tried your solution (dont have time this period ) but looks very promising. when i have time i will try it and accept it as solution :P

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.