0

Ok, so I am creating a web app with php and mysqli. I have a table friends which is a simple set up:

f_id int(11)
uid int(11)
fids TEXT

now its basically like a row for each user with the fids consisting of a lot of numerical values (other userids) separated by commas like: 1,2,3 so I use this function to get each user's friends:

function getFriends($db, $userid)
{
    $q = $db->query("SELECT fids FROM friends WHERE uid='$userid'");
    $ar = $q->fetch_assoc();
    $friends = $ar['fids'];
    $fr = explode(",", $friends);

    return $fr;
}

but each posts comments that appear to each of their friends. my problem comes from trying to sort these comments by the time they were posted. lets say my comments table is:

c_id int(11)
uid int(11)
c_text TEXT
c_time int(11)

I want to be able to get the comments posted by each 'friend' put them all into an array together, then sort them from their c_time value, then all the values from that particular row in the comments table.

The problem comes from my how I've set up my friends table. I'm using:

$fr = getFriends($db, $userid);
        $updates = array();
        $i = 0;
        foreach( $fr as $friend)
        {
            // Get Updates from friends and from self
            $q = $db->query("SELECT up.*, u.* FROM updates up
                LEFT JOIN users u ON u.id = '$friend'
                WHERE (up.userid = '$userid') ORDER BY up.up_id DESC");
            while($ar = $q->fetch_array(MYSQLI_BOTH))
            {
                $updates[$i] = $ar;
                $i++;
            }
        }
            $sortArray = array(); 

            foreach($updates as $update){ 
                foreach($update as $key=>$value){ 
                    if(!isset($sortArray[$key])){ 
                        $sortArray[$key] = array(); 
                    } 
                    $sortArray[$key][] = $value; 
                } 
            } 
            $orderby = "up_id";
            array_multisort($sortArray[$orderby],SORT_DESC,$updates); 
            $updates_limit = array_slice($updates, 0, 20);

to get the comments from each friend, sorting it by time, then slicing it to the first 20. However when I var_dump($updates_limit) it takes the last row in the comments table, and then makes it look like each friend posted the same comment.

Can anyone see the problem or a better way of addressing this issue?

8
  • 1
    Is refactoring your friends table an option? Commented Jun 22, 2014 at 21:52
  • I agree with FrankieTheKneeMan's refactoring question. It seems it would be better to have your user's friends in separate rows for each friend, rather than using a comma-separated TEXT column. For example, SELECT id FROM friends WHERE relationship = '$userid'; Commented Jun 22, 2014 at 21:56
  • 1
    For only 1000 rows? That shouldn't take long at all. Are you using proper indexes? Commented Jun 22, 2014 at 21:58
  • 1
    That's what a database is for. PHP will always be worse at this. Commented Jun 22, 2014 at 22:03
  • 1
    Also, you definitely don't need a separate id for the friendship. Commented Jun 22, 2014 at 22:06

1 Answer 1

1

I'd completely refactor the friends table to look something more like this: (Also, use english - Characters are cheap :c))

CREATE TABLE friends (
    user_id int FOREIGN KEY REFERENCES user(id)
    , friend_id int FOREIGN KEY REFERENCES user(id)
    , PRIMARY KEY (user_id, friend_id)
);

Then you can take essentially the same comment table:

CREATE TABLE comment (
    comment_id int PRIMARY KEY
    , user_id int FOREIGN KEY REFERENCES user(id)
    , comment_text text
    , comment_time datetime
);

And your "query for friend's comments" becomes:

SELECT comment_id, comment.user_id, comment_text, comment_time
  FROM friends
    INNER JOIN comment
      ON comment.user_id = friends.friend_id
  WHERE friends.user_id = ? #Target it
  ORDER BY comment_time DESC
  LIMIT 0, 20;

You can even speed this up by adding a few indexes - like comment(user_id).

Sign up to request clarification or add additional context in comments.

3 Comments

Thank you, I had a similar set up to this previously, but I had assumed a mass number of rows would have lowered the efficiency and speed.
I mean, it will, but it'll (almost) always be faster than doing it any other way. I've queried against tables in the billions of rows before, and I guarantee your RDBMS is the best choice for initial implementations. Let your database do what databases do - it's got years of very smart developers tackling these problems.
Thanks for the help, I'll refactor my friends table, set up the proper indexes, then revert back to my previous way of sorting through them.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.