2

I have below foreach loop, which iterates over a bunch of different custom rules my users can apply.

public function parse(Document $document)
{
        $content = $document->text;
        $rules = $document->stream->fieldrules;

        foreach ($rules as $rule) {
            $class = $this->getClass($rule);
            $content = $class->apply($content);

            //Save to database.
            $fieldresult = new FieldRuleResult();
            $fieldresult->create([
                'field_rule_id' => $rule->field_id,
                'document_id' => $document->id,
                'content' => $content
            ]);
        }
}

As you can see, I am calling the database on each iteration. Some users may have up to 50 rules defined, which will then result in 50 insert queries. So I believe I may have encountered a n+1 problem.

I was wondering - what is the best practice here? I've tried searching the Laravel documentation, and found the createMany method. So I tried to refactor the iteration to below:

$result = [];
foreach($rules as $rule)
{
  $class = $this->getClass($rule);
  $content = $class->apply($content);
  $fieldresult = new FieldRuleResult();

  $result[] = [
     'field_rule_id' => $rule->field_id, 
     'document_id' => $document->id, 
     'content' => $content];
}

return $rules->createMany($result);

Which gives me below error:

Method Illuminate\Database\Eloquent\Collection::createMany does not exist.

Now I imagine that is because $rules returns a collection. I tried to alter it to:

return $document->stream->fieldrules()->createMany($result);

Which gives me below error:

Call to undefined method Illuminate\Database\Eloquent\Relations\HasManyThrough::createMany()
1

2 Answers 2

3

There is a method available in the Illuminate/Database/Query/Builder named insert. This method allows you to insert many models in one query. Noted is that this will not, as all multiple in one query methods do, trigger eloquent events.

Like you have already done, store the data in an array and execute the query after your loop is done. However this time you could use insert.

$content = 'your content here...';

$data = [];

foreach($rules as $rule) {

    $class = $this->getClass($rule);

    $content = $class->apply($content);

    $data[] = [
        // ...
        'content' => $content
    ];
}

FieldRuleResult::insert($data);

Not 100% sure this works directly on the model. However in the documentation you can find an example with the DB facade.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks Thomas! However, is there a way where I can insert the data more elegantly (without the n+1 issue), and still fire off events?
I think, this is a good way to go. Before you prepare your $data just apply the filter.
As @TheAlpha comments, is only for use with the DB Facade. DB::table('table')->insert([ [$data1], [$data2] ]);
@oliverbj We can fire eloquent events manually, butI am not sure if we should do this. However, depending on your needs, we can create a custom event and listener.
2

If Eloquent events need to be triggered for whichever reason, and you would like to optimize for the n+1 problem, I would suggest using a transaction. This will still yield n queries, but the database can optimize this via bulk insert (because of the transaction). It has the extra benefit of data consistency (either all inserts succeed, or none of them do).

$result = getResult();

\DB::transaction(function() use ($result) {
    foreach($result as $item) {
        FieldRuleResult::create($item);
    }
});

If you do not care about Eloquent events being triggered, and you just want to "bulk insert", you can use the insert method, as mentioned by Fabian. I would still advise to use a database transaction though, for data consistency.

$result = getResult();

\DB::transaction(function() use ($result) {
    FieldRuleResult::insert($result);
});

In this latter case, if the amount of entries you're about to insert can be "large", you might also want to chunk these inserts. (eg. 100 at a time, rather than all at once)

2 Comments

Thanks, I will try with the transaction as adviced. Quick question: would it make sense to further move the actual DB insert into a job and then queue it?
Depending on your project, that might make sense. Especially if the database can be under heavy load, or there is a lot of data to be inserted in that job. Just be aware that having it queued in a job is usually async, and thus if the user can view his submissions somewhere, he might be confused as to why they aren't there (yet!)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.