PHPFixing
  • Privacy Policy
  • TOS
  • Ask Question
  • Contact Us
  • Home
  • PHP
  • Programming
  • SQL Injection
  • Web3.0

Saturday, February 19, 2022

[FIXED] How to insert big data on the laravel?

 February 19, 2022     bigdata, insert, laravel, laravel-5, laravel-5.6     No comments   

Issue

I am using laravel 5.6

My script to insert big data is like this :

...
$insert_data = [];
foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);
    $posting_date = $posting_date->format('Y-m-d');
    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];
    $insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);

I have tried to insert 100 record with the script, it works. It successfully insert data

But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :

504 Gateway Time-out

How can I solve this problem?


Solution

As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:

1 - Reorganise your code to properly use chunks, this will look something like this:

$insert_data = [];

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];

    $insert_data[] = $data;
}

$insert_data = collect($insert_data); // Make a collection to use the chunk method

// it will chunk the dataset in smaller collections containing 500 values each. 
// Play with the value to get best result
$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}

This way your bulk insert will contain less data, and be able to process it in a rather quick way.

2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :

ini_set('max_execution_time', 120 ) ; // time in seconds

$insert_data = [];

foreach ($json['value'] as $value)
{
   ...
}

To read more go to the official docs



Answered By - Vit Kos
  • Share This:  
  •  Facebook
  •  Twitter
  •  Stumble
  •  Digg
Newer Post Older Post Home

0 Comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Total Pageviews

Featured Post

Why Learn PHP Programming

Why Learn PHP Programming A widely-used open source scripting language PHP is one of the most popular programming languages in the world. It...

Subscribe To

Posts
Atom
Posts
Comments
Atom
Comments

Copyright © PHPFixing