I want to backup my whole Laravel application mysql database from my application by clicking a button, anyone to help?
8 Answers
(Laravel Backup) by Spatie.
this package provides you with a command (php artisan backup:run
). which is not exactly what you need.
But you can create a button in your admin section with a link to a route. and from there (either from the route or controller) you can call this command by
Artisan::call('backup:run');
which will trigger a backup.
2 Comments
Here is a good package for that. You can than issue the artisan command on a click of a button, or automate the way which is much better.
Comments
Export Only Table Data Backup without structure with row limit
Keynote:
- API
- Zip Download
- Only data backup not structure
- Order by 1 column
- No need mysql dump permission
- DB Data Export with row limit
- Exclude specific tables data to export
- In Route
Route::group(['prefix' => 'v1/db-migration', 'middleware'=>'auth:api'], function (){
Route::post('server-db-structure-backup', [DBController::class, 'serverDBStructureBackup']);
Route::post('server-db-data-backup', [DBController::class, 'serverDBDataBackup']);
Route::post('server-db-status', [DBController::class, 'serverDBStatus']);
});
- In Controller
public function serverDBDataBackup(Request $request) // db data backup only
{
$dir = $_SERVER['DOCUMENT_ROOT'].'/uploads/db/backups/';
File::ensureDirectoryExists($dir);
$file_name = 'db_backup_'.Str::slug(getNow(), '_').'.sql';
$file_full_url = $dir.$file_name;
$file = fopen($file_full_url, 'w');
$databaseName = DB::getDatabaseName();
$table_names = DB::table('information_schema.tables')
->where('table_schema', $databaseName)
->where('TABLE_TYPE', 'BASE TABLE')
// ->where('table_name', 'users') // later remove
->when($request->filled('except_tables') && count($request->except_tables), function ($q){
$q->whereNotIn('table_name', request('except_tables'));
})
->pluck('table_name')
->toArray();
// Loop through the tables and export data into files
foreach ($table_names as $table_name)
{
$data = DB::table($table_name) // getting a table dataset using select
->when($request->filled('table_rules') && count($request->table_rules), function ($q) use($request, $table_name) // executing all rules
{
foreach ($request->table_rules as $key => $rule)
{
if (isset($rule['table_name']) && $rule['table_name']==$table_name && isset($rule['row_limit'])) // for table row limit
{
$q->limit($rule['row_limit']);
}
if (isset($rule['table_name']) && $rule['table_name']==$table_name && isset($rule['order_by']) && isset($rule['order_type'])) // for table order by
{
$q->orderBy($rule['order_by'], $rule['order_type']);
}
}
})
->get();
if ($data->count() > 0)
{
fwrite($file, PHP_EOL.PHP_EOL.'-- ==================Table: '.$table_name.'================== '.PHP_EOL.PHP_EOL);
foreach ($data as $row)
{
$insert = "INSERT INTO `$table_name` (";
$values = "VALUES (";
foreach ($row as $column => $value)
{
$insert .= "`$column`, ";
$values .= "'" . addslashes($value) . "', ";
}
$insert = rtrim($insert, ', ') . ")";
$values = rtrim($values, ', ') . ");\n";
fwrite($file, $insert . " " . $values);
}
}
}
fclose($file); // close the file
$zip_file_full_url = substr($file_full_url, 0, -3).'zip'; // zip full url
$zip_file_name = substr($file_name, 0, -3).'zip'; // zip file name
$zip = new ZipArchive();
$zip->open($zip_file_full_url, ZipArchive::CREATE); // open the zip archive
$zip->addFile($file_full_url, $file_name); // add file to zip archive
$zip->close(); // close the file
File::delete($file_full_url); // after zip delete the sql file
return Response::download($zip_file_full_url, $zip_file_name, [ // download the zip file
'Content-Type' => 'application/zip',
]);
}
- In Postman request
//except_tables[0]:activity_log
except_tables[1]:audit_trail
table_rules[0][table_name]:users
table_rules[0][row_limit]:10
table_rules[1][table_name]:activity_log
table_rules[1][row_limit]:10
table_rules[1][order_by]:id
table_rules[1][order_type]:DESC
Comments
Hi If you database is large then mysql dump process consume large amount of memory when you database backup start from mysql dump then it may case to your site is slow and site is not working this issue i have facing in my site.
I have create db:backup command and use I/O input and tee it will help manage server memory now i have dump approx 40 gb and it will take 4 mintues but still my site in not slow. Please check below example
Use the make:command command to create a new command. Simply pass in the command name, like so:
php artisan make:command DBBackup
2)To do this, update the $signature property of the command, like this:
protected $signature = 'db:backup';
3)To do this, update the $description property to match this:
protected $description = 'Backup the database';
4)protected $process;
5)add below namespace if not exit
use Symfony\Component\Process\Process;
use Symfony\Component\Process\Exception\ProcessFailedException;
6)app/Console/Commands/DBBackup.php
DB command full example
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Symfony\Component\Process\Process;
use Symfony\Component\Process\Exception\ProcessFailedException;
class DBBackup extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'db:backup';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Backup the database';
protected $process;
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
$fileName='sample_'.date("Y-m-d").'.sql';
$this->process = new Process(sprintf(
'mysqldump -u%s -p%s %s %s %s %s %s %s %s %s %s > %s',
config('database.connections.mysql.username'),
config('database.connections.mysql.password'),
config('database.connections.mysql.database'),
'--single-transaction',
'--quick',
'|',
'ionice',
'-c2',
'-n',
'7',
'tee',
storage_path('backups/'.$fileName)
));
}
/**
* Execute the console command.
*
* @return mixed
*/
public function handle()
{
try {
$this->process->setTimeout(null);
$this->process->setIdleTimeout(null);
$this->process->mustRun();
$this->info('The backup has been proceed successfully.');
} catch (ProcessFailedException $exception) {
$this->error('The backup process has been failed.');
}
}
}
and you can set cron as per below
Command : 17 20 * * * php /var/www/html/sample/artisan db:backup >> /dev/null 2>&1
Comments
read this artical https://stackoverflow.com/a/65890498/14913109
return response()->download($latest_filename);// read above artical
Comments
create command
app/Console/Commands/DatabaseBackUp.php
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Carbon\Carbon;
class DatabaseBackUp extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'database:backup';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Command description';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return int
*/
public function handle()
{
$filename = "backup-" . Carbon::now()->format('Y-m-d') . ".gz";
$command = "mysqldump --user=" . env('DB_USERNAME') ." --password=" . env('DB_PASSWORD') . " --host=" . env('DB_HOST') . " " . env('DB_DATABASE') . " | gzip > " . storage_path() . "/app/backup/" . $filename;
$returnVar = NULL;
$output = NULL;
exec($command, $output, $returnVar);
}
}
and in app/Console/Kernel.php
update
protected $commands = [
'App\Console\Commands\DatabaseBackUp'
];
and run php artisan database:backup command
Comments
//make this code in custom command
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Carbon\Carbon;
class DatabaseBackUp extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'database:backup';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Command description';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return int
*/
public function handle()
{
$filename = "backup-" . Carbon::now()->format('Y-m-d') . ".sql";
$command = "mysqldump --user=" . env('DB_USERNAME') ." --password=" . env('DB_PASSWORD') . " --host=" . env('DB_HOST') . " " . env('DB_DATABASE') . " | gzip > " . storage_path() . "/app/backup/" . $filename;
$returnVar = NULL;
$output = NULL;
exec($command, $output, $returnVar);
}
}
//aadd thi in app/kernal.php
'Illuminate\Foundation\Http\Middleware\CheckForMaintenanceMode',
'Illuminate\Cookie\Middleware\EncryptCookies',
'Illuminate\Cookie\Middleware\AddQueuedCookiesToResponse',
'Illuminate\Session\Middleware\StartSession',
'Illuminate\View\Middleware\ShareErrorsFromSession',
'App\Http\Middleware\SessionExpired'
protected $middlewareGroups = [
'web' => [
\App\Http\Middleware\EncryptCookies::class,
\Illuminate\Cookie\Middleware\AddQueuedCookiesToResponse::class,
\Illuminate\Session\Middleware\StartSession::class,
\Illuminate\View\Middleware\ShareErrorsFromSession::class,
\App\Http\Middleware\VerifyCsrfToken::class,
\Illuminate\Routing\Middleware\SubstituteBindings::class,
\App\Http\Middleware\SessionExpired::class,
],
1 Comment
Another approach for it, which I'm using here Shell_exec make sure it's enabled in your xampp/wamp
Note: I'm using PHP 7.4.29 on localhost
Check if shell_exec is Enabled
if(function_exists('shell_exec')) {
echo "shell_exec is enabled";
}
shell_exec("C:/xampp/mysql/bin/mysqldump -h localhost -u root test > C:/xampp/htdocs/projects/main.sql");
syntax
Path to mysqldump -h [host] -u [username] -p[passowrd] [database-name] > [path/dump-name].sql
Note: remove -p if empty (no white space)
C:/xampp/htdocs/projects/main.sql Where you want to save sql file
Also check this reference for help: https://stackoverflow.com/a/15294585/13804634
Make sure your mysqldump set in your environment variables if
'mysqldump' is not recognized as an internal or external command
