I've created command in Laravel which download (copy) database's .sql files from staging server and paste it on production server (Requirement). I don't have SSH access to production server, So created route for that command and execute it from URL. Here are my code.
Route
Route::get('console/import_all',
function () {
Artisan::call('importDatabase:staging', ['tables' => 'all']);
});
Command Function
private function downloadFile($url, $path)
{
sleep(5);
return copy($url, $path);
}
Now there are approx 30+ files and some of them are more than 10MB in size. My command works fine from SSH (by admin) and from URL as well. But issue is when I hit command from URL, page keeps loading till all download finish. Is there any way to execute this from background? So if admin hit button from admin panel, s/he should not wait for all the time until coping all file finish and I can display message that process has been start and you will notify once done.
from Newest questions tagged laravel-5 - Stack Overflow http://ift.tt/2AgNwOc
via IFTTT
Aucun commentaire:
Enregistrer un commentaire