Sometimes when I'm sending over a large dataset to a Job, my queue worker exits abruptly.
// $taskmetas is an array with other arrays, each subsequent array having 90 properties.
$this->dispatch(new ProcessExcelData($excel_data, $taskmetas, $iteration, $storage_path));
The ProcessExcelData
job class creates an excel file using the box/spout package.
- in the 1st example
$taskmetas
has 880 rows - works fine - in the 2nd example
$taskmetas
has 10,000 rows - exits abruptly
1st example - queue output with a small dataset:
forge@user:~/myapp.com$ php artisan queue:work --tries=1
[2017-08-07 02:44:48] Processing: App\Jobs\ProcessExcelData
[2017-08-07 02:44:48] Processed: App\Jobs\ProcessExcelData
2nd example - queue output with a large dataset:
forge@user:~/myapp.com$ php artisan queue:work --tries=1
[2017-08-07 03:18:47] Processing: App\Jobs\ProcessExcelData
Killed
I don't get any error messages, logs are empty, and the job doesn't appear in the failed_jobs
table as with other errors. The time limit is set to 1 hour, and the memory limit to 2GBs.
Why are my queues abruptly quitting?
from Newest questions tagged laravel-5 - Stack Overflow http://ift.tt/2fk82oG
via IFTTT
Aucun commentaire:
Enregistrer un commentaire