I would like to know re queueing a laravel job is a bad idea or not. i had a scenario where i need to pull users post from facebook once they integrated there facebook account to my application. i want to pull {x} days historic data. facebook api like any other api limit there api request per minute. i keep track the request headers and once rate limit reached i saved those information in database and for each re queue i check whether i am eligible to make a call to facebook api
here is the code snippet for a better visualization
<?php
namespace App\Jobs;
class FacebookData implements ShouldQueue
{
/**
* The number of seconds the job can run before timing out.
*
* @var int
*/
public $timeout = 120;
public $userid;
public function __construct($id)
{
$this->userid=$id;
}
public function handle()
{
if($fbhelper->canPullData())
{
$res=$fbhelper->getData($user->id);
if($res['code']==429)
{
$fbhelper->storeRetryAfter($res);
self::dispatch($user->id);
}
}
}
}
The above snippet is a rough idea. is this a good idea? the reason why i post this question is the self::dispatch($user->id);
looks like a recursion and it will try until $fbhelper->canPullData()
returns true.that probably will take 6 minutes.i am worried about any impact would happen in my application.Thanks in advance
from Newest questions tagged laravel-5 - Stack Overflow https://ift.tt/2GpJrcx
via IFTTT
Aucun commentaire:
Enregistrer un commentaire