mardi 30 janvier 2018

Fetching near-realtime data from external API

I'm looking for a sustainable solution to fetch data every x seconds (let's say 20) and store it in a relational database using PHP. After doing some research I found a few options:

1) Cronjobs (with shell scripts)

See https://askubuntu.com/questions/800/how-to-run-scripts-every-5-seconds for more information. This basically comes down to run a shell script (looping/sleeping)

This doesn't feel right as I could not catch exceptions and/or race conditions might occur. Also, cronjobs itself are not made for this kind of tasks.

2) Web-worker (with queued jobs)

Laravel provides a queue worker that can process new jobs (asynchronously) as they are pushed onto the queue. I could push multiple (say a lot) of jobs to the queue at once which should processed every x seconds consecutively.

This sounds like a more robust solution as I could catch exceptions and make sure the worker is running (using observers). The downside; it's slower and it might be overengineered.

3) Web socket

I could use node.js to run a websocket client like socket.io and implement some kind of timing mechanism to store the data every x seconds.

This solution feels odd as I was taught that sockets are used to push data to clients (realtime), but I have never seen that they were used to insert data.


All help is appreciated.



from Newest questions tagged laravel-5 - Stack Overflow http://ift.tt/2np4gLW
via IFTTT

Aucun commentaire:

Enregistrer un commentaire