mercredi 27 janvier 2021

File get contents too slow on 50 files

I need to copy existing private s3 files to another directory but my process is too slow, on my local its 2 seconds per file_get_content of each file.

My problem is most files that I process are 50+ files so if you total that would be 2seconds * 50 and its really not a great user experience waiting for that amount of time for a process to finish, what might be the best approach I can do to refactor this? queue is not really an option at the moment

foreach ($sourceAttachmentFiles as $sourceAttachmentFile) {

    $newFullFileName = $newDirectory.$sourceAttachmentFile->filename;
    
    // 2 seconds every loop
    $content = file_get_contents($sourceAttachmentFile->getS3SignedUrl());

    Storage::disk('s3')->put($newFullFileName, $content, 'private');
}


from Newest questions tagged laravel-5 - Stack Overflow https://ift.tt/2MaWZiv
via IFTTT

Aucun commentaire:

Enregistrer un commentaire