Consider an application that collects orders for multiple customers where you end up with something like:
Customer1: Order1, Order2, Order3
Customer2: Order1, Order2, Order3
I'll get duplicate key errors if there are simultaneous service calls using code like this:
function insertOrder($data) {
DB::transaction(function() use ($data) {
$newLocalId = DB::table('orders')->where('customer_id', 1)->max('local_id');
DB::table('orders')->insert(
['customer_id' => 1, 'local_id' => $newLocalId + 1]
);
});
}
In order to attempt solving, this code was changed to this:
function insertOrder($data) {
DB::transaction(function() use ($data) {
$newLocalId = DB::table('orders')->where('customer_id', 1)->lockForUpdate()->max('local_id');
DB::table('orders')->insert(
['customer_id' => 1, 'local_id' => $newLocalId + 1]
);
});
}
I was hoping that locking the row with the current max local_id and this preventing reads on it would completely solve the issue, but running into deadlocks very frequently. I'm not seeing how this function can create a deadlock though.
from Newest questions tagged laravel-5 - Stack Overflow https://ift.tt/31FuU6f
via IFTTT
Aucun commentaire:
Enregistrer un commentaire