ARTICLE AD BOX
In laravel, I have a query that searches thousands of rows in a database, and I trying to compile that into a CSV file. In attempt to reduce memory usage, I get 500 rows at a time and output the CSV.
$callback = function () use ($query) { $file = fopen('php://output', 'w'); $query->chunk(500, function ($rows) use ($file) { foreach ($rows as $key => $row) { fputcsv($file, array_map(...$rows...)); } log_info("Memory used " . memory_get_usage()); }); fclose($file); }; $headers = [ ... ]; return response()->stream($callback, 200, $headers);The actual query is a bit more complex, and involves getting related models which also need to be rehydrated. When I run this, it begins generating the CSV file and, after a while, runs out of memory. This is in my log
(59): Memory used 17208328; (59): Memory used 25105328; (59): Memory used 30601328; ... (59): Memory used 127380496; (59): Memory used 129352584; (59): Memory used 131207672; [2025-11-23 23:50:15] qa.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 16384 bytes)What I Tried
I tried putting the following inside the chunk loop, hoping that it would free memory. It had no effect on the memory consumption.
unset($rows); flush(); gc_collect_cycles();