In developers’ life, there may be situations where they need to handle and process a large amount of data. Many times, when we process this large data through a loop, it consumes memory constantly, and you may also face the issue of the allowed memory size being exhausted.
So today in this article I will let you know how to handle large amounts of collection or data in PHP and especially in Laravel.

In Laravel you can handle large amounts of data using two methods :
1. Chunk
2. Cursor
Using these methods, you can significantly reduce your application memory consumption when iterating through thousands of Eloquent model records.
Chunk
In Laravel, the eloquent chunk method breaks the large data set into smaller groups of sub-datasets, also called chunks.
If you need to process thousands of Eloquent records, use the chunk command. The chunk method will retrieve a “chunk” of Eloquent models, feeding them to a given Closure for processing. Using the chunk method will conserve memory when working with large result sets.
- Constant Memory Usage
- Chunk query into queries with limit and offset
- Fetch the result by calling PDOStatement::fetch()
Example:
Product::chunk(2000, function ($products) {
foreach ($products as $product) {
// implement required logic
}
});
Cursor
The cursor method allows you to iterate through your database records using a cursor, which will only execute a single query. When processing large amounts of data, the cursor method may be used to greatly reduce your memory usage.
Cursor uses PHP Generators, you can check the php generators page.
- High Speed
- Only single query
- Fetch the result by call PDOStatement::fetchAll
- Minimize eloquent model memory overhead
Example:
foreach (Product::cursor() as $product) {
// process logic
}
Feel free to reach out to me if you have any further questions.