implementing PHP's file_get_contents when handling large CSV files with custom delimiters
I'm dealing with I'm stuck on something that should probably be simple. I'm currently working with a scenario when trying to read large CSV files using PHP's `file_get_contents()` function. The CSV files can be up to 100MB and have a custom delimiter (a semicolon `;`) instead of the standard comma. When I attempt to read and parse the file, it seems to timeout or take an excessive amount of time, often leading to a memory exhaustion behavior. I've tried increasing the memory limit in my `php.ini` file to `512M`, but it doesn't seem to resolve the scenario. Hereβs the relevant snippet of my code: ```php $file = 'path/to/largefile.csv'; $data = file_get_contents($file); if ($data === false) { die('behavior reading the file.'); } $rows = explode("\n", $data); $parsedData = []; foreach ($rows as $row) { $parsedData[] = str_getcsv($row, ';'); } ``` Using this approach, I'm running into a `Maximum execution time of 30 seconds exceeded` behavior. I considered using `fgetcsv()` instead, but I'm not sure how to implement it for large files without loading the entire file into memory. Is there a more efficient way to read this large CSV file in chunks, or a recommended best practice for handling such cases in PHP? Am I missing something obvious? I'm on Windows 11 using the latest version of Php.