halaxa/json-machine
Efficiently parse huge JSON files and streams in PHP with low memory usage. json-machine provides an iterator-style API for incremental decoding of arrays/objects, supports JSON Lines and custom pointers/paths, and works great for imports and ETL tasks.
Install via Composer with composer require halaxa/json-machine. The package provides a minimal yet powerful API for streaming large JSON files without loading them entirely into memory. Start by reading the README’s basic usage example: JsonMachine::load('large_file.json')->getIterator() yields decoded JSON items one by one. For quick validation, parse a small 10–20 line JSON snippet first to confirm installation and basic behavior. Key classes to know: JsonMachine, JsonMachine\Parser, and JsonMachine\Source\SourceInterface.
file_get_contents() + json_decode() with JsonMachine::fromFile('path/to/file.json') to process JSON line-by-line or object-by-object, drastically reducing memory usage.JsonMachine::fromFile() with IteratorIterator or custom generators to filter, transform, or batch-process data (e.g., consume only "type":"user" entries).php artisan import:json) or job queues for ETL tasks — combine with Laravel’s logging (Log::info()) and event broadcasting.JsonMachine::fromString($jsonString) or custom sources (e.g., from HTTP streams or S3 via fopen() wrapper), enabling real-time parsing of streamed APIs.JsonMachine\Mapper\CallbackMapper to reshape or validate each decoded item during streaming, avoiding post-processing overhead.JsonMachine\Parser::fromFile()->withMapper(fn($item) => $item['data']['items']) — but ensure keys exist or wrap in ?? to prevent Undefined array key warnings.JsonMachine::fromFile('file.json')->map(function ($item) { dd($item); }) sparingly in CLI — instead, log to file or use foreach with break after N items.SourceInterface for custom input (e.g., decrypting ciphertext on-the-fly or parsing gzipped streams with gzopen() + stream wrapper).fromFile() over fromString() for large files — the former uses memory-mapped I/O under the hood for speed. Always run benchmarks with real data; the package shines most with files >50MB.How can I help you explore Laravel packages today?