|
I use the posix(unix) open(), close(), read()... APIs to process stream file, similar to the C functions.
In general I allocate 4M of memory for the file buffer and read the file in 4M blocks, and process the buffer. The tricky part is dealing with "records" that cross the 4M boundary, part is in memory part is still in the file.
The big problem I see is performance, just like reading one record at a time from a DB could be a performance hit, reading one byte at a time from a stream file is almost guaranteed to be a poor performer.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.