Wouldn't it make more sense to use a data queue for this process instead of a 
physical file?

Then you wouldn't have to worry about deleted records....

Kenneth

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx
[mailto:midrange-l-bounces@xxxxxxxxxxxx]On Behalf Of James H H Lampert
Sent: Monday, September 25, 2006 3:24 PM
To: midrange-l@xxxxxxxxxxxx
Subject: File that has records constantly being added and deleted


Here's the situation:

We have a file. Any arbitrary number of jobs can put 
records into the file; a single dedicated job reads the 
records, in arrival sequence, processes them, and deletes 
them. We thus have a file that rarely has more than a few 
active records, but accumulates lots and lots of deleted 
ones.

Is there a way to squeeze out deleted records without 
having to grab an exclusive lock on the file? Or would it 
be more sensible to set it to re-use deleted records, and 
modify the processing program to read by key? Or are there 
other ideas?

--
JHHL

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.