|
Jack--
No, it's not a 'troubled' file-- the data is fine. It is a sign that whomever originally created the file didn't know how much data would be stored, and under-estimated the file size.
The problem is that every time the file fills up an 'increment' of records (ie every 100 records you add), the operating system will ask what to do about it. Usually, the option to increment the file and continue is taken. This has happened over 42,000 times!
The cure is to CHGPF and increase the base size of the file, and the size of the increments. Since the file currently has more than 4 million records, the changed size should be at least 5 million. The increment should be more than 100. I recommend the maximum-- 32767. The maximum increments is already 32767, which the file has exceeded by at least 10,000 increments!
CHGPF FILE(LIB/FILE) SIZE(5000000 32767 32767)
You can't change the file while it's busy-- pick a 'quiet time.'
Before you change the file, however-- How many deleted records does the file contain? Would reorganizing the file be a good starting point?
Paul E Musselman
PaulMmn@xxxxxxxxxxxxx
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.