This is a topic that I am not an expert on, but I have seen significant increase in AS/400 performance as a result of my mucking around with how the software functions and how much ancient data is in there.

* I rewrote a MAPICS inventory application, which took me several months ... response time went from several minutes to several microseconds. This was one of the first major projects of mine at my current employer, before we switched from MAPICS to BPCS.

* I got permission from the users to delete some data that was more than 2 years old. After getting this done, response time went from noticeable to not notice any degradation, and some reports that used the data and took the better part of an hour to get done, now got done in 5 less minutes

* I changed logicals used for some program access, and the reports involved got done in about 80% of the time the previously took

* Some Query/400 users tried to run something that I had setup for JOBQ only ... via JOBQ it takes about 1/2 minute to run ... interactive it takes several hours and kills performance for everyone else

I got my data on how much faster stuff ran 2 ways:
* DSPLOG get how long something took to execute on JOBQ
* WRKACTJOB then F4 (not enter) then F10
* On the response time limit ... make that 0.1
* enter then F11

If you not familiar with this stuff, I can share more info on how to interpret the screen, but you can learn a lot by cursor on a column then F1.

You might check the BPCS-L archives for "archive" since this topic has come up from time to time there, because there are several competing add on packages that archive BPCS data and improve BPCS performance in several areas. <http://archive.midrange.com/bpcs-l>

UPI is one of those suppliers, which I more familiar with, since my employer uses them for tech support when we have a serious emergency.
One start point for their site = http://www.unbeatenpathintl.com/
Info about their archiving product for BPCS http://www.unbeatenpathintl.com/locksmith_bpcs/source/1.html

There's end month tasks in BPCS whose run time is directly related to the volume of records in the file ... so a file with 10 million records takes 10 times longer to process than one with 1 million records.

Some of this may be related to how BPCS is designed, and would not apply to other vendor's work. Also by package there are unique situations.

BPCS comes with a "sizing questionairre" that asks stuff like how many users on the system, how large the files, how active certain inputs, and from that it computes how powerful an AS/400 you need to get. Over time, a company evolves so that it no longer matches what was on the questionairre, and new OS/400 upgrades mean the basis for the questionairre need to be revised, so one solution to a slow AS/400 in that reality, is to redo the questionairre, based on the evolved reality, to see if the answers have changed ... perhaps the company now needs a faster processor, more memory, more CPW, etc.

There's also stuff that can be done with SQL optimization advice.

Al,

This intrigues me because it just so happens that I'm working on a
project to unarchive history.  The current process is to back off the
sales history to a separate library at year end.


So far I've merged the last two year's with the current year's history
(in a test environment, of course).  There are no discernible
differences to me (but I'm blind in one eye and can't see too well out
of the other one) as far as inquiry goes.  The inquiries do ask the
requestor for the customer and, optionally, a starting date.


Naturally, if I was reading the sales history as an Input Primary, I
would expect things to take longer.  Is this what you are talking about,
or is there some impact on the random searches (inquiries)?  What kind
of activity other than the I/P would adversely impact a situation such
as this?


Thanks.

        * Jerry C. Adams
*IBM System i Programmer/Analyst
B&W Wholesale Distributors, Inc.* *
voice
        615.995.7024
fax
        615.995.1201
email
        jerry@xxxxxxxxxxxxxxx <mailto:jerry@xxxxxxxxxxxxxxx>



Al Mac wrote:

>   Depending on your applications, archiving can dramatically improve
>   performance.  Let's suppose for example, people are using an inventory
>   history inquiry program and the history has 10 years of data, but 99% of
>   the people only need to see what is in the last 2 weeks.
>
>   Having 500 weeks of data in the file can slow access for people who only
> need to see 2 weeks worth. Archiving solution makes the data available to
>   the 1% applications that need to see 500 weeks, while performance
>   dramatically improved for the 99% of your users.
>
>   There's bunch of people on these lists who are familiar with the various
>   archiving alternatives and can elucidate further if this interests you.
>
>   -
>   Al Macintyre
>   http://en.wikipedia.org/wiki/User:AlMac
>   BPCS/400 Computer Janitor ... see
>   http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.