|
On Thursday 22 April 2004 17:44, Mike Berman wrote: > I would like to monitor the DSPLOG on a daily basis. I would run it to > a spool file, change the spool file to a PF, and then read this PF for > any problems. Has anyone already done this, or knows which type of > error messages to check for? I recall some years ago that when A DISK > failed for example, there had been some warning messages in the DSPLOg, > that we had never noticed. I am trying to write something along these > lines. IT would also include any failure in the night processing, Hi Mike You can go that route but it can take a while to generate & process the spoolfile. Depending on your situation (ie, if you have the authority), you might be able to read the log files directly. They are regular physical files in QSYS. The name is QHSTYYDDDX where YY is a two digit year, DDD is the julian date (days from start of year) and X is a sequencing letter (A then B etc throughout the day). The file has just a single field, SYSLOGFLD, where the first 10 bytes hold the date & time. As yet I haven't figured out how that's encoded but I'm sure someone else on the list will know. We just have the one application that parses the logs, written a long time back. It uses the DSPLOG/CPYSPLF technique which is why I know it takes ages ;) If someone knows the method for extracting the date & time I can rework the app to speed it up :) Regards, Martin -- martin@xxxxxxxxxx AIM/Gaim: DBG400dotNet http://www.dbg400.net /"\ DBG/400 - AS/400 & iSeries Open Source/Free Software utilities \ / Debian GNU/Linux | ASCII Ribbon Campaign against HTML mail & news X / \
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.