|
Hi Dan
II am not at work now)but I have copied the text from the email into
notepad and removed the asterisks - the code looks like what I originally
had but no absolute guarantees:
SELECT
A.FROM_JOB,
A.FROM_JOB_USER,
A.FROM_JOB_NAME,
A.FROM_JOB_NUMBER,
REGEXP_SUBSTR(MESSAGE_SECOND_LEVEL_TEXT, ( '(saved on
volumes\s+)(.+?)(\s)'), 1, 1,'',2 ) as "Volume",
DATE(A.MESSAGE_TIMESTAMP) as START_DATE,
TIME(A.MESSAGE_TIMESTAMP) as START_TIME FROM
TABLE(QSYS2.HISTORY_LOG_INFO( START_TIME => (CURRENT_TIMESTAMP -
24 hours ) )) A
WHERE A.FROM_JOB_NAME LIKE 'BACKUPJOB'
AND A.MESSAGE_ID = 'CPC3701'
ORDER BY A.MESSAGE_TIMESTAMP DESC
FETCH FIRST 1 ROW ONLY;
On Fri, Mar 21, 2025 at 4:48 PM Dan Bale<dan.bale@xxxxxxxxxxxxxxxxxxxxx>
wrote:
Hi Evan,
Somewhere between your actual source code and what you pasted below got
asterisks "inserted" in your email. Out of curiosity, are you copying the
source code from RSS and pasting it in the email? RSS does a lot of
colorizing of the code, and I wonder if that's where all the asterisks are
coming from. Would you please try copying your original source code and
pasting into Notepad, then copying from Notepad to email?
- Dan Bale
-----Original Message-----
From: MIDRANGE-L<midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Evan Harris
Sent: Thursday, March 20, 2025 5:32 PM
To: Midrange Systems Technical Discussion<midrange-l@lists.midrangecom>
Subject: Re: What was the last volume used on a save
Hi Don
I had a crack at this out of interest this morning using the
HISTROY_LOG_INFO SQL service. It might not be quite what you want but
perhaps it provides a starting point or alternative approach.
SELECT
A.FROM_JOB,
A.FROM_JOB_USER,
A.FROM_JOB_NAME,
A.FROM_JOB_NUMBER,
*REGEXP_SUBSTR*(MESSAGE_SECOND_LEVEL_TEXT, ( '(saved on
volumes\s+)(.+?)(\s)'), *1*, *1*,'',*2* ) as "Volume",
*DATE*(A.MESSAGE_TIMESTAMP) as START_DATE,
*TIME*(A.MESSAGE_TIMESTAMP) as START_TIME FROM
TABLE(QSYS2.HISTORY_LOG_INFO( START_TIME *=>* (*CURRENT_TIMESTAMP* -
*24* hours ) )) A
WHERE A.FROM_JOB_NAME LIKE 'BACKUPJOB'
AND A.MESSAGE_ID = 'CPC3701'
ORDER BY A.MESSAGE_TIMESTAMP DESC
FETCH FIRST *1* ROW ONLY;
Change the job name to whatever your backup job is called.
The fields are mostly to return something, except for the REGEXP_SUBSTR
which is extracting the volume ID from the SECOND_LEVEL_TEXT.
You might need to tweak the start time and how many hours the query looks
back from current timestamp.
I sorted by timestamp in descending order and fetched the first row only
so as to get the last record (that was my thinking anyway) - there might be
a better way to do this.
I have used similar queries with the db2 command in strqsh to return
specific values into (for example) a specific data area, so I will leave
that part as an exercise for you.
*** CONFIDENTIALITY NOTICE: The information contained in this
communication may be confidential, and is intended only for the use of the
recipients named above. If the reader of this message is not the intended
recipient, you are hereby notified that any dissemination, distribution, or
copying of this communication, or any of its contents, is strictly
prohibited. If you have received this communication in error, please return
it to the sender immediately and delete the original message and any copy
of it from your computer system. If you have any questions concerning this
message, please contact the sender. ***
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email:MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit:https://lists.midrange.com/mailman/listinfo/midrange-l
or email:MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
athttps://archive.midrange.com/midrange-l.
Please contactsupport@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.