Jim--
I think you received 2 copies of my message because I sent one copy 
directly to you, the other came through the Midrange mailing list.
We wrote a program that can extract the data from any file.
You need to create a custom retrieval file for each data file's 
records you plan to save.  This file has all the 'journal' fields 
that you get retrieving a journal transaction, plus all the fields 
from the original file's DDS (or retrieved SQL).
You then retrieve the journal entries for a single database file to a 
work file.  This is a WIDE record-- it has the journal fields (date, 
time, job, user that created the transaction), copies of the database 
fields (the DDS), plus a lot of extraneous journal information that 
we're not interested in.
Then you copy this work file to your custom file, FMTOPT(*NOCHK). 
This will truncate at the end of the DDS fields, and will result in a 
lot more compact data.  This file can be analyzed with any database 
tools.
We've encapsulated the 'journal entry retrieve' and the CPYF into a 
program-- we just pass parms for a whole mess of files.
The problem with this technique is that you need to run this same 
process for every database file.  And retrieving journal entries this 
way means that for every file you need to scan the same journal 
receivers.  Over.  And over.  And over and over again.
Carsten Flensburg (sp) showed a more efficient technique using the 
RCVJRNE command. This command piggybacks on the IBM journal logic. 
You tell the system, "While you're capturing transactions, if you see 
any for this list (a, b, c, e) of files, give me a copy as well." 
This means that for a large number (see command limitations), you 
have one job waiting for transactions to happen, and another program 
that will process them once found.
See 
http://www2.systeminetwork.com/artarchive/index.cfm?fuseaction=viewarticle&CO_ContentID=17415 
We mutiliated his procedure for our needs, but it's a lot more 
efficient than retrieving journal entries!
--Paul E Musselman
PaulMmn@xxxxxxxxxxxxxxxxxxxx
As an Amazon Associate we earn from qualifying purchases.