CPYFRMIMPF is a tool designed to generate /rows/ of described database data from previously exported _database_ data; database data that was exported in either a delimited or a fixed format, and irrespective of which format, has standard text line delimiters. Stream data is essentially incompatible with the concept of row data, whereby the latter can only _conceptually_ represent the former, and most generally that representation only as varying length rows. That is because when the stream data is not delimited for representation as multiple fields, it is almost surely either a binary stream for which any concept of a row is unnecessary complexity, or strings defined by some delimiter(s) which are typically manifest as /lines/ of text data for presentation versus as rows of database data. The distinction between lines and rows is that the row for a database file has no delimiters because its fixed or varying row size takes place of the delimiters used for the separation of distinct strings of text within a stream.

The given snippet of data is indicative of a stream which does not have any obvious text /lines/ of data, so it should be obvious that there is no direct copy into /rows/ of data. If there had been typical line termination characters, as recognized by standard text parsing utilities, the DSPF request would have presented the data in somewhat /readable/ strings of text, as separate /lines/ of data. The first requirement is to insert delimiters which define the /lines/ of data to such utilities. As someone suggested, it may be the tilde character which should be treated as a line delimiter, and thus what should be translated into a line feed or a combination of line-feed and carriage-return characters. That action would allow the text copy utilities to recognize what should be treated as separate lines of text data. If the data is not first modified to be in such a /standard/ form, the data should be processed by a user program versus a text copy utility [which expects standard line delimiters].

If the data need only be separated into 80-byte chunks without regard to intended delimiter to establish /lines/ of text data, then use either FTP or CPYTOSTMF which as I recall can break up the data into fixed length lines without needing standard delimiters.

Regards, Chuck

KHeeter@xxxxxxxxxxxxx wrote:
I am having a bugger of a time with getting a simple 80 character EDI
file from the IFS to a file called FILE080 in my library with one
field of 80 bytes. I will then ADD it to my Edi inbound queue. I have
done it a zillion times with .csv files, but not this EDI file, I do
not want delimiters. I just want the data to be copied exactly as it
is.

I tried a zillion variations of CPYFRMIMPF and CPYFRMSTMF to no
avail.

Can anyone help?

Here is a snip of the data:

Browse : /home/kathie/FILE.TXT <<SNIP>> ....+....1....+....2....+....3....+....4....+....5....+....6....+ ************Beginning of data************** ISA*00*0000000000*00*0000000000*08*6111470100 *01*074240599
*080614*074
8*U*00401*000034236*0*P*>~GS*PD*6111470100*074240599*20080614*0748*747*X*004010~
ST*852*045181099~XQ*H*20080613*20080613~N9*DP*059~N9*IA*3094424~N1*RL*TARGET~
<<SNIP>>

Display Attributes

Object . . . . . . : /home/kathie/FILE.TXT
Type . . . . . . . . . . . . . . . . . : STMF
Owner . . . . . . . . . . . . . . . . : HEETERK
System object is on . . . . . . . . . : Local Auxiliary storage pool . . . . . : 1 Object overflowed . . . . . . . . . : No
Coded character set ID . . . . . . . . : 819
<<SNIP>> Browse KATHIE/QDDSSRC SEU==> FILE080
*************** Beginning of data *********
0009.06 A R @FILE80
0009.07 A FLD080 80

Any help is greatly appreciated... Going Crazy!

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.