On 25-Dec-2014 20:14 -0600, Gary Kuznitz wrote:
Thank you very much for the reply...
Comments below.
On 25 Dec 2014 at 16:52, CRPence wrote:
On 25-Dec-2014 12:18 -0600, Gary Kuznitz wrote:
I'm trying to copy a file from the IFS to a DB file. I'm getting
errors on every record. The error is CPF2973
The from-program and to-program were not included in the
information below. The partial symptom string is msgCPF2973
F/QCPIMPRT FM/QCPIMPRT FP/Send_type_msg stmt/15 ... TM/QDBCTHTWRK
I am only working with one program that doesn't call any other
programs. I don't know why the from-program and to-program were not
included in the information.
  The implication is that the information that was presented, as taken 
from the joblog, was missing information.  The layout of the data is 
such that the Message Identifier (MSGID), the Message Type (MSGTYPE), 
Severity (SEV), and a number of other details appear across one line, 
and then several lines follow showing more details; i.e. the snippet of 
lines left included\quoted just below:
  From module . . . . . . . . :   QCPIMPRT
  From procedure  . . . . . . :   Send_type_msg
  Statement . . . . . . . . . :   15
  <<SNIP>>
  Thus I was meaning to suggest, that what was included\given as 
supporting information from the joblog, was effectively truncated; 
headless really.  The above should have appeared in the joblog instead 
more like the following lines wherein I included the heading line and 
the missing line both as wrapped lines; alternate formatting could be 
used to include all of that information, yet I can understand the desire 
to convey the necessary information while also desiring to avoid 
spending possibly minutes of editing to further /beautify/ the information:
MSGID      TYPE                    SEV  DATE      TIME
FROM PGM     LIBRARY     INST     TO PGM      LIBRARY     INST
CPF2973    Completion              00   12/26/14  17:54:52.072704 
QCPIMPRT     QSYS        *STMT    UsrPgm       UsrLib     ????
   From module . . . . . . . . :   QCPIMPRT
   From procedure  . . . . . . :   Send_type_msg
   Statement . . . . . . . . . :   15
An APAR for v5r2 described a similar issue; at least having noted
the missing replacement text for that message.
  The specific issue with the missing from-file replacement text for 
the truncation msg CPF2973 issued for Copy From Import File (CPYFRMIMPF) 
was not corrected until v6r1.  The origin for that specific issue is per 
use of the FROMSTMF vs FROMFILE in the given scenario.  The problem 
being, that the messaging used for the copy utility's truncation error 
CPF2973 did not properly account for the IFS naming [only 10-byte naming 
was accommodated], until the correction with the IBM i 6.1 changes, 
since which a different message is sent.
   In the CLP I have:
CHGJOB     CCSID(37)
Message . . . . :     5700 - CPYFRMIMPF
   FROMSTMF('/home/Payroll/ATU_PAYROLL_DATA_FY14_372_398.txt')
   TOFILE(GARY2/MBRPAYATU) MBROPT(*ADD) RCDDLM(*CRLF) STRDLM('"')
   FLDDLM('|')
   FROMRCD(2 *END) ERRRCDFILE(GARY2/MBRPAY1ERS) ERRRCDOPT(*ADD)
   RPLNULLVAL(*FLDDFT)
I am getting no records in the error file.
I don't have any double quotes in the file past the first record.
This is one record of the input file from the IFS:
372|7/12/2013|000027857|Last Name, Brian
J|3926|00001|A3|6/22/2013|000764|SIGNAL
INSPECTOR|A|AWD|AWARD|1|1|20|20
The above is all on one line.  = One record.
Does anyone have any idea why I would be getting the error?
PS: V5R3
Is the Copy From Import File (CPYFRMIMPF) running using the v5r2
support; i.e. is there a Data Area (DTAARA) named QCPFRMIMPF in
QSYS with the string value 'CPV5R2'? If so, then allowing the
feature to use the actual V5R3 support might assist; i.e. delete
the data area, or modify the value of the string data, because the
v5r2 code path had an almost identical symptom.
I don't have that DTAARA on the system.
Note that the date data values are in *USA format, but the default
for the command tells the import feature that the date data values
should be *ISO.
Thank you for catching that. I'm not sure the date values are going
to  work anyway.
  The feature understands all of the /standard/ formats irrespective 
the specification, because that is what the SQL does\understands.  Thus 
if the date values are any of *EUR, *ISO, *USA, *JIS then they will be 
understood.  Only _one_ of the non-standard dates can be understood by 
the SQL, and that choice would have to be explicitly specified on the 
Date Format (DATFMT) and Date Separator (DATSEP) parameters; i.e. if for 
example some of the date values in the Separated Values records are *MDY 
and others are *DMY, then only those values consistent with the command 
specification [DATFMT+DATSEP] will be understood.
The dates in the input file may have different formats.
One may be: 7/12/2013
Another may be: 12/12/2013
I have changed the date data values to *USA.
  Those two examples are not considered "different formats" in the Date 
Format (DATFMT) sense; though formatted slightly differently as 
'M/DD/YYYY' vs 'MM/DD/YYYY', those are both valid *USA literal\constant 
[character string] date values.  The implication was to explicitly 
specify DATFMT(*USA), though I suspect [per my prior comments; and even 
with a test on v5r3], the chosen specification should be moot if indeed 
all records use only *USA date literals; should not matter, between use 
of DATFMT(*USA) or [defaulted to] DATFMT(*ISO).
In the DDS I didn't specify the Date fields as dates.
  A            TRANSDATE     10          TEXT('Transaction Date')
  A ten-byte alphanumeric should be valid to support any valid date 
literal.  I would not expect changing that to a Date data type would assist.
The command suggests the strings are delimited with the
double quote, but the items in the given record clearly are not.
I wasn't sure what that meant. Does that mean I should use *None?
  Not "should use", but "might consider using, according to what is 
desirable".  The command defaults to STRDLM('"'), but the string data 
appears consistently [though only inferred with the one sample row as a 
given] to appear _without delimiters_.  However if some rows do include 
the double-quote as delimiter, for example to ensure that strings can 
contain embedded pipe ['|'] characters, thus to avoid those being 
mistaken as Separator characters, then the STRDLM(*NONE) specification 
would *not* be desirable.  The specified string delimiter designation is 
to suggest to the utility, that _if_ there is a delimiter for the data 
for a character string column, then the specified delimiter should be 
removed from the datum before becoming column data.
Performing the request from the command-line in a new job rather
than via the CLP and in a job that has performed other work might
exhibit a different effect; may be worth a test attempt.
That same request made with the given record on the PUB1.DE public
v5r3 system functioned without any errors.
If none of that is helpful, then ...
That was very helpful.  It's fine running now.
Thank you tremendously.  You are a life saver.
  Neither the change to a Date data type nor the string delimiter seem 
likely to have been something that would have assisted.  Not sure if 
perhaps running the request distinct from any other activity performed 
prior within the job or outside the CLP, or even if perhaps a changed 
query [from a problematic cached query] due to the change in the target 
output database file might have been necessary to progress; i.e. the 
positive effect may be more chance than anything else.
The means to create the target of the import and the error record
file were not included, but those could be helpful to identify the
issue. Also helpful would be the first few stream records of the
STMF, regardless the /first/ record was asked to be skipped; much
more relevant, given "every record" gets the error. A sample of
just a few redacted records that still exhibit the errors would be
ideal, along with the hexadecimal code points of the data; plus the
CCSID of the STMF and the CCSID of the fields of the DBF.
  If the problem returns and\or persists in another environment, then 
the additional information [e.g. the entire DDS, not just for the Date 
field] describing the full scenario is most beneficial to assist others 
to be able to assist with [understanding and recovering from] the 
failing scenario.
As an Amazon Associate we earn from qualifying purchases.