On 10 Apr 2013 12:18, Raul A. Jager W. wrote:
It is very likely that the DDL files will initialize the field with
the default, eliminating the 4040 problem, or at least, it will fail
in the program that writes the bad data, making it easier to fix.
The latter must be the case, i.e. the WRITE will fail, else there is
a defect in the definition of the file. For example, on v5r3 there was
a defect whereby a column defined with the DECIMAL(32) to DECIMAL(63)
were being generated without the [proper] BCD validation code.
I /assume/ the problem has been corrected in a future release [and
perhaps a more recent PTF in v5r3 than what I have], such that the LIC
DB column validation code would be generated for any new columns and
re-generated for any defective column definitions that exist.
IIRC I had alluded similar in a past message, but I do not recall if
anyone had tested and reported back whether the defect still exists on
any newer releases. To know if files created with the defect and had
remained on disk\DASD from a prior release is much more difficult to
test. A script that anyone could use to test [replace ExecCmd of the CL
request as appropriate; e.g. issue at a command-line outside of the SQL
environment].
create table qtemp/bigdec (d decimal(33))
; -- Table BIGDEC created in QTEMP.
create table qtemp/bigdecbd (d char(17))
; -- Table BIGDECBD created in QTEMP.
insert into qtemp/bigdecbd values(' ')
; -- 1 rows inserted in BIGDECBD in QTEMP.
call execcmd('cpyf qtemp/bigdecbd qtemp/bigdec fmtopt(*nochk)
mbropt(*replace) fromrcd(1)')
; -- CALL statement complete.
select * from qtemp/bigdec
; -- report from above SELECT in STRSQL with *LOCAL connection:
D
++++++++++++++++++++++++++++++++++++++++++++
; -- the '+' plus symbols per: msgMCH1202 f/PackedEdpd t/QQURB
As an Amazon Associate we earn from qualifying purchases.