Hi, Pete

The second-level text on the message gives you the possible reasons for this error. It's worth taking a look (F1 is your best friend). In this case, your subselect has 4 columns, and your table has 5.

At least 4 options:

1. Don't do the CRTDUPOBJ. Rather, from interactive SQL, press F13 and take option 1 to change the output type to file. The first time, set it to create the file. Then turn your subselect into a select (run it by itself) - next time, set option to replace member, or whatever works.

2. Don't do the CRTDUPOBJ. Rather, put the subselect, by itself, into a *QMQRY. You can use STRQM in SQL rather than PROMPT mode. Or put the statement into a RCDLEN(91) source file and use CRTQMQRY. Then use STRQMQRY, which has an OUTPUT(*OUTFILE) option and has OUTMBR(*FIRST *REPLACE) as well, just like CPYF.

3. Don't use CRTDUPOBJ. (Is this getting boring? Nothing wrong with it, as far as it goes, but is not necessary) Get Martin Rowe's very excellent EXCSQL command at <www.dbg400.net/excsql.html>. This lets you run an SQL statement from a command line. It has OUTPUT options, also, since it's based on QMQRYs. There are others-Midrange Computing had one, I wrote my own once, Buck Calabro has written one.

4. Use CRTDUPOBJ. In interactive SQL, run the alter table statement to DROP the offending column. Then use your INSERT INTO

I strongly recommend QM queries. In the long term, they are more flexible than RUNSQLSTM or Query/400 queries. There are QM forms that give you the layout for reports. There are several better options in QM forms than in QRYDFNs. And, perhaps best of all, you can use substitution variables. QRYDFNs have this, too, but are not callable from CL with the values. The secret of Martin's command, et al., is to use a QMQRY with only substitution variables.

HTH
Vern

At 08:51 PM 5/21/2004, you wrote:
This is the classic duplicate record issue where my customer posted data
twice and now has some duplicate data in a file.  The file is keyed such
that a "duplicate" really doesn't occur because there is a line number in
the record that can be incremented by the RPG program when a duplicate is
entered (there are reasons for this..which has back back to haunt me).

So I have a table (ppay408a) that looks like:

tmlssn,tmljob,tmlped,tmldat, tmllin, etc

A "duplicate" is when the first four columns have identical values and the
line number (tmllin) is different.  So I wrote an sql statement that shows
me the duplicate records:

select * from ppay408a a inner join
(select tmlssn,tmljob,tmlped,tmldat  from ppay408a
group by tmlssn,tmljob,tmlped,tmldat
having count(*) > 1) as dups
on a.tmlssn = dups.tmlssn
and a.tmljob = dups.tmljob
and a.tmlped = dups.tmlped
and a.tmldat = dups.tmldat

Sweet! All my duplicated records are there.  So I want to write them to a
file:

so I change the statement to be:

insert into ptmp408a
select * from ppay408a a inner join
(select tmlssn,tmljob,tmlped,tmldat  from ppay408a
group by tmlssn,tmljob,tmlped,tmldat
having count(*) > 1) as dups
on a.tmlssn = dups.tmlssn
and a.tmljob = dups.tmljob
and a.tmlped = dups.tmlped
and a.tmldat = dups.tmldat

And I get an SQL error (SQL0117) saying "Statement contains wrong number of
values."

Is that because of the subselect?  Something else?  ptmp408a was created
using the crtdupobj command so the tables are identical, aren't they?

Second question?  Is there a better way to go about this (without writing a
program to do it...which is my alternative...)

Thanks,

Pete Helgren
Value Added Software,Inc.
801.581.1154 x202



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.