I am sure I have done this before, but it escapes me and I cannot find code
where I have done it.



I have a file I need to create, stripping out duplicate records. The problem
is I have a quote file with overlapping quotes Example:



Quote # Vendor Part Number Dates from / to
Price

1234 abcd Item1 01/01/2012 to 09/30/2012
10.00

2345 abcd item1 07/01/2012 to 12/31/2012
11.00



For the application I have, I need to select ANY one of the 2 above.



I have created a temporary file QUOTE2 and am then using the INSERT to fill
the new file QUOTE2 with unique records.



The command is:

Insert into QTEMP.QUOTE2 Q2

( Select A3.QTCO, A3.QTVEND, A3.QTPN, A3.QTPN

from myLib.QUOTE A3

where current date between A3.QTDTEF and A3.QTDTEX

And A3.QTCO || A3.QTVEND || A3.QTPN not in (

Select Q1.QTCO|| Q1.QTVEND ||Q1.QTPN from QTEMP.QUOTE2 Q1)

) ;

The difference here is I am trying to insert records into the same file that
I am checking for duplicates (QTEMP.QUOTE2).



My result contains the duplicates. It appears the system is building up a
work file before it starts the INSERT. As the work file starts off being
empty, the NOT IN has nothing to check against and so writes all records
into QUOTE2. I would like it to check back against QUOTE2 for every INSERT.



I need a new way or some way similar to a declare cursor using the SENSITIVE
predicate.



My code is running in a script using RUNSQLSTM so I am limited by what I can
do in the script.





TIA

Darryl Freinkel | Assignment 400 Group, Inc.




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.