John
Create an identical work file
Then use SELECT DISTINCT to select from table with dups,
inserting into your work file:

CREATE TABLE mylib/wkfile AS
( SELECT DISTINCT fld1, fld2, ... fldn
FROM alib/dupfile )
WITH DATA



-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of John McKee
Sent: Friday, January 20, 2012 7:45 AM
To: Midrange Systems Technical Discussion
Subject: Removing duplicate records

During testing, I was real careful to clear this work file before each run. Managed to have one of "those" moments and failed to clear it prior to a production run. Result is that every record that should be in the file once is in the file twice.

Is there a way to remove duplicate records with SQL? They are completely identical. However, if it makes the process any simpler, the account number is guaranteed to not be used twice. Now, of course, had that field been described with DDS keyword UNIQUE, there wouldn't be an issue.

I can see possible ways to fix it, but would rather not have to wonder if I actually made it worse.

Thanks,

John McKee
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a moment to review the archives at http://archive.midrange.com/midrange-l.




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.