|
1) If the table is journaled, make sure you use commitment control (orTh
stop journaling )
2) Don't write one row one at at time.
INSERT INTO <lib.file>(field1, field2, etc)
VALUES (value1a, value2a, etc),(value1b, value2b, etc), (value1c,
value2c, etc),<....>
Charles
On Tue, Feb 17, 2009 at 4:06 PM, Gqcy <gmufasa01@xxxxxxxxx> wrote:
I have a process that just inserts rows ( INSERT INTO <lib.file>
(field1, field2, etc) VALUES(value1, value2, etc)
things were great.... until I got like 400k records in the file,
then when I wanted to do a lot of writes at near the same time,
performance died.
I see something in the SQL monitor about doing a "table scan" before
each INSERT.
How do I NOT do this? (if I can)
I have no key on this physical.
Should I create this file differently?
Does someone have the "perfect example" of a process that only writes to
a BIG FILE?
Thanks
Gerald
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.