|
Hi Larry, You shouldn't have any problem if you use a standard forward-only ResultSet. The values are not accessible once you move to the next row and should be available for garbage collection with a decent driver ( I'm generalizing here. ) There is a default fetch size, which determines the number of rows downloaded per trip to the database; you can change it to suit your needs. Use Statement.getFetchSize() and Statement.setFetchSize() for control. There are also other ways to connect, grab a set, disconnect, process, repeat, but unless connections are scarce, you probably don't need to go that route. Joe Sam Joe Sam Shirah - http://www.conceptgo.com conceptGO - Consulting/Development/Outsourcing Java Filter Forum: http://www.ibm.com/developerworks/java/ Just the JDBC FAQs: http://www.jguru.com/faq/JDBC Going International? http://www.jguru.com/faq/I18N Que Java400? http://www.jguru.com/faq/Java400 ----- Original Message ----- From: "Larry" <larryhytail@xxxxxxxxx> To: <java400-l@xxxxxxxxxxxx> Sent: Tuesday, September 13, 2005 7:27 PM Subject: SQL Question > > I want to do an SQL in a java program (JDBC). It's something I'd run once a month, but I know it will generate a lot of records in the record set. If I just did the query in one shot, I'd probably freeze my computer due to insufficient memory. > > Is there a way to just bring down 1000 or so records at a time with each repeated call of the SQL getting the next 1000 and etc until I hit the end of what would have been one big result set? > > The method that lets you do this, is it efficient for the backend dataase (iSeries), or does it actually run the whole query each time, but just return the 1000 records you need for that call? > > Any advice? > > Thanks, > > Larry >
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.