[This topic is migrated from our old forums. The original author name has been removed]
Hi Roger,
Long time Oracle user just trying 9.1.6 with PostgreSQL for the first time. Attempting to save a very large result set directly to csv file with @export. Have increased JVM memory to 1024m. DBVis quickly starts consuming memory while running the query and within a few seconds has popped up the out of memory dialog box.
Was looking for a setting such as "buffer all rows" for MySQL, but didn't find one. Is there presently a solution to this problem?
Doug
Hi Doug,
As far as I know, the PostgreSQL driver does not have any "buffer all rows" setting so the @export command should only read one row at a time into memory. Does the query include BLOB/CLOB columns that may hold very large values?
Please enable debugging for DbVisualizer in Tools->Debug Window and mail or post the output from running the @export command.
Best Regards,
Hans
Roger Bjärevall
said
over 10 years ago
[This reply is migrated from our old forums.]
Re: PostgreSQL @export query - out of memory
Doug,
The PostgreSQL driver buffer all rows in the result set before handing it over to DbVisualizer. To solve this add the following driver property in Connection Properties / Driver Properties by adding a new row (parameter = value):
java.sql.statement.setFetchSize = 10
Click Apply and re-connect. In the SQL Commander you are running the @export script you must make sure it is not in Auto Commit mode, as the driver in auto commit mode still will buffer the result set.
I hope this helps.
Regards
Roger
a
anonymous
said
over 10 years ago
[This reply is migrated from our old forums. The original author name has been removed]
anonymous