If i run my exported sql-file (~3gb) in the Commander using the @run-command a popup appears telling me to use the "Continue w/o preprocessing", ending up in an endless import-process (i canceled it after 45 minutes, only 10 of 88 tables had been imported). Running the same file using "Continue normally" ends up in ~15-20 Minutes preprocessing and then throwing an "out of memory"-exception. So btt, is there any posibility to import a database correctly? Other DB-Management-Tools offer an import-wizard for databases, not only for tables.
Best Answer
U
Ulf Weiland
said
over 8 years ago
Hi, Oliver
So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.
We do agree that this would be a good improvement and will add your vote for this.
It seems like i didn´t descripe the problem correctly...
To make it easer: The mentioned way to import databases works fine for smaller database-dump-files. Instead the import of a 3GB-file doesn´t really work, although i set the memory up to 4096MB. Is there any possibility given to import bigger dumps correctly?
Ulf Weiland
said
over 8 years ago
Hi, Oliver
Sorry for the late answer.
Have you tried the following before hitting the run button?
In the Log tab uncheck the "Preprocess script" check-box. I think you did this implicitly by doing the "Continue w/o preprocessing" but anyway try unchecking the checkbox from the beginning.
In the Log tab specify to Log To file.
I just tried here with 0.5Gb file and it worked fine. When using "Log to GUI" I was also experiencing an out of memory error.
Best Regards
Ulf
O
Oliver Kruse
said
over 8 years ago
Hi Ulf,
thanks for your answer. I have tried it by unchecking directly from the beginning (see attachment).
The import now runs for roundabout 60 minutes and still is importing one of the first (bigger) tables, there are 5 bigger tables following up.
This can´t be the solution, especially for a specialized db-management-tool.We´re talking about 3GB and not about terrabytes of data...
The export generated by dbvis inserts every row as a single statement.
I´ve tried an export made with another tool summing up inserts to bundled statements (each statement max 1024kb) and run this file in dbvis. The whole import (executed in dbvis) now was finished in 25 minutes. This is logically comprehensible because the server now has to execute less statements then before.
So the problem could be solved in offering an option maybe called "sum up insert statements". Is such a feature planned in further updates?
Ulf Weiland
said
over 8 years ago
Answer
Hi, Oliver
So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.
We do agree that this would be a good improvement and will add your vote for this.
Oliver Kruse
I´m using DBVis 9.5 on MySQL-Databases.
To import a complete database it seems to be recommended in https://support.dbvis.com/support/discussions/topics/1000077756 to use the SQL-Commander.
If i run my exported sql-file (~3gb) in the Commander using the @run-command a popup appears telling me to use the "Continue w/o preprocessing", ending up in an endless import-process (i canceled it after 45 minutes, only 10 of 88 tables had been imported). Running the same file using "Continue normally" ends up in ~15-20 Minutes preprocessing and then throwing an "out of memory"-exception. So btt, is there any posibility to import a database correctly? Other DB-Management-Tools offer an import-wizard for databases, not only for tables.
Hi, Oliver
So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.
We do agree that this would be a good improvement and will add your vote for this.
Best Regards, Ulf
- Oldest First
- Popular
- Newest First
Sorted by Oldest FirstOliver Kruse
It seems like i didn´t descripe the problem correctly...
To make it easer: The mentioned way to import databases works fine for smaller database-dump-files. Instead the import of a 3GB-file doesn´t really work, although i set the memory up to 4096MB. Is there any possibility given to import bigger dumps correctly?
Ulf Weiland
Hi, Oliver
Sorry for the late answer.
Have you tried the following before hitting the run button?
Oliver Kruse
Hi Ulf,
thanks for your answer. I have tried it by unchecking directly from the beginning (see attachment).
The import now runs for roundabout 60 minutes and still is importing one of the first (bigger) tables, there are 5 bigger tables following up.
This can´t be the solution, especially for a specialized db-management-tool.We´re talking about 3GB and not about terrabytes of data...
Oliver Kruse
To be more constructive:
It seems to be a "problem" with the export-file.
The export generated by dbvis inserts every row as a single statement.
I´ve tried an export made with another tool summing up inserts to bundled statements (each statement max 1024kb) and run this file in dbvis. The whole import (executed in dbvis) now was finished in 25 minutes. This is logically comprehensible because the server now has to execute less statements then before.
So the problem could be solved in offering an option maybe called "sum up insert statements". Is such a feature planned in further updates?
Ulf Weiland
Hi, Oliver
So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.
We do agree that this would be a good improvement and will add your vote for this.
Best Regards, Ulf
1 person likes this
-
Forward Engineering
-
DB2 Database Connection Dropping every 20-30mins or so ?
-
roles are not unfolded
-
Install4j: Java(Tm) VM could not be started
-
Worksheet name default when exporting to Excel?
-
sqlformatter and extract
-
Bug with empty password
-
dbvis 5.0 gives error after opening
-
Saving file makes it incompatible with other SQL tools!!!?!?!?!?
-
Error Schema Name in References view.
See all 1316 topics