Start a new topic
Answered

Import Database - Not possible

I´m using DBVis 9.5 on MySQL-Databases.


To import a complete database it seems to be recommended in https://support.dbvis.com/support/discussions/topics/1000077756 to use the SQL-Commander.


If i run my exported sql-file (~3gb) in the Commander using the @run-command a popup appears telling me to use the "Continue w/o preprocessing", ending up in an endless import-process (i canceled it after 45 minutes, only 10 of 88 tables had been imported). Running the same file using "Continue normally" ends up in ~15-20 Minutes preprocessing and then throwing an "out of memory"-exception. So btt, is there any posibility to import a database correctly? Other DB-Management-Tools offer an import-wizard for databases, not only for tables.


Best Answer

Hi, Oliver


So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.


We do agree that this would be a good improvement and will add your vote for this.  


Best Regards, Ulf


Answer

Hi, Oliver


So I interpret this as you are no longer getting the "out of memory"-exception. The issue is now that the export format should be more "effective" making the import quicker.


We do agree that this would be a good improvement and will add your vote for this.  


Best Regards, Ulf


1 person likes this

It seems like i didn´t descripe the problem correctly...

To make it easer: The mentioned way to import databases works fine for smaller database-dump-files. Instead the import of a 3GB-file doesn´t really work, although i set the memory up to 4096MB. Is there any possibility given to import bigger dumps correctly?

Hi, Oliver


Sorry for the late answer.


Have you tried the following before hitting the run button?

  • In the Log tab uncheck the "Preprocess script" check-box. I think you did this implicitly by doing the "Continue w/o preprocessing" but anyway try unchecking the checkbox from the beginning.
  • In the Log tab specify to Log To file. 

I just tried here with 0.5Gb file and it worked fine. When using "Log to GUI" I was also experiencing an out of memory error.

Best Regards
Ulf

Hi Ulf,


thanks for your answer. I have tried it by unchecking directly from the beginning (see attachment).

The import now runs for roundabout 60 minutes and still is importing one of the first (bigger) tables, there are 5 bigger tables following up.

This can´t be the solution, especially for a specialized db-management-tool.We´re talking about 3GB and not about terrabytes of data... 

DBVis2.png
(34.1 KB)
DBVis1.png
(129 KB)

To be more constructive:

It seems to be a "problem" with the export-file.

The export generated by dbvis inserts every row as a single statement.


I´ve tried an export made with another tool summing up inserts to bundled statements (each statement max 1024kb) and run this file in dbvis. The whole import (executed in dbvis) now was finished in 25 minutes. This is logically comprehensible because the server now has to execute less statements then before.


So the problem could be solved in offering an option maybe called "sum up insert statements". Is such a feature planned in further updates?

Login or Signup to post a comment