Start a new topic

Multi-row insert statements when exporting tables.

I saw a post from the distant past that requested the ability to generate multi-row insert statements when exporting data from a SQL Server table. This would be a very nice feature to have as it would save a tremendous amount of time when inserting large data sets.


Hi,


Thanks for the feedback. I have added your vote for this feature.


Best Regards,

Hans

Thank you. This is a feature I would use very often. I actually keep a copy of another database utility for exporting larger tables; it also has an option to "ignore identity columns" that is also very useful. I'd like to start using DbVisualizer as my the primary interface for database activity... it's a fantastic tool that I really enjoy using.

 

...this shouldn't even be something you need to do.  Get your employer to give you bulk/copy permission.

With no disrespect intended, I fail to recognize how you could possibly know what I need or what functionality would be beneficial for the tasks I perform? The functions I describe are available in RazorSQL and I make good use of them frequently. If DbVisualizer had said functionality I would ditch RazorSQL altogether.

 

The only use case for this feature is when you don't have permissions for bulk and/or copy methods.  A large file of insert statements is the absolute worst way to load data on nearly every sql flavor.  

Thanks Trevor, I think you've made your point. You don't want this feature... I get it!
:o)

 

Aksturgeon,


Your request is certainly appropriate and as Hans said in his first reply we've added your vote for it.


Are you looking to export from DbVisualizer and later import using DbVisualizer? If this is true, there is experimental support for bulk import. What you do is to export in CSV and then enable the experimental bulk import to import the data. 


If you are interested to test this, let me know.


Regards


Roger

The product we sell has a set of tables that needs to be initialized with default data. We have sql scripts to create and populate these tables that must be executed during the installation process. We have a few tables with thousands of records and one table in particular that has around 9000 rows. The default data often changes between release levels so I typically script the data out of our QA database as insert statements for inclusion in the installation scripts. By inserting 1000 records at a time I bring the process of populating these tables from several minutes down to a few seconds. Since this is part of an installation process, while performance is a factor, it isn't highly critical; the more important factor is in keeping the installation process as simple as possible with as few steps as possible.

I can certainly alter the scripts that I output from SQL Server to modify the resulting output to insert 1000 records at a time but it's much easier to use a tool that simply allows me to specify how many records I want to include in each insert statement.

 

That makes sense. Thanks for the clarification.


Regards


Roger

Thanks Roger. It's clear that not everyone finds any benefit in such a request assuming that there are no use cases for such a feature but the simple fact that SQL Server (and other database managers) allows one to insert up to 1000 rows per insert statement seems to imply that they certainly expected such needs.

 

Login or Signup to post a comment